Web Analytics and User Experience Design
02 Apr 2009

Web Analytics and User Experience Design

02 Apr 2009

Without user experience design to ground and inform it, trying to make sense of web analytics results in conjecture. On the other hand, user experience design without analytical testing to validate and fine-tune it can only be informed guesswork.

Analytics data is useful when it is utilized to measure success of goals and to understand performance issues. The elements of the user experience design field, such as user-centered design and usability paradigms, help to make sense of such data. In addition, while data can measure lack of success, it can not tell provide solutions; it takes a user experience specialist that has training and experience in optimizing such systems to offer potential solutions.

Just as analytics needs user experience design, the UX field also needs analytics. Otherwise, how can we tell that a redesign is effective? Perhaps the new design results in fewer sales, shorter user lifetime, higher bounce rate. There is absolutely no way to know such things without quantitative testing. I cannot begin to tell you how many times I’ve witnessed seasoned experience designers being shocked by the unanticipated shortfalls or successes of their work when put to the test by analytics or other data-driven testing.

ClickZ, a popular information website for digital marketers, has an interesting post about how analytics and user experience design can work together. In the post, the author interviews ClickZ’s own associate director of user experience, Aaron Louie. In the interview, Louie states

[User experience design and analytics] are subservient to higher-level goals. In performance marketing, what drives both analytics and user experience are the business goals and user goals. We ask the fundamental questions: “Why does the site exist?” “What do you want users to do?” and so on. The answers to these questions determine what we design and how we measure the performance of that design….

During discovery, we review the baseline analytics to look for potential problem issues. We then collaborate with the analytics team to conduct the goals analysis, connecting high-level user and business goals to measurable user behaviors. During design, we collaborate with the optimization team to identify and generate design variants for A/B and multivariate testing. And then post-launch, we supplement analytics data with user surveys and usability testing, providing the “why” for the “what.” Then we repeat steps one through four.

I encourage you to read the full interview.

Leave a comment
More Posts
Comments
Comment

Leave a Reply