Good user research requires two types of data: qualitative, and quantitative
Both quantitative and qualitative data provide specific insights into a product’s overall usability. However, without proper understanding of what kind of information you can get from these types of data, you won’t reap the benefits. Worse, you may read the data wrong and come to incorrect conclusions.
The most common mistake is to do only one or the other. Typically, qualitative data is sidelined. Without numerical data and user insights to back each other up, you end up with an incomplete picture of issues surrounding your product. So what exactly does this data look like, and what information can be gleaned from it?
What is Quantitative Data?
Quantitative data is numerical data used to gauge the user’s experience. It comes in the form of analytics, questionnaires, heatmaps, and other forms of testing that generates quantifiable answers. It is considered to be an indirect way to test usability.
These metrics help you find the “wheres” and “whats” of usability issues.
What is Qualitative Data?
Qualitative data is direct data derived from user testing. This type of data gives you the “how” and “why” of usability issues. By observing a test group, you can directly ask users why they have issues with the product.
By taking both sets of data, you can get a complete picture of the issues getting in the way of users’ experience on your platform. For example, heatmaps show 60 percent of users completing tasks with a new feature. With user testing, you can observe users trying to complete the new task.
While the majority of your test subjects complete the task, they unanimously report confusing directions which make them unsure of their next steps. Both sets of data in tandem give the complete picture of the issue, and a starting point with how to solve the problem.
Getting insights with your data to work together
The bounce rate of a page is the rate at which users leave it without taking action, such as navigating to a new page or making a purchase. Various page analytic trackers can give you insights into a page that should be converting users, but is not.
Analytics can show you where the problem pages are. Meanwhile, user tests and interviews can illuminate why they aren’t converting.
Because of the nature of user testing, researchers have a limited pool of users to test at a time. A limited pool of users can mean the results vary, with no clear correlations. With complex tasks, such as a full user flow, and various ways to carry it out, user testing may give a sense of where to start.
However, detailed heatmaps and analytics tracking users through their journey can provide a snapshot of a larger data pool. Since quantitative data can gather a more accurate average of users, it can inform details that qualitative data may not be able to efficiently capture.