Transforming raw data into actionable knowledge:
Insight building

Analysing data collected from sources such as user interviews, NPS surveys and user testing to extract meaningful insights.
The goal of insight building? Translate these insights into informed decisions and strategic actions that drive positive outcomes for our products.

Sole researcher - Ongoing project

Tools used:

Dovetail

Notion

Zoom

Forms

Excel

The why.

By emphasising the importance of insight building, our team and company can create products that not only meet user needs but also drive strategic business outcomes, ensuring sustained growth and success.
Below are three of the areas singled out as to why we analysed feedback and built a repository of insights for continuous improvement.


Objectives

Systematic and methodical tagging

Utilising Dovetail's software, I needed created a database of "tags" tied to the feedback we acquired.

This process entailed group think sessions for deciding tag names, their utilisation and allocation, and constant collaboration concerning the appropriate tags for various feedbacks.

Allocation and grouping tags

Once these tags had been refined and acquired, we analysed them and decided on specific groups for them. These included:


Client segmentation: this included tags like age groups, location, and the type of company.
Quality: Four groups we used to base the majority of reports off of; usability, features, performance and suggestions. Alongside this we included product specific tags.
User and client metrics: this included skill level, number of users, team sizes and other general numerical and statistical tags.
User testing and journeys: purely related to any user tests conducted, how the user felt and personal feedback received from an individual.


The how.

Analysing the research target

Prior to producing any analytical report, we, as researchers and designers, needed to confirm that we were thoroughly acquainted with what our product provided to our users.

This encompassed making use of our education platform, evaluating demos, and reviewing intrinsic help documents.


Discussions with Product

A constructive dialogue and continual exchange of ideas with our product department created an environment conducive to free inquisition.

Product managers maintained direct correspondence with our clientele, offering us crucial insights into what our customer feedback could potentially indicate in instances of uncertainty.

Figjam collaboration

As researchers we collaborated together on the information we had gathered from the previous steps mentioned above.

Utilising Figjam, we brought our knowledge together and created a structure to how the report should drive home the overall summary of our feedback.

Product specific tagging

To further enhance our repository of research, we also included a product specific tagging system.

This involved separating certain features that may only be associated with one of our products, enabling us to target precise areas of our feedback.

Results

After building our research repository of tags and feedback, the resulting reports were classified into three separate categories:


Outcome

13 reports generated

Over a 3 month period, 13 reports were generated falling under the headers above.

Over 50 insights gathered

Within the reports, over 50 insights were gathered assisting with improving design decisions.

Collaborative environment

Improved collaboration amongst departments

Research repository

A repository of feedback, reports, insights and tags was developed for all departments to view and revert back to for future decisions.

User testing

These reports and repository of feedback opened the door to some of the first user testing sessions conducted in the company.

User satisfaction

Being able to use what our users are saying about our product helped us to connect with them and make improvements which increased user satisfaction overall.


Reflections / Learnings

Drivers for change

The reports generated helped drive changes for our clients. The feedback we gathered built insights which targeted specific areas we can modify.

Improved collaboration

Results were presented to other departments and discussion were had to determine if these changes were feasible and where they were so, the changes were implemented.

Importance of qualitative and quantitative analysis

Recognised the value of both qualitative data (e.g., user interviews) and quantitative data (e.g., NPS surveys) in forming a comprehensive understanding of user experiences.

Balancing these data types enabled me to develop well-rounded insights that drive informed decisions.

Prioritising actions based on impact

Learned to prioritise actions based on how significantly they will enhance user experience and product success.