Understanding the user: User testing

Conducted and ran user testing sessions to help improve direction for redesign of events workflow on corporate actions software.

Background

Background

A considerable amount of criticism surfaced from the qualitative investigation we conducted on our product, highlighting the necessity for further exploration to completely understand and eliminate the root of these problems.

Our product was under the process of redesign, and before completion, we required additional feedback.

We committed ourselves to an objective of proactively engaging our customers for user testing. However, looking into the previously collected data was a priority before we moved forward.

A considerable amount of criticism surfaced from the qualitative investigation we conducted on our product, highlighting the necessity for further exploration to completely understand and eliminate the root of these problems.

Our product was under the process of redesign, and before completion, we required additional feedback.

We committed ourselves to an objective of proactively engaging our customers for user testing. However, looking into the previously collected data was a priority before we moved forward.

Findings & Results

Our team gathered results, responses, notes, and client validations using a structured and continuously improved tagging system via Dovetail.

Low NPS score

While a relatively small sample size was gathered for the survey, the NPS score was low and reflected badly on the product. It was a stark indication the service needed improving.


For more information on what an NPS score is, check out our evergreen report on the topic here.

User interview feedback

After conducting numerous user calls with our clients, we listened back and gathered quotes received. Some positive, some negative - but all incredibly valuable.

This feedback was then grouped together and placed under the following titles describing them:

  • Performance and reliability feedback

  • Features feedback

  • Usability feedback

  • Feature requests

Insight building

Upon classifying user feedback and dissecting the actual problems, we managed to gather insights that furnished us with a holistic, concentrated comprehension of our users and their necessity for improved tools to facilitate their everyday responsibilities.

Next steps

Presentation of insights

After producing insight reports based off the feedback we received, I presented these to our designers to help direct their design decisions.

Handoff to designers

After presenting, we came together and pin-pointed the areas we needed to tackle and I left the rest up to the designers.


Ideate and concept

The designers worked their magic and developed some low and high fidelity sketches and prototype and we were ready to start testing it with users.

Reach-out email

The aim was to make our clients aware of the importance of user testing to us. The feedback we received from these calls was going to help shape the future user experience of our product.



We made them aware that this was not a requirements gathering session but rather an opportunity for our users to speak freely about anything they would like to share regarding the prototype.

User test tasks

In collaboration with our design team, we shaped a series of steps for our clients to participate in while on the call.

These steps need to be clear and pure, yet should not overly guide the client through the process.

To ensure absolute transparency about these steps, we asked various departments to review our list within the prototype and provide their perspectives on potential improvements.

Schedule calls

Here we scheduled the calls with the client making sure it was at a time and date they were comfortable with.

This step was easier than anticipated with us receiving a hugely positive response to the prospect of testing a new prototype version of the screens our users use on a daily basis.

Discuss and rehearse

The lead designer on this project and I came together to discuss our call script and rehearse how we hope the call to go.

We decided a moderated user test was the best approach for these user tests as we were targeting specific areas in our task list.

The beauty of user testing is that we also came across some nuggets of information we did not anticipate beforehand.

Perform calls

Each user engaged in user testing sessions, culminating in a total of 6 distinct calls.

Our approach remained unbiased throughout, curbing the constant temptation to assist participants when they faced obstacles.

This strategy unearthed a number of significant problems that we could address later on.

Research

After the calls, I generated a report outlining where we could improve going forward.

The report followed a structure of the following details:

- Number of users experiencing issues.
- Nature of issues.
- Action needed.
- Overall score.

After presenting our findings we began building an appendix of early ideas.

Outcome

Overview of outcomes

Of the 14 specific areas we targeted in the user testing sessions;

Of the 14 specific areas we targeted in the user testing sessions;

4

Satisfactory and an easy user-value opportunity.

Satisfactory and an easy user-value opportunity.

3

Needed minor improvements

Needed minor improvements

6

Performed poorly and needed improvement.

Performed poorly and needed improvement.

1

Had some opportunity where we could do more for our users.

Had some opportunity where we could do more for our users.

Changes

Improved tooltip availability and clarity.

Improved clarity of "display settings" function.

Improved visibility of operational icons.

Created a complete distinction between two operational areas that users found issue with on multiple occasions.

Altered hierarchy of views which was more in line with the user's mental model.

Made the left sidebar more visible and easier for the user to recognise as clickable.

Overhaul of grid area creating a more simplistic container view colour-wise.