Understanding the user:
User testing

Conducted and ran user testing sessions to help improve direction for redesign of events workflow on corporate actions software.

Researcher and designer - 1.5 month duration

Tools used:

Dovetail

Figma

Zoom

Teams

The why.

A considerable amount of criticism surfaced from the qualitative investigation we conducted on our product, highlighting the necessity for further research to completely understand and eliminate the root of these problems.

We committed ourselves to an objective of proactively engaging our customers for user testing. However, looking into the previously collected data was a priority before we moved forward.

I gathered results, responses, notes, and client validations using a structured and continuously improved tagging system via Dovetail.


Objectives

Identify issues

Pinpoint specific usability challenges and pain points experienced by users within the redesigned event workflow.

By understanding where users encounter difficulties, we can make targeted improvements.

Validate workflow changes

Confirm that the changes made during the redesign effectively address the criticisms and issues identified in the initial qualitative research.


This validation will help us determine if the redesign has improved user satisfaction and usability.

Gather insights & enhance

Collect feedback from users on the redesigned workflow to inform further design decisions.


Engaging with users will help us create a more user-centric and effective workflow.


The how.

Presentation of insights

After producing insight reports based off the feedback we received, I presented these to our designers to help direct their design decisions.

Handoff to designers

After presenting, we came together and pin-pointed the areas we needed to tackle and I left the rest up to the designers.


Ideate and concept

The designers worked their magic and developed some low and high fidelity sketches and prototype and we were ready to start testing it with users.

Reach-out email

The aim was to make our clients aware of the importance of user testing to us. The feedback we received from these calls was going to help shape the future user experience of our product.



We made them aware that this was not a requirements gathering session but rather an opportunity for our users to speak freely about anything they would like to share regarding the prototype.

User test tasks

In collaboration with our design team, we shaped a series of steps for our clients to participate in while on the call.

These steps need to be clear and pure, yet should not overly guide the client through the process.

To ensure absolute transparency about these steps, we asked various departments to review our list within the prototype and provide their perspectives on potential improvements.

Schedule calls

Here we scheduled the calls with the client making sure it was at a time and date they were comfortable with.

This step was easier than anticipated with us receiving a hugely positive response to the prospect of testing a new prototype version of the screens our users use on a daily basis.

Discuss and rehearse

The lead designer on this project and I came together to discuss our call script and rehearse how we hope the call to go.

We decided a moderated user test was the best approach for these user tests as we were targeting specific areas in our task list.

The beauty of user testing is that we also came across some nuggets of information we did not anticipate beforehand.

Perform calls

Each user engaged in user testing sessions, culminating in a total of 6 distinct calls.

Our approach remained unbiased throughout, curbing the constant temptation to assist participants when they faced obstacles.

This strategy unearthed a number of significant problems that we could address later on.

Research

After the calls, I generated a report outlining where we could improve going forward.

The report followed a structure of the following details:

- Number of users experiencing issues.
- Nature of issues.
- Action needed.
- Overall score.

After presenting our findings we began building an appendix of early ideas.

Findings

We targeted 14 specific areas in total for our user testing sessions. The following is the ranking system we developed for each of the areas and the number of times they were an occurrence in the tests.


Outcome

Improved tooltip availability and clarity

Created an overlay menu on hover and click to copy data function.

Created distinction between two areas

Created a complete distinction between two operational areas that users found issue with on multiple occasions.

Improved visibility of operational icons

Added notifications to the operational panel icon.

Sidebar adjustment

Made the left sidebar more visible and easier for the user to recognise as clickable. Implemented border around header to make more distinguishable.

Altered hierarchy

Altered hierarchy of views which was more in line with the user's mental model.

Grid overhaul

Overhaul of grid area creating a more simplistic container view colour-wise.


Reflections / Learnings

Open to change

Contrary to the belief that "change is bad", we found that all user's were hugely open to the idea of changing things for the better. The response we received to reach-out mails was one of excitement.

Client communication

As this was something new to not only myself, but the company as a whole, it created great communication between us and the client who all stated they would be open to discussing further whenever possible.

Battling bias / challenges

Throughout the user testing sessions there was a battle to fight becoming bias or influencing the user.

The stakeholder involved in these tests was insistent on being involved in one of the calls which ultimately made some of our feedback void.

Importance of early engagement

Prioritising user involvement at an early stage of development ensures a user-centric final solution.