UX Design Reviews: The Ultimate Guide to Success

UX design reviews offer a valuable opportunity to gain fresh insights, pinpoint high-impact UX improvements, all without breaking the bank.

This post will take you through the ins-and-outs of conducting a UX design review. But before we dive into the process, let’s explore why these reviews are so useful.

So, why conduct a UX design review?

The question really should be, “Why wouldn’t you?”. In order of importance, they’re really useful for:

  • Users
    Identify UX flaws and strengths, any accessibility concerns and capitalise on those quick wins early on.

  • Time
    Knowing what does and doesn’t work in the early stages can save precious time during both the design and development phases… and sanity.

  • Cost effective
    Often, a UX review is a cheaper and quicker alternative to extensive user testing. However, user testing remains invaluable if you can incorporate it in your budget too!

In essence, conducting a UX design review can uncover usability challenges and identify the strengths of your product or service. It can be performed during any stage of the product’s life cycle, as long as the prototype has sufficient detail for review.

Whatever stage you’re at, you’ll almost certainly find it a valuable investment.

Who should conduct a UX design review?

Opting for external evaluators is often the wiser choice, as they’re not personally involved in the projects strategic or design decisions up to that point. They bring an impartial perspective, and won’t be tainted by internal strategic or design influences.

Aurora Harley, says it best from her ux expert review article:

A fresh perspective is more likely to be unbiased and deliver candid feedback: an outsider is not emotionally invested in the design, is oblivious to any internal team politics, and can easily spot glaring issues that may stay hidden to someone who’s been staring at the same design for too long

Ideally your  reviewer would possess:

  • Deep knowledge of UX best practices
    Proficiency to assess the design using a set of usability guidelines, usability related psychology and human-computer interaction principles.

  • User research experience
    Having past experience conducting user research can be useful as they’ll know patterns of user behaviour to look out for.

  • Domain knowledge
    This is a bonus rather than a requirement, but having experience in the product sector can be useful.

I should stress, if you’re looking to build up experience get out there and start conducting your own UX design review. Don’t be put off by a checklist on a Medium article, start practicing!

How many evaluators should conduct a UX review?

The NN Group wrote a fantastic piece on Heuristic Evaluations and found involving multiple evaluators improved the overall effectiveness in catching usability problems.

Below is a visualisation where 19 evaluators were tasked with finding 16 usability problems. While some of the more successful evaluators were able to identify more usability problems, some of the less successful evaluators were able to identify the harder problems.

Read the full study by the NN Group

‍So what does this show? In general, conducting a UX review is a challenging task for a single individual.

While it’s better than not conducting one at all, one person is unlikely to catch everything. This isn’t a reflection of their abilities; different evaluators excel in spotting different issues on different products — we’re all human, after all!

Based on the above study the NN Group recommends using between 3–5 evaluators, any more and you don’t gain much bang for your buck.

What benchmarks should I use in a UX review?

While design is subjective we use a set of standardised criteria to help identify potential UX pitfalls. Below is some of the criteria I’ve used previously, based on  NN Group’s heuristic benchmarks:

  • System feedback
    Is it clear to users what action they’ve taken and what they’ve achieved?

  • Match system & real-world
    Does the product use language and conventions familiar to users?

  • Control & freedom
    Can users recover easily from errors or wrong turns?

  • Consistency & standards
    Avoid guesswork. Users should not have to wonder whether different words, situations or actions mean the same thing.

  • Error handling & prevention
    Effective error messages and proactive design reduce mistakes.

  • Recognition rather than recall
    Minimise cognitive load by making objects, actions and information easily accessible.

  • Flexibility and efficiency of use
    Provide a smooth experience for both new and existing users.

  • Aesthetics
    Surface only relevant information. Key actions and dialogues should not compete with those of less importance.

  • Accessibility & inclusion
    Humans are diverse. Accommodating different abilities and perspectives gives everyone a sense of belonging.

Prioritise, prioritise, prioritise…

Findings should be prioritised and a set of recommendations outlined on how to proceed next. This guides teams on where to allocate time, budget, and energy, and highlights quick wins!

We use a standard traffic light system, which Simon Jones’ article about delivering impact covers well:

  • Green: Positive
    Reflects strong characteristics that appear to be working well and should be maintained in future phases.

  • Amber: Caution
    This rating has a detrimental affect on the experience and should be considered as soon as is practical.

  • Red: Negative
    This rating reflects aspects that will seriously harm the experience, and should be addressed at the earliest opportunity.

Starting a UX design review

Before you begin, it’s essential to understand who will use the product and define specific tasks.

Talk to the project stakeholders to determine:

  1. Core users

  2. Common tasks

  3. What information/guidance do the users need for each task

  4. What should they already know (syntax etc)

Assuming we have the above, we now know the types of people using the product and their most common tasks. Imagine tasks as stories, divided into a beginning, middle, and end.

Joe’s story

Imagine Joe from HR needs to add an employee to a new CRM system that’s just launched.

Beginning: What does Joe need to know in order to start the task?

Ok, a new employee has come on board. I need to add Jill to the new company CRM system.

Middle: How easy is it to get from the beginning to the end?

Huh, I can’t find where the Team is. There’s a ‘Contacts’ menu item so I’ll click on that first.

Ah, there they are!

I’ll click on the ‘Add team member’ button to add Jill to the system…

Joe continues on his journey until the end…

End: What does completion look like?

Great! A success message has appeared indicating Jill has been added to the system.

It’s a crude example but based on our earlier benchmark criteria we’d note the following:

  • Red: Terminology (Match system & real-world)
    HR users would use the terminology ‘Team’ rather than ‘Contacts’ — this should be reflected in the navigation menu and the success messaging.

  • Amber: Naming consistency (Consistency & standards)
    Terminology inconsistency between the menu item ‘Contacts’ and the main Call-To-Action button ‘Add team member’. Whatever you go with, keep terminology consistent!

  • Green: Success message (System feedback)
    I’d note that it’s good to get feedback when a user completes an action and it should be maintained throughout the app.

Small adjustments like these can greatly enhance the user experience.

Creating tasks based on core user needs helps identify potential issues systematically rather than stumbling upon usability problems by chance. You’ll likely find general usability issues as you go, so make sure to note those down as they emerge too.

What next?

UX design reviews are a valuable method for identifying usability wins and uncovering potential issues.

As with all things UX it’s an ever evolving process. The benchmarks above are by no means an exhaustive list to review against, but serve as a good set of starting guidelines. It’s certainly helped our team provide focused, prioritised reviews.

Coupled with user testing, you’ll have a rock-solid UX process and your product will be better for it.

Need a design partner that specialises in business software and tools?

©2024 Caboodle

Caboodle Digital Ltd. International House
64 Nile Street, London N1 7SR