Design

How to get started with user testing at your company

4 min read
Lee Munroe
  •  May 25, 2017
Link copied to clipboard

UX research is a core component of a successful product design process. It helps you build empathy with your users and understand their behavior and problems. And then it enables you to test out your solutions and confirm hypotheses.

One of the key methods for a successful UX research process is user testing.

“The goal of user testing is to identify any usability problems, collect qualitative and quantitative data, and determine the participant’s satisfaction with the product. The earlier issues are identified and fixed, the less expensive the fixes will be.” –Usability.gov

Related: User testing gone wild—a guide to course correction

We all agree that user testing is a valuable practice. Yet few of us actually do it. Why is that? Some of the reasons I hear include:

  1. Time. The time it takes to recruit, schedule, and run the tests.
  2. Money. Paying participants incentives or paying 3rd parties to run tests for you.
  3. Fear. Fear of being wrong or of introducing changes late in the dev cycle.

User testing doesn’t have to cost a lot of money. You shouldn’t be fearful of hearing the truth about your designs.Twitter Logo But no doubt it can take up a lot of time. The thing that has deterred me in the past is the time and effort it can take to recruit and schedule.

“We all know user testing is valuable. So then why do so few of us do it?”

Twitter Logo

I’ve been an early designer at several startups, so I have a lot of experience championing regular user testing. When I joined Mesosphere, I worked with the team on a system that would help automate and encourage our design team to run user tests every 2 weeks.

For some context, Mesosphere is a B2B software company. We have an enterprise product called DC/OS (Data Center Operating System) that helps software companies build and run modern apps. We also have several open source products with several thousand users.

This is how we set up our user testing program.

Recruiting users

Instead of waiting until you’re ready to start conducting user tests, start collecting interested users now. This way you have a bucket of people to call when you’re ready.

The first thing we set up was a form to capture leads for UX research. If they opt in to our list, we have permission to email them about upcoming user testing sessions. We can also reach out to them for surveys, prototype feedback, interviews, and field visits.

Related: How to quickly create a powerful survey

We initially used Google Forms to set up the form, as it was quick and easy. Then we shifted to Pardot so we could integrate our list with Salesforce (which other teams are using). More recently we’ve been using Ethnio, a service made specifically to handle user research recruiting and scheduling.

The tool you use here isn’t important. The goal is to create a list of people who are interested in user testing. Then you have them to call upon when you have something to test.

There’s a balance in how much information you ask for in your opt-in form. We want to make it short so users complete it, and we also want to find out as much about them as possible so we can qualify them.

Some of the questions we ask:

  • Phone number. This comes in handy when you need to confirm with the person that they still intend to show up for user testing.
  • Technologies used. This helps us filter our list down to the type of persona we’re looking for certain tests.
  • Local to San Francisco. We use this to help us find good candidates who can attend user tests in person instead of remote.

Once we had the form set up, we created the subdomain uxresearch.mesosphere.com so we could easily point people there. Then we set up some inbound channels to start collecting these people.

  • Twitter—periodic tweets
  • Facebook—periodic posts
  • Intercom—automatically emailing new users after they had used the product 5 times
  • GitHub—links from our readme
  • Support—referral links in the footer of Zendesk emails
  • Meetups—collecting emails at meetups that we run
  • Public Slack groups—promoting the form in our Slack channel
  • Manual referrals—having sales, product, marketing, support teams refer people directly to our form

This worked great and got us a list of a couple of hundred people in the first few weeks. Like I mentioned before, we already have a good user base, and I realize a lot of people reading may not have that user base yet. Some other channels I’ve found useful in the past when you don’t have users yet:

  • Advertise on weekly newsletters
  • Craigslist
  • TaskRabbit
  • Related Slack groups
  • Facebook Groups
  • UserTesting.com

Plan your tests

We set aside every other Thursday for user testing which forces us test something. Each user testing day has a designated test lead. This will be one of our designers and we’ll take it turn about. It’s their job to facilitate and run the tests that week.

Typically, the type of tests we have fall into 3 categories:

  1. What are they doing today? Interview style research to help us understand their current workflow or problems.
  2. Would they use this? Hybrid of interview and design artifacts to get their reaction to something we’re working on.
  3. Can they use this? Usability test on a clickable prototype or coded solution. We make heavy use of InVision prototypes here to make sure we’re confident about our solutions before we develop anything.

The test lead works with the broader team to align on what we want to test and prioritizes the key things they want to focus on. We’ll typically meet the Monday before a test to discuss what’s on the list of things we want to test. This list usually comes from design and product, but it can also include things from engineering and docs teams.

Once we understand the key things we want to test, we’ll work together to come up with a high-level script and tasks of how might be best to test this. The person who wanted the testing done then comes up with a more detailed script for the test lead.

When we’re testing clickable prototypes, we use Sketch and InVision, and we use Abstract to manage our design files.

We’ll typically branch off from the main project and create any additional screens or add content to help it make sense on the context of the task we set. Then we’ll create a separate InVision prototype so we don’t confuse other team members with those screens that are specifically made for the user tests.

Scheduling

Our goal for scheduling is to have 5 user tests in a day. We have 5 slots, each lasting one hour: 10am, 11:30am, 1pm, 2:30pm, and 4pm, with a 30-minute buffer between each to prepare for the next.

We believe in Nielsen’s law of diminishing returns. Five user tests should be enough to highlight the main issuesTwitter Logo that we need to fix. Any more tests isn’t time well spent.

When we plan for 5, we make sure we have 1-2 backups planned. Usually we’ll have someone internal on standby so when someone cancels (there’s always one) that person can step in. We still get good feedback—and that means our time didn’t go to waste.

There’s a ton of back and forth involved in trying to find a time that suits everyone, so we use scheduling tools to help with this. Some I recommend:

These tools let us create a page with time slots, and then our users choose a time that works for them. They also help us manage messaging and reminders for the appointments. This is a big help.

Using a research coordinator

Recruiting and scheduling takes up a lot of time. After our designers did this a few times, we asked for help from one of our executive assistants. Thankfully they were able to allocate some of their time to running this for us (thanks RJ!), and they even helped optimize the process.

“Make sure your user testing participants know that nothing they do will be wrong.”

Twitter Logo

I can’t recommend this enough, as it means less for our team to think about in terms of operations—and we get to spend more time working on the designs and script.

The research coordinator takes care of:

  • New recruiting channels and working with other teams to fill the pipeline
  • Scheduling each of the available slots every 2 weeks
  • Confirming users are still going to show up
  • Welcoming users on the day
  • Following up after with gift cards
  • Making sure our user testing lab (fancy word for a meeting room) was reserved

Running the tests

On the day we actually do the user test, we greet the user, have them sign a waiver, and bring them into the testing lab.

In the lab, we’ve already set up a MacBook that has tabs open for everything we plan to test.

“Invite devs to watch user testing sessions so they can see problems in person.”

Twitter Logo

We use ScreenFlow to record the sessions and hook it up to a big screen so we can watch. We have no more than 2 people in a room: the test lead and either another designer, PM, or developer who takes notes. Personally I love when developers join because they get to see problems in person—and then they’re sold on fixing them.

To make the person feel comfortable, we start off with the Google homepage on the screen. Some things we’ll say at this point:

  • We’re not testing you, we’re testing our designs—so nothing you do is wrong
  • Think out loud as much as possible so we can understand your thought process
  • What websites do you like to visit daily? What news did you read this morning?

Starting with the Google homepage ensures they don’t get immediately distracted by cool new features and start clicking around. Instead, they’ll listen carefully to what you have to say and to what the task is. Remember, most people haven’t done this before and don’t know what to expect. You’re playing host, so it’s important to make them feel comfortable and get to a point where the conversation flows.

Don’t ask leading questions during user testingTwitter Logo. Some examples of non-leading questions:

  • What would you expect to happen if you did that?
  • What do you think this means?
  • Can you explain what just happened?
  • Tell me about the last time you had to perform a certain task.

Related: How to really understand your users’ motivations

The script we put together serves as a guideline. It’s important not to stick exactly to the script. If the user goes down another path, it might be worthwhile continuing down that path then circling back to where you left off later.

Finish by asking them if they have any questions, and if they could wave a magic wand and have any feature today, what would it be? This helps us build a list of feature requests, and it makes the person feel good that we’re listening to their feedback.

“At a user testing session, you’re the host. It’s your job to make people comfortable.”

Twitter Logo

After the test, express your thanks and send a gift card (we typically offer Amazon gift card incentives). This is also a good time to ask if any teammates would be interested in coming next time, or to schedule a field visit at their office.

A note on remote tests

While we prefer in-person tests, remote tests are also valuable. These are easier to organize since the person doesn’t have to travel and you’re not dependent on your location.

Some things we’ll do for remote tests:

  • Send a waiver to sign via RightSignature
  • Conduct the test via Google Hangouts

Reminding the person beforehand that they should be in a quiet room with a good internet connection is a good idea. Things get awkward when they dial in from a busy coffee shop.

Report your findings

Take your raw notes, create a report, try to keep it brief, and highlight the main issues you discovered. Our reports outline the top issues, and we assign a priority of 1-4 to them.

We share our reports in our wiki. The report includes:

  • Who was tested (photos, names, titles, companies)
  • What was tested (links to prototypes/designs)
  • List of issues including topic, description of issue, recommendation on how to fix, priority
  • Other observations and feature requests
  • Links to videos (stored in Google Drive)

It’s important that this report is digestible by others in the company, so we spend time cleaning it up and making sure the key issues are called out.

Related: The simple guide to product testing

Share the report with anyone who’s interested. We share it with engineers, product team members, and designers across the company via Slack and email. Discussion happens through comments on the wiki between designers, developers, and PMs. We also add tags to our reports so that we can easily search them and pull them into future project wikis.

Action items

The most important thing: take action on what you discover during user testing.Twitter Logo There’s no point doing all this if you don’t actually make changes to make the product better. Sounds obvious, but often this part gets skipped as this is where you need to get buy-in across the broader team.

If you can show clips of users struggling, it’s an easier sell to developers, stakeholders, or whoever needs convincing. And like I mentioned before, having developers and PMs sit in on a test helps to create the empathy needed fix some of the issues you discover.

Conclusion

User testing isn’t straightforward. It takes time, effort, and patience. But every session we run brings so much value.

Follow Mesosphere Design on Dribbble and Twitter to learn more about what we’re working on. We’re also hiring communication and product designers.

Want to try out user testing? InVision has partnered with leading user research platform UserTesting to give InVision users a faster, more seamless way to capture insight into their app, website, or prototype.

You’ll love these posts, too

Collaborate in real time on a digital whiteboard