Here at Engine Digital, getting to the “right” concept quickly is critical for our clients and our agency. It’s important because it reduces budgetary risk and it allows the team to focus on the right things. It also increases the probability that the project will meet business and user objectives.
To increase our concepting speed, we started using the Google Ventures (GV) sprint process. A typical GV sprint consists of 5 stages: Map, Sketch, Decide, Prototype, and Test.
Related: Go inside design at Google Ventures
This post is about our “Test Day” experience. We’ll review what we found most useful in the context of remote moderated testing.
Let’s imagine that you and your team have just finished creating a prototype. It looks and feels like the “real thing.” It represents your team’s best efforts to solve a client’s problem or need. Your team has a lot of confidence in the solution but your client has some doubts.
Who’s right? Will it work?
The only reliable way to test a concept: put it in front of the people who will actually use it. Testing your concept with real users immediately reveals what works and what doesn’t. Test data helps you and your client cut through subjective opinions and trivial feedback.
Real-world test results keep agencies and clients honest and goal-focused. Tests take take time and effort, so we try to squeeze out as much insight goodness from each session. We do this by involving the entire team on test day.
Test day is all about giving up control and letting concepts stand on their own. Having our entire multidisciplinary team take part on test day creates a shared foundation. The team draws upon this understanding again and again throughout a project. It also allows the team to step away from their work, and look at it objectively.
“Test day is all about giving up control and letting concepts stand on their own.”
The steps on test day are pretty simple:
- Prepare the observation room
- Gather the team
- Contact the participant and start the test
- Listen, observe, take notes
- Wrap up test session and discuss as a team
On test day, we step away from our usual work routines. There’s anticipation, surprise, validation, high-fiving—and sometimes facepalms. Here are some comments from a recent test session:
“Nobody found the testimonials important. They didn’t even look at them. Unexpected!”
–David, Senior Content Strategist
“Most of the instructors seem to use 13-inch Windows laptops.”
–Bryan, Lead Developer
“The book cover is very important for instructors. Our client’s hunch was right.”
–Ryan, VP of User Experience
“It was interesting that he couldn’t find the NEXT button… It’s right there!”
We definitely learn a lot by watching as a group. But sometimes observations aren’t enough to understand what’s wrong. When it isn’t clear why users are having difficulty, the moderator asks clarifying questions.
Communicating with participants
With remote testing, asking users probing questions can be challenging.
“Can you click the NEXT button?”
“What NEXT button?”
“It’s towards the right of screen, beside the book image.”
“I don’t see it.”
“It’s on the right side… it’s right there, beside the book.”
“Hmmm…. I don’t see it. Am I doing something wrong?”
To avoid situations like this, we often use InVision LiveShare when sharing prototypes. LiveShare is a team collaboration tool, but the features work well for remote testing. For instance, with LiveShare, participants can see our mouse and we can see their mouse.
“Can you click the NEXT button? I’m circling it with my mouse.”
“I see it. I didn’t think it was clickable!”
“Good point. We’ll make sure our designer makes it more obvious.”
LiveShare also has presenter control allowing the moderator to orchestrate the test session. For voice communication the tool offers a choice between VOIP and conference line. To record sessions we use Lookback, but we try to make most of our observations in real time.
During the test session everyone observes and takes notes on Post-it notes. An observation can be something good, bad, or simply interesting. Magic happens when these notes are placed on the observation wall.
The observation wall
Taking and organizing notes as a group is hard—unless you use the observation wall. The observation wall is a grid consisting of 5 columns and a handful or rows. The columns represent test participants, and the rows represent tasks. Ideally, the grid is drawn on a large whiteboard.
As tests progress, the wall fills in with Post-it observations taken by the entire team. The steady march of Post-it notes populate 1 column at a time. After each test, we review the notes as a group and identify patterns and show-stoppers. At the end of the day, we review and identify the primary problems and opportunities as a group. Analysis and next steps are quick because everyone understands the issues as they happen.
It doesn’t take much effort to get useful design feedback. In just a day, you can validate concepts with real users and see what works and what doesn’t.
Including all core team members in test sessions is critical. Seeing something first hand is way more powerful than being told about it. Testing removes subjectivity, allowing the design team and client to make informed, goal-oriented, decisions.
Keep reading about testing
by Eugene Huh
Eugene Huh is a Senior UX Strategist at Engine Digital. He leads research, planning, and strategy for many of the agency’s clients across financial services, technology, media & entertainment, and B2B verticals.