6 important lessons for user testing copy and language

4 min read
Kai Rikhye
  •  Jul 11, 2018
Link copied to clipboard

Not all types of user testing are created equal.

Certainly, no research is easy. But it’s often less difficult to get immediate feedback on the way a web page or product looks or feels. Less quick is soliciting feedback on copy—and finding out what works or what doesn’t.

Traditionally, most design teams would approve copy like this: an editor will make sure it follows the tone of voice, then A/B test two different versions to see which works more. Sometimes a brand team might be involved.

Photo by @criene

There’s nothing… wrong with that per se. But it doesn’t necessarily provide you with specific feedback on which types of copy work, and why or why not. A/B testing that copy will give you a binary outcome: one works better than the other. But it could be missing blind spots where copy may confuse users, or may only be doing a good enough job.

You might even miss points where that copy is actively damaging your brand.

That’s where user testing language comes into play. Instead of an afterthought, copy questions and tasks should be designed from the very beginning—and they ought to be an intrinsic part of your testing process.

This is how you can do it.

The right words at the right time

One of the problems with A/B testing copy is that, while you get an overall result, you don’t get specific feedback on what works, and what doesn’t.

For basic landing pages with very little content, like a form, this isn’t necessarily too big of a problem. But when you start developing pages that are designed to show detail and explain complex topics, like a product page or a feature page, understanding what phrasing works and why it works becomes so much more crucial.

“Not all types of user testing are created equal.”

Twitter Logo

Yet it’s also difficult. Users will often tell you explicitly why they don’t like something on a visual or interactive level, but most people aren’t writers. They can’t tell you why they don’t understand something.

So they’ll generally say things like, “I don’t get it.” Or “I don’t understand what this feature does.” That’s ambiguous and often vague, but extremely valuable. It means you know your copy isn’t doing the right job.

Now, if you had just done an A/B test? You’d never know that specific feedback. For many designers, copy testing isn’t a priority. They focus on the visual, on whether the CTAs are in the right place. But then ignore the copy that would make them want to click on the CTA in the first place.

That’s a mistake.

How to create a quality trial

In our experience at MYOB, user testing with language can reap valuable results. Our software is pitched at people who are just beginning to understand accounting and how it works, so we have to do a lot of education. Particularly about complex topics like tax.

Related: How Netflix does A/B testing

Recently we’ve been creating new pages for our products. As part of that we’ve conducted user testing on some different prototypes, and we’ve picked up feedback we never would have otherwise. In some cases, we weren’t doing enough to explain how certain features worked. Other times the testers interpreted phrases differently than we thought they would—and not necessarily in a way that would increase their propensity to buy.

So without A/B testing, we’ve already been able to obtain:

  • Feedback on how our tone of voice differs from our competitors
  • Evidence that certain phrases or copy dissuades the user
  • Critique on how those phrases affect our brand image

You know what’s even more interesting? Few of these things would affect the user’s likelihood to take out a trial, or purchase the product.

From the perspective of an A/B test, we wouldn’t see anything different. But the customer’s *sentiment* is different from one version of copy to the next. How they feel about us going into the trial or purchase changes, which not only affects their loyalty but the propensity to churn later down the line.

Without copy testing? We’d have none of that.

Principles for user testing your copy

If you’re really serious about user testing your copy—and you should be—there are some principles you need to keep in mind. Following these will make your life easier, but more importantly it’ll give you tangible results.

You need full copy, not placeholder text

We all love a bit of lorem ipsum, but it’s not going to do you any favors with users. As part of your testing you need to provide users with full text to get the entire experience. It doesn’t need to be perfect, but it needs to give an indication of what type of copy goes where.

Of, course that means…

The copywriter needs to be involved from the beginning

This isn’t about getting a wireframe quickly out the door as fast as you can. This is about creating an entire experience, and that means copy too.

Never show the copy on its own

For some reason, some testers seem to show copy on its own without being placed in a design. That’s always a mistake. Users don’t have the ability to project that copy visually without a reference, and so they’re going to give criticisms that might not even make sense. A word that works on its own may very well fall flat in design.

Do the work—don’t make the testers do it.

Understand specifically what you need from the user

It’s not good enough to just get a general “feeling” about whether the copy works. And remember, even though a user might say, “the copy was fine,” it may very well be because they didn’t even read it.

So just like you create tasks for your visual designs, you need to create them for copy too. Here are a few specific topics that you can use to get started:

  • Does the user understand the text? What are they confused about, if anything?
  • Do they actually understand what the text is trying to tell them? Get them to explain it back in their own words
  • What do they actually read? Do they skim over anything?
  • Does anything stand out as offensive, or shocking?
  • Does any specific phrasing stand out as either positive, or negative?

If there is a particular piece of copy you believe is important, create a task around it and call it out specifically.

Avoid asking about specifics

Your users aren’t going to be able to pinpoint specific words they don’t like. (Some might, but most won’t.) Asking them about verbs or adjectives isn’t going to get you anywhere. Instead, you need to get a general vibe from them about why something is or isn’t working. Don’t go in expecting more than a general direction—the rest will be up for you to figure out.

As always, observe behavior over opinion

Even though a user might protest over specific pieces of copy, always be sure to keep the behavior in mind. What they do will often contradict what they say, and in some cases you want your copy to be bold and edgy. Just be aware of when that’s appropriate and when it’s not.

“Bring copywriters into the process as early as possible.”

Twitter Logo

Testing copy means understanding the details

Bringing in copywriters from the very beginning of the testing process means you’re going to get so much more detail on what works and what doesn’t. Remember, it’s better to have a design that works and delights the user, rather than one that works but leaves them only half satisfied.

And you can only do that by bringing in writers to the process as early as possible.

Collaborate in real time on a digital whiteboard