100 anecdotes don’t make a data set.
…Or do they?
This is just one of the topics we discussed with Diego Rodriguez, Chief Product and Design Officer at Intuit, on the Design Better Podcast.
We asked Diego how they measure the impact of design at Intuit and whether they go beyond frameworks like Net Promoter Score (NPS), which has its own set of potential problems.
Addressing NPS’s challenges
Diego agreed with the challenges that NPS poses and went on to describe a few ways that they address them. He spoke about the way they instrument their products—tracking anonymized data collected from customers— so that they can understand when someone starts a task and where they might get stuck.
He gave the example of a plumber sending an invoice, and how they can tell if someone starts to create an invoice and can’t make it through the whole process. For a small business entrepreneur, “that’s a big deal because that means they’re not going to get paid, which means that tuition payment for their kid in college may be in jeopardy. So we take that really, really seriously.”
Measuring the impact of design
Another way Intuit measures the impact of design is through a product recommendation score, which they use as a complement to NPS by gathering feedback about specific parts of the product workflow. They ask people about the experience while they are in the product flow, and can tie the feedback to a specific part of the product experience, vs. the overall brand experience.
Intuit complements their qualitative data with quantitative research. Diego related a story from his IDEO days—where he climbed the ranks to eventually become Global Managing Director—when one of his clients, who had been shown some of the qualitative research they had done, said “we [conducted] this great study in the city and talked with ten people, but all we have are anecdotes…I’m concerned that 100 anecdotes don’t make a dataset.”
This is often the kind of feedback given by business-minded stakeholders keen to understand the validity of qualitative research. But Diego pushed back, saying “by the time you’ve talked with ten people, you would know what’s going on with a specific product or an area. And when you talk to 100, it’s even richer.” And he went on to describe how at Intuit, they use the power of AI and natural language processing to get qualitative data at scale.
For example, he shared an anecdote about the millions of customer calls they get during tax preparation season. They can assess these calls using AI for sentiment analysis via the emotional tenor of the conversations they’re having, and tie that to potential problems with specific features: “Oh, wow, we made a change to that workflow two weeks ago. Maybe it’s not working the way we thought it was going to work…that extra button we put in isn’t haven’t the effect we thought it was. It’s actually confusing people.”
They can then go on to fix the problem. Diego finds that this is a great example of balancing qualitative and quantitative data. Too many data points, and not being clear about what you are measuring, can create confusion. And too few, like just measuring NPS, is hard to make actionable. But finding the right balance between quantitative metrics and qualitative research will make insights actionable for every design team.
If you’d like to learn more about measuring design, watch our discussion with Leanne Waldal and Kerry Rodden on Communicating the Value of Design.