Embracing measurement puts yourself and your team out there. It openly says that we aren’t perfect and perhaps we could do our job better.
Through my experience of building the design team at Percolate, my perspective on measurement has changed. I’ve gone from thinking measurement is all about understanding the performance of a solution to measurement being a tool that can be used to improve our teams, our design process, and the business we’re building.
Measuring design hasn’t come naturally to me, though. There have been many occasions where I brushed requests for measurement under the carpet. This was partly due to my natural instinct to focus on the work, and partly because I didn’t want to expose our imperfections.“Measuring design isn’t a terrible idea.”
As disciplines like marketing, engineering, sales, and finance have evolved at our company, I’ve been exposed to the measurement methods they use to understand their impact on our business. Because of this, I started looking more critically at how we could measure the contribution of design.
This post shares the exercises we’ve explored as a design team. Through these exercises, I’ve gained a deeper understanding of how our success metrics aren’t just affected by the design solutions we deliver, but also how well we operate as a team.
Success, struggle, surprise
It’s important for design leads to feel confident that the less experienced designers working alongside them are making progress. To monitor these developments, we started simple diaries that asked leads to note three observations on what the designer working alongside them had accomplished each week; something the designer was successful with, something they struggled with, and something that surprised them. These diary entries surfaced a range of learnings, including how designers documented their work, what they could do better when presenting work, and when they went above and beyond on projects. As a result of these diaries, I had conversations with design leads around coaching tactics, team collaboration, and project development. At times updating these dairies slipped, but that was okay. We always came back to them as we valued the view they presented on our team’s development.
If you haven’t come across KPIs (Key Performance Indicators), they are “performance measurements that evaluate the success of an organization or of a particular activity in which it engages.” They can be applied to projects, teams, and individuals. For example, a KPI for a writer on our marketing team is to publish 3 blog posts a month. We hadn’t set goals like this on the design team until we started fleshing out the design manager responsibilities. We found that breaking down the manager responsibilities by specific goals helped them get a clear picture of what success looks like. They knew how many user research reports we were targeting, how many usability design workshops to hold every quarter, etc.
These goals gave managers a target to work towards and helped them be more proactive on the things that mattered. It wasn’t easy to put hard goals against all responsibilities, though. For example, the effectiveness of a manager’s mentorship style is best discussed qualitatively rather than tying specific metrics to it. Through this exercise I learned to use goals where they made most sense, and not to force them across all responsibilities.
The onboarding plan for new members of the design team is a constant work in progress. You want to give people enough info on the company, the teams they work with, and an introduction to our product. Then you want to get them working on a project. For designers to be successful, you want them to feel ramped—meaning they have all the information they need to do their job.“Measurement can improve our teams, our design process, and the business we’re building.”
At the end of last year we did a survey with some of the new members on our team to figure out what worked and what didn’t work during their onboarding. We wanted to know at what stage they felt they had a clear understanding of our product, and what onboarding sessions were most valuable in getting them there.
The biggest takeaway from the survey was our onboarding for new designers was too broad. During their first 2 weeks at the company, designers wanted the sessions to focus on helping them learn the product inside and out. Once they felt comfortable with our product, they saw the value in learning about the work of other teams. While it was frustrating to learn that some sessions weren’t as effective as we’d hoped, the fact we could be open with our learnings and make adjustments was more important to our team.
Redesigning the page templates we use across our marketing website was a project that brought together people from our marketing, design, and engineering teams. From the outset, we chose to use this project as a test to see if we could record how successfully we worked together to solve the brief and launch the new templates. At each stage of the project (IA, wireframing, visual design, etc.) we recorded the time our design team members spent, what was working, what could have gone better, notes on project status, and next steps. We used a spreadsheet layout and color system that made our productivity patterns visible across project stages.
Once the project was complete, we were able to see a map of the project from start to finish. From here, it was clear at which stages collaboration was successful and where it broke down. Have no doubt, completing the spreadsheet each week was tedious at times. But it wasn’t long before we could see value in the exercise. Coming out of the project we had clear view on the investment we had made. We could also see what practices and assets we could create to help us improve.
In pursuit of improving the user experience of our software product, we’ve started to record a backlog of usability enhancements. An example of an enhancement is making our secondary navigation consistent across applications. The backlog is reviewed with product management and engineering during planning sessions and a selection of the enhancements are incorporated into our development sprints. This is a big deal, as up until now roadmaps have largely focused on new features, not enhancements.
It hasn’t been as easy as I’ve made it sound, to be honest, as we had a lot of false starts with this project. What was difficult is the fact we had a group of passionate people who wanted change immediately, and on the other side of the table we had a group of people who were balancing a number of development needs. We needed to find common ground so we could plan and prioritize change together. To help explain the value of the enhancements, we identified a set of usability principles. The principles provide a good articulation of the usability problems we’re trying to solve. They’re helping get all disciplines on the same page about how we can push our product forward.
Customer voice report
This is a project we’re working on at the moment. It started from a conversation I had with Emmet. We were talking about research best practices our teams had been using to help inform the products we’re building. One of the inputs we are both interested in is customer feedback.
Customer feedback comes in many forms at Percolate: advisory boards, field conversations, sales analysis, research projects, beta programs, and in-app support tools. What Emmet shared was that the product team at Intercom reviews and filters customer feedback submitted through in-app support tools to create “Customer Voice Reports.” These reports are delivered to the product teams on a quarterly basis for consideration when planning their roadmaps. This idea resonated with me as a smart way to inject the customer voice into our planning process. Once we establish this baseline for the Customer Voice Report, we can look to incorporate other customer inputs to give further context and definition to the reports.“Goals need to be bigger than making sure something gets done.”
My biggest learning from doing these exercises is that my goals needed to be bigger than making sure something gets done. This mindset wasn’t going to inspire our teams to grow.
Instead, asking questions like, “What can design do to help make our customers successful?” and, “What can we do to work better with the teams around us?” has helped me find ways to go beyond design solutions. We’ve been able to surface valuable insights on our activity from across the design journey.
Beyond the impact these experiences have had on my approach as a manager, they’ve created a range of opportunities for everyone on our design team to share their ideas and feedback. In some cases, this has brought clarity around areas that were unclear, making us more productive and accountable for the way we work. Other exercises have created consensus on the projects we should be working work on, where previously we were struggling to find a way forward. The most effective exercises have given the team assurance that their voice is heard and that they can have an impact.
These are benefits that not only directly impact how we operate as a design team, but also affect the success of the teams around us. I’ve used our learnings from these exercises to better communicate the value of design across the organization, and to inform new tools and processes that improve collaboration between teams.
I hope you can apply some of our learnings to your team. We’re much better for these experiences and I believe your team could be, too. As we look to increase the impact of design moving forward, I know we’ll be bringing new ways of analytical thinking to the table.
So despite my doubts, it turns out measuring design isn’t a terrible idea after all.
This was originally published on Design.blog.