Originally published: May 23, 2012
[Note: KPIs are an effective organization's critical secret sauce. Therefore, while everything in the following story is true and really happened, it didn't happen to the same people during a single project. Instead, to protect important business secrets, I've distorted some details and melded several teams' experiences together.]
It took the customer-service representative what seemed like forever to explain this one field. In fact, it was just eight minutes – we timed it. He explained what the field did, how it worked, but most importantly, that the customer should never, ever, under any circumstances, change the value of this field in his customer profile to any value other than what it was currently set to. The field's label was a very uninformative pound sign ('#') and the value that should never change was 1.
The customer on the other end of the phone will never get those eight minutes back. Nor will the customer-service representative. Nor will the three developers watching the recording.
But that eight-minute description of the # field and its never-to-change value was critical to the success of the product. It helped the team identify something that could easily be fixed and have ripple effects through the entire product.
The eight-minute description of the inscrutable # field was a perfect example of a problem that plagued this team's product. Three of the developers observed it at one of their regularly-scheduled product training reviews. The purpose of each review (which every developer does for about 90-minutes every two weeks) was to identify the percentage of tool time that happens during a training session.
Tool time is any time where the user (in this case, the customer-service representative) focuses on the tool instead of their goal of using the product. It's busy work that computers should do on behalf of the user. It's an activity without added value to the user's objective.
Teams that reduce tool time see a huge improvement in the quality of their users' experiences. The users get more time with those things that make the product great.
The organization had decided that the tool time in the product training sessions was a great candidate for a User Experience Key Performance Indicator (UX KPI). The developers were measuring the amount of tool time in each one-hour session and recording the reasons it was occurring. From this, they could make improvements to the product that would remove the necessity for customer-service representatives to spend that time, giving the reps opportunities to do other things during the call.
To do this, the developers needed a solid metric to track. The 48 developers split themselves into three-person teams. Each team would spend 90 minutes every fortnight watching the recording of a previous day's training session. While watching the session, they recorded the length and nature of all the tool time they saw.
Identifying a tool time incident is a subjective call. To make it official, all three team members had to agree. If, after a short discussion, anyone objected, then they didn't record it. With a little up-front discussion, it became easy for teams to decide on their criteria for tool time. (When we compared the different teams' criteria, we didn't see any meaningful differences. Each team had arrived at basically the same rules for inclusion.)
They created a database to track each tool time instance. The database then produced an ongoing average which everyone in the organization could track over time. This became the UX KPI.
The developers, having directly witnessed the tool time incidents, started to talk about them in their regular standup meetings. They came up with easy fixes for some of the "low-hanging fruit" issues. More complex issues started to come up in sprint planning and product direction discussions.
Almost immediately, the teams snuck some quick changes into new releases. Those changes simplified critical parts of the interface for getting new customers on board quickly. The customer-service representatives now skipped past that part, which let them use that time for something more valuable with the client.
This company's customers are small business owners who start a business around their passions. Many became business folks by accident and need as much help as possible when it comes to bookkeeping, inventory, and marketing.
Without dealing with the tool time, the customer-service representatives now work with the customers on those new business skills. In return, this makes the product more valuable, since those customers are now taking better advantage of advanced features. Customers are more likely to stick with their subscriptions and tell their friends about the service.
For this organization, tool time measurement has all the properties of an effective UX KPI:
Behavior based: The team was measuring the customer-service representative's behavior with the product. As they change the design, they get new behaviors. If those new behaviors reduce tool time, then the team can tell they've improved the design.
Behavior is critical for a UX KPI. Many organizations try to use non-behavioral metrics, like customer satisfaction or Net Promoter referral attitudes, but those don't work well as a UX KPI. Without the behaviors, you can't tell what's happening in the experience of the users.
Key to the business: The first word in KPI is key. There are five basic areas that are easy to tie KPIs to: (1) increasing revenues, (2) decreasing costs, (3) increasing marketshare, (4) increasing revenue from existing customers, and (5) increasing shareholder value. An effective KPI is tied to one or more of these.
In this case, the tool time is tied to the costs of product training (decreasing costs). It also influenced the subscriptions renewals (increased revenue from existing customers) and new customers through word-of-mouth (increased marketshare). Each of these make this metric a great way to tell how the design team is enhancing the business.
Performance indicator: A good KPI predicts an important change in the business, hopefully with enough of a lead time to react if necessary.
The tool time metric is a good performance indicator because it predicts the way customer-service representatives use their training session time. If they can reduce the length of the sessions or get into more soft-skill training, it shows improvement in the bottom line.
However, if the designers do something that increases tool time, they'll see costs go up and possibly more subscription cancellations down the road. If they catch it fast enough, they can correct it before it does too much damage.
Unique to how the business or industry runs: I'd love to tell you that, for your business, measuring the tool time in training sessions would be an equally effective UX KPI, but I can't. Unfortunately, while it makes sense for this company, it may not for yours. You'll have to find your own UX KPIs.
Generic KPIs produce generic results. If we really want something that touches the core of what makes our business special, it should be a metric that only applies to what we're doing.
Easy to measure: By using the unanimous voting mechanism of a three-person review team, they turned a subjective measure into something quantitative. Even though it takes 90 minutes to review the one-hour training session video, the exposure to their users and a glimpse into how the product is used is priceless.
The organization collects a lot of data by having each developer take turns doing the review over a two week period. And the repetition from multiple reviews helps point out patterns that might otherwise go missing.
Diagnostic: Because the developers are seeing their users futz in these tool time incidents, they not only know the frequency and duration, but the specific instances when it happened. The teams can compare notes and see what keeps reappearing across sessions.
Teams can only start to think about UX KPIs when they have their basic UX design process under control. It's a sign that the organization has integrated experience design into their culture and are willing to make the investment, because a proper UX KPI program isn't cheap. Attempts that are too early are likely to fail (and possibly leave a sour taste for KPIs and UX in general in everyone's mouth).
However, when a team manages to get a KPI that sticks, the power it brings to the organization is remarkable. It helps everyone focus around the experience, giving a common language and understanding to how great design makes a great business.
What have you used for your organization's UX KPIs? How did you go about deciding that would be a great choice? We'd love to hear your experiences on our UIE Brain Sparks blog.
Read related articles: