Published: Oct 06, 2008
For more than seven years, we've studied how the great user experience teams succeed. We've looked at a variety of variables to isolate what it takes. We've looked at management structure, employed methodologies, best practices, and hiring qualifications. We've looked at team communication techniques, requirement gathering techniques, the target industry, and the geographic location. All said, we've inspected about 250 different variables for dozens of organizations across a wide variety of industries, educational institutions, and government.
As with most things, most variables don't play a role. However, we found three key variables as being critically important: vision, feedback, and culture. Using these three variables, we've created corresponding questions to help us quickly rate a team's experience design prowess. Teams that answer these questions well are far more likely to create great experiences than the rest of the pack.
Here's the first question we ask: "Does everyone on your team know what the experience will be like interacting with your offerings five years from now?"
When the answer is affirmative, any team member can describe what the user's experience will be like in five years. They'll tell us a story, like this real one from a century-old insurance company:
"An insured home and car owner, having just had a tree fall on their garage, will log into the site, explain the damage, upload pictures, and get initial claim approval to start temporary repairs and get a rental car—all within a few minutes. Within the next 24 hours, inspection appointments and a detailed damage assessment are scheduled and reviewed, and the repairs are underway within 48 hours. All the payments are handled electronically from the insurance company, with a single NET-60 bill sent to the policy holder for the deductibles."
This story is an experience vision. It outlines how the person, in this case someone who insures both their home and car with the company, can make a joint claim and quickly start the recovery process. Notice that the story doesn't describe the specifics of the design or the system -- that's not important. What's important is understanding the experience of the policy holder.
While this particular story may not sound that interesting or difficult to someone outside, for this organization it's a radical departure from today's experience. Their business units currently don't talk to each other and pretend that customers don't exist beyond their own individual products. So, this integrated vision shows a radical departure and eliminates much of the frustration caused by today's organizational reality. For this organization, five years is aggressive for the substantial, under-the-covers changes that this vision will require.
We like looking five-years ahead because it gets beyond the immediate reactive requirements and starts considering what a great experience could be. If we only looked one year ahead, we'd be stuck with the current realities. If we look too far out, we get into the realm of science fiction.
Because everyone on the team has the same vision, they are all on the same page for what it takes to succeed. Think of it as a stake in the sand on the horizon. Everyone can see the stake and knows when they are taking baby steps towards it and when they are moving away. The stake can move at any time (and, for some organizations, does frequently), but that's ok, since everyone can see the change and start moving in the new direction.
Struggling teams can't answer this question affirmatively. They either have never considered beyond the problems of the day or everyone has a different vision. Working to have a solid vision that everyone shares will go a long way to help these teams.
While the first question deals with where the team is going, the second question deals with where the design has been: "In the last six weeks, have your team members spent at least two hours watching people experience your product or service?"
We're looking for teams that can answer affirmatively no matter when we ask. That means they are regularly watching the users and learning from them.
These observation sessions can happen in a variety of ways (and in the best organizations, the variety is wide). They can be usability tests or field studies. In each case, each team member has spent a minimum of two hours observing the current experience.
Note that we're not talking about surveys or satisfaction measures. Those instruments are often flawed and only give a very small piece of the picture. In the best case, they can tell us whether users are frustrated or delighted, but they can't tell us why. The team needs to observe the experience, in a detailed manner, to really get the information required to make the critical decisions.
Six weeks is an important period. In our research, the average team member works on an experience design project for twenty-four months. This means they'll encounter a minimum of 16 separate experiences during their tenure, working out to be an average of 48 observations for a four-member team during that period. All of that detailed information can't help but create better informed decisions in the design process.
Longer than six weeks and the exposure to the users starts to wear off. It's far less likely that a team member will say, "What about when we saw Fred have problems with accessing multiple policies?" when Fred's experience happened months before.
Many struggling teams have never had a single member observe the experience of using their design, even though, in some cases, millions of users interact with the design every day. In other cases, they only get data from indirect sources or they've had limited exposure during their tenure. When this happens, each member of the team can only talk to their own experience of using the design, which is very likely to be at odds with how real users experience it.
The first two questions are straight-forward and make sense, from a strategic perspective. You have to know where you're going and you have to know what you've already built. The last question, on the other hand, can seem counter-intuitive: "In the last six weeks, has your senior management held a celebration of a recently introduced design problem?"
In most organizations, problems are not cause for celebration. However, in a culture that pushes for frequent small changes, problems become opportunities for improvement. Teams that answer affirmatively have established a culture that not only accepts failure, but relishes it as a way to learn about the users and their needs.
At a major software corporation, the CEO regularly holds parties to give out a valued award, shaped as a full-size life preserver, to individuals who have created "learning opportunities" by introducing a problem into the design. Of course, the CEO acknowledges that the problem wasn't introduced intentionally. But, because it made it into the design, the organization learned important lessons they can use going forward. Receiving the life preserver award from the CEO has become a high honor within the company.
For example, a technology company recently experienced a massive server outage as, upon the release of a highly-desired new feature, millions of users tried to upgrade simultaneously. While the server outage was a major embarrassment (reflected in the press and on Wall Street), it was because of a successful marketing and design campaign for highly-desired functionality. Despite the momentary crisis, the organization simultaneously learned how to create desirable enhancements while also learning the impact that it has on their infrastructure -- both valuable lessons they'll refer to for years to come.
The best organizations hold these celebrations frequently, because they are constantly learning from their mistakes. By making the learning process explicit, through their acknowledgement and reward, the culture starts to look for it. As the old saying goes, "That which is measured gets done and that which is rewarded gets done well."
Struggling organizations do not hold celebrations of what they perceive to be design problems. Instead, they'll punish the "culprits" and put new product-preventing policies in place to stop it from re-occurring. Soon, the original stimuli for these policies are forgotten and the organization is doomed to repeat the mistakes.
The neat thing about these three questions is their applicability to constant improvement. Teams can self assess and look for opportunities to answer the questions better.
A good team may have a start to the vision, but hasn't communicated it to everyone who has influence over the design. The team may occasionally get feedback on their current experience, but hasn't seen anyone recently. And there's always opportunities to highlight the latest things they've learned, even if it was a difficult learning process.
While further research could show there are other factors that influence a team's success, it's clear to us that these three factors are critically important. Fortunately, improving them has little downside, making them a serious candidate for any amount of investment the organization can afford.
How is your team dealing with the vision, feedback, and culture factors? We'd love to hear! Pop us a note at our Brain Sparks Blog.
Read related articles: