Hertz Brings Wanganui to the US

Jared Spool

July 26th, 2005

A database containing airports and their corresponding addresses. Not too difficult a thing to implement. Probably not something you would need to worry about the usability of, right?

Well, not quite. At least, not for Hertz.com.

Here’s the scenario: An experienced traveler comes to the site, hoping to reserve a car for pickup at the Seattle/Tacoma airport. They enter SEA into the “Renting City or Airport Code” field, which is exactly the right code for SeaTac airport.

Hertz.com Page to for choosing a city. User typed in SEA into the Airport Code field
Click to see the Hertz.com page for choosing a city.

To their surprise (and ours, I might add), our traveler was presented with a pop-up declaring that they had to choose which SeaTac airport they wanted, with three interesting choices: the conventional Seattle, Washington US, the interesting Seattle, Western Australia US, and the amazing Seattle, Wanganui US! Who knew you could have so many choices to find Seattle?

This is obviously a problem with Hertz’s database. A similar request for SFO (San Francisco) or BOS (Boston) only results in one choice of airport. (Though, it is a little strange that they also do a text match on the letters, so the pop-up for SFO also asks about Chelmsford, Massachusetts, Gosford, South Wales, and Amersfoot, Netherlands. I wonder how many people search for a city by typing in 3 letters from the middle of the name.)

Three results for SEA, including 2 new US states: Western Australia and Wanganui!

This interesting database integrity problem requires the user to pause, figure out what is up, and confidently choose the right city. At least, that’s how it seems. Turns out, if they choose the SeaTac in Wanganui, they still get the real one. Whew!

But, it does highlight an important issue: are we, as usability evaluators, supposed to report on problems we find with the database? If so, this creates a whole new dimension to usability: data integrity.

As we can see, data integrity issues impact the user experience. But, it was just a coincidence that our user entered SEA into the form. Virtually any other city would have produced “expected” results, thereby causing us to miss the problem entirely. If we need to find these problems, how do we create test scenarios to ensure we uncover them?

I’d be curious to know if you are running into problems like this and, if you are, how you’ve enhanced your usability process to tackle this in a thorough and thoughtful way.

5 Responses to “Hertz Brings Wanganui to the US”

  1. Doug Anderson Says:

    Yes, any design or implementation flaw that negatively affects the user’s experience with the site/application poses a “usability” problem. As you noted, this was a problem only for some subset of possible data values entered into a form. That you encountered the flaw was fortuitous, and any such chance encounter ought to be listed as a usability bug, at the very least.

    However, you ask how we might be addressing such problems thoroughly. To do so is the province of the development and quality assurance teams. They are the only ones with the mandate to catch such flaws. To attempt to be thorough in finding such problems during usability testing would consume more than the available usability resources.

    Design reviews, code walkthroughs, and careful designing of QA tests based upon a thorough understanding of the software & database designs are more cost-effective ways to catch such issues.

    OK, so maybe I have a not-so-firm grasp of the obvious.

  2. Jared Says:

    Doug,

    Interesting thought. But would a design review, code walkthrough, or careful QA testing actually have predictably caught this problem?

    From my outside perspective, it looks like the problem is purely a data coding issue. Other cities, such as SFO or CHI don’t produce these problems.

    It’s been awhile since I attended a design review or code walkthrough, but I’ve never seen one that inspected the data feeds.

    I think there’s more to this problem than what our standard design practices can accomodate.

  3. Laura Johnson Says:

    A design review *might* have caught this problem, if the reviewers were thinking creatively about what could go wrong. Careful QA *might* have found this problem, but there would be a certain amount of luck involved … just as there was a lot of luck involved when the usability test caught it.

    In my organization, the usability testing and QA testing are both coordinated from within the R&D team. So we realized early on that we might (in fact, we would) uncover code/data problems during the usability testing. One of our process steps after every walkthrough and usability test is to enter defects found into the defect database. In this case, entering the “Wanganui” bug would (hopefully!) cause the software engineers to think about what other problems could occur involving the data encoding.

  4. Thomas Mantooth Says:

    Whether a design review or QA would have caught this problem would depend on the true nature of the underlying problem. What you’ve shown are the results of the problem, not the problem itself. Are there really three different entries for Seattle, or is there one entry for Seattle, WA and three possible translations for “WA”? If there are really three different entries for Seattle, then it would be a database problem, and it’s unlikely that a review of anything but the data itself would have shown such a problem.

    However, if there’s only one Seattle, WA in the database and three different possible translations of WA, then that should have been caught in a design review. In this case, there should be logic (and additional data) that associates a province/state code with a specific country. So, if the country is US, the only translation for “WA” would be the correct one: Washington; if the country were Australia, then Western Austrailia would be the correct translation; etc.

    I’ve seen things like this happen before, and there are several possible causes. One is that there was no review of the design actually done. Many times there is so much pressure to get a product deployed that shortcuts are taken that directly affect quality and usability. Peer reviews, walkthroughs, and field testing are the areas that typically get slighted, much to the detriment of the product’s users – regardless of whether those users are internal or external customers. Unfortunately, those are also the places where usability problems are most likely identified.

    However, the more common problem in development organizations is one that no amount of peer reviews will solve, and that problem is having too limited a scope of who will be using the product and how it will be used. In the Hertz case, if the problem is that there are three “WA” possibilities, then the cause is thinking that only certain users would ever use the software: only people in the US. If only US airports are involved, then there’s only one translation for any of the standard USPS state codes. A design that works just fine for a single country, however, might not work so well in a global setting, which is a likely scenario for the Wanganui problem.

    My company produces software for a specific vertical market, and most of the designers and developers expect the software to be used one way: the way they intended it to be used. What they have failed to realize is that the software may be used for purposes other than those intended purposes. Why? Because our customers have business functions that they need to accomplish, and some of them have been very creative in the ways they’ve used our software to do those functions. Sometimes it works well for them, sometimes it doesn’t.

    So, to bridge that gap between the designers and the users, we are getting the users involved earlier in the software design process. Usability is also one of the criteria we use when reviewing a design. We are also putting more of an emphasis on developing good use cases and doing more prototyping than ever before. We seem to be getting better at usability, but our customers will be the ultimate judges of that.

  5. Avis: Trying Too Hard? » UIE Brain Sparks Says:

    [...] not to be outdone by their competition, Hertz, Avis.com has chosen the rebellious route and decided to defy standard convention. [...]

Add a Comment