19 Lessons from United Airlines on How To Build A Crappy Survey

Jared Spool

December 26th, 2010

As I was boarding my fourth airplane for the week, I noticed a wifi decal on the fuselage. I’ve used wi-fi on planes before (primarily Virgin America, my favorite airline), but this was the first time I saw it on a United flight (and I’ve flown 74 United flights so far this year).

While I actually like the solace that a non-internet-connected airplane flight gives me, I decided to give this a try. Once we were airborne, I fired up my browser and connected to the network.

United In-Flight wi-fi Home Page

On the other flights I’ve taken, there’s just been a log-in page for Gogo, the inflight internet service. It has the usual information about paying for the service and a place to log in if you’re a frequent customer. (They store your billing info, so you can just charge your account with an email address and password. It’s pretty simple.)

While United uses Gogo, it’s completely hidden behind a home page that I pretty much ignored. All I wanted was the internet access. There are two not-so-easy to find buttons (obscured because they’re identical to lots of stuff I don’t want) to hit, so I chose one expecting to be told what the outrageous access fee would be.

Crappy Survey Lesson #1: Don’t let people opt out of your survey.

Survey Request Page

Instead of a pricing and log-in page, I get a simple screen that says “Before you access the Internet, please take a few minutes to complete a short survey. Your responses will help us improve United in-flight Wi-Fi.”

There’s no option here to skip the survey. I must fill it out. I watched other passengers encounter this page and it’s there for everyone. I’m guessing it’ll be there for a while, so I’ll get to fill it out on every wi-fi flight I take until they stop the survey.

Of course, they want everyone’s opinion. However, do they want everyone’s opinion multiple times? How does that help them?

Given no choice, I started up the survey. That’s when it got really amusing.

Crappy Survey Lesson #2: Ask a multiple choice question with the wrong answers.

First survey page: Why are you going online today?

The first question of the survey asks why I’m logging in. Frankly, I was logging in to see what the experience was like. I hadn’t any other agenda, but I imagine I’d probably use it to keep in touch with my office and friends, along with checking my email. I had plenty of work to do, but it had been a long week and I might do something else. However, I hadn’t given it any thought until they asked.

There are four answers: working, checking email, leisure/entertainment, and all of the above. I would’ve liked to say I was just checking out United’s internet offer, to see if it’s something I’d be willing to pay for. That wasn’t a choice.

I’m not sure what they meant by leisure/entertainment. I find the Twitters entertaining, but I didn’t think they meant that.

I get that they are trying to find out what people are doing with their service, but I don’t understand what they are going to do with the information from this question.

Let’s say 30% of the users choose “Check email”. What could the United team possibly do with that information? How would they change the service or set the price? I can’t see how these answers help them. I wonder if an open text box where people describe it could be helpful.

Of course, there’s another way to get this question: watch what people do. Their analytics will tell them how many people check email (either using POP3, IMAP, or SMTP ports, or visiting one of the web-based email providers, such as Gmail or Yahoo mail). The analytics will also tell them which sites the users visit. It’s pretty easy to separate out work sites from pleasure sites. VPN usage is a good clue too.

Why ask a silly question when you can observe the behavior?

Crappy Survey Lesson #3: Use radio buttons when you mean check boxes.

I found it interesting that United thinks checking email is neither work nor entertainment. I’m curious what the United team uses their email for.

If they really wanted answers that get them closer to something meaningful, check boxes are probably a better way to do this.

Crappy Survey Lesson #4: Don’t tell me how many questions we have left.

This was just the first question and I didn’t have any clue how many of these were left. Were we almost done or was my entire six-hour flight a long survey experience?

Fortunately for United, I had no choice other than to continue or not use the Internet. There was no button to say I was no longer interested in participating.

At what point do people start choosing a random answer, just to get through the survey so they can work, check email, and/or be entertained? (I’m betting on the second or third time they have to fill out this survey, that’s pretty common.) I wonder how the United team will tell the difference between the legitimate answers and the just-get-me-online answers.

Crappy Survey Lesson #5: Use words that mean different things.

Survey question 2: Did you use any features?

United wants to know if I used any “features” on the “United Wifi Home Page.” Are we talking about the page that first popped up when I connected? Or is this a page that describes the service that I haven’t seen? What does “home page” really mean to an average user? Earlier that day, I went to united.com to check in for my flight by clicking on a link in an email. Is that the United home page? A picture of the United Wifi Home Page might help here, assuming it was big enough I could recognize it.

What do they mean by features? On the page that came up when I connected, I clicked on a blue button that said “Full Internet Access”. Is that a feature?

Crappy Survey Lesson #6: Use Yes and No as answers.

The only two answers are yes and no. What if I don’t know know? United is forcing me into answering a question I don’t understand. If I could answer “I’m not sure”, at least they’d discover their question is confusing to people.

Let’s say 50% of respondents answer no. What does the United team do with that answer?

Crappy Survey Lesson #7: Ask about satisfaction.

My thinking went like this: I am using the wi-fi to connect to the internet. I clicked on the “Full Internet Access” button to try to get it. (Hopefully, I will get it once I’ve finished this survey.) Therefore, that button must be a feature, so I answered yes.

Question #3: How satisfied were you?

What then appeared was Question 3: How satisfied would I say I am about the features on the United Wifi Home Page?

First, let’s just look at the word “satisfied.” I think United is trying to find out if somehow this page made me happy, or, at a minimum, pleased me in some way. Yet, the dictionary says it means content, completely paid, or convinced. Happiness or pleasing is not a condition of being satisfied.

If I were to pick one of the dictionary definitions that could be useful, I’d pick content. Yet, content is a neutral term. Being content or satisfied is like finding a restaurant’s food edible. It’s not a particularly positive term.

Crappy Survey Lesson #8: Pick a Poor Neutral.

If United’s team really wanted to know if I was content with their home page, they could make the scale go from the extremes of “Extremely Contented” to “Extremely Discontented”. But what would the middle be? What’s it like to be neither content or discontent?

That’s the problem with this scale, which has a wacky state of “neither satisfied or dissatisfied.” If 30% say they are satisfied or extremely satisfied, 30% say they are dissatisfied or extremely dissatisfied, and the remaining 40% say they are “neither satisfied or dissatisfied”, what does that tell the team? What will they do with that information? How would it be different if everyone says they are dissatisfied?

Crappy Survey Lesson #9: Ask about things people haven’t used.

I still don’t know what United thinks a feature is, but if it’s the other buttons on the home page, those are things I haven’t used at this point. How can I indicate my satisfaction with things I haven’t experienced? Frankly, without looking at the screenshot I took of the page, I don’t even know what they are at this point.

That begs the question: what are people actually answering when they choose a number between one and five for this question? Are they predicting their satisfaction? Or are they only to answer for the features they’ve used?

Crappy Survey Lesson #10: Ask users to rank things they don’t care about

Question #4: Rank your favorite features

I’m sure the list of items, from movies to inflight food ordering capabilities, are a well-researched collection, not something the team just brainstormed over a beer. (Ok, maybe I’m not so sure.) However, just because they are on the list doesn’t mean they are something the respondent wants.

At least, not this respondent. I couldn’t care less about United providing me more movies, games, books, or shopping. I carry an arsenal of electronics that already house what I want in those areas. I don’t know what “destination content” is, but I’m guessing it’s tourist stuff about the city I’m landing in. If the plane actually served food, I might want to order it online (assuming it works like Virgin where they actually bring it to you within a few moments of ordering). And I might like “More United Information”, if it was useful.

Question 4 after ranking.

That’s two things out of seven that I’m interested in. How do I rank that? Well, I tried giving a 7 to everything I didn’t care about, but the survey software didn’t like that at all. Now I have to spend effort ranking things I don’t want.

The problem is now the item I’ve ranked third (Shopping site/deals) is really something I don’t want. How does the United team know that? They would likely get the wrong impression that I would be happy with that offering.

Crappy Survey Lesson #11: Don’t give a space to learn from your users.

The item I like most on this list is More United Information. Anything United can tell me about my trip, especially connection information, could be useful to me. I’d like to know how far and fast I’ll need to trudge across the terminal to make my connection. I’d love to know if my bag is on the flight. I’d like to know what the best ground transportation options are, once I land, and where I find them in the terminal. And I’d enjoy knowing what the food options are in the terminal (and perhaps even placing an order when rushed).

Of course, I just made all that up and don’t know if the United team thinks like me. I’d love a way to tell them these great ideas (in a form other than a blog post criticizing their survey). There’s no place for me to enter my ideas, great or not.

Crappy Survey Lesson #12: Don’t give users an out.

Have you used another airline's in-flight Wi-Fi service?

This is definitely the best formed question so far. Yes or no. Good choices. Unless I’m not sure. Like, is the stuff on Jetblue or Virgin America, where I can chat and play games with people on other flights part of their wi-fi service?

As we learned in Crappy Survey Lesson #6, if someone isn’t really sure, their only option is to answer yes or no. If the team is going to do something with this data, it might be nice to separate out the people who are sure of their answers from the people who aren’t. An “I’m not sure” answer would get them there.

Crappy Survey Lesson #13: Make the survey pages load REALLY SLOW.

Had you been sitting with me on the plane, you would’ve noted that we’re not more than 10 minutes into this exercise. That’s because the survey pages were loading really slowly, apparently being summoned from a server that was buried deep beneath the earth’s mantle.

Had there been an option to quit the survey at this point, I’d likely would have. No such option existed (the entire page is what I’ve been showing you). It took more than 15 minutes of my life to finish the entire survey. (Interestingly, the performance of the wi-fi service matched the survey’s performance.)

Crappy Survey Lesson #14: Don’t bother with good English.

On which flight did you last use a Wi-FI service onboard?

“On which airline did you last use a Wi-Fi service onboard?”

It’s a one sentence question. Convoluted grammar doesn’t help the experience.

Crappy Survey Lesson #15: Ask the wrong question.

What possible use is it to know which airline I last used wi-fi service on? I have used wi-fi on American, Continental, Delta, and Virgin America. I believe the last one I used it on was Virgin America, but I am not completely sure as it was a while ago.

However, why does it matter? For one thing, they all use the same underlying service: Gogo Inflight, just like United. The experience (other than this silly survey) is pretty uniform. It’s my understanding that I can buy an unlimited-use monthly pass from Gogo that works on any of the airlines they serve.

So why is the last airline I used it on useful? Well, it isn’t to help with the next question.

How satisfied was I with other airlines service?

If the previous question had been checkboxes where I could indicate every airline I’d used it on, this question might be more helpful. But now, I’m rating all my previous experiences.

Also notice the scale has changed from the previous satisfaction question: I’m not allowed to indicate “Extremely Satisfied” with the competitor’s service. Only “Very satisfied”. Curious.

Crappy Survey Lesson #16: Ask how much I’d pay.

Would I be likely to pay $5 for a 3 hour flight?

Oh, where do we start on this question? We can start with how it’s a five-point scale where 3 points will do. What is the difference between “Extremely Likely” and “Likely” when asking if someone will pay a price for a service?

The neutral point (3) is poorly labeled. What does “Neither likely nor unlikely” mean? “Can’t decide” or “Don’t care” or “It’s complicated” would be much better answers to choose from.

But those are minor nits that we’ve already covered here. The real problems come from the question itself.

First, you’re asking me about a three-hour flight when I’m sitting on a six-hour flight. United doesn’t have very many three-hour flights. They have a ton of flights that are less than three hours, like the frequent flights I make from Boston to Washington, DC. They have a ton a flights that are more than four hours, like the flights I make from Boston to Denver. However, a flight that is only three hours, no more, no less, is very hard to come by.

Second, you’re asking me about the future. If you really want to know what I’m likely to do, you probably want to know my history (beyond what my last flight was). Have I ever paid for wi-fi service on a flight less than three hours? (Yes, but will often pass.) Have I ever paid more than $5 for wi-fi service on a flight of that length? (Yes, but it makes me think if I really need it.) How often have I done those things? The answers to these questions are probably more predictive of my future behavior than any answer I could give to this question.

Lastly, (and most importantly,) I know how this question works. If I say yes, you charge me $5. If I say no, you consider not charging me $5. Guess which outcome I’d prefer? For what reason should I tell the United team that I’d likely (or EXTREMELY likely) pay that price? None that I can think of. So, EXTREMELY unlikely it is.

Crappy Survey Lesson #17: Focus on the present or the future.

Do you book flights on United.com?

I have booked flights on United.com. It’s clumsy and I prefer not to do it. Recently, I’ve only done it to book a flight with points. I much prefer Orbitz.com for my flight booking.

How do I answer this question? I don’t want to book my flights here. I haven’t in a while and been thankful.

Had this question asked me what I’d done in the past, I’d have known what to answer. However, it asked me in this weird future tense. I’m not booking a flight now. I have no plans to book my future flights at United.com. Unless they pull a move like American (who has apparently decided to stop selling its flights through Orbitz), I would be happy to not book on United.

Never is a long time. However, out of fatigue of this survey (now clocking in at 12 minutes), I decided it was the right answer here. That probably prevented another few questions.

Crappy Survey Lesson #18: Don’t ask for contact information.

Any other thoughts/comments?

What thoughts or comments should I have at this point? I could talk about how I believe this survey is a waste of time, since I had to fudge answers to make them fit. I could talk about how the connection speed was slow.

At this point, it’s still not clear if United is planning to charge me for the internet usage. I could talk about that.

Of course, being a long time United customer, I have LOTS to talk to them about. I’d love to talk about how messed up United.com is. I’d love to talk about their crappy upsell process. I’d love to talk about how I feel I’m always being scolded by United employees, even though I spend thousands of dollars on them every year. I’d love to talk about how their employees often give the impression that it would be much easier to make the airline work if they didn’t have to deal with all these pesky passengers.

I don’t think that’s what they wanted in this box. However, I don’t really know what they wanted in this box, as, once again, it’s not clear. It would’ve helped if they said, “Is there anything else you’d like us to know about how to make the best inflight Wi-Fi experience?” (I would’ve mentioned having power at the seats would make a huge difference for six-hour cross country flights.)

As is common with other United surveys I’ve encountered, they don’t ask for my contact information. If they had my contact information, they could match my frequent flyer information up with the answers I gave, which would, of course, give them tremendous insight into where I’m coming from. Even my flight and seat number would be valuable.

Anonymous surveys have their place. This isn’t one of them. There was nothing here that I’d change if I thought the United team could figure out who I was.

United could make the contact information field optional. I’m betting most folks would fill it in regardless, as that’s been my experience for these types of things. Especially if you make a compelling case for how it’s valuable to your research and how they’ll benefit from it. (However, you do need to be clear about how you’ll use that contact info from a marketing standpoint. People don’t want to sign up for more spam.)

Crappy Survey Lesson #19: Waste your users time.

Survey Thank You Page

I had just come from a client workshop where we went through two weeks of site visit data. With the client in tow, we visited 14 of their customers, asking a lot of questions and observing how their own businesses worked.

At the end of each visit, we thanked our research participants. However, we also took the extra effort to express what we learned that was valuable to us.

It’s nice that United thanked me for filling out the survey. What would’ve been nice is to give me a sense that it was more than busy work.

To make this worse, I still had no idea if my internet usage was free. I didn’t know if the next click would bring up a Pay-to-Log-In screen of some sort, looking for my credit card information.

At a minimum, the last screen could’ve said, “We value your input. We’re going to study it carefully and come up with the best inflight wifi experience. As a small token of our appreciation, we want you to use the service for free on this flight.” That would’ve been cool.

Instead, I felt like I had just wasted my time. (Vowing to have it not be a complete waste, I promised myself I’d document it in this post.)

My biggest worry is the next flight I’ll get on with wi-fi service will have the exact same survey. If that’s the case, I’ll probably answer all the questions differently, just to mess with their heads. After all, if they’re going to waste my time…

30 Responses to “19 Lessons from United Airlines on How To Build A Crappy Survey”

  1. Joshua Muskovitz Says:

    Another nit to pick. Would it be too much to ask for them to match the direction you are actually traveling to the graphics on their “home page”? It sure *looks* like you are heading to the west coast (in the tiny “view flight map” button) or at least westward (the giant picture of a plane to the right). And yet it bothers to tell you that you are heading East-Northeast. Why torture customers this way?

    A question for you, though: What psychic power allowed you know to grab screenshots along the way? I would have been on the third survey question before it occurred to me.

  2. Livia Labate Says:

    Next flight they’ll have “back to previous page”, “take the survey” and “Jared, click here”. Oh wait, they won’t. Because they don’t pay enough attention. It’s amazing that you still try to point out the error of their ways. Oh well, at least we all learn something.

  3. Rob S. Says:

    Great article Jared, and not just because I’m designing surveys at the moment 🙂 Shameless plug: Actually, I have an entire site dedicated to constructive criticism about ‘surprising’ interfaces (to put it kindly) – http://www.allaboutbalance.com/.

    @Livia – Companies do pay attention to this sort of thing, thanks to Twitter and Google Alerts. I wrote an article about a Comcast UI and someone from their team almost immediately. I wouldn’t be surprised if someone from United has already read this.

  4. All This ChittahChattah | ChittahChattah Quickies Says:

    […] [from steve_portigal] 19 Lessons from United Airlines on How To Build A Crappy Survey [UIE Brain Spa… – [Jared's detailed deconstruction of a badly written and entirely inappropriate survey – on board a United flight before he can get to the WiFi login screen – let alone find out if there's even a charge for the onboard WiFi – reveals the tragic limitations of badly written surveys and puts the lie to people who shrug off bad questions with "Well, at least you learn *something*". Even more this blog post reveals the emotional and intellectual state of someone who is taking a survey; the external orientation most surveys lack or deny. Required reading.] My biggest worry is the next flight I’ll get on with wifi service will have the exact same survey. If that’s the case, I’ll probably answer all the questions differently, just to mess with their heads. After all, if they’re going to waste my time… […]

  5. Mark Salsberry Says:

    Nice article. It’s another good example of how more and more people think that they can design an effective survey just because there are do-it-yourself survey tools available. Similarly, it amazes me that United has their customer experience survey (ualsurvey.com) on every boarding pass and yet when you go to this website from a smartphone, it’s a terrible, cumbersome experience. What a perfect opportunity to allow their flyers to take a survey immediately after the flight – maybe while walking through the terminal – by optimizing their surveys for mobile phones. There are services out there that make it really easy, like http://www.jetjaw.com and http://www.mobiode.com.

    [Editor’s note: Mark apparently works for jetjaw.com.]

  6. Emanuel Says:

    I’d send this to United, at least they might read it then. It would also make a bigger difference than just ranting, I think United should know about this…

  7. WC Says:

    I’ve been on the other side of creating something like this. The programmers don’t know or care what’s being done with the information being gathered (it isn’t for them!) but management didn’t give them a clear idea of where they were going and only a partial list of information to ask for. So likely, the questions were put together in a few minutes by their lowest employee… Or whoever drew the short straw.

    And then management objected to the questions and asked to have them changed. Multiple times.

    The end result is the frankenstein user experience nightmare above. It’s not of any use to anyone and only serves to annoy the customer.

    Of course, that’s only 1 scenario where the above is the result. I’ve also seen it where management gave the job of creating the questions to some random employee that has no experience or training in creating a survey. The person did their best on it, but it just isn’t their thing. It’s possible there wasn’t even a programmer involved here as an off-the-shelf survey system would work just fine with Random Employee entering the questions.

  8. Kitty Says:

    Did you get free wifi in the end?!

  9. Peej Says:

    Quite entertaining. But what about the wifi, was it free?

  10. Vicky Says:

    Also: since when do surveys go from 5 to 1?

  11. ArtSpot Says:

    I think we’ve got a winner!

    WC hit the nail on the head.

  12. Jared Spool Says:

    @Joshua: I’m currently researching surveys and satisfaction instruments for a project, so taking screen shots of any surveys I come across is now second nature.

    @Kitty, Peej: Depends on your definition of “free.” I didn’t have to hand over a credit card # to charge. However, United is very good at demonstrating they don’t value my time. (Just look at their attitude towards flight delays.) So, from their perspective, it was “free”. I’m not sure that’s my perspective.

    @WC: I think you’re right on. When I was explaining the post to my girlfriend, I mused that the survey was probably designed by a staff intern.

    @Emanuel: I’ve given up trying to get United to pay attention to me. They know where to find me. (Probably in seat 13D.)

  13. When An Intern Designs Your Surveys | Martin Research Says:

    […] assuredly has people aboard who can do better. Jared Spool, in a terrific blogpost entitled “19 Lessons from United Airlines on How to Build a Crappy Survey“, takes United to school on how to make sure your Customer Satisfaction Surveys do nothing […]

  14. Frank Martin Says:

    Excellent post Jared! I just had to blog about it and link to your site!



  15. Bart Hilhorst Says:

    Fantastic post! Lesson #14 could also mention the multiple usage of the word “niether”.

    Happy to know I’m not the only one taking screenshots of surveys.

  16. axplock med copywriting och nya medier 2010-12-28 | axbom Says:

    […] » 19 Lessons from United Airlines on How To Build A Crappy Survey » UIE Brain Sparks Ett riktigt skrĂ€ckexempel! United Airlines verkar inte göra mĂ„nga rĂ€tt pĂ„ den digitala arenan. […]

  17. Kelly Watson Says:

    Here’s a good one: tell people you’re doing a survey because you care about their opinions, when really you just want to gather user data to sell more ads. Seriously, do you really need to know my household income because you care about my opinion?

  18. Chia Lopez Says:

    A lot of the things you are nitpicking about this survey are part of statistically accurate evaluation techniques. While you make some salient points there are several errors in your posts that appear to reveal a bit of ignorance about statistical data collection. Seems unusual for you to be so right and so wrong at the same time. Are you only looking at this from a UI aspect? Did you hastily err on the side of attempted witticism?

  19. Jared Spool Says:


    I know a little bit about statistical accuracy (and precision, for that matter). What about my nitpiking demonstrates ignorance?

    I can only guess what United’s goal for this survey was, but if it was my team using something like this to improve the product, I think I can tell whether the possible answers will provide meaningful, actionable results.

    I’d love to know what I got wrong.

  20. Sweetwater Tom Says:

    I have stopped doing corporate surveys whenever possible. As you noted, they force wrong answers, they don’t have my answers, and there is no free-form area to express my thoughts. ( I think they only care about what else they can sell me or charge me.) If I am forced to complete one, I enter all lowest answers. Hopefully their analysis software will catch the abnormal response.

  21. James V Reagan Says:

    Nice article, seems that most companies follow this exact formula and have been since the beginning of time.

    Related to #13 (slow pages) why do surveys have to put questions on different pages? That’s extra clicks that add no value to user or surveyor. Pet peeve #1 with me.

    BTW, I love the way Netflix does their surveys. Simple question, 1-click and I’m done, they get useful information.

  22. Rob Jones Says:

    They also spelled neither wrong every time, and they used neither/or instead of neither/nor. Can they really not check the spelling and grammar first?

  23. Jared Spool Says:

    @James: Netflix’s 1-question survey is brilliant.

    @Rob: I looked that up. Neither/Or is in fact grammatically correct according to may favorite source: Grammar Girl.

  24. Warren Jokinen Says:

    Jared —
    In your copy you refer to wifi, wi-fi and Wifi. Are these all different services? 😉

  25. Jared Spool Says:

    @warrn: No, just insufficient editing.

  26. How To Build A Crappy Survey | nomBat | The Blog Says:

    […] United Airline’s drawn-out, unclear, down-right useless (for the customer and the company) wifi internet access survey and turns it into an amazing learning experience. Moral of the story: making your customer feel […]

  27. Todd toler Says:

    Jared, this is fantastic. I’ve put it to immediate use. Especially interesting to me is point #10, as I’m still uncertain on whether these prioritization (or conjoint) questions are a valid way of learning about the appeal of current or proposed website features. I suspect that the ambiguity of language is a problem (for instance, one user’s “search” might be another’s “browse”), but your simple notion of not forcing people to rank the items they don’t care about or understand would certainly help.

    P.S. I feel sorry for the entire travel industry sometimes, stuck as they are under the weight of your relentless and pointed critiques of their services.

  28. All This ChittahChattah | Don’t Bother, Braun Says:

    […] appeared on my Facebook page. (Recent notable additions to the oeuvre include Jared Spool’s 19 Lessons from United Airlines on How To Build a Crappy Survey and Steve’s imagined reaction to a Netflix survey, Effective Concept Testing: Getting the […]

  29. A Simple Customer Satisfaction Survey | Martin Research Says:

    […] One thing that absolutely, positively does NOT make customers happy? Long, boring and repetitive customer satisfaction surveys. […]

  30. Dante Says:

    I know there’s a long time span from the original posting on 2010, but United has a much bigger problem than their UI: customer service.

    On June 30, 2012, United didn’t chaperone a 10 y.o. minor traveling alone to her connecting flight despite the unaccompanied minor service fee that her parents paid. What follows is a story of broken customer service, frantic parents, institutional apathy, and zero accountability. http://www.nbcnews.com/travel/10-year-old-girl-flying-alone-united-left-stranded-chicago-942140

    They not only did they succeed with a crappy interface, United’s customer experience is utterly dysfunctional to non-existent.

Add a Comment