Steve Portigal – Immersive Field Research Techniques

Sean Carmichael

August 25th, 2011

Play

[ Transcript Available ]

You can’t ask people what they want. They can’t tell you. The answer is almost always narrow in focus, concerned with the here and now rather than the future. How do you get them to give you the observations you need to design what they will want? Conducting field research to actually learn about your users can lead to innovative new ideas. Going out into the field provides real opportunities to see what the world surrounding your product is like.

Steve Portigal, Principal of Portigal Consulting, is an expert in conducting field research. He understands the value and unique insights that can come from observing your users actually using your product.

Creating a great user experience starts with field research, that’s why it’s one of the 8 workshops at this year’s User Interface 16 Conference in Boston, November 7-9. And luckily, Steve Portigal is presenting the workshop. Get the details on Steve’s and the other 7 workshops at UICONF.com.

Here’s an excerpt from the podcast.

“… If you are trying to change the game in a certain space, that’s well entrenched, you’d better have a more interesting approach to the field than to say, ‘well what would you want to see different?’

You have to be looking more broadly at people’s behaviors and their needs? What are educated people trying to do, and how are people solving problems? What are the entrenched challenges there? You need to use techniques to gather that information and make sense of that information.

It’s not a ladle that you dip into the soup, right? Scoop. Oh, here’s what people said they want. We’re going to go off and do it. That’s never a way to do breakthrough stuff…”

Tune in to the podcast to hear Steve cover these points:

Do you have experience conducting field research? Share your thoughts in our comments section.

Recorded: August, 2011
[ Subscribe to our podcast via Use iTunes to subscribe to UIE's RSS feed. ←This link will launch the iTunes application.]
[ Subscribe with other podcast applications.]


Full Transcript.


Jared Spool: Welcome everyone to another episode of the SpoolCast. I am very excited today because we get a chance to talk with one of my favoritest people on the planet, Mr. Steve Portigal. Steve, how are you doing here?

Steve Portigal: Great. Who can not be great when they’re framed in such a glorious fashion? So, thank you.

Jared: Awesome. Steve, you probably know this but everyone else might not, you are going to be teaching a full day workshop at UI16 this year. I hope you know this because if you don’t it’s quite a shock I’m sure.

Steve: DOING. Yes, I know it.

Jared: On immersive field research techniques. I’m very excited about this because we haven’t had a field research topic in many years and I know you and I have been talking about this for a long time. It’s really exciting to see people really interested in field research these days.

The amount of interest has been growing radically. And, I was just thinking about this the other day that I think one of the reasons is because no longer can we just depend on incremental improvements, you know, just fixing this feature a little here or running the usability test and cleaning up this dialogue box.

We now, for a lot of organizations in order to really have a full burst into the marketplace or to really have people pay attention they have to have something that is completely ahead of what their competition is doing and the way to do that is to go into the field and see what is being missed by the current products and offerings and what opportunities are there. I don’t know if that’s what you’re seeing in your work.

Steve: Yeah, very much so. I think that, as you say, implementable improvements they’re table stakes. As you say the competitors are going to be doing that at least. Lots of spaces are getting crowded, new spaces are opening up. I think what’s interesting is those are hard problems to solve. There’s not necessarily obviously clear what to go do in the “white space”.

So companies have these massive trajectories, they have momentum, they’re succeeding, many of them are doing really, really well and using design to kind of optimize within that. I think people know from experience, from recent trends that that can be a short lived advantage, that they have to keep working along the “innovation”.

I’m going to put quotes along every six words I say here. Along the “innovation” vector to try to continue to open up new spaces and stay ahead or to get ahead.

Jared: Right, yeah. I mean, one of the things that I’ve been seeing is that, you know, someone the other day on Twitter, I don’t remember who it was, posted that if you are always trying to catch up to your competition you’re always staring at their ass. And I think that if you want to get ahead you’ve got to do some of this research that really gets you out there, really gets you with your customer, really gets a chance to see what the total game is that is happening with where the products today are working and where they’re just completely missing the boat and then designing for that space as you put it.

Steve: There’s a great sort of, what I think of as a great myth around that that maybe we can talk about for a sec. Just the oft repeated idea that you can’t ask people what they want because they can’t tell you so that if you’re in the kind of business and design challenge that we’re talking about where you want to break through and innovate and reinvent something you shouldn’t ask people what they want because they can only talk about what is going on today.

I love hearing that because I feel like I have a good response to that. It’s a conflation of a few things. One is, let’s just say, looking more largely, doing field research to learn about people and asking people what they want. I think if this is not an area that you’re experienced in you think those are the same thing. You think the only thing you can do in field work is to say, “well what do you want?” and then go off and build it.

And most people would say that’s not an effective technique for learning new things. I agree with that on the face of it. If you, you know, are trying to change the game in a certain space that’s well entrenched you’d better have a more interesting approach to the field than to say, “well what would you want to see different?”

You have to be looking more broadly at people’s behaviors and their needs and, you know, what are kind of educated people trying to do and how are people solving problems? What are the entrenched kind of challenges there? And so you need to use techniques to gather that information and make sense of that information.

It’s not a ladle that you dip into the soup, right? Scoop. Oh, here’s what people said they want. We’re going to go off and do it. That’s never a way to do breakthrough stuff. So yeah, I agree when people say don’t ask what people want because they can’t tell you but I don’t agree with the implication of that which is don’t do research to try to innovate.

Jared: Well it’s interesting you put it that way because I took a team out on a bunch of field studies. This is a company that had been in business for six years, they had a very popular product, the customers loved them but they’d never actually been out to watch their customers use their product.

They handle support calls all the time so they know the things people call in for and they use it themselves all the time so they know how they use it but they didn’t and they hadn’t ever gone into the field and seen someone use it.

And I asked the head of development, I said what are you hoping to learn on this? He had a very interesting response. He said, “I’m really hoping to see all the ways that people hack our product, all the ways that they use it in a way that we never intended possibly because we’ve left some big, gaping hole out there or possibly because there’s a use out there we never thought of.
I want to see all of that.”

And I thought that was really open minded for him. It was really sort of out there. And you’re right that sort of, you know… was it Henry Ford? I thought it was Bill Gates who said don’t ask people what they want because if I asked people what they wanted they all would have told me they wanted a start button. Wasn’t that the quote? I think that’s the quote isn’t it?

But it seems to me that there are real opportunities to get out there and just see what the world around your product is like. It’s not so much that you’re asking people what do they want as much as how have they molded their world around the products that are there?

Steve: That’s right. That’s right. In that example and in your story about your clients you’re kind of talking about what’s the research question? What do we want to know?

I think implicit in your story is the business question which is: “what do we want to do?” So I can imagine your client saying we want to change, you know… direct resources toward changing the type of engagement we have with users or redesign the platform to take us 10 years ahead.

There’s obviously some strategic question that’s driving that and then the research question which you created with them or they created in their brief to you is a really helpful one. Then I can imagine your method just falls right out of that. Once you understand that there’s an interesting continuum there.

But you need to surface the business question. You need to surface the research question. What do we want to learn that’s going to help us answer that? Those are, I think, really important to draw the thread between those before you get to methodology. I think that takes a lot of expertise and so asking people what they want I think is just the naivest version of that.

Jared: Right, right. And I think, it’s a cop-out type of business work, right? That sort of, “well we’ll just ask them.” It’s up there with putting 1,000 little knobs and customization things into your design. It’s this way of saying I don’t want to take the effort to learn what these people want so I’m just going to put it out there and let them, in essence tell me.

And then whatever they say I’m going to decide if it’s a good idea or not and then do it. If enough people say it it must be a good idea.

Steve: Isn’t that called AB testing?

Jared: Yeah, yeah. I think AB testing falls into that a lot. Actually, AB testing has the down side that you never get tot hear any why. So all you know is that design 37 beat out designs 36 and 34, so obviously that might be what people want.

The other downside of AB testing is that people often use the absolute wrong measure to determine what is success. So they do something clever like well we got that person to sign up without ever asking if that person will use or value the service in any way that will be long term meaningful to either us or the user.

But we got them to sign up. And you know, we promised them money and their best sexual fantasies and they signed up. Guess what? So I guess our dishwasher repair service is now going to be a hit.

Steve: Unfortunately you know, year over year returns or loyalty, sometimes those fall to a different team. That are, you know, trying too… I’m sure you’ve encountered this all the time. We have our loyalty team working on a loyalty product and we have our conversion team working on sign ups for the product.

And this, the sort of silo in their design efforts based on kind of slicing and dicing, so to speak, the way that “the consumer” is using their products. Of course, no real person segments their experience in that way at all.

They don’t have a difference between interest, conversion, usage, loyalty. So these companies, I think, are trying to divide up hugely complex problems and you know, apply resources to them to try to own every facet of the experience.

I understand that effort but certainly when you look at the whys, as you bring up, the why question applies across all these different parts of the experience and you can’t think of these different aspects of the experience in that vacuum. It becomes very challenging.

To gain any insight about loyalty without gaining insight into conversion. So AB testing sort of proves how to get people to do a thing you think you want them to. We really like when we get to triangulate across methods.

Obviously there’s no one method that’s good for everything so you can find great clues, as you say, if you’re asking the right questions in looking at log data or other sort of very observational, kind of objective measurement things. Then you can get some narratives from people through different types of methods that help you understand why.

You’re not asking one research method to solve the other problem but if you put together a whole series of explorations in an ongoing way, you know, then you’re sort of doing intelligence gathering and you can tie that into insights. Then I think you can really start pushing your designs to solve the problems because you have a more broad-based understanding of them.

Jared: Yeah. I think that’s true. I think that’s true. Now, I’m interested in your thoughts around this idea. So a team goes out and they do research and, like you said, they triangulate their research. So they’re doing field stuff and they’re doing some stuff with their analytics and some other methods too.

All of a sudden they’re producing all this data. Observations and analytical number and all sorts of things are coming in. And it’s really easy for folks to just say oh look, people who see this screen are more likely to click on this button so we should design that way or when we went out in the field we noticed that people kept asking us for this so we should just build that.

And I’m wondering if you’ve encountered this sort of immediate jump from observation to design solution without taking time in-between and what you do about that when you see it?

Steve: Yeah. I think that’s a very serious concern and I think so many times we’re setting up an engagement and we’re kind of warned. Our internal gatekeepers say we have to think about how we tell so-and-so what we’ve seen because we don’t want them to go off and start doing things, that there’s kind of trains in the station that are charging up and ready to… I’m butchering a metaphor here, but…

Jared: [laughs]

Steve: They’re ready to burst out of the gates–there I go, I killed a metaphor–that they sense that around them, and they’ve seen knee-jerk reactions. I think what we do at the outset of a research project is identify what the milestones are and what the output is and what they’ll be able to do with that and try to have that.

Either it’s explicit in the proposal as part of the conversation that we keep having, because people are hungry. They’re hungry for something. Try to keep engaging them in being in the field and sharing fieldwork stories and sharing early kind of thematic things. But we very deliberately do not say, “And maybe you should do X as part of that.”

So in terms of our structuring of our communications and our reporting and so on, we’re trying to set that frame and set that expectation clearly. I don’t mean that’s sufficient to help structure that. It’s funny. I think of actually a counter example where people did act very quickly on…it’s the low-hanging fruit stuff.

And then when we’re out in the field with something, this is a detour on the way to answering your question, which I think is an important one. They were out in the field and watching someone set up piece of…I guess it was sort of an audio/video/computer hardware product.

The instruction manual explained how they should insert the, it was L-I-O-N lithium ion battery. But I think it’s like L, lowercase I, captial O, It’s like a very weird word. It’s a technology word. The person that we were observing just kept talking about the “Lion battery,” and of course had no idea what the “Lion” battery was.

Jared: Right. [laughs]

Steve: The person that was from our client side who was out in the field with us. I think he owns the documentation process. He went back and he took the word “lion” out of it, so it said, “Insert the battery.” It was awesome. I mean I was just so glad to see because they had huge, huge, huge, huge problems that we uncovered through this longer project that required a lot of design efforts to solve. But I was just really gratified to see.

To me that was just a low-hanging fruit. It was very obvious what the problem was. There was no sort of ripple out affect to making changes. It was “Let’s just take that word out.” It was just a nice little edit.

So one thing is that that could be done very, very quickly, like for him to open up a documentation management change order, so that guy owned it. The change was very, very quick. I think it could be rolled out fairly quickly into the next printing or the next run that they were doing of that documentation. So it was an isolated solution to an isolated problem.

Now the fact that they have used that kind of language in their documentation, and of course you know just having worked on these kind of projects that there were many, many, many things like that that were much more complicated and twisted that didn’t necessarily have such easy resolutions.

So I’m not of course excited about people, you know, jumping the gun on everything, but where there were some very clearly actionable pieces that didn’t require the report from the vendor of the research process, fantastic. It was really great. We were clearly being actionable and having impact and giving them examples at a detailed level in a more user-centered way.

That all being said, this was very, very complicated, and it took a lot of work with us all together to kind of unpack the research and make sure that we understood what exactly was going on.

It’s the difference between going from stories and anecdotes and feature requests to understanding something that looks more like a model or a framework or a continuum or a diagram or segmentation. Some way that you can visually or kind of informationally, if you will, include all this stuff together.

I think that’s analysis and synthesis, and that’s a thing that doesn’t necessarily come naturally to designers who are trained to make a translation between an observed need and a design solution. I love that ability the designer’s playing, and I always just want to just slow it down. I think we often run into the consequences of the failure to do that where you know these teams are sitting on a multifaceted compost heap of anecdotes and mythologies.

Jared: I’m writing that down. “Multifaceted compost heap.” That is like the best phrase of the day. [laughs]

Steve: They are looking at all this stuff. Depending what angle you’re looking at, it’s like, “Oh, people want this” or “So-and-so said this.” These myths get created, and it’s hard for anyone in the organization, let alone as a group, to have a coherent sense of what to do. They’re just looking this pile and seeing a glimpse in the sun at that moment.

People will have these pet stories that get kind of retold. Sometimes lurking within this heap are escalated larger-than-life anecdotes. The name of the difficult angry customer that gets repeated over and over again in every conversation. “Well, so-and-so-and-so’s going to…you know how they’re going to react to that.”

We worked with a company that had circulated a video. It was a kind of transactional tool that was often being used in high-pressure market-changing conditions. There was a video circulating of the hands of this person who was using this thing at a rate that you just wouldn’t believe. You’d think the video was sped up.

And you know, so I think on the face of it they’re doing this great thing, right? They’re circulating a challenging use case, but it reaches this kind of status inside, that it’s larger than life. You know, you’re trying to design for Superman, and part of the story of the Superman is that they can’t be designed for because they are so over the top in their performance.

So these “peering into this compost heap” people can kind of, in groups and cultures, create these legends that are trapping them more than anything else. And so you have kind of a divergent mess with these spikes sticking out.

None of it is representative. None of it gives you an integrated, holistic view of what the different types of users, what the different types of users problems are, what the different design strategies are, how to prioritize.

It’s never been folded into anything and kind of elevated up a level where it has structures to it and you can say well there are these different types of folks, here’s how they interrelate, here’s the kinds of problems we’re having, here’s the kinds of issues we’re dealing with.

This is not about… sometimes you’re just trying to reframe what the challenge is. We had a client that was dealing with an easy to install product and it was all about reducing time to install but when we talked with people about that value proposition in their use context they talked about smart.

Kind of in the smart technology, smart home, smart phone use of smart. They kind of pushed back our story that it was about smart. It wasn’t about reducing time. It was about reducing errors and saving me having to go back and fix the install. That was kind of an elevated framework about what is the benefit?

How does this thing that you’re doing fit into the way these people are thinking about their work and what they care about? So, and this is kind of a long ramble, Jared, but it’s about the needs to get from this compost heap, this sprawling mess of stuff where people are grabbing on individual pieces to something that is more holistic, unified, and you know has kind of action items coming out of it.

I’m waving my hands in the air, not that that would help anybody anyway. Doing that aggregation and translation is why, this is a long answer to your question, why do we not want people to jump off and start designing things right away?

It’s because we need to put it together into this larger, generative framework which sounds really smart and hard and time consuming but it is very doable. It just means you need to allow time in that process for that to happen and timing your brain for that to happen and defer or parking lot that jump to solution impulse.

Jared: So to defer that do you use exercises and group activities to get people to take apart the compost heap, as it were, and start to look at the different things and start to measure OK we’ve got the guy whose hands are really fast.

But from a bigger picture what are the other users like and are they like that or are they something different and do we have to design for a continuum of things?

Steve: Yes. I mean the short answer to that is yes. It’s an exercise activity. So you know, I think about doing research with users. There’s sort of these big chunks of activities. One is planning. We kind of already talked about trying too figure out what your research question is, what your business question is, what your methodology is.

There’s all the planning. There’s all the field work. So this is, you know, whatever kind of methods that you’re using, doing that and that is very immersive. That gets your brain going and gets you thinking about… You know, maybe you’re thinking about solutions. I think what we try to do is think about people and get stories about people and new perspectives on people kind of in the mix constantly.

And then the next step is analysis and synthesis. So there’s a phase of work that’s OK so we’ve finished being in the field. We have artifacts, we have experiences, we have video, we have transcript logs, we have reports to go into a new activity that says let’s disassemble all that.

What are even the axes which people are engaging in? What are the factors? What are the extremes? To dump all that and then to start to collate and organize and structure and prototype frameworks. Oh, it seems like there’s a relationship between the maturity of the user and the feature sets that they’re using.

That’s not a very good hypothesis, but it seems like there’s something here where there’s a relationship between these different factors. And it looks like, oh yeah, here’s how it breaks down. Well, no, that’s not actually true. Maybe it’s a relationship between these factors.

You’re sort of experimenting in a guided way. Your gut is driving you, your experience is driving you to look for meaningful ways to organize and structure this stuff. That’s the synthesis part. The analysis is sort of decomposing these stories and these fieldwork experiences into these elements and the synthesis part is putting them back in a new way and building up this framework.

Jared: I think I have an example of this. I was talking to Kate Brigham the other day from PatientsLikeMe and she was telling me that one of the things they’ve noticed was that when people first start using PatientsLikeMe, you know these are folks typically with chronic illnesses or they’re caregiver to someone with a chronic illness, they’re very much about just trying to discover who else out there is like them.

And they’re just, you know, putting in their data about how they’re feeling and they’re getting back data about whether how they’re feeling relates to how other people have felt in similar situations. They’re very much focused on that data and sort of initial connection sort of stuff.

Then they learned that as people use it more and more their attention shifts away from the data stuff and more into the community aspects. They start to have friends who they communicate with on a regular basis through their messaging capabilities. And those people, it’s less about their disease and their complications and it’s more about having these connections and these friendships so the functionality has to shift to this other stuff.

That was something that they observed in studying their user population that there was in fact a change in functionality as people sort of matured with the service.

Steve: That’s a great example, and you can imagine folks not as talented as Kate sort of looking at you know, some — it’s the compost heap again — some set of data, some set of feedback, and not understanding what it is that differentiates user input A and user complaint C to understand what to do about it. Logic is, half of our customers want this kind of change, and half of customers aren’t using this other piece.

Jared: Oh, yeah, well their community members are very, very vocal about the design of the site and tend to discount those things that are aimed towards new members, because they’re long past that. So the types of complaints they get in are very biased towards those more mature feature sets.

Steve: And now that they’ve built this — I’m going to call it a framework — there’s a journey of a user through their relationship with us. As you say, they have different expectations and different interests, and so there’s different designs. The design solution isn’t put feature in for this use case or that use case. It’s creating a product that changes with you or that evolves or that you can find your way through that.

I mean, there are a number of specific types of design solutions that can obviously be built from that, but you’re trying to take those observations and those inputs and build that larger story, at which point there’s some strategies. We’re doing a lot of — in our hand waving conversation here — presumptively on Kate’s behalf…

Jared: Right. [laughs]

Steve: We’re still far from creating design solutions. We’re building what the problem is. I mean, it’s not “we.” Kate did this, but you know advocating for what she’s done. Her team has identified what the design problem is, which is completely different than where they might have started if they were looking at the different types of inputs they were getting — the feedback, the observations, things that weren’t being clicked on. We have to make this button bigger because people aren’t clicking on it enough. Those are naive types of design solutions, because they don’t reflect the deep understanding that you’re relating that Kate and her team have produced.

Jared: Yeah, I think that this idea of slowing down and really trying to understand the problem before you jump to the obvious solution so that you can get a more deeper perspective is really a valuable thing that separates teams that are really good at what they do from those teams that are just really trying to be reactive to the world.

Steve: Agreed.

Jared: So one of the things that I hear from folks when we start to talk about field research is that they get really anxious about having to spend large amounts of time in a customer’s home or in their workspace actually talking to them versus from behind the safety of a double-paned, one way mirror with acoustic tiling, so that they can’t hear you giggle. That idea of being right there — that’s hard for a lot of folks. Do you find that?

Steve: I find it for myself, sure. I think the first interview or first sort of thing that we’re doing at the beginning of any study — I’m just incredibly nervous. You know? And I think this has a lot to do with confidence.

It has a lot to do with personality type. I’m an introvert. I think this puts me into an uncomfortable situation. First one is very scary for me. I actually remember having breathing problems a year ago — not that long ago, and I’m been doing this for a long time — kind of like going up to the door.

But you know, by the end, I mean, it’s kind of like riding a bike for me at this point, but definitely I know what that nervousness at the beginning is.

Jared: See, I thought it was just me.

Steve: We definitely can relate to people that are having their nervousness, right? It is — and as much as you plan for it, it’s always going to go in some way that you don’t expect. You know, I hate to be flip, but I think there is a “just do it” aspect to this.

What’s the worse that can happen in these kind of situations? If the worse that can happen is you get murdered, well there’s probably some reasonable planning that you can do in terms of screening your participants to keep that from happening. I guess the other worse thing is that you piss off a valued customer, and I have worked with financial institutions where we’re not just going to consumers, we’re going to their customers, and it was really charming and eye opening to see how they kind of brought a customer service mentality to research, and I think it was really effective.

It actually helped — it gave them a framework to work within. We’re not going out there to fix their car or something. We’re not performing a service but we’re learning about them in order to advocate for them, in order to do a great job for them, we want to represent them to the institution, we want to represent the institution to them, and that I think was a big part of their culture. I think that was really kind of helpful.

So you know, they didn’t want to piss off customers and have them leave that relationship. So those are worse case scenarios that you can be prepared for.

I think you can get no information or get misleading information, and that’s kind of a risk that you’re taking. And this is, I think, is where practicing, pulling down the information that’s available out there about best practices that you and I are making available that are everywhere, I think, in our field, are ways to mitigate that.

Trying it and reflecting on it — there’s lots of learning that you can do. Watching your videos of yourself do interviews. I just cringe every time I do that, or reading transcripts and seeing the stupid things that I say or bad ways that I ask questions.

Doing it and just have the experience and then reflecting on it, debriefing with your colleagues, talking about what you’d do different. Treating it like a learning process, you know I think it’s something — and also, I’ll just say that being uncomfortable or scared or out of your zone — it’s not the worse thing in the world. If it is, 90 minutes in somebody’s living room — you can, you know, just bear down and say, “OK. In 91 minutes I’ll be done, and I’ll see what I’ve learned from that.”

Because I can think of any number of times that I’ve just been uncomfortable or confused or had my own view of the world pushed on. Some people like to go on rides or go see horror movies. Those are, you know, getting a thrill by taking yourself out of your physical comfort zone or your emotional comfort zone. This, I think, you can look at it that way. I mean, it’s more deliberate and more meaningful, and it’s not for entertainment, but we do have analogs in pushing ourselves that you might look at this in that way.

Jared: Yeah, I think that there’s definitely something to that, and I think that also planning and practice just makes it a little easier, I think, for folks. I’ve found that having a good plan as to what we’re going to ask, so you’re not feeling like you’re improvising from nothing the entire time you’re out there. Of course, you want the conversation with the participant to be natural, but knowing what the goal of the session is and where you want to hit and what points you want to touch on helps a lot.

Steve: I totally agree. I mean, preparing a plan, writing up a plan, creating kind of the archetypal interview. This section takes 20 minutes. This section takes 30 minutes. We’re going to have these props or these activities for 40 minutes here. Checklists — I mean, this, again, depends on your personality type, but what does it take for you to feel confident?

Even if you don’t do anything like that, you’ve at least mentally prototyped what you think the session is going to look like. You’ve got buy in, so you know that your colleagues are confident, because they’ve had input into this. Some people I know do a pilot — because you’re basically hypothesizing that you can have an in-depth conversation with these exercises and these topics in this amount of time, and that you know how to ask about that stuff. So, whether it’s a colleague or friend and family, before you go out into the field, do kind of a participant number zero, and try it out.

Jared: My former colleague, a guy who helped us with a lot of our statistics and things in the early days, Will Schroeder, always used to say, “If you don’t do a pilot or a rehearsal, your first session becomes your pilot or your rehearsal.”

Steve: That’s so true.

Jared: I found that to be very much the case, so there’s a lot of value, particularly if it’s a really important set of sessions and if you’re really concerned that every session go really well, because either each participant’s really important or the people who are observing are really important or you just have so few that you have to make every one work. That’s when a pilot or rehearsal really, really plays an important role, I would think.

Steve: And sometimes you know, these are hard people to get to, so the first one has to be your pilot. But if you build an iteration — I mean, we were meeting with bankers recently and had lots of aspirational methods and we had Post-It notes and we were going to do timelines and get people to rank things and one or two interviews, and we basically didn’t even succeed in deploying any of our plan in the first interview. By the second interview, we were like, “OK. We’ve got to throw a lot of this out,” but we kind of sat down and talked through, “Well, what’s the iteration of our plan look like?”

So, it’s great to have a plan, and we weren’t — you treat it like a hypothesis that you’re testing. I think those first two or three sessions were extremely valuable because we learned what the topics were when we kind of went with it and we weren’t trying to force our guide on them, but we had an overarching architecture, I think, to try to work within. We then were able to just rebuild very quickly after those.

Jared: Yeah, I find that in our research that we’ve done to be the case. Tell me if you find it the same way, that as you do more sessions, the sessions mutate because you’re seeing some patterns and you want to explore them more, because they’re slightly different than the ones you saw before. You’re more attuned to things that are new and different, and maybe they’re things you haven’t covered before that you want to see if you can invoke in the session to see if you got any new responses to it. So there is a metamorphosis that happens.

Steve: Absolutely, yeah. There’s questions that you can’t quite get answers to, so you’re trying a few different ways and you’re re-purposing bits of improv from one into the next. If there are sort of issue here to him, that this is challenging for people and how to help them feel more like, hey, they can go ahead and do this. This isn’t so scary. Maybe just acknowledging that it is iterative, evolving, “metamorphisizing.”

Jared: [laughs] “Metamorphisical.”

Steve: Yeah. [laughter] And that you want to allow for that. You want to debrief. If you had five minutes to debrief about every interview, it would be, I think, two questions. One, what did we learn that surprised us now? And how would we handle the next interview differently? So that you’re debriefing on content — what are we learning? — and on process — how are we doing this? — as much as possible. Those are the two big things to really think about at the highest level.

Jared: That makes perfect sense. Well, everyone’s going to learn about how to do this stuff and get more comfortable with it in your full day workshop. I mean, you’re all going to go out and actually do some field research and then come back and analyze and synthesize the results, and I think it’s going to be a lot of fun. It’s going to be really a very cool day.

Steve: I think it will be great. I think people will be, hopefully, excited and surprised by how far we can get in a day in terms of playing with many, many aspects of this process.

Jared: So if you all want to come and hear Steve, what you need to do is go to the User Interface 16 Conference website, which is uiconf.com. The conference itself is going to be in Boston November 7th through 9th, and we’re very excited about it. It’s going to be a lot of fun. Steve, thanks for taking the time today to talk about all this stuff. This was a lot of fun.

Steve: Thank you.

Jared: And I want to thank our audience for listening, once again. It’s great to have you along with us. And as always, I want to thank you for encouraging our behavior. Take care. We’ll talk to you next time.

One Response to “Steve Portigal – Immersive Field Research Techniques”

  1. All This ChittahChattah | Listen to Steve speak with Jared Spool about “Immersive Field Research” Says:

    [...] You can listen to the interview below, and read the transcript here. [...]

Add a Comment