Ep 12

Don't Trust Your Product Survey

Learn why new product survey questions can lead you in the wrong direction for your product, and what you should be doing instead for better customer research methods. Tune in to Exploring Product with Ryan Hatch and Robert Kaminski of the Headway Product Strategy Team as they answer the most common questions teams might ask on surveys to figure out what’s next for their product and reasons why they can be misleading.

Presented by
Host
Ryan Hatch
Head of Product Strategy & Innovation
Host
Robert Kaminski
Senior Product Strategist
Guest
Guest
guest
Andrew Verboncouer
Partner & CEO
Transcript

Rob Kaminski: Hey everybody. Welcome to exploring product. Ryan, we are getting to the end of the year. It's December, we're on our 12th exploring product podcast episode. And today we're talking about surveys. 

Ryan Hatch: That's right. Well, I can't believe it's already been a year 12th episode of exploring product.

One of our goals was to do one every month and, and we're here. I can't believe it's been a year already have we've had some great guests on, um, lemme continue to do the, to do so. We're, we're excited for this. Thank you so much for joining us today. Um, Merry Christmas, everyone, December and today, we're talking about surveys.

Why are we talking about surveys? 

Rob Kaminski: That's a good question. I think we got onto surveys and there's a couple of things that came to mind here. Uh, we get questions on surveys all the time. Sometimes people actually bring us surveys and I know you're going to tell us a story about that as well. Um, people are using surveys to understand their customers.

They're using them in product environments. And so we get asked, how should I structure my survey? Well, my survey told me this. And so we felt it was a good opportunity to chat through some of the implications of using surveys. Uh, and you can tell by the title, uh, that we use, you can almost get a sense of our opinion.

Don't trust your survey. There's some caution coming in this topic for sure. Um, and you know, in addition to getting questions, although. We see what happens when surveys get misused in really the, I think a lot in the startup world, but especially when launching new products. Wouldn't you say rent? 

Ryan Hatch: Yeah, I think, you know, one thing we're going to talk about a lot today is right.

Tool, right time. And one theme that you'll hear us talk about is while we're not against surveys in the right context, I think there's, there's a large temptation to use. Um, grabs things that are easy to grab, right? You grabbed things that are, that are right in front of you. And a survey is often like one of the easiest things you can do.

Hey, I got everyone telling me different things. Sales is telling me this thing. I got customer support telling me that I have my internal team. It's like, well, how do I, how do I filter through that? And I have my own take, but oh, I know. Voice of the customer. Right? Heard that before. Let's just have the customers tell us, let's just go ask them.

And I think there's this temptation to just reach for a survey, you know, whatever platform survey monkey comes to mind. Right. Or just send us out and we'll get our magic answer. And I think what, yeah, 

Rob Kaminski: I was gonna say, when you taught, when you lay out that scenario, the intent is good. Right? Yeah, go to the customer voice of the customer, right.

We're going to, we need to learn from them to make some of our decisions. And so nothing you said there was bad except surveys. We talk about them being almost too easy, which is what you hit on is, is, are you getting the right information from the customer in the right way to make that decision? I think that's something that's, that's kind of often missing.

Uh, because you're doing it digitally, right. A survey is you're kind of going from a human thing into something that becomes, uh, quantitative and to some extent, binary, and I think there's a, there's a real risk of missing. The why behind customer's behavior. Uh, and I know we're going to open that and unpack that a bit today.

Yeah. 

Ryan Hatch: Yeah. It's like, it's like abstractly, what you're doing is you're abstracting the market out into some kind of data structure, and then you it's like you data structure, customer, and you're putting this thing in between you and the people that really care. I mean that the intent and the intent is good, but oftentimes we see that it's the wrong tool for the job.

It's the wrong thing to grab for. And even though it might be easy to do, um, and it, it might be like, Hey, what I really need to do? I really have a gut. Maybe you already have, like, I have a gut feel for what I need, we need to do. I just need evidence to back it up. Right. I just, I just want to put a big slide on the, on the, you know, to prove to my CEO or whatever that, Hey, we should do this.

See? And it's just like just evidence building and yeah, maybe it's the right tool for that job. If you already made up your mind and what you're going to do, I a CYA, but something to push forward your agenda. Right. Then, then maybe it makes it, maybe it makes sense. But a lot of times. If you're actually, if you actually have trade-offs to make, if you're actually trying to figure out what do we go do?

What do we go prioritize? And where should we play in the market or any of those questions? Um, it's often the wrong tool for the job and, and what we don't want to have happen is w one thing Rob, we talked about a lot in inside headway is, um, bad day. Is worse than no data at all. Right? Because bad data is going to give you this false sense of confidence, this false security that I'm headed in the right direction, but really you're walking on really thin ice.

And it looks like there's, there's a safe place to walk there, but, but really, uh, surveys can often lead you in the wrong direction. Um, we've, we've seen that.

Rob Kaminski: We do at first, I want to, I want to actually, I want to scare some people a little bit with some of these horror stories. So there's two that came to mind as we were kind of putting this together, um, for how we think about surveys and really these stories that stood out to us that impact the way of when we use them.

If we use them. And one of the tools we actually use, there's a great video by David James, who goes deep into. Really it's about research in general. Um, but he gets into methods, methodologies, and ontologies, and how they all fit together. Um, we use this in our onboarding process, Ryan, when we bring on new folks to understand kind of the differences and approach to research and how that all fits in that there's a ton of nuance to it.

But this example of David James at the highest level, he brings up a great example of two research approach. Same exact research question. There's these two researchers, I believe it was at the university of forgetting the exact context of the study. One of the researchers went and dove into surveys, right.

They went right into. Quant approach to how they were going to get kind of feelings and sediment. I think it was within like working cultures or students working in coursework with their teachers and how they interact and learn. So they went to surveys like how much are they learning? How fast is the course going and all those things and a separate researcher with the same question, immersed themselves in.

Kind of the habitat of their, their research study. They went and spent time in the classroom. They interacted with students and the teachers, and it was more of a qualitative approach. And what, what, uh, David James went on to show is the two completely different conclusions that were reached at the end of these studies, simply based on the methods that they used, one being surveys and one being this qualitative approach, uh, and it's kind of mindless.

Because what we're talking about, the analogy to that for us is we're trying to make good product decisions. And if the research question is the same and the, the methods, the methodology you're using are clearly unique and different surveys being one of those, it could take you down a terrible path. So you say no data, bad data.

Uh, bad data is terrible because you're further along the path and you've already kind of bitten on that. And so that's a big one that kind of hit home for us and how we teach against that is to make sure you know, what you're learning against, what you're trying to understand. And the why when it comes to human behavior ends up being one of the more important aspects, uh, beyond just the, what we'll get into that a little bit more.

Uh, wouldn't be breakout surveys. 

Ryan Hatch: Yeah. I mean the one thing you're talking about really. We believe that any time as a, as a, as a CEO of an existing company, as a product CPO, a product person, or a, an idea stage founder, anytime you're, you're, you're running product, you're trying to figure out, well, what do I, what do I bring to market?

And we, we talked about Robert that like any really product strategy, product management is all about asking a series of questions. And like every day you're just answering a different question. The question changes. Right. Um, and so the question is, The question is what's the research question today? You know, what question are you trying to answer today and let that be the focus, but to Robert's point, you know, how are you going to go answer that question?

Uh, there's many, many ways to go answer that question. And then which brings you to this, you know, this David James example, which is like, How, what do you think? What do you believe it's, it's actually a belief system. What do you believe? Um, the, the, the right approach is to get to reliable data, right?

Reliable customer insights, reliable market insights. And, you know, so just because you have the right research question, the Fest, the first thing is, are you asking the right question? Right? That's that's the very first thing. Um, the second thing is how you go about answering that question and your paradigm.

Which is the stuff we're talking about now with this David James video, he kind of talks about the paradigm and how, what you believe reliable, how evidence can be gathered. Um, it changes your, your research approach. And we've just seen that. You know, the surveys can lead you down a totally different conclusion than like living with these master's degree students, you know, to see, you know, how their experiences living with this other person.

I think lived with them for like, you know, when went to class with them, went to lunch with them, hung out with them, like for 30 days to really immerse in that experience. And yeah. You know, it's a totally different conclusion to the same question. Well, how could that be? How could you end up with two completely different conclusions and understandings when you're asking the same question?

Shouldn't the same question lead you to one answer, right? Truth. Well, no, um, actually like your, the, the key thing to realize here is that reality exists. Like those students, those master degree students on campus exist. But how we sample for that, how we, how we collect and how we gather insights really matters is going to change your conclusions entirely.

Right. Which are what you call into question like, gee, am I learning? Am I, am I making my am I, is my army insights? Correct? Right. Yeah. 

Rob Kaminski: When you talk about that, it makes me think like, how close are you to the epicenter of your research question? And I think that's something where survey seems so easy to do.

But in a lot of times, they're so far from the core of what's happening and what you're trying to answer. And so you're replacing speed and efficiency for understanding and effectiveness. Right. I think kind of like speed versus philosophy. Yeah. You might go for. But for all, you know, you just ran in a big circle and you're in the same spot, whereas know picking the right tool could move you along the path in terms of a velocity to actually get you somewhere where you're trying to go.

Um, and so we're going to talk about this a little bit in, in how we unpack, like what's good and what's bad about a survey and like why this happens before we do that, Ryan, I know you have a great experience around. Uh, you know, a project we worked on where, you know, surveys were a big part of kind of the initial impetus for taking a certain direction.

And so maybe you could share a, as much as you can on what that looked like and why they ran into those challenges 

Ryan Hatch: that what we're trying to do here is kind of compel you to rethink. Maybe, maybe I shouldn't actually trust the data I'm getting back. Right. And that's what we're kind of showing with. This first example is same research question.

One is actually living in the market. One's doing a survey, totally different answers, which would lead you in completely different product directions. Right. Um, whereas we talk about no data when you have no data, it actually. Well, it makes you question yourself and I think we should be kind of continuously questioning, you know, where we're, where we're getting our insights from and how we're coming to conclusions.

The other story that Robert we're we're we're we're talking about here is like, I'll tell you an example of when, when you blindly follow a survey, the, the catastrophes that can kind of ensue from that when you're, when you're just taking it for what it is. So, um, we've seen, you know, product people we've seen entrepreneurs.

Come to us with, with surveys, they already done it. They've already done it before they came to us. Hey, you know, we had this idea. And so we did a, we did a PR, we did hired a marketing firm and we spent, you know, 50 grand doing a market research survey. Okay, great. Well, it turned out to be like, uh, uh, in this case it was a phone survey.

What doesn't matter. Phone or, or, or not. And they're collecting all this data and sampling the market and coming back and asking these questions, you know, yes, it's a survey over the phone, right? So like, yes, no, this category, this category, you know, more satisfied, less satisfied, whatever buckets that are being presented in this thing.

Um, and the, the, the core conclusion that came out of this because the founder was really asking themselves. You know, should we have this direction with the product or this direction with the product, right? Like there's multiple directions. We could go, well, what does the market say? So phone survey comes back with this conclusion that, oh, people really want be in that.

And not a, and, uh, I won't get into the specifics, but this, this conclusion, um, basically, you know, and, and the, the, the, I think. Attempting thing about this also is like, wow, it's going to be statistically significant, right? Like, wow. You know, with a and a P error of whatever. And we're going to have a really confidence that they really want the product, you know, B or feature B.

That's really what resonated with the market on this survey. And then. What happened was went to the investors, raised money on this story and narrative that they already knew what the market wanted. Went out, raise the money, hundreds of thousands of dollars built the thing had, you know, ran ads all over the place.

No one signed up, right. Nobody. And it's like, man. Okay, well, gee, but the marketing firm told me that I'd have like 40,000 customers by the end of the year. What what gives right. And then, and then it's like, well, maybe we just need to, we need to fix the, the marketing side of it. You know, the, the problem, isn't the product, the surveys, right.

Literally it's like, Hey, let's, let's reframe the marketing. Let's, let's add more pizazz and we'll do a video. Maybe we just need to, to, to communicate the offer better. Um, and going through and doing all of that, not to mention like, you know, licensing building products. And there's this huge. The point is the phone survey led to this huge long investment, uh, circle product investments, re you know, uh, marketing changes, all these things to come back and it still didn't work.

And it was all based on this false premise that wow, like the PR the market really wanted product B and not a, and that's the, the, the question that this product person was struggling with. And so we think it's really, really dangerous to, to just. I think you're sampling the market. Think that statistical significance means that you can predict what the market's going to actually do and predict behavior and build your whole company a strategy on let's.

Rob Kaminski: Let's talk about that. Let's talk about, you know, we're basically coming from the premise that their surveys are they're ineffective, right. They can be effective and we'll, we'll, we'll share some areas where they might work when you tell that story. And I think of effectiveness. It reiterates the fact to me, that surveys, they do an okay job of capturing designer, uh, but not a good job at all of capturing behavior.

And what actions will actually, uh, happen beyond that. And that's kind of what come out, came out in your story. And I think in that piece to me, I'm going to jump ahead, but like it's the same doing gap, right? Like, think of any, I don't really take that in any survey. But when I do a lot of times, I'm almost like portraying this ideal self of like, you have the best version of me would pick this or like, I know I should do that.

Or, oh, they they're asking me this question because of this. So I'm going to answer it in a certain way, um, that they expect me to answer and sure. You can learn some things about me maybe from reading my response. But then once you've put me in that situation that should do, or I would like that it goes out the door versus what I actually do.

And I think. You know, one of the key things for me with surveys that I tend to steer clear of them is for that purpose is I care more about what someone's actually going to do than what they say they're going to do. And that's why there's such a science to customer research. When you're sitting in front of someone to really filter through what do they actually mean versus what are the words that are coming out of their mouth?

Ryan Hatch: Yeah. I mean, there's, there's, there's so many reasons why putting an abstraction layer, this data layer abstraction between, between you and the actual customer. Why there's so many reasons why that doesn't work. Like you're trying to put this translation translation layer in between. Well, you've got to actually like design the questions in a way.

Well, they don't understand it. You know, they understand it differently than what you're intending. Right. And so there's this, there's this impedance mismatch. There's also like the say do gap that you're talking. Where, you know, what'd you do this? What'd you questions? Uh, you know, behavior is real, right.

We talk quite often, Rob, we talk in our, in our. And internally, and with clients, we talked about, you know, a new year's is coming up. I got, you know, this is the year I get fit. Like this is, this is the year I get healthy. I got a treadmill. I'm going to get up every day at five 30. It's like, this is a year I do it.

Right. I get my 20 year old self back. And then, you know, and then it comes March and the treadmill is sitting there and it's got a laundry basket on it. There's li you know, it becomes just a piece of furniture in the, in the basement or, or whatever. And that that's, that's so true. Like, just because we have these hopes and these expectations in these, these, you know, would be wishes for ourselves that our ideal self it's very different than what people actually do in their real, real, everyday life.

Right. Um, so there's a big say-do gap and another one is. Humans aren't computers. And I think like we try to say, oh, if I could just get the, if I could, if I could convert people into data, then I could just, you know, look at the data in an Excel sheet. Wouldn't that be great? Yeah. That would be great. It's just not real, right.

It's just like this really false belief system that you can actually turn people in the data. It's that's not true. Right. If I ask you. You know, Robert, we did this, you know, the other day too. It's like, remember what'd you have for lunch yesterday. Okay. You could tell me what you have for lunch. Two days ago.

You struggled to remember what'd you have for lunch five days ago. I have no clue. What'd you have for lunch 20 days ago? Are you out of your mind? No one knows this stuff. Right. And it's so true. It's like, so if I ask you a question, well, Hey, um, what do you usually have for. If you can't remember what you had for lunch three days ago, right?

Yeah. We, we think that we can, we can just have people do an Excel formula, you know, some, this minimize at, you know, what do you, what, what do you do on average? You know, th th people are doing moment. 

Rob Kaminski: The subject is guests. Is the moment you get into bad data territory. Right? There's a lot of these surveys too.

There isn't this, I don't know. Or I don't have the best answer. So please ignore. It's like, no, they're going to plug in an answer. They're going to say pizza because that's what they had Wednesday. And then you're going to like go through all the data and say, oh look, there's there's Ryan. Oh, he likes pizza for lunch.

And he does these other things when it's like, Nope, you're already, you're already created this false content. To make decisions from, and you can see how that could multiply and get just worse and worse as you go through that. I think like I use the example a lot. When I'm working with clients around you, there's a profile you could put around me with a survey that I am right.

To buy an electric car, specifically a Tesla. And would I sure. But if you put the offer in front of me, like, I don't have one sitting in my garage. And, but if you asked me questions about like, even just my profile a little bit, like, am I, am I mindful about the environment? Am I interested in technology?

Do I like electric cars, all those things. I probably fit that profile, but my behavior doesn't align with what part of my like demographic or, uh, my psychographic even wants or can do in there. Right? Like there's things that you can't discover on a survey that would help you piece that together. Um, Ryan, I know you've done some work on surveys.

Um, in ways that can be effective. And this isn't meant to be a deep dive course, but take us to her. Some of these things that in our kind of prep session for this, you brought up, it's like if we're actually going to make a survey work well, these are the things we need to consider. There's a lot. Um, maybe you could intro these.

And just to me, what this highlights is like the complexity of actually running a really solid survey study, but walk us through this. And what do you think this means for those 

Ryan Hatch: Chinese? So what I want to convey in this, and we'll walk through this just to blow your mind a little bit on the absolute complexity and deep dive to do a, to do a survey, super, super well.

It's really difficult. It's super time intensive. There's tons of ways to screw it up. And it's just easier to talk to customers. Like we're going to keep coming back to that. You think survey's easy, but no way. Right? We talk about question biases and we talk about like turning people into data. You know, how you ask those questions matters 100%, right?

Like to do a non-leading questions. It's very difficult, especially when you're sequencing questions one after the other. Right. If you asked this first, well, it actually like makes them more reflective, like, but they're not actually normally reflective in their daily life. Right. So you're bringing them to like a new you're changing their mental state as you're having them walk through these questions.

Um, I think also like you trying to match people up into categories, right? So the, the, the question biases is like, well, which of these five, four categories do you. Uh, well, none to tell you the truth, but I guess you're, you're going to ask me to fit into one of them. So we're trying, we're trying to actually, it's, I'm trying to learn from the market.

What we're trying to do is actually push our mental model onto the market and box people in, and that really doesn't work. And if you want to learn, that's not a good way to learn by boxing people, into drawing, drawing, you know, the lines artificially. And so people are kind of, they have to ask themselves, well, What are you really asking me?

What are they, what are they trying to ask? And there's multiple interpretations of these things. And, um, we talk about like, what, what you think is, is, is a number 10 versus number three is different than what someone else is gonna answer for three and 10. And, oh, you're going to compare those completely equally.

And there's so much complexity in just the question, framing, the sequencing, all of that, just in, just in that alone. And then when we actually have to do, and that takes a lot of effort to frame that stuff. Right? Cause you only get one shot, right? I'm not, I'm just going to go do another survey. Like I have to after ask, like all the questions I ever wanted to know.

Well, I better, right. And it's because there's a lot of upfront work actually to design this thing, then you have like usability testing. So not many people do this, but this is the right way to do it because we know there's this impedance mismatch between what we think the market would, would answer it and how they interact.

We're turning them. We're turning people into data, um, where you actually have to do then is you have to go do usability testing, just like we would do in software, Rob, where we'd actually sit someone down or do an observational study and have them talk aloud. As they're taking the ads are taking the survey and, you know, why'd you answer this and not this, you know what you know, and have them actually talk well, the, you know, and then they would say, I don't actually fit into those categories.

You should add a fifth one. Oh, tell me more about that. And you realize that actually your questions are framed wrong. There, there, the phrase wrong, and the options are incorrect. Like if you don't do usability testing, the data's gone out the window sampling versus population. Like how you sample. Cause you can't ask everybody in the room.

To answer this thing. So you have to, you have to do a sample. Well, now you have to do representative sample and how am I doing random sampling? Am I doing my intentionally skewing the market towards one or another, which we have done on purpose. It actually, we've done this on purpose as well, because we want to understand, well, what is high NPS people?

How do they think differently than low NPS? Uh, and to do that, you have to have this contextual, you know, understanding. You have to match your, your user database up to this thing. Like it, as it gets really complex right now, assertion testing. So assertion testing is when the data comes back, you have to know which answers to throw out.

A lot of times you're paying people to take surveys. Some people actually just take it for the money and like they're taking it for the wrong reasons. And, you know, we call them speeders, right? Like, they'll just take, they'll just answer a AA or whatever it is just to get through it. And, oh, I got, you know, 30 bucks at Starbucks, right?

So like you have to filter them out. And then we actually have to take different questions that we know are being asked in different ways, but we know it should be consistent. Like if they answer this here and this here. That makes sense, but they shouldn't answer this and, and that, right. So we actually build this like assertion testing or unit testing algorithm.

So we actually like throw people out who fishers, just checking 

Rob Kaminski: out halfway through, which I know I've been guilty of is like I do the first 10 and I realized there's 40. And then it's like, okay, all that stoop focus on actually trying to give a, an opinion back, like goes out the window. 

Ryan Hatch: Yeah, you get survey fatigue too, right?

So like, you might answer it a hundred percent, the first 10 questions then question, you know, 35 40. You're like, I'm ready to be done with this thing. Right? So like there's so many reasons you have to rake it, which means you have to fit the sample of the population. Pre analysis. We do a whole bunch of weighting and scoring and segmentation stuff.

So we can like segment so we can set it up for analysis. And then you, then you spend all this time in Tableau, right? Like there's just so many, it's 

Rob Kaminski: hard. It's worth it. Like I see this list and the nuance that it takes to actually execute it. And to me, we talked about early, like surveys are for speed, but.

I see time, right? I see actually like the opposite of speed and working through these things to make it effective. 

Ryan Hatch: If you're not totally true, it's totally true. I mean, you have to, there are instances when this is the right thing to do when you're making like really, really big decisions when you're making really big decisions.

And, but you also, one thing we'll talk about it a little bit. You can't do this survey first. There is no way you can design this. Without first being intimate with your customer. There's no way this thing will totally fail and will be worthless, but you're right. This is, you think a survey is easy. I'll just reach for that.

But then you look at this and you're like, that sounds like a ton of work to do. Right. And it is a ton of work and you know what, it's not just you, the product product manager, the product person doing this. Okay. You got to get your data guy in. He's got a, you got to export all the current customers and match it up with this.

You gotta do assertion testing. You've got to do waiting. And there's a whole bunch of like data flows. You have to do to actually get out like quality data. And you know, when I've done this with companies, it's like, do you really need my data architect, my engineer to pull all this? And Matt, yeah, I do. If you want the real answers, right?

So it's not just you it's multi resourced. It's a huge. To do this, to do this. 

Rob Kaminski: So Ryan, we did have a question come in from LinkedIn. I think it's an interesting one on what we're talking about. Um, so the question we had, there was actually a comment and question with it. So per our discussion, it seems that you've been hitting on Christian Christians, innovator's dilemma, the Delta between a company's vision and what customers are asking for.

And then they go on to ask, uh, how does usage of features within a product play a role when considering survey results? Uh, they assume that usage reign Supreme, but curious of our opinions on this. And so I definitely have a thought here. You want to, you want to take the first stab at this one, right? 

Ryan Hatch: I mean, sure.

Um, Yeah. We're, we're talking about a lot like Craig clay Christianson stuff. We were already talking about earlier when we talked about the research question, right? Cause clay Christianson talks so much about the, the, the question is the place in the mind where an answer can live. And so that that's. The things you want to communicate here?

Is it really, are we asking the right question and then, Hey, what are the different ways you can go? You can go solve that. You can go answer that right surveys, one of those ways. And we're trying to say like, it's often the wrong way, um, to get that. Now, as far as you're asking here, Hey, how does the usage of features within a product play like considering survey results?

Um, we always would think, we'd always say that. Behavior like actions over speak louder than words. And so we would go to something like if you have an existing product and you have people flowing through it and are trying to understand product usage, um, I'd start with the data itself, right. We start with mixed panel, start with actual behavioral events that are flowing through the product.

What are people actually do? Where are they? W you know, how far are they getting? Are they getting stuck at a certain point when people are getting stuck here? Okay. That's interesting. That becomes a research question, which is like, why are people getting stuck here? Right. And then we can go, then you already know who those people are.

You can do an export out of mixed panel or whatever, and figure out I should go talk to those users. I only probably need to talk to five to figure out what the pattern is. Right. So you can learn that stuff. I think really, really quick. And I would, I would suggest that. What are your thoughts? Robert, my interpretation 

Rob Kaminski: is very similar.

I think it's a great question. And I think where they lean is where they say usage, reign Supreme, I think is pretty spot on the way I think about it as going back to like, I mean, you talk about behavior usage is closer to behavior, but there's still that digital filter beyond it. And so why are we even looking at usage to me?

It's to inform decision makers. That we can, we can make. And so when I think about pairing survey and usage data, to me, it's usage first, what's actually happening, but there's going to be gaps. You're not tracking everything in your, in your product, your experiences, your activation and acquisition channels.

And so where there's gaps to me, there's, there's two routes. Go talk to people who, who you think have those gaps, but there still might not be an ability to get to them. Right. They may be part of the flow. You don't really know what's happening or what's breaking to me. That's where you could insert a survey with a specific purpose to fill that data gap where you have, you have some behavioral information of usage, but you don't really know what's happening in there.

You can ask questions like in that path, I'm thinking a little bit more in terms of. App usage or experience flow. And then when you fill that, you're almost like lead generating conversations you should be having. Right? I think ultimately you need to go understand the why, but you could probably uncover some of the nuances associated to that behavior.

And so if I were to summarize, it's really using usage data as a place to find opportunities to narrow a survey, to make it really apply in a way that's super targeted. But in order to inform your ability to get in front of those customers and really learn and understand kind of beyond that. That's how I think about maybe applying those with data together.

It's a great question. Thank you for that one, Chris. 

Ryan Hatch: Uh, stuff and we'll, we'll, we'll, we'll, we'll touch more on like some of the contextual survey stuff that Robert, I think you're talking about as well, a little later. Yeah. 

Rob Kaminski: And so this next piece here, you know, we started talking through some of the other pitfalls of, of surveys and, you know, we talked a lot about behavior, somebody or not behavior.

The other issue is there a snapshot? You know, we have it here as there are. Right. You know, companies do this for MPS both internally and with their customers. And it's, so it's only a one-time thing, or maybe you do it every month or quarter. It's still not often enough to really grab a picture of what's happening, which is really, really concerning to us.

We as humans and people who use products like we're irrational or dynamic, our opinions change, our experience has changed like almost by the minute. And so something that just captures data in such a slow way without the clear picture, uh, is super, super concerning to me. Uh, I, I think a little bit about, um, I think there's this tendency.

I want to go back. Why people reach for surveys and speed, right? Like, oh, I can get access a ton of customers really quick. I attributed back to this sort of core fear that people have interacting with their customers or prospects. I think it's something that even if you're an entrepreneur at season, there is this really difficult challenge of just getting in front of someone that you don't know to talk about topics that are sometimes difficult to discuss or there's unknowns.

You don't even know exactly what you should be talking to. And I, I see surveys as this replacement of, oh, we can get to everyone quicker. It'll be better, but it's a fallacy it's not better altogether. When I think about the difference between interacting with someone face-to-face, even if it's over zoom versus just pulling those individual data points through and through what comes to mind here, uh, on these pieces, 

Ryan Hatch: right?

Yeah. I mean, we're talking about these, you know, Um, cadence surveys or quick surveys here. Do I think you should do MPS? Sure. Why not? Right. Sure. Go ahead and do it. I mean, but so many people rely on this so much. I would not rely on NPS for a lot is to me, it's just, it's just one more data input, but I would not put too much leverage on it.

I think I was, we've seen this fail. No companies will, will do an MPS MPS thing and okay. And maybe they send it out like every six months, right. Or every year, but you should be learning at a much higher cadence than that. Right. If you're relying on this, uh, it's, it's not, it's not, unless everything is just five nines.

And even then, even then we've seen NPS come up, uh, very, very short because what's happening is let's say you're paying someone every six months or maybe once a year. Right. You're not going to, you can't ping them every week or every day. Right. So you're going to ping them every, you know, six months, what ends up being.

It's very low resolution picture. It's very pixelated, right? So it's like your resolution on that, on their, their, their, the pulse of the customer is so, so low if you're getting a pulse every six months. And the thing is, is that your survey that, that NPS score is a data blip. It's not the, day-to-day the day-to-day actual experience they're having with your product and with your business.

Right. So we've actually seen NPSP skewed because it's like, well, I've had all these problems with the product, but, you know, I really liked my account manager. I really liked my sales person. You know, I really liked the relationship that I have with this, with this company. Or was it with this person? I don't want to make them look bad.

So I'm going to answer four or five or, or, or seven, eight, right? And it's this, you're not measuring the experience. You're measuring a relationship, but you don't know that as a product person. Right. You don't know the lens. They're looking through. And so these lagging indicators that are not continuous, they're just real low resolution every once in a while.

Sure. Should you do it? Yeah, sure. But it's not going to tell you very much about what to do. 

Rob Kaminski: Right. There's so much bias in that. And the story that you told the experience that comes to mind for me, we used to do, uh, at a startup. I was at, in Denver. We used to do internal NPS scores. And it's so funny looking back on it because they, when they ask you to fill them out, they don't just ask, they promote a positive score.

Almost like 

Ryan Hatch: fill out our turtle NPS. It's one of our key goals. 

Rob Kaminski: Like 

Ryan Hatch: we want to make 

Rob Kaminski: sure we're doing great for everyone internally. And like, they're pitching me to be a nine or 10, right. When it's like, wait, are you using this to learn where we actually sit? Or is it a. Kind of a vanity metric that you're gonna use externally.

And certainly it was the ladder. Like it was something that they wanted to tell it, that man look at our internal progress. Everyone's so excited. And as you said, the same thing can happen with customers like, oh, we've been working so closely. We're so excited. I'll please fill out as NPS. We want to get better.

It's like, you're telling them to score high. Like you're already in front of them. You don't need a score, ask them, did we do a good job and talk about it, right. It's going to do, you're going to learn a lot more than, than just filling out a number for you at the end of the. 

Ryan Hatch: Yeah. And they might be actually like, maybe you just had a good conversation with them and you, maybe you pitched them on the future of the product.

Have you pitched the vision? Yeah. Maybe they're MPS, like maybe they're like, maybe they're responding based on the hope of the future, not actual current experience. Like there's so many different lenses. People can answering this through and it's, it's just not good. Data is the point. 

Rob Kaminski: So Ryan, let's try and flip this to a little bit of like action or takeaways from what we've talked about.

I think we're w we're crushing survey's a little bit, it's definitely not a tool where you reach for that often, but what would you recommend? And we've, we've alluded to some of this instead of a survey when it relates to product related questions and deciding features, deciding markets, deciding what to prioritize and that sort of thing.

Who do you reach for? 

Ryan Hatch: Yeah, I think we've talked a lot about understanding your customer at a deep, deep level, having that empathy, um, in multiple segments because you probably have multiple segments. You're you're, you're looking at, um, but just, just understanding your customer and. We think that, you know, putting, uh, the data layer in between you and your customer is just the wrong abstract.

It's just the wrong thing to do. I don't think abstracting people in the data, you know, it doesn't work very well. And so just take that abstraction out is really what we suggest is just, you just, you get in closer and you actually, you know, think of that example, Robert, we just had about the. The master's students, you know, um, master's degree at, at college.

It's like they went and immersed themselves with the students for four a month. Right. And so we talk a lot about continuous discovery and, you know, we don't want you to do just a research project for a week and then, and then be done. It's better than nothing, but don't stop there. Continuous customer interviews is really, really good, uh, and important.

What I would do in it was first thing. Instead of a survey. Um, there are other things, other things, you know, Robert, we, we kind of talked about a little bit, um, understanding contexts when, when a survey might actually be the right thing to pull pull for. 

Rob Kaminski: Yeah. Let's talk about that. Uh, cause I think it'll open up into some of the areas where we actually would apply them and just some mental models for how we approach these as well.

That's all, I'll open it up with like when to use when I think it's okay. And there's, there's sort of, there's a couple of here and there's actually one that I don't even have on this list that I'll bring up. Um, 

Ryan Hatch: I've seen 

Rob Kaminski: really good surveys where companies, or this happens a lot in content creation as well, where there's an existing audience and you're trying to get a better understanding of a context you already have.

Right. And so the thinking here with having an existing audience is you've already done a lot of the hard work of the law. And what you're doing is you're tuning into the details for yourself. So think about it at Morris. Adding extra information into a really solid foundation of an understanding already.

Um, the one example here that I saw, it was a really good one, a podcast I listened to occasionally my first million, which was Shane park or Sam Parr and chain jury. Uh, Shane did a survey here around. And it was very, very simple survey, but he actually was just trying to understand the demographics of his audience he already, and he knows a lot.

He interacts with them, uh, through Twitter very often. Um, and he was just trying to quantify, quantify some very specific questions. And so that's another thing that kind of triggers for me where surveys can be applied is if you've already done the complex. You know, complex problems and products, like when you're making these strategic decisions are the big ones.

If you've already done the big leg work and you're trying to make smaller product decisions like this color or that this button or this, where does it get placed? Like sure. Surveys and kind of reactions to that can work because it's, you're already down to something so focused. And that's kind of what I see in the analogy of this, like having an audience version.

Um, and then the other one for me is a, a reflective user experience. Kind of the example I brought up. Where, if you have usage data or you haven't experienced, but there's gaps and understanding why that person did that, what they were trying to do when they were using a certain feature. And if it's small and focused, I think those could be really effective to at least almost a aluminate part of the space so that you can go in and explore a bit further.

And so they might be able to highlight pain areas or. Uh, kind of deliverables that you have in your product where you can go and explore, but I, I pull back from being able to make decisions purely off of those, even in an application of a survey in that context, Ryan, you want to talk us a little through, like when not to use, we sort of hit this one a little bit.

Um, but to make it really clear for any of our listeners of like, when we definitely stay away. 

Ryan Hatch: Sure. Yeah. Um, before that, I might just add to the, when to use piece, um, a couple of times the surveys have actually been. Been helpful has been, like Robert mentioned quantifying job to be done or quantifying, um, how many right.

You already know, but this requires you to already understand and have like a really deep understanding of your customers. And you're just trying to understand how many of them, but if you, if you don't understand your customers first guaranteed. Um, I think the reflects, 

Rob Kaminski: they're not going to tell you if their job to be done.

Right. Well, like the one survey I even brought up, um, with, uh, Sean was, he actually lists out the assumed jobs to be done, which has a risk, because if there's, again, if there's not one on that list that they attribute to, there could be some bad data in there. But it was a clear representation to me that he already knew the things that his audience wanted and he was just looking for a sense of priority.

Right. So he's just actually trying to make a decision of like, if, if I actually go put, you know, a few weeks worth of work into one of these things as an add on where might I even start the focus, uh, which I thought was pretty okay. Uh, but he had to have that he had to have this, like, I think this is what you're actually trying to do.

And so he's looking for a bit of validation in that, in the way that he ran the survey, which I thought was pretty clever. 

Ryan Hatch: Yeah, yeah, yeah, exactly. And the other, the other one do you use is really when Roberts has this reflective experience, we mean by that is like open-ended questions. So there's been times where.

You know, we have a product and there's users flowing through the product and they're just, you know, we, we we've done maybe like interviews with, uh, like a survey, like not necessarily a pool, but a pool of people like respondent, or like what we call proxy customers, potential customers, but they're actually not using the product and to get empathy from, from that side has been really great.

But when you actually have real customers flowing through, you want to talk to your actual real customers and sometimes. They're not comfortable actually sitting down for a conference first conversation, maybe they're they're just so cold. You don't have, you really have no relationship with them or they're in there.

I had a situation where I had, you know, I was able to get people coming into the, into the product that I had a screen pop up and say, I had a video of myself playing and said, Hey, we'd love to get your insight. We'd love to understand why you're here. Tell me your story, book, a book, or your book on my calendar and let's chat.

So I had a whole bunch of people booking, booking with me, but the problem was they never said. Right. Like they just never showed up. And I did this for two weeks where it looked like I had tons of people to talk to. And there's a, there's a challenge. Well, there's a whole different conversation on how to, how to, how to recruit, um, and actually get those meetings and there's things we can do.

And there's things we've done, uh, successfully. But in this case, I just had a hard time getting people to talk. So the other thing was I had. Pop up in the, in the product again with my video, but instead of talk, booking a time with me, Hey, here's just five questions. Can you answer these for me? And they're were all open-ended questions, open text box, right.

And just tell me about your journey. Tell about your story what's resonating with you. What's. Uh, w why, why, why are you using this product now? What's going on in your life? And that was really, really good. Right? So it's, again, it's, you're not trying to put an abstraction of data. Notice the actual differences.

I'm actually trying to get closer to the customer with open text box and trying to understand their world. So I think that in, in that sense, it works really, really well. And another example of that, Robert is, um, Ariel on our team. Um, She had done this previously where, you know, she came on as a product manager into a new product and she didn't know up from down.

Right. So new domain. So how do you, how do you quickly absorb a new domain? Um, one way that she did it, uh, was actually, they'd done a survey. But the key thing was, they'd done a couple open-ended questions. So she actually ignored all of the other one to five and categorization questions threw those out, and that was the right thing to do.

And instead just looked at the open-ended quite open-ended damn. And then when you actually like there's, there's really cool things you can do. You can take the open-ended questions, actually put them in a data in a, in a word cloud. Right. And you're like, oh, this is, these are some interesting patterns.

That's a quick way. Word. Clouds are a quick way to do that, but also actually taking those open-ended questions. Synthesizing. In into pattern matching and figuring out like, oh, there are some really key themes here, um, that won't come out in the word cloud, but I think that's not a way to get empathy and get close, but without trying to put people into buckets, that's where stuff really breaks down.

When you try to turn people into, into data when, when not to use anything, anything like that, otherwise we'll go to whatnot. Go for it. Um, yeah, when you're trying to actually learn new things, that's what we call generative research. We're trying to actually learn new things. Don't turn people into data.

Don't try to put them into buckets when you're trying to learn 

Rob Kaminski: and to be specific, this is, this is like, Hey, I want to go help. We're doing a podcast. I want to go help podcast hosts. If you're just starting and you've never talked, done a podcast or talk to anyone who's run upon. Don't send out a survey to all the podcasts hosts saying, Hey, who are you?

What's your top priority? Cause like, you're just going to get this big bucket of information without any context. Um, and so that's more of like an actual example that aligns with this generative. Like when you're almost doing this exploratory, it's a dangerous place to start because if you go down the initial path of bad data and you're going to have to work back to kind of a core, uh, initially.

Ryan Hatch: Yep. The second one is testing business ideas. We've already talked about this example where like this customer did this big whole circle, you know, based on this, based on this bad data they got from this phone survey. Um, so don't test your business ideas or solution testing, or would you, or a feature prioritization don't do feature and solution privatization in a survey.

That's a, that's a recipe for disaster, right? Like. You want to understand your customer deeply and you want to create solutions and, and, and then move into behavioral stuff. Right? We talk about actual behavior. Will they actually sign up? Will they actually purchase it? That's different than just sending out.

Hey, which feature do you think we should do? Or, you know, this kind of artificial commitment testing and Robert, you want to do the last one? Yeah. 

Rob Kaminski: Anything where you don't understand the why? I think this is a good place to kind of close with our, how we approach it is for us to lie in human behavior and desires and motivation.

It comes from spending time to them or with them, excuse me. Uh, and there's, there's no, there's no replacement for that. There's no shortcut, um, surveys. Aren't going to get you to that level of understanding, uh, and think about any relationship you've ever had in your life. Like, you learn more a lot about what people don't say and what they.

Uh, emotions tell you and not words, and certainly not scores that they put into a spreadsheet, uh, in a test or quiz format. Uh, and so you got to understand the why without the Y you're you're not going to be armed with the ammunition to wrap the right constraints around the survey to make it meaningful to what you're trying to

Ryan Hatch: get after.

Wonderful Robert awesome. But we, uh, we, we did it 12, 12 months. 

Rob Kaminski: 12 months, uh, we're excited for next year and about the content we love the questions that are coming in. I think know we're really looking to continue to grow, um, to serve in terms of value of like what, what product teams are struggling with out there.

And, and the questions help us kind of drive our content in that. So, uh, yeah, this was, this was a fun one, uh, kind of, uh, the summary of this. It's funny. I think of like the, the anti-Semite. It's sort of a, just get on here and say don't use surveys and products and then close it out, kind of the 10, second version of today's stop the mic.

Right. Um, but good chat. Uh, appreciate the questions. Thank you, Chris. And we'll catch you next time on exploring product 

Ryan Hatch: Merry Christmas everyone happy new year. See in January.


show notes
Content
  • Using Product Surveys For Customer Research
  • Product Survey Horror Stories
  • Why Product Survey Questions Can Be Misleading
  • Why It's Hard To Do Product Surveys Well
  • Question - Feature Usage And Survey Results
  • NPS Scores and Surveys Is Lazy Customer Research
  • What To Do Instead of Product Surveys
  • When Product Surveys Are Useful