GOOGLE SEARCH CONSOLE FOR MARKETERS 8

{PODCAST} In-Ear Insights: Market Research and Surveying Best Practices

In this week’s In-Ear Insights, join Katie Robbert and Christopher Penn as they discuss best practices around surveying and market research. Listen to some of the things marketers do most wrong, from leading questions to response types, and hear what marketers should do instead to design and run credible, effective surveys. Hear the answers to questions about when to survey versus use a focus group, planning ahead, and what non-response bias is.

[podcastsponsor]

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn 0:00
In this week’s In-Ear Insights, we’re talking about surveying best practices. One of the things that we’ve run into recently in some of our client work are people wanting to do surveys, which we fully endorse because of course surveys and one of the best ways to collect first party first person data from people to understand why things are happening. It’s really easy to get the what out of our data, because we have analytics and data science for that. But the Why is something we always have need to ask people for. But there’s a whole bunch of ways surveys can go sideways from biased questions, leading questions. These got it So Katie, you spent the better part of a decade doing like life or death surveys with people on opioid management stuff. So what are the things that people do right and wrong when it comes to survey?

Katie Robbert 0:51
I’m one of the things that people do wrong is not have a purpose for the survey. They tend to want to just ask a collection of questions now in the clinical trial. world, you have to clearly state your hypothesis up front, which is where I learned the real rigors of survey design and development. You know, but in marketing in non clinical trial world, people tend to just want to say, well, let’s ask him this. And let’s ask him this. Let’s ask them this. Well, first of all, who is them? Who are you asking? You need to figure out, you know, basically, what’s your plan? What is what is the question that you are trying to answer by collecting this information? And I often see things go wrong, where, sorry, I’m babysitting my dogs today. I often see things go wrong, where people try to have multiple goals in a survey as well, which you can do but you need to make sure that they are clearly outlined. Having a lack of clear purpose or too many purposes is where I start to see surveys go wrong and then not having a clear idea. of who it is you want responding to this survey?

Christopher Penn 2:04
Is that because people are not doing the prep work up front, because a lot of the times when we’re talking about qualitative and quantitative data analysis, we’re usually have the quantitative stuff after it. So it kind of goes a cycle, you have your qualitative, which is like a focus group of some kind to try and figure out what questions should we be asking? What does the landscape look like? Or mining through social media data or my new available data sources? They need to a first round of quantitative like, can we find any numbers for this? And then that second round of qualitative is ok, we have the numbers, we have a sense of what’s happened. Now we need to figure out why when you see people go straight to survey. Is it because they didn’t do that? prepper? Or is it because they they’re just not sure what it is that they’re asking for?

Katie Robbert 2:48
I would say honestly, it’s both. I think, you know, when we think about the data analytics hierarchy, you start with that foundation of that qualitative data, what happened? And you’re absolutely right. A lot of times people can Figure out what happened. So they skip that step and go straight to Well, why? And it’s out of order. And you’re not going to get that good quality information, if you have specific questions that you’re answering. So, you know, let’s say, very simple example. You know, all of a sudden, we see a spike in traffic coming to our website over the next, you know, quarter, and it goes, like, exponentially up, we really want to know what happened. So we put a survey on our website of what brought you here, or how did you hear about us or something along those lines, and then we can figure out where this traffic is coming from and why suddenly were the most popular kid in school. And so that’s a very clear cut example of how to use a survey with the right audience. You see what happened, the traffic increased. The people that you want to ask are the people coming to the site to figure out why? And then the question you ask is What brought you here so you can you get that Why so that’s a very simplistic but succinct example. What I often see happen is that people jump to you know, the tried to answer the why question without knowing what how do you?

Christopher Penn 4:14
How do you talk to people about non response rate about dealing with the portion of the audience that does not respond to a survey, I would imagine, especially in like clinical trials, that’s a major issue because you probably have very low response rates, a lot of people aren’t necessarily going to want to, to do any kind of talking about what it is that’s going on with them.

Katie Robbert 4:37
You know, you would be actually wrong on that because in clinical trials, typically subjects get paid for their time. Ah, okay. And also, one of the things that you need to determine ahead of even starting the survey is how many respondents that you need, and so you keep running the clinical trial until you get the total end. That makes it just statistically significant in order to say this is a representative sample of this particular population with this affliction, or whatever it is. And so in non clinical trial surveys, you need to approach it the same way. And so just saying, well, we’ll run this survey for 10 days isn’t a good way to think about it, you need to think about the total number of respondents that you should have to make your data statistically significant. Now, you’re probably saying like, you can’t even say those words, Katie, you’re stumbling over them. So how am I supposed to figure out what is statistically significant? And I’ll say to you, you can just Google it and there’s a lot of handy calculators that will tell you, if your total population looks like this. Here’s the total end you need to survey in order to have a statistically significant response rate.

Christopher Penn 5:57
with the caveat with the cap That you also have non response bias. So, in your example, in clinical trials, that is, you have a pretty good handle of the audience. One of the things that we’ve seen done wrong many times by some of the biggest names in marketing from the biggest companies in marketing is a serious non response bias. And what that means is this, let’s say, I’m massive marketing software company, right? That’s the name of the company, we send out a survey to all of our customers, okay, we’ve already start with a biased population. Let’s, let’s put that aside. And we get, say 13% respond. And we say great, we got a 30% response rate, Neil, we do a calculations and that is a enough of a sample to say like this represents the population. Is that right? No, because unless we know the composition of that audience, we might have gotten 100% of responses from people who are like Junior marketers, we have no CMOS. And so for us to be able to say like, yes, this represents all of marketing. That’s not true. We don’t know that there are no CMOS and if our goal is to, to portray marketing as a whole, we need a good sampling within each of the layers of seniority in order to be able to say we have a valid survey. The same is true by industry as well, b2b, b2c, you know, medical and healthcare and all these things. And so these are things that people we see do really wrong with serving as they just had that maybe they do get as far as doing statistical significance, but they don’t talk about the composition of the strata in the in the audience, and that is fine, as long as you then don’t make a claim that this represents the whole audience when it really doesn’t.

Katie Robbert 7:39
Well, and that goes back to the planning piece of it your requirements of what’s the question you’re trying to answer? Who’s your audience? So in your example, you know, you’re talking about marketers, but you’re also digging a level deeper in terms of the role of the market or whether it’s a junior or senior or you know, whatever that is. Those are things that you should determine. ahead of time of, you know, we are likely to get X percent of junior marketers, we are likely to get X percent of CMOS. Ideally, our goal would be to get 50% of this and 20% of this and 10% of this. And then we can say it’s representative. Yeah, I, in my experience outside of clinical trials, I’ve never seen a marketing survey put together in that way. Now, I don’t have access to, you know, how they put the CMO survey together or something along those lines. You know, I’ve never been on the planning side of it. But in my experience, just doing them with clients, it’s, well, I want to get some information about you know, my product, so let’s put out a survey. Well, great. Why. Yep. And that’s, that’s where people get stuck is why am I even doing this not even the, you know, the strata the segmentation and the significance Why am I doing this?

Christopher Penn 9:03
I will say the CMO server, you can download the backend data. So you can see like all their calculations, which is really nice. But again, that’s something that will touch on that in a second. With marketing surveys, you’re right. There is very, very often there is no data company that and that goes back to what you’ve been saying all along, which is planning. Like, if you don’t know the demographic breakouts of your audience, you are going to create a bias survey. So for example, if you aren’t sure how many senior executives are there in marketing, and you don’t take the time to go to the Bureau of Labor Statistics and look at the number of people employed in marketing, you know, in various scenarios, which by the way is available for free. You can’t calibrate and say like, yes, our audience is representative of the population as a whole. You need to have that benchmark and that comes from the planning side. And again, that’s something that I don’t see people doing.

Katie Robbert 9:55
I don’t either and you know, in touching upon, sort of The bias in general in surveys, you know, a lot of times, what you’ll see happen is, instead of saying this is the question that we want to answer, or this is the thing that we want to prove, or disprove, you’ll get this is the headline that I want to have 80% of millennial consumers, you know, don’t like to get out of bed before 11am in the morning. Okay, that is a result of the survey, but you can’t go into the survey, assuming that that is, what is going to happen. So you need to, you know, pull it back and start to say, what is it that you’re trying to prove? You know, so in that case, you would need to have some sort of a hypothesis, to say, Oh, geez, I don’t know. millennia, you know, people of a certain age bracket, tend to sleep in past 10 o’clock in the morning, or something along those lines. I’m just making this up, right? Instead of going into it saying I’m going to ask everyone Question until I get to the response of 80% of millennials.

Christopher Penn 11:05
Yeah, no, the the, the intentional bias is definitely problem, especially with marketers, especially public relations people where they have, you know, if they work at a company and they want to provide create something that supports their brand. That is something that our friend and colleague Tom Webster over Edison research cause in curiosity, you’re not curious, you have instead a corrupted intent when it comes to your survey. And that’s really dangerous for a number of reasons. One, it is mathematically and scientifically invalid to if you attempt to use that data then to model and build make decisions on you’re going to screw up your company and three at a more ethical and societal level. It’s it is unethical and it is essentially what you know, it’s part of what we would call fake news right where where you are creating something that is intentionally wrong, and distributing it as though it were truth. And you know, there’s a long rabbit hole, we can go down that road we don’t right now. But that’s in curiosity, and it is a substantial danger in marketing.

Katie Robbert 12:15
I agree with that. And, you know, you’re absolutely right. That’s a rabbit hole. So I had to actually just pull myself back out of the virtual rabbit hole. Let’s talk for a minute about just the structure of the survey itself. So you have, you know, a variety of different questions. You have your standard, yes, no questions, you have your multiple choice. You have your singular choice, you have your Likert scales, and then you have the dreaded open ended question. I say the dreaded open ended question because all too often and this goes back to not having a plan. When people are putting a survey together, what I’ve seen more often than not, is they don’t know what kind of responses are appropriate for the question they’re asking. So they leave it open. Trying to analyze an open ended question is a bit of a nightmare. Because you honestly never know what you’re going to get. You’re going to get things spelled incorrectly, you’re going to get trolls, you’re going to get people who don’t understand the question. And so if you find yourself relying on open ended questions, you need to step back and really rethink the questions themselves, because the majority of your questions you should be able to say, you know, you should be able to structure in a yes, no, or a singular choice. Having an open ended, other please specify is fine. People tend to ignore that because they don’t want to take the time to write out whatever it was they meant. So they’ll pick one of the responses that most closely matches their opinion. And keep in mind, surveys are opinions. That’s all they are. You are surveying for an opinion. So it’s completely subjective. You know, you can build it In a scientific way, but it’s not a binary, you know, 0101 or a qualitative data point. Its quantitative it. I’m getting those backwards and my

Christopher Penn 14:12
quantitative, qualitative. Quantitative is a numbers. Qualitative is not numbers.

Katie Robbert 14:16
Thank you. I was like, I know I’m doing this backwards. I also can’t say statistically significant. So here we are on a Monday morning. You know, so my rant is basically around the structure of a survey question. So open and open ended questions, use them sparingly multiple choice where you can choose more than one thing. Also try to use those sparingly because what ends up happening is if you have seven response options, and only 200 respondents, then you’ll get very, very small percentages of responses that you can actually use. So it will be a and b is point 2%. A plus B plus C is point 3%, a plus b plus c plus D. And so those are the two pitfalls that I often see with survey construction is open ended and, you know, multiple choice. I love to keep it very simplistic. Is it a yes or no question? Is it a singular choice? is a Likert scale like it scale? Is are those scales where on a scale of zero to five or zero to 10? How much do you agree with this statement that I’m about to make?

Christopher Penn 15:25
Yep. I would argue that open into questions shouldn’t exist on surveys at all, because it’s not the place for them. You know, going back to planning, that’s something that belongs to the focus group, where you can have that conversation and mine for what is the language people are using to describe the thing and then use the survey to quantify that as opposed to trying to qualify, you know, extract qualitative data from the survey now, yes, in some cases, you can’t because limitations and how you’re doing but if you’re going to do it, right, focus group, survey, focus, group, survey, all train that pure qualitative with that quality. Fine, how many people in our target audience agree with what the focus group had to say? Essentially,

Katie Robbert 16:05
I would agree with that. I do think that open ended don’t belong in your straightforward survey. The other thing that that isn’t utilized enough, but it’s also tricky to really map out is branching. Now, in a clinical trial, you tend to have different types of software. And you’re already paying the money into these more sophisticated data collection tools, and marketing. It’s What can I get for cheap or free and what can I do the fastest? And sometimes that’s okay, but you get what you pay for. And a lot of times, there is a need for a branching, you know, type of survey. Now, a branching survey is basically, you know, if you answer a question a certain way, you either move on to another question or you don’t. And so what ends up happening is because people are just throwing these services Together, they have this long list of questions that very quickly become irrelevant to people. So I could start asking you like, you know, what’s your favorite baseball team and you’ll respond, I don’t watch baseball. And then I’ll continue to ask you questions about baseball, that are completely irrelevant to you, basically either forcing you to give me junk response options, or to abandon the survey completely. And so those are just other things to be mindful of. If you are asking questions, that will quickly become irrelevant to your audience, find a way to give them an out because then they will have completed the part of the survey that is relevant to them and not giving you junk information or an abandoned rate for the rest of it. Yeah.

Christopher Penn 17:45
To wrap up, to conclude, survey requires a tremendous amount of planning. It requires a tremendous amount of thought and care in advance. And I’ll say this, it is a profession market research is a profession just like a surgeon is a profession. And while marketers can and should learn how to do market research well understand that at a certain point if you want the best quality, you’re probably going to need to bring in outside experts you’re probably gonna need to bring in people who for whom this is their primary professional, this is their day job, but it is what they do. And if you have a choice between not doing a survey or doing a survey poorly, please choose not to do it, please find some other data source that is credible, that is usable, and that will not contribute to the morass of that information out there. Any last words? KT?

Katie Robbert 18:39
Yes, if you do find yourself in a situation where you would like to put a survey together, please reach out to us we are happy to help you can listen to me rant for hours and hours about for survey construction. No, I won’t do that. But you know, you can join our free analytic No. Oh my goodness. I can’t talk today are free analytics for marketers slack group. At Trust insights.ai slash analytics for marketers, and you can ask all of your survey questions there. It’s a great group of people. I’m sure a lot of people will chime in to their experiences. But don’t go it alone. Ask for help.

Christopher Penn 19:14
Exactly. Speaking you’re not going alone. Be sure to subscribe to our YouTube channel and our newsletter. You can find both of them over at Trust insights.ai will talk to you soon. Take care


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This