GOOGLE SEARCH CONSOLE FOR MARKETERS 18

{PODCAST} In-Ear Insights: Marketing Analytics Skills and Techniques

In this week’s In-Ear Insights, Katie and Chris discuss marketing analytics skills and techniques. Why don’t more people use their data? Why are analytics and data science skills so siloed and so under-utilized in organizations? From examples using Chinese food menus and restaurants to dealing with confused C-Suite people, you’ll learn how to start breaking down those barriers and using the analytics techniques you already know.

[podcastsponsor]

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn: In this week’s In-Ear Insights, when we’re talking about marketing analytics, an awful lot of the time we’re talking about a combination of tools and tactics and outcomes. But one of the challenges I see a lot of marketers having is that they don’t actually have the right tools. And by that, I mean individual analytical techniques. They don’t necessarily have the right tactics, which is how you do the thing, and there isn’t necessarily an outcome in mind as to what the thing is supposed to do.

Let’s take an example. The very simple technique of a moving average, which is when you take any given period of time, like seven days, and you average that time together. Then when the next day happens, you drop the first day off and you move that window of time. So that’s still seven days, but the average changes as time goes on. This is a very useful technique for smoothing out wild variations, particularly if you’re a B2B business where you have weekends, which never look good in your analytics. So why don’t more people use this technique? Why don’t more people adopt it? Why don’t more people use it to build insights? When you look at an organization and you look at people who are doing their work—these techniques are not new, like moving averages are centuries-old—why do you not see people adopting these techniques to make their lives easier and to make their insights more fruitful?

Katie Robbert: I think it really comes down to a lack of understanding. If you think about marketing as a discipline and as a practice, and what they teach you in school about marketing, they’re teaching you the front end of marketing, they’re teaching you how to know your audience, Porter’s Five Forces, SWOT analysis. The term analysis is in quotations because it’s more of a “These are my competitors so I need to make sure I do better than them.” Data analysis is a different skill set. From what I’ve seen, the larger broad stroke assumption is that marketers are not knowing that they need to take some sort of a statistics course in order to analyze their data beyond putting numbers in a spreadsheet. So a moving average is a step above doing the actual average. To your point about weekends, what I’ve seen happen is, ‘Well, the numbers suck on the weekends. Let me just pull that data out altogether so I’m looking at the Monday through Friday data only.’ Therefore, Saturday and Sunday are never accounted for in the data, because it does drop it down a bunch. So I’m seeing more of that behavior than I am seeing a moving average behavior because I would guarantee if you asked a marketer who primarily focuses on executing campaigns “Do you use a moving average?” they would give you a blank stare. It’s not because marketers aren’t smart and capable, it’s because they’re not aware, or they haven’t been given the education in statistics in order to do the type of analysis that we tend to talk about.

CP: So in a lot of our client work, there’s that one analytics person who literally lives in like a 400,000-page spreadsheet, 48 tabs, and a gazillion other things. And even in that case, I do see the use of more techniques. But it still isn’t impactful, it still isn’t creating any usable insights. And one of the things I think might be a problem is that any given technique is going to give you some kind of outcome, but people forget that a number in isolation doesn’t mean anything. You have to be comparing it to something, you have to be comparing it to the previous week’s number, to your competitor’s number, a year over year number. There has to be some kind of change because a number only has meaning when you can see the change from some other kind of number. And that’s not something that’s a mathematical technique. That’s not something that requires any statistics.

It would seem to me to be common sense that a number without context is unhelpful. But why do we then consistently see reports and spreadsheets and dashboards that as Avinash Kaushik says, you’ll have data puke all over there? Here are all these numbers, but there’s no meaning to them. What’s our mental blockage in marketing analytics that’s preventing us from saying, “This is the context around this number.”

KR: I think it’s knowing what to do when you see the number. We oftentimes prepare ourselves for the number going down or the campaign not doing anything at all so the number is zero. I feel like, as marketers—and I’ve been there myself—we really only think about the scenario of what to do if nothing happens, or if the number is zero. So let’s say our goal is five and we see number three, then the number four, and then the number three again. We think, “Okay, we’re getting closer to our goal. So we’re just going to wait it out.” There’s no real plan of action for what to do if the number is three versus the number is five. And I think that’s where it always comes back to the planning phases of things. What do you do in different scenarios where the number is X? Or if we’ve met our goal, how do we make sure we continue to meet our goal and then exceed our goal? I think there’s a very loose conversation around that. But if you’re looking at the numbers day to day or, Chris, to your point, you’re talking about a totally separate person from the marketing team, a data analyst. Does that person have the general instruction of letting us know when the number hits this because we want to do X? I think it’s just a matter of having that conversation and making those plans for all of those different scenarios.

Back to your original question about the different analysis techniques. If you have a separate data analyst, that person probably knows, but they don’t have the context to apply the technique to get to the right thing. And then if you have a marketing analyst, my assumption and what I’ve seen is, the marketing analyst has lighter weight analytical skill sets. So they can pull just the straight average, but not a moving average, or account for weekends and anomalies and those types of things. So they don’t necessarily have the time or the skill set to do something more advanced.

CP: So in this scenario, the marketer is the customer, and the analyst is the chef. Is it a case where the market is saying “I’m hungry,” and the chef is going, “So what do you want?” And the marketer just says, “I’m hungry.” And the chef’s like “Fine, I’ll cook something, but I have no idea what you actually want.”

KR: Essentially.

CP: So how do I get the customer to ask, “I want sushi” or “I want a pizza.” How do we get to that point? Is it incumbent upon the chef to press for more details? Is it incumbent upon the marketer to have a sense of what they want? How do you get them to come to that vision? Because in a regular restaurant, it’s pretty easy. You go to a pizza joint if you want pizza, right? You don’t go to a pizza joint, say you want sushi, and expect a rational response.

KR: Yes, the answer is yes to all of that. It’s the responsibility of both parties. If you have someone doing the analysis of your data, they should have a general set of questions and requirements to try to figure out what it is that you’re after; what is the question you’re trying to answer? What do you hope to see in the outcome? So it’s not necessarily going to say, “I want the outcome to be X so make that data say that.” It’s more, “What are you hoping to understand so that I can apply the proper type of analysis technique.” Then if the customer says, “I don’t know,” well, then you have to continue the conversation.

Chris, you have kids. How many times have you said to them, what do you want for dinner? And they say, I don’t know I’m hungry. So you say great, you’re going to have peanut butter and jelly for dinner and they say, I don’t want that. You say that’s what we have. It’s very much that type of conversation where you say, “This is what I have to offer you. Do you want anything else? If you want something else you need to tell me but unless you tell me, this is what you’re going to get.”

CP: I love that analogy because, in a lot of cases, the marketer is beholden then to a CMO or a VP or somebody else in the company, right? And they’re like a kid who doesn’t know. Or in that case where the boss wants better results, and the marketing asks what they want and the boss says, “I don’t know.” Then you have the data analyst who is the person actually doing the cooking and now the market is kind of caught in the middle between the noisy kid and the restaurant cook. That’s an even worse situation because then it’s not incumbent upon the marketer in a lot of ways to have the answer. It’s incumbent upon the marketer to be trapped in the crossfire intermediary. And to your earlier point, if there’s a Michelin-star chef in the kitchen, they can cook pretty much damn near anything. If you say you want a pizza with sushi on it, they’ll figure a way to make it work. But if the marketer doesn’t know that that’s possible, is the marketer the bottleneck?

KR: In some ways they are. That is a really interesting analogy as well because, if you keep the analogy of food, I don’t even really know what’s possible. I only know what I’ve been exposed to living in the suburbs of Massachusetts. It’s still pretty limited. We have our fair share of chain restaurants. Our Italian food isn’t really Italian food, you have a lot of Irish bars that don’t actually serve Irish food. You are beholden to what you have had exposure to. So yes, in some ways, the marketer is the bottleneck. But then if you have your data analyst slash Michelin five-star chef, it is then incumbent upon them to start to educate. “Did you know that I could make you foie gras?” “Oh, what’s foie gras?” And then you start to have the conversation of, “Well, it’s this. Why don’t you explain to me some of the flavors that you like, why don’t you explain to me some of the textures that you like.” So there’s a way to have the conversation so that whoever’s on the receiving end of the data can start to pull out the information.

Now that we’re sort of stepping out of the conversation of analytical techniques and more into how to manage up. Chris, you had said the CEO, the CMO says, “I want better results,” but what does that mean? “I don’t know.” Okay, does the marketer than just go away and say, “Well, he said he didn’t know. So I’m just going to go about my merry way and try to figure it out.” Or is there a way for that marketer to continue to ask intelligent questions to try to get more information or even just say, “I don’t have enough information to execute what you’re asking me to do?” And then if that’s the case and the CMO says, “Go away, I’m busy,” well, then they can start to document that the CMO is wholly unhelpful, and I can’t do what they’re asking me to. So when my ass is on the line, I have at least documented that I tried. But it’s the actual trying to get more information that I think doesn’t happen. It depends on the kind of company culture that you’re in—and I know we’ve talked about this—, but asking questions tends to be frowned upon. Or there’s a perception that if I ask too many questions, it makes me look like I’m unintelligent, or I don’t know what I’m doing or whatever the thing is. Whereas asking questions, when you don’t know, is the best thing you can do, and it’s okay to say you don’t know the answer.

We’ve sort of gone off the path from where we started, but I think it all kind of connects together.

CP: Going back to the analogy, though, I think that for the data analyst person, there is the equivalent of something that’s done a lot in Asian restaurants—especially in Chinese restaurants—where they have tons of photos that show what a dish looks like. There are pages and pages of these things. You go to TGIFriday’s and there are photos of all the foods, and you think, “Oh, that looks good.” And it gives you a sense of what the restaurant serves and what the chef can cook. To your point about education, for the folks who are analytically inclined, is there a version of that Chinese food menu? Is there a gallery of dashboards and reports they could create that illustrate, ‘Hey, does this help? Do you like this kind of thing? Did you know we can do this?’ For example, if you were to think about stats, you have things like interquartile ranges and box whisker plots. Now it’s obviously hard to visualize that in a podcast, but do you have an example of what goal completions in Google Analytics look like with box whisker plots, where here’s the high, here’s the low, here’s the average, and you could see that over time you think of it as a neat way of looking at it.

KR: I would say maybe. I think in this instance, the analogy of pictures of food does not necessarily translate to pictures of charts and graphs. The only reason I say that—and this is something that we talk about a lot in our company—is because you can’t then understand the underlying technique of how you got to the answer. I could look at a bar chart, and I could have two bar charts side by side and one is done with moving averages and one is done with a regression analysis. Am I going to know the difference between the two? Am I going to understand that this one gives me better results? All I know is that the numbers sort of look the same. So I don’t know the answer to that question, other than it is definitely incumbent upon the person doing the analysis to educate the person asking the question. You know, ask more questions, try to help them understand they could do it this way, they could do it that way. If you do it this way, you get X result. Lunch and learns are a great thing. (Sighs). It’s a big question.

CP: Right? No, that makes sense because over the weekend, I was playing around with some financial analysis techniques, and this one programming package has something like 70 some odd techniques that you can use to analyze—mostly stock data, or any financial data—, but about 70% of it has direct application to marketing analytics. The challenge is A) you’ve got to know what the technique is, B) you’ve got to know what the output is, and C) you’ve got to be able to explain how the output is useful to someone who is not looking at the analysis. So from that perspective, yes, the chef is constantly skilling up, which is what you want to have happened, but the restaurant audience may not be ready to eat some of it yet. You serve exotic food to people and they’ll go, “What the heck is that?“

KR: And I think it always goes back to what is the question you’re trying to answer with this data? What are you trying to understand with this one particular data point or this set of data? Then the person doing the analysis can say, “Okay, it sounds like you’re trying to understand which day of the week we get the most foot traffic. Well, we’re closed on Saturday and Sunday.” So it’s having that context of Saturday and Sunday is closed where you can ask yourself, “So do I do a moving average? Or do I do something else to make sure that I’m accounting for the zeros that are going to be happening at certain times?” So it’s also the marketers’ responsibility to educate the person doing the analysis and showing them the context.

CP: Right. I think that’s a good place to wrap up. For the marketer who’s not doing the analysis, be crystal clear about what you want, what you need, and what you’re going to use it for. In a lot of cases, if you’re talking to a really experienced data analyst, they have a huge palette of tools available to them, but they need to know specifically what you’re trying to get at because what you say initially may not be what you actually need.

For the data analyst side, be ready to have the full scope and scale of the tools you have available to you. But don’t just throw them all out there. Listen carefully and try to use the feedback you’re getting from the customer or the business professional as to what techniques you will eventually select. Because if you don’t have that you will just be puking data. And if you are in a situation where you have none of this, feel free to give us a ring. We’re happy to have these conversations with your organization and try to figure out what should be on the menu and who’s eating at the restaurant.

As always, please leave comments on this podcast episode over at TrustInsights.ai and subscribe to the newsletter and our YouTube channel. Talk to you soon.


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This