GOOGLE SEARCH CONSOLE FOR MARKETERS 60

{PODCAST} In-Ear Insights: Avoiding Data-Driven Marketing Traps

In this episode of In-Ear Insights, Katie and Chris dig into this key question: is there such a thing as being too data-driven?

“I’m not sure if you and I have discussed this in the past but it’s been on my mind a lot lately, and I’m not sure if there’s formal thought around this:

Being TOO data-driven.

I’ve seen areas where data point to weird or undesirable possibilities. Examples:

  • A female YouTuber is clear that videos do better when the thumbnail has her in a bikini. Sometimes it’s appropriate because it’s beach vlog, other times, no. But the data say: wear a bikini
  • By far, my most popular blogpost is one I regret having written because it was tongue-in-cheek and deliberately used a clickbait headline. It resulted in 6 years of mean comments from people who didn’t read the article. But, the analytics suggest I do more of that.

The difficulty is in having clear evidence that something works, but is it content (or a product/service) that we want to keep creating?

It’s popular to say, “it’s not about you, it’s about what the audience wants.” And that’s been the approach of coaches I’ve paid and people whom I’ve asked advice from. They go straight to the analytics and suggest doing more of the top performing stuff.

For the female YouTuber, it might be an easy decision to always use a bikini-clad photo of herself on thumbnails and make that part of her brand. But this person was torn. Once in a while was fine, when it was relevant to the video. But, the data was pushing her toward, “no. Do it all the time.”

Said another way: when do you tell the data “shut up! I’m not listening to you”? And how do you manage the mental strain of resisting something that clearly works?”

Dig in as they tackle how data-driven marketing and avoiding these kinds of traps.

[podcastsponsor]

Watch the video here:

{PODCAST} In-Ear Insights: Avoiding Data-Driven Marketing Traps

Can’t see anything? Watch it on YouTube here.

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn 0:02
This is In-Ear Insights the Trust Insights podcast. Do you want to use AI in your marketing but you’re not sure where to start? Take a class with Trust Insights and the marketing AI Institute. The AI Academy offers more than 25 classes and certification courses to start you on your AI journey including our intelligent attribution modeling for marketers certification. One membership gets you access to all 25 classes. Visit Trust insights.ai slash AI Academy to learn more and enroll today. That’s Trust insights.ai slash AI Academy to enroll today. Are you struggling to reach the right audiences? Trust Insights offers sponsorships in our newsletters, podcasts and media properties to help your brand be seen and heard by the right people. Our media properties with almost 100,000 people every week from the In-Ear Insights podcast to the most timely and in the headlights newsletters. Reach out to us today at Trust insights.ai slash contact to learn more. Again, that’s Trust insights.ai slash contact in this week’s in ear insights friend of the show oz Drusilla asks, Is there such a thing as being too data driven? And the example he gave in the comments he made on LinkedIn? Was that when you look at data, can it lead you down pathways that you might not want to go one example. He said, a female youtuber is clear that videos do better when a thumbnail has this person in a bikini. Sometimes it’s appropriate because it’s a beach flog other times No. But the data says always wear a bikini. And there are a variety of other things. But in these examples, is there such a thing that where you can be so focused on trying to maximize the results? That you end up doing things that might be not great? I feel like there’s there’s two branches, we could go with us? I think we should talk about them both. One is on testing. But I think the bigger one comes down to values and ethics. So Katie, when you listen to these different examples of you know, people being ready clickbait headlines, or using thumbnail images and stuff, yes, the data says, do more of this. What’s your take?

Katie Robbert 2:12
My take is that you Yes, you can be too data driven. And it’s, you know, as you start to walk backwards, you know, first and foremost, is your data, right is are you getting the right information. And so if you’re so heavily focused on what the data is telling you to do, have you been focused enough on is the data set up correctly. So I guess there’s that part of it. So there’s like the, the infrastructure and the data integrity piece of it, because, you know, false information, misinformation, all of those things include data in it. But that doesn’t mean the data is correct. And so that in and of itself is a fairly large topic to tackle. But if you’re talking about, you know, like the ethics of it. I think that listening to the data is usually a good plan. However, what we know about, you know, being data driven and artificial intelligence in general is that there is a still a very large role for humans to play, you need that human judgment. And so, you know, we’ve talked on past episodes about how, you know, training data sets have been included incorrectly. And so it gives you the wrong information, the most, one of the more famous examples being the Amazon hiring only men, because that was what, historically, their training data set said. And you still need that human to say, you know what, that doesn’t sound? Right. And so that’s to me really sort of where we’re back with, you know, can you be too data driven? Absolutely. Because if you take the data at face value, and just assume that it’s correct, then, you know, that’s also problematic, you still need that gut instinct, or that just that, that doesn’t seem right, or that doesn’t align with, you know, what it is that we’re trying to do?

Christopher Penn 4:01
Yeah, it’s, to me, data driven is a tactic. And a lot of these questions or access to Dziedzic ones in the sense that imagine, like, you know, data driven literally is when you open up Google Maps, and you have it help you plot your destination. In this case, to me, it seems like the YouTuber example, isn’t thinking about the destination of where they want to go. Right. So they’re just having the essentially the their GPS is saying go this way is faster, this was faster, this was faster, but there’s no real destination, or it’s a destination, they didn’t mean to go to like, do you want to be, you know, the bikini clad YouTuber? Is that going to be your brand? Is that your strategy? If so, then yes, following that, you’re the current data you’re on is the way to go. But if that’s not where you want to go, then you’re right. Your data is actually misleading you in the same way that if you wanted to go to Boston, but you put, you know, Burlington, Vermont and your GPS, it doesn’t matter how optimized the data is, you’re still going to the wrong place. I feel like in this in these examples that we’re talking about, it feels like the person or the question has not built that strategy, has not built a plan and has not set a goal. That’s clear that would help them with that data.

Katie Robbert 5:19
You know, it’s interesting, you say that data driven is a tactic. And I agree with you, I think that, you know, when you say that your mission or your values are to be data driven? Well, yeah, you and everyone else, whether or not you realize that you are looking at data or listening to data, or, you know, including data in your decisions you are, every single decision you’re making throughout the day includes some data point, you know, and it’s not always a numeric data point, you’re not always saying, I’m going to look at a spreadsheet and make a decision. So, you know, I would argue that everyone in everything is already data driven. So saying it is redundant, and it’s not, you know, a core value of your company, like, you know, if you say, Hey, I looked at our revenue numbers for last year, and I want more great, your data driven, you looked at the data, you’re driving your decision based on the data. So I would argue, using the term data driven, is irrelevant in the conversation, like, so it’s just like, take it out of there. Let’s just say everyone’s data driven, everything is data driven. Everything is a data point. So just get it out of the conversation. So then really, what we’re coming down to is, are you making the right decisions? Based on the information presented in front of you?

Christopher Penn 6:38
Are you seeing data driven is the new synergy? Hey,

Katie Robbert 6:46
let’s circle back on that later. put a pin in it.

Christopher Penn 6:51
All right. Um, secondarily, I think there’s the optimization trap at play here, too. And this is something that a lot of folks who do AV testing should be aware of. And what it means is that if you are constantly optimizing, which we recommend, you should always be optimizing and trying to improve your results, what can happen is you end up optimizing for an ever smaller part of your audience. So if you have two versions of a creative A and B, and a wins by 60%, great, now you optimize for a and the next time you do a test, you have C and D, and C wins by five, you know, 55%, now you’ve, you’ve not taken 55% of your audience, you’ve essentially taken 55% of that 60% from a and so you keep getting all this pin down into this, this niche of this is what the data says. But again, you’ve lost sight of the goal, and you actually create an increasingly larger pool of people you’ve pissed off along the way. Because you’re like, Oh, yeah, you’re even though you’re 40% was the loser in this, it’s still 40% your audience. So in these examples of, you know, should you wear a bikini all the time, because it’s, it’s, you know, what the data says, You’re not introducing anything new into the testing so that you can see, maybe this isn’t the best thing. Or maybe there’s there’s some additional context, one of the things that in machine learning that is really important for coders to understand is, you need to do what’s called perturbation testing, where you add in five or 10%. net new things that were not that you either explicitly excluded from the optimized pool, so that you can test to see, hey, maybe this new thing we’re trying on the side here actually performs better than than the, you know, the 80% of the pre optimized data that we started with. That way, you can constantly find out, oh, there’s a new thing here. Real simple example. If you’re on Amazon, and you’re shopping, and you know, this happens after a while you’re shopping for, I don’t know, new slippers. And suddenly, you’re, you know, your whole feet of you might also enjoy becoming variations on slippers, right. But you’ll notice as you do that, Amazon will toss in like a lawn mower, right? or other things that are related to other interests you have, but they’re they’re testing to see like, are you still on the slippers kick? Or have you changed your mind like now you you’re looking at books, or now you’re looking at, you know, various versions of hot sauce. That testing allows you to figure out, oh, this isn’t just a one trick pony. Whereas it sounds like in a lot of these examples. There’s not enough testing going on. And so these marketers are ending up as one trick ponies as opposed to constantly testing their audiences. And that, to me also feels like not only a lack of strategy, but it also feels like a lack of awareness about how testing an optimization supposed to work.

Katie Robbert 9:45
I think it’s testing and I think you’re right, I think the other piece of it because as you know, digital marketing like you need all of the things working together is are you even getting your stuff to the right audience. And so you know, I’m looking at these examples and it says, you know, a female youtuber does better with her thumbnail when she has a bikini on? Well, I don’t even know what the thing is that she does. She could just be, you know, reaching people who like to see a woman in a bikini. And so it doesn’t matter what it is that she does, she’s already reaching the wrong audience. And so I think that there’s a lot of, you know, audience analysis that also has to go in with it. You know, this person who asked the question, oz said that my most popular blog post was a tongue in cheek and deliberately used as clickbait. Well, I don’t know what the thing was about. But obviously, it attracted a lot of people who may not even care about the thing that he was trying to write about. And so testing is a big part of it. So obviously, don’t just do one test. You continue to iterate you’re testing Chris’s you’re describing but also, you need to understand what is the audience that you’re reaching? And so, you know, let’s say, for as a terrible example, every single one of our blog posts starts to be extremely No. No, has a picture of one of our dogs. And so there are millions of people in the world who love pictures of dogs. And so every one of the mobile or desktop previews includes a picture of a dog, therefore, people who, like dogs start to click into our content, and very quickly realize this isn’t the content for me, but it had a picture of a dog. So I was attracted to it. And, you know, we can continue to test, you know, my dog, your dog, John’s dog genies like, all the different dogs that we have, it’s still the wrong thing to be testing. And we’re still attracting the wrong audience. We are attracting people who love dogs, when we really want to be appealing to people who want help with their data, or their Google Analytics or their marketing strategies. And so I think that there’s, you know, another layer to this in terms of being too data driven, like, yeah, if we keep putting up dogs, the data is going to tell us that picture, the pictures of dogs are really popular. But that’s the wrong thing to be putting up a picture of, because then we are attracting the wrong audience.

Christopher Penn 12:13
And I think that would come out in the data if you were doing a better job with the data if you had better attribution modeling and things because you would find out you attract a lot of traffic, but none of it converts.

Unknown Speaker 12:23
Right. So

Christopher Penn 12:23
if you’ve optimized for eyeballs, then yes, dogs in bikinis and dogs in bikinis would be the thing to get eyeballs.

Unknown Speaker 12:32
Can we just let the bikini thing go

Christopher Penn 12:35
with cigars?

Katie Robbert 12:37
Just let it go.

Christopher Penn 12:40
We would get no conversions on that. Because you get there. As one of these examples, you get there like, Oh, this is not at all what I came for. And so that’s a failure of having the wrong goal. Right? If you are selling something like AWS is a Microsoft MVP sells, you know, amazing Excel trainings. If you’re putting up a piece of clickbait, that gets an audience that doesn’t care about Excel, then you’re not going to convert, you know, we put up a piece of content. We’ve had this discussion internally, like, should we be putting up as much content as we do about things like Instagram analytics, when we’re not a social media agency, we can help you with your social media data. But we’re not a social media agency, we’re certainly not an ad agency. And are we getting those conversions from the right people? So it again, no surprise, it comes back to what’s the goal that you’re working for, towards if the goal is eyeballs to sell ad inventory? Great. But if the goal is to get people to buy something, a core product, and you’re out of alignment with your core product, it’s not going to be great. But I suspect for a lot of these examples, that attribution analysis probably does not exist.

Katie Robbert 13:50
It doesn’t, I would say that it definitely doesn’t. And we know from the work that we’ve done, that there’s not a great out of the box system that does the kind of attribution analysis, Chris, that you’re talking about. So, you know, out of the box, Google Analytics does offer attribution analysis, but it has limitations to what it’s able to do, and what it’s able to show you in terms of what’s happening within your data. So it does a great job of first click, and last clip, click attribution. And depending on the type of company that you are, that might be fine. But if you are a company that has multiple channels and multiple efforts and social and email and organic and paid and you know this and that and other things, then there are definitely limitations to what you can find out from the out of the box, Google Analytics, attribution modeling. And so you’re definitely at a disadvantage in terms of finding out what it is that’s working for you and bringing your audience and but if you’re just looking for basic, like, does this page help people Convert, then yes, you can find that out from Google Analytics. So you know, in this example of, you know, I wrote a blog with, you know, clickbait, and it resulted in six years of comments. Well, I can probably make this strong assumption that it drove traffic, but nobody converted because of it, because it was clickbait. And you were learning people into read the thing. And they were like, hey, that’s not what I wanted. And so those are easy ways to determine is the content that you’re putting out there doing what you want it to do. So Chris, in your example, we could drive more traffic to our website by posting more about Instagram, TV and other social metrics. And people would convert but then we would have to sit there and say, but we don’t do social media. So you converted? And we’re so sorry, we can’t help you. Yep.

Unknown Speaker 15:50
I think

Christopher Penn 15:52
when we look at how to look at this data, even a basic attribution model, you know, time decay, for example, would be good enough to know oh, this ain’t working. Right? You might not, you need the more sophisticated attribution modeling to understand the nuances of how different channels interplay. But if you put enough stuff, it’s just not converting like, Oh, this, this flat out ain’t working. You don’t need a sophisticated mods, no, things just aren’t working good. You do need a sophisticated module, understand how things work. But this is pretty clearly case of a I put up a picture of a dog in a hat, but I sell analytics. That’s, that’s not going to convert. And so again, unsurprisingly, come back to Did you have a goal on a plan? And if the data is telling you something that is opposite? The goal, you have to decide is the gold one you still want to pursue? And if it isn’t, great, change it? And if it is, then yeah, you have to ignore the data, you have to be willing to say, Yep, we’re going to willfully ignore this data, even though it looks good. I have. There’s a post I wrote in 22,007. The difference between two different types of decongestants, sudo, f adrene, and phenol, Efrain. And that’s the one the highest traffic blog post I get on, on any given wintertime month, because people still don’t understand the difference. But I ignore it completely. I don’t ever optimize it, I’ve never gone back and updated a refreshing thing, because it’s not something I want to be known for. It’s there, because I wrote it for fun, but it’s not relevant. And so I put no resources and effort towards it, because that’s not what I want my people to come into my blog Come on, same stuff for my pizza sauce recipe. It’s good. It’s not what I want to be known for. So to kind of circle back and wrap up on all this, if you don’t have a goal and a plan, yes, your data can mislead you can take you in places you may or may not want to go. And you will always end up if you don’t have a direction, you’ll always end up you know, sort of the lowest common denominator, whatever your audience is, you need that goal on that plan,

Katie Robbert 18:04
stretching anything else, we need to tie up on the skatey you need to go on the plan, and you need that human interaction with the data, the data stand alone, doesn’t make decisions, you know, your audience, you know, your company, you know, your goals. Therefore, you need to have a heavy hand in, you know, determining what you’re going to do with the data. So if you just like hand someone spreadsheet and say do whatever the data says that is the absolute wrong way to be data driven, data driven, really, you know, the spirit of it. What it is meant to encompass for companies is that we use data in a smart way to make good decisions for our company. What that really implies is that we are collecting data the right way, we are looking at the data on a regular basis. And then we are making decisions based on what we inherently know about our audience and what they want. Because the data can’t always tell you all of that nuance. And so that I think sort of, you know, you need the people, you need to collect it the right way. You need to trust your gut and your instincts and you need to collect the data correctly.

Christopher Penn 19:14
data driven means you do the driving the the map just helps you get there. And if you don’t know where you’re going, you will find out very quickly just how entertaining google maps can get you lost. If you have follow up questions about this or anything else, please hop on over to our free slack group. Go to Trust insights.ai slash analytics for marketers you can join over 1400 marketers talking about all the analytics issues of the day. If you haven’t subscribed to the show, wherever it is you’re consuming right now go to Trust insights.ai slash ti podcast where you can subscribe to the show on the platform of your choice and never miss an episode. Thanks for listening and we’ll talk to you soon take care. One help solving your company’s data analytics and digital marketing problems. This is Trust insights.ai today and let us know how we can help

Unknown Speaker 19:56
you


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This