In this week’s In-Ear Insights, Katie and Chris answer a very popular question: what do you do when the numbers don’t add up? When your Facebook results conflict with the data in Google Analytics, when your CRM doesn’t match your marketing automation, when accounting asks you why your forecasts were wrong – what do you do? Listen to some solutions and how to approach the problem – and what you can’t fix.
Subscribe To This Show!
If you're not already subscribed to In-Ear Insights, get set up now!
- In-Ear Insights on Apple Podcasts
- In-Ear Insights on Google Podcasts
- In-Ear Insights on all other podcasting software
Advertisement: Data Science 101 for Marketers
Do you want to understand data science better as a marketer? Would you like to learn whether it’s the right choice for your career? Do you need to know how to manage data science employees and vendors? Take the Data Science 101 workshop from Trust Insights.
In this 90-minute on-demand workshop, learn what data science is, why it matters to marketers, and how to embark on your marketing data science journey. You’ll learn:
- How to build a KPI map
- How to analyze and explore Google Analytics data
- How to construct a valid hypothesis
- Basics of centrality, distribution, regression, and clustering
- Essential soft skills
- How to hire data science professionals or agencies
The course comes with the video, audio recording, PDF of the slides, automated transcript, example KPI map, and sample workbook with data.
Sponsor This Show!Are you struggling to reach the right audiences? Trust Insights offers sponsorships in our newsletters, podcasts, and media properties to help your brand be seen and heard by the right people. Our media properties reach almost 100,000 people every week, from the In Ear Insights podcast to the Almost Timely and In the Headlights newsletters. Reach out to us today to learn more.
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
- Need help with your company’s data and analytics? Let us know!
- Join our free Slack group for marketers interested in analytics!
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Christopher Penn 0:02
This is In-Ear Insights, the Trust Insights podcast.
In this week’s In-Ear Insights, what happens when the numbers don’t match? We’ve gotten a number of questions at recent events, MAE Khan, marketing profs and things like that, about situations where you have two different data sources like Facebook ads, and Google Analytics, or even Google ads, and Google Analytics, or Facebook and Instagram, etc.
Where the performance numbers, the impressions, the clicks, the conversions just don’t match and not just don’t match by a little in some cases don’t match by a whole lot.
And it puts people in very difficult positions when they’re making reports to senior management to say, hey, here’s what Facebook says we did.
Here’s what Google says we did.
And senior management’s like, so which is it? Because they’re both, they both look wildly different.
So Katie, when you look at reporting, when you look at or if someone gives reporting to you, and things are clearly not right, what do you do?
Katie Robbert 1:03
Um, you know, I used to run into this a lot, when I was doing more regulated, work it with pharmaceuticals and clinical trials, because you had to have one to one match the data.
So you would, with a clinical trial, a lot of times, you would collect the data twice to validate that the data that you were collecting was correct.
The difference there is that you are collecting it the exact same way, whether it be by computer, and by hand, sort of the controlling the experiment, experimental, but it was the exact same methodology.
What you’re talking about Chris is, you know, basically like unicorns, and pumpkins, and so you have Facebook, which collects data their way, and Google who collects data their way.
And so it’s never going to be a complete one to one match.
If it was, they would be the same company, or they would have shared secrets with each other, which we know is never going to happen.
And so I think that’s where we need to start is, what is the source of the data that you’re trying to compare? What is the methodology in which the software systems are collecting the data? Because we know from our work that, you know, the way that some of these data points are defined, even if they’re using the same terminology is different.
There is no good industry standard, because every software system wants to do it their own way.
So that’s where I would start is, what are your sources of the data? What are you trying to compare? Is it apples and apples? Or is it apples and oranges? And so, you know, Facebook and Google do collect data differently, they also don’t have the same kind of transparency.
And so, you know, I’ve seen a lot this is a question that actually comes up quite a bit is, you know, well, Facebook says, I have this many impressions.
But Google says this, well, how the heck is Google going to know how many impressions you have on your Facebook social post, when it’s never entered the Google ecosystem at all? And so it’s also just that education of the limitations of the software systems themselves.
So that’s where I start.
Christopher Penn 3:07
Makes sense? One of the things that I always asked folks is what’s the business outcome? Right? If it’s, is it leads? Is it ecommerce sales? Is it foot traffic to your store? And then suggest to them? Have you done any kind of just a basic correlation analysis of all these numbers you get from Facebook or YouTube or Instagram or Google Analytics, whatever, which one mathematically? Looks like the the outcome you care about, right? If Facebook says you made a million dollars in sales, and Google Analytics says you made $2 In sales, okay, well, what does your accounting system say? Like at the end of the day, the CFO is gonna be the one saying, Well, we made $10,000.
And you Okay, so which one is closer? Well, in this case, it’d be, you know, Google Analytics, they’re both wrong.
But from accountings perspective, Google Analytics, in this case, would be less wrong than Facebook, because you didn’t make a million dollars.
You didn’t make two either.
But the two is closer to the actual number.
And so from a math perspective, that seems to me, a place that I would start is to say, like, Okay, well, well, which one has, which meant it looks more like reality, the challenge being if you don’t have access to that reality, you may want to work on your, your internal governance.
Katie Robbert 4:22
But I, so I feel like that sort of becomes sort of the point of the question, because Chris, the situation you’re describing is, well, both systems are wrong.
Neither of them reflects what Accounting has.
And why is that? And so, I think one of the things that we as marketers try to do is, sort of force the system square peg round hole into reflecting a reality that they just can never reflect because, you know, they only have a limited scope of what they’re able to see.
And so our job as marketers is to create that holistic picture of all of the things data together.
And so I guess when someone says to me, Well, my data doesn’t match? Well, it’s not supposed to it’s one piece of the whole picture.
And so it’s this part over here and this part over here.
And so to your point, Chris, about governance is you need to understand, you know, what is your reality supposed to reflect? And what are the data points that you need, in order to understand that reality? It might be a little bit from over here, a little bit from over here, a little bit from over here, I don’t know that there is one single system that can accurately pull all of that data from start to finish.
Christopher Penn 5:40
Well, you’re right, there isn’t especially because you know, to your earlier point, it’s not gonna match.
So if you look, for example, in our in our content caching system, CloudFlare, it gives us web analytics traffic says, Here’s how many people have passed through our system on the way to your website.
And it’s different than Google Analytics.
And in some cases, it’s pretty substantially different.
And then if you look above that, you know, on, say, Google ads, how many people clicked on your ads? Again, that number is different.
I see this every week on our email marketing stats, I see, you know, this link got 40,000 clicks.
And then when you look in Google Analytics says, You got 4000 clicks, like, what happened there? You know, how, why is it such a huge difference, but you’re right, it’s kind of like the old proverb of, you know, five blind people, you know, feeling different parts of the elephant and not sharing, none of them know, that’s an elephant, like one thinks it’s a tree, one thinks it’s a rope, one thinks it’s a leaf, a big tree leaf, and none go, oh, this is an elephant.
Katie Robbert 6:37
So I think, you know, this sort of gets into that discussion, where a lot of companies are trying to find that, you know, unique snowflake of the single view of the customer.
And, you know, we’ve worked with a lot of clients just on having that conversation, and really helping them understand what that really means.
And so because what they want to be able to do, the goal of the single view of the customer is every single touch point, that person went through offline and online, that led them to a sale so that they can start to understand what that looks like.
The problem with that is that there’s a lack of understanding of how many different systems that it’s going to take in order to get that single view of the customer.
And that goes back to the original question, Chris, of what do we do when the data doesn’t match? It’s not supposed to it’s different systems.
And so one might have impressions, one might have website visits, one might have goal completions, one might have revenue, one might have personality traits, one might have customer feedback surveys.
And all of those systems together make one full picture but they’re not meant to be compared to one another.
Christopher Penn 7:46
Now, the one thing I will say is that if you think about the marketing operations funnel from the very bottom, which is your your revenue, all the way to the very top, which is like you know, your your awareness, the closer you are to the bottom of that marketing operations follow, the more accurate the numbers should be, right, because you have more control, the further down the funnel you get, the more control you have, if you are in your sales CRM, and you don’t know how much money you made, you have a problem, like you have a really, really serious problem.
Because you can’t you know, your CFO was probably strangling your chief sales officer.
Because, like, none of what you’re telling me reflects reality, whereas your CFO was probably going to be okay with, you know, your, your PR impressions being you know, wildly, you know, you’re ridiculous, gigantic, that was like a 8 billion people saw, there’s like, no clue that’s, you know, a fictional impossibility.
But because it’s so far away from the bottom of the funnel, it’s less impactful, right? It’s you know, your your business not riding on it.
If you’re closed one deals do not show up in accounting, something’s received.
Katie Robbert 8:58
I think it’s, you know, it’s interesting that you’re saying that, like, the farther away from the impact, the more okay, it is for them to be wrong.
And of course, my insides are cringing, like, no, it’s never okay for those numbers to be wrong.
And so I think that then sort of like goes down that, you know, dark, dirty path of, you know, the inflated numbers and the inaccuracy.
But it’s okay, it didn’t affect the bottom line.
So let’s just go ahead and keep inflating numbers.
And so Chris, can you talk a little bit more about why, you know, some metrics are okay to be incorrect, and some metrics aren’t.
Christopher Penn 9:33
It’s not that it’s okay.
It’s that you have no control over it, right.
Like with Facebook says, you got a million and a half impressions on your ad.
We don’t really know how Facebook measures it.
We have the documentation, which says anyway, it appeared in the feed, you know, as the user was thumbing by as quickly as possible to get to the friend stuff.
You know, when you think about Instagram, for example, and you’re on Instagram, and you’re swiping through your friend’s stories, and there’s an ad you swipe past it.
Well guess what that advertiser has been told this ad was served this impression.
So even though you saw For half a second, and you probably couldn’t even tell me what it was that you just knew it was an ad.
Because it wasn’t a thing you recognize from your friends.
We don’t have control up there at the top of the funnel we have no, we have no control over the measurement systems.
We have no control over what even the numbers mean, right? You said this earlier.
A YouTube view is 30 seconds of viewing time, a Facebook view is three seconds right there.
They’re apples to kumquats, right? They’re not, they’re not even in this one is 10x different than the other.
So you can’t even make an apples to apples comparison at that top layer.
And so all those numbers are wrong.
When you get to the middle of funnel, and you start talking about things like web analytics, like Google Analytics and stuff, you are still dealing with things like ad blockers that are knocking out your your tracking scripts, unless you’re using server side tracking, and even then that’s iffy.
You’re still dealing with proxies, for example, apple in the new iOS 15, now has a VPN built right into Safari.
So if you’re using the browser on your your Apple phone, and you’ve turned this thing on, you are now screwing up.
The marketers analytics, they think, you know, all this traffic is coming from Kansas, because that’s where it’s geo located, or from where Apple’s nearest server is.
And so the market is like, Oh, we got to start doing some campaigns in Kansas, and everyone’s like that, that makes no sense.
We don’t mean we’re in Germany, we have to do that.
And it is only when you’re at the bottom of that funnel, where you can say, Yes, we’re at a point now where the data we’re working with, we have control over we have control over how its measured.
When you’re talking to a salesperson, you’re saying, hey, salesperson, I need you to enter the data into salesforce.com, like this, right? And you need to update your leads like this.
And so you have that ability to exert that control and therefore the accuracy gets better.
Because you control the measurement method.
You control what the numbers mean.
And you can control the data quality much better.
It’s not great that your PR agency tells you you have 25 billion impressions on an article.
I’m not saying that’s a good thing that you know, that is completely wrong, you know, it’s wrong.
But you don’t have control over it.
So it’s a question of where do you fight your battles?
Katie Robbert 12:18
So it’s interesting, because we started the conversation saying, what do we do when the numbers don’t match up.
And now we’ve sort of gone into this part of the conversation where we’re saying, like, you know, if you really have to only focus on what you can control, but that then puts a whole bunch of people out of work, because there are agencies that only go after, you know, impressions and views and those kinds of things.
And those are metrics that matter to a lot of companies.
And so I guess, my question is, you know, how do we help educate on those metrics? How do we make them better? Is there a way to get to a more accurate state with those numbers? Or are we still forever going to be at the mercy of these third party systems that make up their own rules? And maybe change them every day?
Christopher Penn 13:11
It depends, which is the answer.
I know we all love.
It depends on consistency.
If a number is consistently wrong, like if you know that this number is always going to be 10x, off of reality, you can model that you can account for that, right? And then that number becomes accurate.
Like, right, so if, if Google Ads tells me my ad got 10,000 clicks, and, and Google Analytics shows me, I got 1000 visitors.
And the next day, it’s 5000, clicks and 500, visitors and so on, and so forth.
If I always know that the Google Ads number is going to fall off by exactly a factor of 10x, that I can predict it, I can forecast it, I can I can make good assumptions from it.
If on the other hand, one day, it’s 10x off the next day, it’s 2x off the next day, it’s 5x, Auphonic.
I can’t use it, I can’t predict how wrong this is.
So a key factor in dealing with those those numbers you don’t have control over is modeling their consistency to say, Okay, well how if we accept it’s going to be wrong? How consistently wrong is it? Can we account for it? So you and I do this when we’re setting up someone’s Google Analytics goals.
If we know that a lead is worth, say, $100 to a company because you know, the closing rate and sales, closing rate and deal rate and all that stuff.
If you know, these were the $100 and you check the next month, and it’s a lead is worth $101 You check the next month at least worth $99 Then you can say pretty comfortably.
Okay, look $100 probably about right that goal value and Google Analytics and that’s going to be good enough to help you make decisions.
Right? If on the other hand, you do we do the exercise and like one month leads worth $1 The next month at least worth $10,000 like we gotta come up with something.
Gotta find something else that consistent so that you can you can make decisions on it.
Because that’s really at the end of the day, what matters is, can you make a decision on the data?
Katie Robbert 15:08
So let’s take a step back then.
So you know, you’re talking about knowing what those error rates are knowing that it’s 10x off knowing that it says, how do we as marketers get to that point of knowing how often it is because, you know, again, we’re talking about, you know, millions of impressions, for example.
So unless we’re actually sitting with every single person in the world, and watching them view our ad or scroll past their ad, we actually have no idea how off the numbers really are.
So how do we start to get to that point where we know where we can then sort of account for that error rate?
Christopher Penn 15:47
Well, you do this already.
You you have a spreadsheet that you use for Trust Insights of all of our marketing metrics.
So I’ll talk a little about what’s in that.
Katie Robbert 15:55
Yeah, so I capture a lot of information.
And I’ve tried to organize it in sort of the sales funnel slash customer journey way.
And so I start with website visitors, how many visitors came to our website, and then I worked my way down from their how many people filled out a contact form, how many people filled out the subscribe to the newsletter form, how many people turned into leads from prospects and sort of I work my way down that way.
But I also capture metrics around how our different social channels are growing, how many followers there are, from last month to this month, how many people are subscribing to the newsletter, what the open rates are? How many people are subscribing to the podcast? And so I look at those metrics as a whole to understand, you know, is something wildly off? And so, you know, in that example, I don’t capture impressions in my spreadsheet, but if I did, I could probably see something like, well, we only have, you know, 300 followers to our Facebook page, but yet somehow our social post is reaching 25 million people.
How is that possible? If only 300 people, you know, should have been seeing this post for three days, the math doesn’t add up.
So I could at least start to know where the issues are.
But I wouldn’t necessarily say I could accurately figure out, you know, what it should be versus what it’s telling me it is.
Christopher Penn 17:24
So in that example, if you think about those columns from stage stage, the percentage change from month to month in each number and between columns.
So from website visitors to contact form fills, what is that percentage number it? And then what is the variance of that number month over month? Because if you do that, then to answer your question, that’s how you start to know what that consistency is what things are right or wrong, if one month, Google Analytics says he had 10,000 visitors and there were, you know, 1000 contact forms.
So it’s a 10% change, right? If the next month, it says you have 10,000 views, and we got like five people that brought the point like, okay, something has gone off the rails there.
But if those numbers are the percentage change is consistent, then you can say, Okay, this, whatever is coming out of Google Analytics, whether or not it’s correct, it is consistent, and we can make decisions off of it, then we could say like, Google Alex, traffic is down, we’re gonna have a down month on leads, right? Or Google Alex is up.
So we’re gonna have an up month on these.
So that’s the key is it you know, to answer the question, how do you get started, it’s literally putting stuff in a spreadsheet, like you’re already doing, and then measuring the changes across rows and across columns and looking for consistency.
Katie Robbert 18:40
We actually ran into this, maybe about a month or so ago, where, because I’ve been collecting the data for the past three years, every 15 days, like the OCD maniac that I am, we were able to see that there was an issue in our CRM system.
And the numbers were being inflated based on how they were coming into the system.
So we were able to correct that.
But it was definitely a big jump from, you know, what we were expecting to see.
Because in that jump, like, it would be great if that reflected reality.
But none of the other numbers further on down the line reflected that jump.
So sales didn’t go up, you know, this didn’t go up at whatever.
And so we were able to pinpoint, okay, this particular metric is not accurate, because it’s not, you know, working its way through the rest of the metrics as it should, if it were accurate.
Christopher Penn 19:40
So that’s, that really is the answer when you have numbers that don’t add up, or that that don’t make sense that that are wildly diff.
It comes down to doing that basic math, putting everything in a spreadsheet and looking at consistency rates between numbers but over periods of time and saying well What here makes logical sense? And where are their variances that are so wild that either there was an anomaly in our marketing performance, we did something else that was beyond that, that clearly either worked better or worse, substantially.
So, or there’s a system that we can no longer trust.
One of the things that we talked about last Thursday’s live stream that I think is an important point, worth reiterating, is like an attribution report is a diagnosis of your supply chain.
Because you as a marketer, your suppliers are the places that are giving you audience and attention.
And if you see wild differences in the numbers coming from your different suppliers, like Facebook, as a supplier, you give them money, and they give you a product, the eyeballs of audiences, not literal eyeballs.
If the quality of that product goes down, it’s no different than a grocery store saying, hey, you know, let us farm all the lettuce you sent us this week was rotten, like, and the less farms like that was perfectly fine.
Like, this is brown, it’s supposed to be green, and a lot of fun, because continuous, this is fine, then you know that that supplier is problematic.
So when you’re if you’re, as you’re thinking about answering this question of why don’t our numbers add up? A good portion of to think about is, is that supplier trustworthy? Right? Is that suppliers quality, product quality, good.
And if it’s not, then you, as the buyer, have the right to say, you know, I want to find a different supplier, because I don’t want this pile of brown lettuce.
Katie Robbert 21:29
I think that that’s a really good pro tip is, you know, know what your system is, know what they stand for.
And so that sort of, you know, you’re making those business decisions.
But also sort of going back to the start of the conversation is you shouldn’t be comparing system to system because no two systems are both the exact same.
And so if you’re trying to compare Facebook data to Twitter data, well, those are two different systems.
Yes, they’re both social media.
But they define things differently.
Chris, to your point, you said that Facebook and YouTube define impressions differently based on the amount of time and so understanding the definitions of those metrics will help you be able to explain the differences, especially if you’re just handing a report to the C suite.
And they say, why is it this over here? And this over here? Having those definitions to say, this is how this system captures that data? Are we okay with that? Can we live with that information will help you start to focus in on what metrics really matter?
Christopher Penn 22:34
And the last part I’ll leave off on here is if you’re looking at your marketing reporting, and things are not going the way they want, remember, there’s two sides to this party.
Yes, you could have done something bad in your marketing, right? Oh, you published a dumb tweet, or your ads are just not great.
You know, your ad agency didn’t do a great job.
But don’t discount the possibility that the suppliers supply went bad, right.
ad companies in particular done a very good job of convincing us as marketers, that although all the performance problems there your problem your bad creative, right or your bad target? Well, it could be that you supplier half a bad audience, right.
And just remember that, remember that as you’re comparing data from systems as you’re trying to figure out how to improve your marketing, you do have some of the responsibility, yes, but so does the supplier.
And you have to remember that you are the customer to
Katie Robbert 23:31
well, it sort of strikes me of you know, if you have one supplier making the same promise over and over again to millions of people, at some point, they can’t deliver on that promise.
And so if I’m an ad system telling you, Chris, yes, I can get you this incredibly, you know, targeted audience of these 500 people.
And then I tell that exact same story to 100 other people Well, guess what, it’s no longer the super special, like targeted audience because everybody’s trying to reach them.
And then the 500 people who comprise the audience are like, I don’t want any of this, everybody just go away.
And so I think that it’s, you know, it’s a really interesting way to be thinking about, you know, these larger platforms that we as marketers have become reliant on to disseminate our content.
And back to your you know, the point that you always make Chris is focus on what you have control over.
And that’s, you know, your website, your email newsletter, you know, your owned content, those are the things that you have control over, versus really relying on these third party systems that quite honestly don’t really care if you succeed or not.
Christopher Penn 24:42
And if you’d like to be part of a special audience that doesn’t get 500 offers with 500 other people.
You can join our free slack group, go to trust insights.ai/analyticsformarketerswhereagainyouhaveovertwothousandotherpeoplereceiveonlytargetedstufffromusjustkidding It’s really good It’s a discussion community for you and your colleagues to ask questions get help and answers to questions you have.
There’s no charge to join and no one else advertising.
And if you’ve got quite if you want to watch or view this show in the place that works best for you, we go to TrustInsights.ai dot AI slash ti podcast.
You can find us on most other channels go again.
We’ll talk to you soon.
Need help making your marketing platforms processes and people work smarter.
Visit trust insights.ai today and learn how we can help you deliver more impact
Need help with your marketing data and analytics?
You might also enjoy:
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, Data in the Headlights. Subscribe now for free; new issues every Wednesday!
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new 10-minute or less episodes every week.