In this episode of In-Ear Insights, Katie and Chris review a case study of a client who was missing attribution data on more than 75% of their website visitors. When your data is that dirty, making good, data-driven decisions is all but impossible. Watch or listen as they talk through what happened and some ways to look at your own data and assess how clean it is.
Subscribe To This Show!
If you're not already subscribed to In-Ear Insights, get set up now!
- In-Ear Insights on Apple Podcasts
- In-Ear Insights on Google Podcasts
- In-Ear Insights on all other podcasting software
Advertisement: Data Science 101 for Marketers
Do you want to understand data science better as a marketer? Would you like to learn whether it’s the right choice for your career? Do you need to know how to manage data science employees and vendors? Take the Data Science 101 workshop from Trust Insights.
In this 90-minute on-demand workshop, learn what data science is, why it matters to marketers, and how to embark on your marketing data science journey. You’ll learn:
- How to build a KPI map
- How to analyze and explore Google Analytics data
- How to construct a valid hypothesis
- Basics of centrality, distribution, regression, and clustering
- Essential soft skills
- How to hire data science professionals or agencies
The course comes with the video, audio recording, PDF of the slides, automated transcript, example KPI map, and sample workbook with data.
Sponsor This Show!Are you struggling to reach the right audiences? Trust Insights offers sponsorships in our newsletters, podcasts, and media properties to help your brand be seen and heard by the right people. Our media properties reach almost 100,000 people every week, from the In Ear Insights podcast to the Almost Timely and In the Headlights newsletters. Reach out to us today to learn more.
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
- Need help with your company’s data and analytics? Let us know!
- Join our free Slack group for marketers interested in analytics!
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Christopher Penn 0:02
This is In-Ear Insights, the Trust Insights podcast.
In this week’s In-Ear Insights, it is a kind of a tradition to write case studies, particularly the end of the year, if you’ve ever worked in the agency, when things get slow, one of the things that folks do is spend some time updating case studies and all kinds of documents, they owe the kind of end a year clean up, sort of like a Spotify rabbit, but a much less exciting version of it.
So if we get this year, as we’re, we’re looking at our own stuff, we’d walk through one of the case studies that we’re tidying up and preparing and even give some thought to how we might have done it differently.
So Katie, want to talk through what exactly this thing is.
Katie Robbert 0:45
Yeah, so um, you know, in this context, a case study for us is an opportunity to demonstrate some of the work that we’ve done with a client.
And this particular case study is talking through how our service of Google Analytics audit helped one of our clients, in this case, the AAA club Alliance, to clean up their instance and have better overall data integrity, and the ability to do a more accurate attribution.
So triple A club lines is a very large enterprise size organization.
And so they have a lot of data filtering in and out of their systems, they have a lot of different teams, running digital marketing campaigns.
And when they started with us, they really didn’t have a good sense of what was working.
And that was due to how their website tracking system in this case, Google Analytics was configured.
And so that’s where we started.
And so all of our case, studies are broken down into the five P’s because what we’ve realized is that every project touches upon all of these things.
And so we started with what the problem was, and that it’s people process platform and performance.
And so in this case, AAA club Alliance, asked us Trust Insights to help them clean up their infrastructure, so that they could do better attribution.
So that’s what this case study is all about.
Christopher Penn 2:22
Okay, one of the things, I think, you know, is important, and this is something that anybody can and should do is, in your Google Analytics instance, if you’re looking at Google Analytics, look to see what percentage of direct traffic you get.
One of the things that Google doesn’t make clear about Google Analytics is that when it says something is direct, what it really means is, we don’t know, like Google Analytics is saying, I throw my hands there, I have no idea what this is, I’m just gonna say it’s direct traffic, because I don’t know how to interpret it.
That can come from a variety of things like misconfigured ads, can people typing your URL into their browser bar or bookmark or an email, if you send somebody an email, there’s no tracking codes on it.
Basically, it means it’s an absence of tracking codes.
And so in this case, when we first signed up with them, it was something like what 70 ish odd percent of there, more than 20% of their traffic was direct, which, for your average website, it’s, it’s not a hard and fast rule.
But generally, if more than 50% of your traffic is direct, there’s there’s a big problem is when you get above 60%, is a really big problem.
When you have 70%, you probably should stop what you’re doing, and devote to fixing things.
Because at that point, you have no idea what’s going on with your website is there’s so much data coming in that has no meaning.
And the ideal, of course, is for you to get direct down to, you know, five or 10%.
Because those probably are people just typing a URL into into their browser.
Katie Robbert 4:01
So specifically, 76.7% of their traffic was direct.
And that, you know, to your point, Chris, that’s a huge issue.
Because as an enterprise size organization, who is spending literally hundreds of 1000s of dollars on their digital marketing, you kind of want to know what’s working.
And so that’s where we started.
And so, you know, one of the first things, Chris, that we do with the Google Analytics audit, is you have a list of best practices that you always make sure are configured in a Google Analytics system.
Do you want to kind of just at a high level walk through what some of those settings should be?
Christopher Penn 4:49
Well, we know for example, that your channel groupings, particularly ones in Google Analytics, three are almost always wrong out of the box, right? They lump things like Facebook traffic into referral traffic, which it isn’t.
It should be social media traffic.
There’s never anything set up for stuff like paid search paid social.
And those are things you definitely want to split out and have denoted within the application itself.
And then there’s a whole bunch of things like making sure that your site is referring traffic to itself isn’t counting itself as a refer, that’s a pretty easy one turning on Google signals unless you have some regulatory reason not to try to unify a mobile and desktop traffic is important.
If you are on Google Analytics for some of the best practices there are, you’re setting up conversions properly, and conversion events, which events do you want to designate as conversions, making sure that you’re using the right view, the right settings for your funnel view, making sure if you are a company that is using the data in multiple places that you have big query linking turned on.
So there’s all these things this pile list, I would say have close to 100 Different things that some are minor, and some are pretty major for the best practices just to get a Google Analytics account up and running.
But actually, that’s not the hard part.
The hard part, then is the governance which is getting people to do the thing.
When you work with a new agency, when you work with a new team, when you launch a new campaign, making sure your tracking tags are in place when your email marketing goes out that it’s got its appropriate tracking tags.
Because as we are seeing with the reduction in the effectiveness of cookies, having those UTM tracking codes in every URL that you put that’s not on your website is so important to getting clean data into Google Analytics.
And without that, it gets really messy and you just lose a lot of data even more.
So there’s other things that within your Google Analytics instance, can can make the data much harder to read.
For example, if you have extraneous query parameters, if you’ve used Hubspot, for example, Hubspot attaches these little codes to the end of URLs.
And just like Google Analytics does, but Google Analytics knows to take its own codes out, it doesn’t know what to do with say Hubspot codes, or Marketo codes or Salesforce codes.
So if you look in your Google Analytics, and you sort by your your most popular pages, you may have a page that has 10s of 1000s of views.
But because each page has its own Hubspot attached unique code, you never see that, right.
So knowing telling Google Analytics, hey, you got to knock these things out, is really important to having data that you can use in the application.
So again, those are more things that you need to do to make Google Analytics work well.
Katie Robbert 7:42
So if you’re hitting on a really important point that I wanted to bring up, so a lot of times when we approach these projects, like a Google Analytics audit, the person who’s coming to us, the client who’s coming to us, has this idea that the problem is with the platform itself.
And that’s only partially true.
So when we were working with this client, we took a look at their Google Analytics instance.
And, you know, we fixed up the things that we needed to fix up as Chris was just describing.
But the bigger challenge that we have been working on with them throughout our engagement with them, which has lasted a few years now is really the people in the process.
And that’s where the data integrity starts.
And so we spent a fair amount of time working with that team, to help them put better governance in place.
And that combined with a properly set up system, is going to give you that combination of really good data integrity.
So we trained them on UTM tracking, we gave them rules and spreadsheets to help make sure that there was correct UTM codes going on to things like email.
And their social campaigns, we made sure that we knew who had access to change settings, within the infrastructure itself.
Those are sort of the other pieces of the puzzle, that a lot of times I don’t think get discussed enough.
Because, yes, you can set up a system perfectly.
But if you still have garbage data coming in, you’re still going to have garbage data going out.
And so that’s a big part of the work that we did for this particular project is that in order to get a 40% improvement on the data cleanliness, it wasn’t enough just to set up the system correctly, because the system is only as good as you tell it to be.
The people have to be putting in good quality data.
So that’s an ongoing piece of the project that we’re just could we continually have to revisit every few months of here’s what we’ve noticed, here are the things that we had to fix.
Here’s the reeducation on the governance have the data going in.
And so that’s more of that perpetual thing versus a one time setting up the system correctly.
Christopher Penn 10:08
And now, you know, as we look back at this end and look forward, we’ve changed things up a little bit, we’re now those checks are more automated.
So this is an example, from my personal website of UTM codes, say, like, in the last 90 days, where has my traffic coming from, and I can look at this and those bars that are red in here, you don’t need to see the details, you just know that some are blue, some are red, the red ones are ones where like, there’s a tracking problem.
And so being able to, to, like you said, continually audit and look for things that maybe don’t make sense, is something that you have to do it’s, it’s, it’s like bathing, right, you don’t just pay the ones, right, you have to do it regularly and frequently to be clean, the same is true of your data, you have to check on it, to see is it clean on a regular frequent basis, you know, this is another example where we’re talking about this query parameters.
And this is the last 180 days from our website, there’s, I need to go into the second thing down there is the FB clip, I need to go into my own Google Analytics and knock that out to say I don’t stop tracking this stuff, you know, taking this in the unsubscribe, acquire parameter that he’s in my news that I need to take that out of Google Analytics as well.
So that it’s not again, messing up the data.
These are all things that are regular maintenance things that now you know, as as we continue to evolve, keep getting better, and keep getting more automated and find problems faster, get those problems fixed faster, because if you don’t, then it’s kind of like, you know, you steer the car once and then the road changes, and you don’t change in here.
Next thing, you know, your face first into a tree,
Katie Robbert 11:55
which sounds pretty terrible.
Knock on wood, I’ve not experienced that.
But so but I do want to sort of, again, sort of raise another point, Chris, that you’ve been touching upon is, you know, when you bring on a consulting agency, such as Trust Insights, or anyone else, if that agency is not giving you a maintenance plan for the work that they’ve done, then that’s a red flag, because I have yet to see any kind of project that can live in isolation from start to finish.
And then that’s it.
It’s complete, like, especially when it involves data and planning.
And your marketing campaigns, like, those things are not just like an nice, neat little package, there’s ongoing maintenance that has to happen.
So if your agency isn’t giving you a maintenance plan, whether they’re going to execute it, or they give you the tools for you to execute it.
That’s a big red flag.
Because there’s always, especially with this kind of data, there’s always maintenance that has to happen.
Somebody has to be continually checking, is the data clean, somebody has to be continually checking, is the team, putting the right data into the system, if not, what resources they need to make sure that they understand the data that they put in, is affecting our outcome in terms of our ability to understand what’s working.
Christopher Penn 13:20
Then sometimes the system changes, like, so I was changes.
I was in Google Analytics for last week, and they moved the cheese again, it’s like, Okay, where did the setting go that I was working on classic Oh, it’s not here anymore, it’s now moved to this other spot.
Or there’s been API changes and stuff in those levels of, of rapid change in any kind of modern software are inevitable.
But they’re also painful, because we don’t really get a choice.
Right? When we use Google Analytics, especially the free version, we’re not paying Google.
So they literally owe us nothing.
In terms of what they’re going to do when they’re going to do it.
They will tell us things out of courtesy.
But that’s courtesy and and you have to know where to find it.
So a lot can happen.
A lot does happen in this data.
I was using the HS SEO software the other day for one of our clients.
And to call this the code that I had written to process this data just stopped working like what happened in the span of two days.
Literally everything stopped working.
I went in to look oh, they changed some column names in their API.
And and there was a blog post about it.
Just they didn’t bother to let us know that blog post was there.
And so to your point, Katie, not only shouldn’t agency have a maintenance plan for the client work it does.
But they probably need to have a maintenance plan they can talk about for their own stuff internally to make sure that they’re keeping up on top of things and knowing when things are broken.
especially if you’ve got somebody who’s, you know, traditional B2B, proprietary software, and you know, all the wonderful stuff.
But yeah, how often do you fix it? One of the things that I saw with mild amusement was some of the software that we’d written in our previous life.
For the agency we used to work for, you know, as more and more time went by, after our departure, less and less of it worked.
Until finally, you know, the whole thing got retired, it’s like, okay, well, now TrustInsights.ai stuff.
But it’s an ongoing battle against entropy, right? It’s, it’s, again, it’s kind of like bathing and proper nutrition, you’ve got to continually exercise and stay fit, and all these things, to fend off the forces of entropy.
And this kind of case study really is about maintenance.
And when you and when you’re talking to executives, when you’re talking to stakeholders, and they say, well, what’s the ROI on this? Yeah, what’s the ROI on eating? Right? What’s the ROI on bathing, they, there’s, there’s some things that are just part of doing business, if you want to do business well, and make good decisions from it.
The activity itself may not have direct ROI.
But if you don’t eat, you will have consequences.
There will be consequences in your life.
Katie Robbert 16:22
I you know, I think that that’s a big part of planning that’s often overlooked.
And so, you know, when I’ve sat in planning sessions for clients or for other jobs, there’s this, you know, laser focus on the net new things we’re going to do.
And so here’s the new campaign, here’s the new idea, here’s the new product.
But that whole chunk of maintenance is missing from the planning.
And that’s where Chris to your point, things start to fall apart.
And so the maintenance has to be part of your planning, the governance has to be part of your planning, I’ll be the first to say, those are the least for other people, not for me, those are the least fun parts of planning.
But what I like about planning, the maintenance on the governance is the predictability, the confidence that you can create with knowing here’s what’s going to happen, the scenarios that you could be prepared for have, if this thing happens over here, what are we going to do about it? Oh, and when the thing happens, we’re ready for it.
And so for me, that’s why I like it, maybe it’s my OCD Control Freak type anus.
But I like that predictability of I know exactly what’s going to happen.
And when Google decides that they’re going to change their website tracking system, we have a plan to address that so that we don’t lose big chunks of data.
And that our data integrity that we’ve worked so hard to create and maintain, doesn’t suddenly take a nosedive.
So the purpose of the case study that we were sharing, was to demonstrate sort of that surface level of work that you can do just to start to get your data in better shape.
So in this example, we worked with our client, we cleaned up their system, we gave them some tools to make sure they’re tracking their data correctly, and we were able to clean up their on attributed data from 76.7% to 34.5%.
Now, for a company their size, that’s a really big deal, having less than 40% of their data on attributed is actually a really good thing.
Because there’s so many different people in campaigns and outside vendors and agencies that they don’t necessarily have insight to and control over that that’s a really good number for them.
So that was our goal was to keep it under 40.
And we’ve been able to keep it under 40% For the past three years.
Christopher Penn 18:53
Yep, I mean, to extend the fitness analogy, this is essentially getting them a pair of decent running shoes, right? They still have to run, they still have to do the hard work to make good decisions from their data, but they at least have the proper foundation to be able to do that.
Right? If you if you have somebody who’s wearing like Raul McDonald clown shoes, it’s gonna be really hard for them to get a good you know, three kilometer run in in clown shoes.
And while it’s probably not the best analogy, it does illustrate that there are some foundational things you have to do.
And you may not be able to measure the ROI directly of it.
But you for sure will not get any ROI if you don’t do it.
Katie Robbert 19:38
Well, and I you know, it’s funny, because it’s to your point, Chris, it’s not that instantly tangible number, but you can see the impact of it when it’s not being maintained and set up correctly.
And so, you know, what’s the ROI of my email marketing campaign? Well, I don’t know I didn’t have the phone.
to track it correctly in the first place, or there’s a negative ROI, or whatever the thing is, because it’s reliant on that base layer of a good foundation.
Christopher Penn 20:12
Oh, the other options, you could just lie, I suppose and say.
But if you care about data, which presumably if you listen to this podcast you do you care about things like truth and mathematical accuracy, then that lying is not a good choice.
Katie Robbert 20:27
Well, and if you’re like me, lying is the kind of thing that literally will keep you up at night, even if it’s a very small, tiny little line.
So it’s just not, you’re not physically able to do it.
Christopher Penn 20:40
You wouldn’t be a terrible politician.
Katie Robbert 20:42
I’m okay with that.
Christopher Penn 20:45
So any other thoughts on this case study before we close it out.
Katie Robbert 20:49
Um, so the the overall goal of sharing the case study was to demonstrate the kind of work that Trust Insights is able to do for its clients, but also to demonstrate what you as the potential client is, you know, the things that you can do with your own system.
So if you’re shaky about the confidence in your data integrity, maybe start to look at your system, maybe start to look at your governance, maybe start to look at how many people are involved in bringing data into those systems.
And so it gives you a starting place of the things to look at what the problem is, if you’re seeing a similar situation for yourself.
And if it’s the kind of thing that you want help with, will Trust Insights can help you with that.
And that sort of the other side of the coin is the case studies are meant to demonstrate, here are the things that we are capable of doing for our clients.
Christopher Penn 21:42
Exactly, yeah, if you’re, you’re making a recipe and it’s not turning out the way it looks on Pinterest, there’s one or more things that may have gone wrong or the person on Pinterest was lying.
One of the two probably is an issue with the ingredients, the process, possibly the chef and and equipment and or whatever it is.
If you’d like some help with that we can we can tell you, Hey, you bought sand instead of flour to make the recipe.
So if you’ve got comments or questions or anything you want to talk about about analytics and the cleanliness of your own data pop on over to our free slack.
Go to trust insights.ai/analytics for marketers, we’re up over 2200 other marketers ask and answer each other’s questions all day.
And wherever it is you watch or listen to the show if there’s a challenge you prefer to catch it on.
Most of them are going to be over at trust insights.ai/t I podcast thanks for tuning in.
We’ll talk to you next time.
Need help making your marketing platforms processes and people work smarter.
Visit trust insights.ai today and learn how we can help you deliver more impact
Need help with your marketing data and analytics?
You might also enjoy:
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, Data in the Headlights. Subscribe now for free; new issues every Wednesday!
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new 10-minute or less episodes every week.