So What header image

So What? How To Migrate Your Universal Analytics

So What? Marketing Analytics and Insights Live

airs every Thursday at 1 pm EST.

You can watch on YouTube Live. Be sure to subscribe and follow so you never miss an episode!


In this week’s episode of So What? we walk through user stories for why you should migrate your Universal Analytics data and different options for the data export. Catch the replay here:

So What? How to Migrate Your Universal Analytics


In this episode you’ll learn: 

  • Why you want to migrate your universal analytics
  • What processes to use when migrating your universal analytics
  • What can you do with your migrated data

Upcoming Episodes:

  • TBD


Have a question or topic you’d like to see us cover? Reach out here:

Note – The following transcript is AI-Generated and may not be entirely accurate:

Katie Robbert 0:39
Well, hey, how are you everyone? Welcome to SWOT the marketing analytics and insights live show. I’m Katie joined by Chris and John. Hey, fellows. Hello, John with the double hands today.

Katie Robbert 0:55
On today’s episode, we’re talking about how to migrate your Universal Analytics. Some people call it Google Analytics 3. Today, we’re going to cover why you would want to migrate your Universal Analytics, the process ease the processes, processes to you, oh, my god, the processes used to migrate your Universal Analytics, we’re off to a good start, and what you can do with your migrated data. So this is a timely topic, because we are literally running down the clock on when you will have access to your Universal Analytics, you can see if you have a Universal Analytics profile, still, there is a literal clock running down second by second. John, I believe you call it the doomsday clock, you can tell it was done by developers because that thing is hideous, no offense to developers, but you are not designers. And that’s a whole different show. So we wanted to talk about this, because there’s been a lot of conversation, especially in our Slack group and on LinkedIn of should I or shouldn’t I? And if I should, how do I do the thing? And so I wanted to sort of start with, well, how do we figure out if I should, and I know this will come as a shock to no one. You’re going to need the five P’s. You need to figure out? What’s the question I’m trying to answer? Who are the people involved? What is the process I need to go through? And that’s what we’re going to focus a lot on today. What’s the platform? We know it’s Universal Analytics, but where the heck does the data go? And how do you retrieve it? And then performance? Which is Did we answer the question, did I get my data out? And can I use it? And so Chris, John, that sort of the big question on the table is, should people even go through the process of exporting their Universal Analytics data? Let’s just start there.

Christopher Penn 2:48
Depends on why why do you need it? For me, as a data junkie, a hoarder and stuff like I want to ask for souvenir value, right? Because I have on my website, I have 17 years of of data while 15 Because you know, it stopped recording a little while ago. And so having it for souvenir value, just to just have it is kind of like a is kind of like a a digital security blanket. Am I going to use it infrequently at Best Buy infrequently. I mean, maybe once every five years, like, oh, look, I’m bored. Tonight, I’m gonna look at my old data. This is what I do, because I’m lonely person.

Katie Robbert 3:27
But it’s not true. He has a family he has. Alright, go ahead.

Christopher Penn 3:33
From a business perspective, probably not. Because at this point, Google Analytics 4 has been out since October 2020. We all forgot that because of that whole pandemic thing. But it has been in production for four years, and you’ve had it running now for at least a year, because you had to as as of last July. So you have at least almost a year’s worth of data. If you migrated late if you were early on, like, if you have a Trust Insights account, we turned it on October 2020. So we’ve have almost four years worth of data in our GA four account. So that old Universal Analytics data, there’s not a whole lot of value to it. The exception, I would say is if there’s specific types of data that you might want, maybe from a demographics perspective, pre pandemic, if the pandemic changed your business.

Katie Robbert 4:26
What about you, John? You marketing over coffee? has Google Analytics running in the background? Correct?

John Wall 4:30
Yes, but it is hilarious. I’m the exact opposite of Chris, I’m like, we’ve already taken credit for the wins. So the sooner that data vanishes, you know, the sooner no one can go back and blame us for losses on that. We can just confidently say yes, that data is no longer available. We’ve got to move on. So yeah, I can totally see you could use it for future reporting and value but for me, it’s like well, it’s not on fire today. So maybe if I close my eyes it will go away.

Katie Robbert 4:59
I you know why? I respect that. Because I feel like there’s a lot of things that are calling our attention at the moment. And so it does, in some ways, it feels like a big distraction from Google of like, Hey, you have to do this thing. And a lot of companies aren’t stopping long enough to question why, like why So, the only for me, because we’ve had Google Analytics 4 running for a few years, we do have year over year data. I feel like the best case scenario for us for Trust Insights to keep that Universal Analytics data is to do predictive forecasts with is to do trend with like, so you sort of have pre and post pandemic, because we have it up through. And so that is, now here’s the thing. To date, we haven’t done that. So it’s really just a distraction. It’s just like, hey, we could do the thing. And I don’t think I have a strong enough business case to justify that we should do the thing. But a lot of companies do your point, Chris, the pandemic did change their companies that changed how customer behaviors, how they interacted with the website, how their purchase, so there is value. So I think what we’re saying is, first of all, before you go through the process of exporting and saving your Universal Analytics status somewhere, go through the five p framework and make sure that you have your user stories, which is a simple three part sentence, as a persona, I want you so that, so that you’re clear about the expectations of why you’re doing this in the first place. So that I can, once a month, run a predictive forecast, or I can understand my customers pre and post pandemic, whatever the thing is, but make sure that you have a good reason to do it, not just because Google’s telling you to do it. Google, please don’t come for me. So let’s say we went ahead and justified enough of here’s why we need to do it. So where do we start with this? Chris? Can we do it ourselves? Are there services? Like? What what does this ecosystem look like?

Christopher Penn 7:08
So there’s, there’s, I would say, four different options for doing this. The first and simplest, if you only need very, very rough, aggregated data, I’m gonna go into Google Analytics here. This is the old Universal Analytics. If you only wanted, like, really, really early stuff, I’m gonna set this to February 2015. You could just say, I want this chart, export it to I want this table export. So if you just want to know like, Hey, how you went, when was our first users to the company website, okay, I can export this. And you’ll have this as a CSV file. This is a simplest, it is the easiest, is top level data. So this for some of that look back, you know, as you’re saying, Katie, for forecasting, may do predictive forecasting, this would be the bare minimum. So this is option one, go through and find the reports that maybe with those, I’ve gotten used to GA forks, you know, confusing interface now have to go remember where everything used to be in, in Universal Analytics. But you can go in and export that information manually. That’s one way of doing it. And I would say, probably, for a good number of use cases, this is okay, like this is this is good enough for just those bare basics. Okay. Second option is to use a service of some kind. So there’s a couple of different services that we talked about in the past the matomo analytics system, we did a whole show on it, there’s a plugin for matomo that will plug into Universal Analytics and essentially vacuum the data out of it. So if you’ve got matomo installed, and it’s a fresh instance, meaning it’s, there’s no current data in there, you can vacuum up that Universal Analytics data, put it in matomo. And it’s available with a user interface with the interface that we talked about in the past, and make it available to you. It takes a long time. It takes weeks sometimes because of Google’s API limits for how fast you can get data out of the system, depending on how large your old GA account is. So that’s option two. Any questions so far?

Katie Robbert 9:21
Um, no, I mean, just as a reminder, for those who didn’t catch our how to set up a matomo instance in 2024, you can catch us out on our YouTube channel AI slash YouTube, go to the so what playlist and you will find that episode is only a couple of weeks old. It is a really good option. I’ve been playing around with matomo since we did that episode, and I’m really liking what I can get out of it. But I also have very simplistic analytics needs. So for me like this is probably the option I would go for personally, but I’m curious to see what else is out there.

Christopher Penn 9:59
option three, there are paid services that will do this data export over in our analytics for marketers community, our friend and longtime contributor Todd Bolaven recommends the analytics Canvas service, which will take your your Universal Analytics account, vacuum up all data out of it, and then spit it out as both CSV files and into your own BigQuery database. So just to give you an idea of what this looks like, you go into the service, you spend the 99 bucks on it, you run the backup. So let’s go ahead and just walk through the steps for doing this, if you want to do it, where it is connected to a BigQuery database that you can then manipulate. If you first go in, you choose which view you want you pay per view. So if there’s multiple views that you have in, in Universal Analytics, you’re gonna wanna you’re gonna spend 99 bucks a shot, you choose the view, you choose which of the 45 tables they can export you want. And the cost is the same for whether it’s one table also, I just choose them all. And then you choose your export option, your export options are just Google Sheets, Excel files, CSV files, and BigQuery. If you go with BigQuery, you then have to set up the connections and access to connect to your company’s BigQuery instance. So that the software can talk to it, which is a multi step process that requires you to have administrative access to your Google Cloud account.

Katie Robbert 11:34
So this $99 is really you’re talking about at least, you know, a few $1,000, if not more, for the time commitment, depending on who it is you need to help you plus, you also need to understand how big query works. Because if you have 45 different tables, you probably I’m assuming have to find some way to join those tables or query multiple tables to make one coherent data set. And then you still need to, you still need to understand how when the data gets into BigQuery, what the heck it means. So you probably need to have some kind of a data dictionary translation, because I’m assuming it’s not going to be as simple as this metric means unique users. And this means the number of unique users on this day like that would be way too easy.

Christopher Penn 12:28
It’s worse than that. Excellent. So let’s go over to BigQuery and see what’s in the box. What it does is it spits out because the Google old Google Analytics API gives you limits, the limits are 10 metrics, and nine dimensions per per API query. You can’t query more than that, so that you can’t say just give me everything. And so what this software has done is essentially do multiple combinations per report per view of what’s in Google Analytics and puts it into BigQuery tables. So for example, let’s look at acquisition overview, daily detailed, that gives you the field names from the API. So yes, you need a data dictionary if you don’t know what these mean. And then in the preview table, you see there’s your view, ID, your view, some of the fields, what channel grouping the source, the medium, the landing page. And then some of the metrics include some of the goal completions overall, that if you then scroll down here, you can see for conversions spits out conversions to its own separate tables. And then folks will remember, old Universal Analytics had 20 Gold slots. So what it does, it spits out by by gold number, what was in those goals on a per day basis? So you get all this? Here’s the challenge.

Katie Robbert 13:56
Wait, that wasn’t the challenge. No. Okay.

Christopher Penn 14:01
You get the same thing in the CSV files, too. Right? When you do the export, you get a nice folder filled with files. My god. Oh, my God. Challenge is this. There’s no primary key bring to the table. So they’re all standalone, they do not unite together.

Katie Robbert 14:19
Oh, my God. Well, and I would imagine, too, I mean, so this is something we talk about, even with just sort of like general setup is having consistent naming conventions, and writing things down. So like if you didn’t really name your goals other than goal, one goal to go three, Goal four, you have no idea what you’re looking at.

Christopher Penn 14:44
Well, so you in universal Alex, you can’t do that. They are you are given those hard coded names in universal X. You can’t name them. I mean, you could name them in the interface, but it does not.

Katie Robbert 14:52
That’s what I mean. It’s like yeah, yeah, I know that it assigns like the numbers but yeah, you can. You can either leave it at as cold one, or you can change it to newsletter subscription, so at least having some sort of like, understanding of what the goal is. Yeah, exactly which the whole?

Christopher Penn 15:13
Because this has not come over. Yeah.

Katie Robbert 15:17
I mean, this is a mess, I’m okay, let’s keep going.

Christopher Penn 15:22
The data is all in there, right and all is available.

Katie Robbert 15:34
So I feel like this goes back to where we started have, you really need to have a strong business case as to why you need this data. And to be fair, some companies, some teams 100% will need this data. For whatever reason, you got to be clear about why you need it. Because this is not easy, like Google did not say like, here’s the magic wizard of just pick all the things you want, and we’re going to export it into something that actually makes sense. They said, We’re gonna give you what we think you deserve. And then you’re gonna have to spend another three weeks or four weeks or two years, figuring out where we put it, what we did with it, and what the heck it means. So good luck. Looking at you, John Wall.

John Wall 16:22
Yeah, that’s about how good it just gets uglier.

Christopher Penn 16:26
So that’s Option three, to use a service, a paid service to do it for you, which, again, if you’re the if you’re the data hoarding, packrat, this is actually a this is actually nice, this is thorough, you get all the data, you can you can lock it away, put it on a backup, hard drive, and just have that that the comfort of knowing you have the data even if you will never ever look at it again.

Katie Robbert 16:49
So option one is to just do some simple exports directly from the Universal Analytics interface. Oh, yeah, I can feel my blood pressure going up. Remind me, I’ve already forgotten what was option two again.

Christopher Penn 17:08
Option two is install a matomo instance and uses plug in to vacuum up the data.

Katie Robbert 17:14
Okay, and that’s the option I like so far. But I’m sure that has its own set of challenges. And then option three is to use a paid service like this $99, one that will give you a whole bunch of BigQuery tables or a whole bunch of CSV files that you have to join together but has no join Key. Great, what? And I’m almost afraid to ask what is option four.

Christopher Penn 17:40
I’m nervous. option four, is to code it yourself. So one of the things that you can do, because all of those fields in the Google Analytics in the universe, Alex API, you may not need them all. If you’ve done a good job with your five P’s and user stories you may be saying like, these are the other 10 metrics. And nine dimensions were allowed in an API call. These are the ones we actually care about. But we want to see it maybe more granularly than what’s in the backup service. The backup service, for example, does not bring in anything other than day levels. If you want an hour level or minute level, which is in the UAE interface, you won’t get that there is some there are some use cases where that you would want to do also there’s some some secondary dimensions that are not in the export, either like previous page next page. So you would go into a tool like ChatGPT, or Gemini or one of the big models and say, hey, I want to code a Python script or an R script to talk to the Google Analytics API, you would have a long conversation with it. And at the end, if you are technically inclined, you will have a piece of code that you will run that will vacuum up that data that you care about the the fields that you care about, and store it somehow. So for example, no long before we heard about the $99 option, one of the things that we did was we we wrote this code for ourselves, we picked sort of the 10 metrics in the nine dimensions we cared about. And we vacuumed up the data from our Google Analytics instance, into, in this case, a SQL database. So we said this is what we know, makes the most sense, for the looking backwards. And we have this stored in case we need it. And we did this June of 2023. Because we we got wind, like yeah, this is gonna go away if we don’t get all this data. So that’s option for the reason option four would be good as if you if you don’t need all the data, and you’ve got your user stories really well designed, you can just get the data you need, and then store that somewhere. And then the advantage of this is that it’s a it’s not everything in B, you could store it the format that you want. So maybe you want it in a CSV file, but maybe you wanted a SQL database, maybe want it in, who knows some arcane system, you get that choice because the code supports that. So those are four options, simple export of just the stuff you need from from the interface, install matomo on your own servers, and then run the import plugin, which back up the cost there is setting up a server using an export service and getting the exports that way, and then writing your own code with the help of generative AI to extract the information that you want, and put it in some storage format.

Katie Robbert 20:36
So all I’m hearing with all of this is, there is no getting around doing the requirements upfront, because it’s going to be a mess on the other side of it, if you don’t plan for what it is that you actually want to get out of this. So if you say I want to understand my attribution trends, for example, like throughout the lifecycle of my company, pre COVID, post COVID, you know, or just through all of it, you know, we’ve changed. We’ve restructured the company, we’ve changed how we sell, we’ve changed the products, whatever the thing is, make that clear and get that data but what I’m all I’m seeing is a gigantic nightmare, if you don’t do that planning. And I You’re, I’ve seen it happen. And I’m guessing it’s going to happen.

Christopher Penn 21:28
You’re 100% Correct. So one of two things will go wrong. Either you will vacuum up so much data that it’s unusable, right, or you will if you don’t do your requirements planning, and you will just get the things you think you need. And then in July 2, like oh, we need to get oh, crap. It’s not available anymore. Because you didn’t do the requirements planning first. And the other thing that I think is really important to point out, because Katie, you and I talked about this. It’s not apples to apples. But user and Universal Analytics is not the same as the user in GA for a conversion, the goals are different how conversions are measured the events, for example, event tracking is is a new model, it’s not new anymore, is the current model in GA four that did not exist in the same format, in Universal Analytics. So even if you’re doing historical look backs, I would be very hesitant to use predictive forecasting on these two different data sets, because they’re not the same. And you do have to do a lot of manipulation of the data to get them to be on par with each other, you typically will have to use some kind of a third system like, like CRM data, to level it out, to basically say, Okay, well, what multiplier do I need to add or subtract the Universal Analytics data to make it even with GA for data? Because you have that baseline of like, how many leads you had, say, in your, in your Hubspot instance. So there’s that aspect too. I think, from a from a, like the data pack rat perspective, yeah, this is great. Like, you just get all this stuff, and you can play around with it. But I don’t know that it adds a whole lot of value anymore, particularly even if you want to do like, complex attribution modeling. You could do Markov chain modeling, just with the data inside Universal Analytics, you can’t do that with just what’s in unit a GA for anymore, you have to use the BigQuery exports to do that. And, again, it’s not apples to apples, right?

Katie Robbert 23:18
Yeah, it’s, it’s tough, because, you know, we are going to run into a lot of people who are have that packrat mentality of like, but what if, what if I need it? And I think that the challenge there is really digging deep into the user story exercise, like, alright, let’s walk through those what if scenarios, let’s see how realistic they are. So what I would recommend there, you know, Chris, if you were coming to me saying, like, I just I have to have the data for a rainy day. I don’t know why yet. What I would be asking of you is, all right, let’s write down all of the potential user stories real or not. And then try to prioritize them to say which of these is most likely to happen? Which of these is least likely to happen? Like, I’m sort of simplifying the exercise. But it’s an important exercise, because data storage costs money, data storage takes resources. And then at some point, if you decide that yes, one of these scenarios, one of these user stories is likely to happen. How do you get that data out? You still need to go through the process of doing the requirements to know where does the data live? What data do I have? What data do I need? So for those of you who are just wanting to buy the $99, stick it in a big query and forget about it. Not a good idea. I’m telling you right now, don’t do that. Do some? At the very least just answer these five questions. What is the purpose? Who are the people? What is the process? What platform and how am I going to measure it? What’s my performance? Like? Just very basic questions before you do it, just to at least sort of like start the conversation? That is my like begging upsa for today.

Christopher Penn 25:03
And it’s important because when you look at what’s in the data itself, we go back to sharing my screen here real quick. When you’re looking at things like behaviors on your website, and what content people are looking at, for example, this is previous not going to enjoy a 61 megabyte spreadsheet. It’s not, it may not be granular enough, it may not dig into the level of detail. So for example, with with Trust Insights, we look at things like your specific goals. When I look in what people are doing with the data, the behavior pageviews, that’s in this report, goal completions is not in here. So you don’t know if a piece of content contributed towards one of the goals you care about. So that’s not available in the stock report. It’s not available, it may or may not be available, depending on how well things, your instance maps to matomo. So if that is of critical importance to you, you probably are going to go with option four, which is pull the content that you want. So like here’s the page, here’s the goal completions, and so on. So you’ll have a custom binding together those those metrics, but you need to know that from requirements gathering before you embark on this journey, because this part also takes time, this takes about three and a half days to pull all this for 17 out of the credit, 17 years worth of data. But it takes about three or four days to get all that information into a database.

Katie Robbert 26:33
Well, that’s not including the time that it took to put the code together. And you’re pretty savvy with writing code. At this point. You know, if you said to me, I need you to put together some code to against the Universal Analytics API to pull this data that we’ve already done the requirements for I would be like, alright, well, I guess I’ll see you in six months.

Christopher Penn 26:53
No one took you that long for us gender of AI

Katie Robbert 26:57
Well, but, see that but you’re you’re making assumptions about people skill set and their ability to put these pieces together. So yes, maybe six months was an exaggeration. Point being is that it’s still going to take time to write that code, but also then to execute that code. So you have to have the environment setup to run the code, you have to know how to check the code, you have to know how to check the data coming in. So there’s a lot of different steps. It’s yeah, I don’t know. John, are you convinced? Or have we talked you out of even touching this data altogether?

John Wall 27:30
Yeah, it’s, it’s just one of those things. It is, it’s like, opening Pandora’s box, you know, if you say, Hey, we’re gonna go grab this data. Now you’re gonna have to explain what to do with it. And if there’s any problems in there, you have to make those go away. So yeah, I’m still I just like sit here and hope that the problem will go away on its own.

Katie Robbert 27:51
So Chris, knowing that you’re a data hoarder, and like, well, and I say that because I know that you have all kinds of different historical data, how often are you referencing the historical data that you’ve collected that you’ve stored away? Never. Which I think is a really good point to bring up because you, Chris, are of the three of us the most data driven? Let me see what the data says. And if it’s not something that you’re using, that makes me question, well, do we even need to go through this process? Again, there’s going to be companies and teams that absolutely need to do this for a variety of reasons. For Trust Insights, I don’t know that it makes sense for us to do it. I know you have some of it already. But I’m having a hard time and coming up with a valid user story that would justify us taking time away from other things to do this.

Christopher Penn 28:49
The only user story that I can possibly come up with for this data and is not revenue driving whatsoever, is that come, you know, 2028, whatever. And we say like, Hey, 10 years ago, Trust Insights opened its website. And you know, this was the these are the first pages people visited kind of a nice walk down memory lane. That’s it. There really is no other value. For myself, I have the data because yeah, I want to be able to look back at like what blog posts were popular in 2009, or 2012. For fun, have I done that? No, because there’s been more fun things to do than that is the day to day if I need to. Yes. From a business perspective, here’s the thing, if you were early on in adopting GA for like, say you installed it in 2020, or 2021. I would not bother going backwards. Because the world since 2020, is so radically different in so many ways. Even if your business fundamentally has not super changed, not like Trust Insights is still pretty much the same company as we were when we were founded in 2018. We help people do more with their data and their analytics and their AI systems. Chatbots UbD certainly made everyone more aware of what AI is. But we’re still pretty much doing the same work of helping people make more with the stuff that they have. The audience has changed, right, so many more people are working from home still, and will continue to be. And so many more somewhere where businesses are now hybrid businesses, that looking back at 2018, and 2019, other than for historical curiosity doesn’t offer us any valuable business insights, even things like in our in your CRM software, for example, when John and I were talking about this, this week on marketing over coffee, it used to be for Account Based Marketing you could rely on Okay, well, we’re getting traffic from this set of IP addresses. So it’s coming from, it’s coming from AT and T we know that’s their building. And so we know how much you know, we can target people. Now, half those employees work remotely, and you’re like, we’re getting a lot of visits from like people’s Comcast accountants, because they’re working from home. And so even that data is not helpful anymore, because the world has changed. So looking at your Universal Analytics data, unless you literally just switched over in July of last year, and you needed one year’s one full calendar year for doing year over year recording. I don’t see any value. If you have data in GA for from AutoAI to 2021. I don’t see value in having that Universal Analytics data other than to cover your but just say that you have the download in case somebody needs it.

Katie Robbert 31:28
Question. Theoretically, if you’ve had the data, you’ve also theoretically been doing reporting on said data. So, you know, to your question, Chris, about what blog posts were the most important in 2019? I would wager a bet that through our monthly reporting, we probably have that in a PowerPoint somewhere. So I would I think that’s also part of the conversation for companies of Do you already have the reporting done on the data that you’d probably care about, so that you don’t need to go through the process of pulling this data, again, just to look for answers that you’ve already got from the data by doing the reporting. And I think it’s a really good opportunity to audit what you’ve been doing, again, sort of going through the five Ps. And what do we already have? That’s a big part of the conversation is, are we creating something net new? Are we doing this just to do it? Or do we already have answers to these questions? So if your question, Chris, is, I want to understand what blog posts were the most important in 2019, which ones were converting? I can almost guarantee you that we answered that question in 2019? Because that’s the question we’ve been asking for since we started the company.

Christopher Penn 32:55
Yep, exactly. However, for small organizations, or in cases where, let’s say a new stakeholder has, has taken over taking the reins of the company. And the there’s been a loss of institutional knowledge, we had this happen, you know, at several clients relatively recently, where new stakeholders are come on old stakeholders have departed, no one remembers where anything is, no one can find anything. Having the raw data might be situational useful, I don’t, again, I would not put a whole lot of stock in data prior to the pandemic. So if you’ve got 15 years of Universal Analytics data is a huge, I think it’s still a historical curiosity. I don’t think that it’s something you should be making decisions with.

John Wall 33:40
Well, that is there’s one angle of you kind of a good insurance policy is just like, grab the raw data and throw it in a folder somewhere. And, you know, leave it at that like that does insurance is worth something. There’s a whole industry about people just selling peace of mind.

Christopher Penn 33:56
Yeah, exactly. I think having so this, this lovely folder here occupies about 600 megabytes of disk space. That compressed, it goes down to 60 megabytes, so it goes into an archive folder, and it just stays there forever. You know, I like this approach for when you don’t have a solid use case, because this gets put on a hard drive somewhere. It doesn’t cost you money to store it, because it’s already on someone’s hard drive anyway, or it’s on a server backup server somewhere. And that’s, that’s good enough. The matomo approach, I think, is easier if you want to use the data, because that hasn’t worked the interface, but it does cost money to keep that operating and stuff. So I would say this approach of just backup the data somewhere. And the stuff in a way is the least costly approach.

Katie Robbert 34:43
I think that’s fair. You know, and just sort of to give a nod to the highly regulated industries that will be required to do this. So we were saying we know that some companies some teams are going to have to do this. I’m thinking about pharmaceutical insurance, financial where There’s, you know, regular audits of everything, you know, really exploring these options, what makes the most sense. So again, going through the five P’s to figure out, you know, what are we audited on? What do the auditors need to know? If you’re in pharmaceuticals? It’s probably, what does the FDA care about? What is the trail? You know, so again, it’s a really good time to do a refresher on your data governance, on sort of what you have what your tech stack consists of, and what you have to have to answer those questions.

Christopher Penn 35:33
But to your point, Katie, if you’re in a highly regulated industry, you’ve already been doing this with this data. This is this should not be a surprise.

Katie Robbert 35:40
No, it shouldn’t be. But for some companies, it they’re to, you know, they, they sort of have the John Wall mentality of Let Me ignore it until it’s on fire.

Christopher Penn 35:50
Right? Or if you’re like, for example, a defense contractor and you’re working with several three letter agencies, yeah, you’ve got to have all this data stored somewhere, you’ve had to all along. And so you will want to have this avail to be produced when the their auditors come knocking and say, Hey, we need less 10 years worth of data.

Katie Robbert 36:08
I will say to that. So you’re saying if you work in a highly regulated industry, you’ve already been doing it and having worked in a highly regulated industry? That is not the case. It is the it is the we’re going to ignore this until it’s on fire, and then everybody scramble. And make it look like we’ve been doing it all along.

John Wall 36:26
So you have to backtrack and clean your clean the tracks. Yeah.

Katie Robbert 36:31
In a perfect world, people plan ahead in reality, it’s Oh, crap, I needed this done yesterday, what are we missing? Let’s hope nobody notices.

Christopher Penn 36:41
We’re in a highly regulated industry, and you need to cover your tracks and get the process all your data last 10 years, please contact us, we are happy to extract the data for you and process it.

Katie Robbert 36:50
Yeah, if you want to know what it’s like to be audited by the FDA. Reach out? I’ll tell you, yeah. But with that, you know, we really can, you know, so we’re sort of talking through from a Trust Insights perspective. But you know, we’re talking about it, because we’ve gone through the steps we see have seen what needs to happen. So if you do in all seriousness, need help with exporting your data, putting it somewhere where you can actually access it, please feel free to reach out to us AI slash contact, and we will not tell you not to do it, we will help understand what’s going on and actually help you do the thing.

Christopher Penn 37:34
That said, I would also encourage you to do data governance on your current analytics, because if you haven’t been doing on the old stuff you probably haven’t been doing on the new stuff either.

Katie Robbert 37:45
Yeah, we can help with that. Yeah. I don’t know. I mean, this is a whole different topic. I don’t know why data governance is so hard for people to do just do it. To make your life easier.

Christopher Penn 38:01
It’s the same reason people, people, people don’t comment their code.

John Wall 38:06
Yeah, eat right, go to the gym. It’s all right there.

Christopher Penn 38:12
Buy low, sell high. It sounds easy.

Katie Robbert 38:17
Well, on that note…

Christopher Penn 38:20
On that note, Thanks, folks, for tuning in. Please do do look at the options but do your do your user stories and do your five piece first before you go grabbing a whole bunch of data that you can’t use. And really critically if there is data you do need that makes you make sure you actually get it. So that’s gonna do it for this week’s show. We will talk to you all next time. Thanks for watching today. Be sure to subscribe to our show wherever you’re watching it. For more resources. And to learn more, check out the Trust Insights podcast at trust AI podcast, and a weekly email newsletter at trust Got questions about what you saw on today’s episode. Join our free analytics for markers slack group at trust for marketers See you next time.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This