In-Ear Insights: Updating Mental Models and Old Knowledge

In-Ear Insights: Updating Mental Models and Old Knowledge

In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss how you can keep your professional knowledge relevant despite rapid shifts in technology and software. You’ll discover how to leverage agentic AI to audit and modernize your outdated standard operating procedures. You’ll learn the vital importance of maintaining human oversight to prevent the loss of critical expertise. You’ll understand why curiosity remains your most valuable asset for effective leadership in the age of automation. You’ll see how to balance the speed of machine-led updates with the necessity of human critical thinking.

00:00 – Introduction
03:15 – Why keywords matter less in the age of AI
07:45 – Using agentic AI to update old SOPs
12:20 – The risk of cognitive offloading and knowledge decay
17:50 – Maintaining human leadership and curiosity
22:10 – Call to action

Watch this episode now to learn how to stay ahead of the curve without losing your competitive edge.

Watch the video here:

In-Ear Insights: Updating Mental Models and Old Knowledge

Can’t see anything? Watch it on YouTube here.

Listen to the audio here:

Download the MP3 audio here.

[podcastsponsor]

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher S. Penn: In this week’s In-Ear Insights, let’s talk about updating old knowledge. Katie, you’ve been doing some work on updating standard operating procedures about Google Analytics. I’ve been putting together slides and workshops for SEO and PPC professionals about the way things are. One of the things that I noticed, particularly when I was digging through Reddit data, is how much focus there is on things that are no longer relevant.

I’ll give you a simple example. In SEO, we talked a lot about keywords—keyword lists, keyword topics, related keywords, and stuff. There is still some marginal value to that. But with the way that things like AI mode and AI overviews operate today, and the way language models like ChatGPT operate, the keyword is essentially irrelevant as a thing to focus on. It’s not where you should put your effort. Instead, you should be putting your effort on the semantic space of a topic, which again, is not necessarily all that new. When I look at the top questions in Reddit about SEO, people are still fixated on this thing that really hasn’t mattered in about 5 years.

So, when you were doing your Google Analytics stuff, I’d love you to talk through what you’re doing on that front, because there’s a lot of stuff that we thought we knew about Google Analytics that, thanks to Google’s never-ending UI changes, is completely different. Talk to what you’ve been doing and what old knowledge you’ve had to replace.

Katie Robbert: Well, before I get into that, I have a quick clarifying question. Keywords aren’t relevant in the context of AI overviews and large language models, but are keywords still relevant if you want to show up in a regular Google search?

Christopher S. Penn: They’re less and less relevant. Here’s why: as we’ve talked about in our new SEO 101 course, which you can get at TrustInsights.ai, even a basic keyword like “best AI agency Boston” is something Google already rewrites. Google said in 2024 that Google is going to do the Googling for you. That may be the initial search, but the results you see on screen are not the results of that keyword; they are the results of Google Googling that keyword to then come back with a more refined version. So even something that is seemingly a basic search is now being intercepted by a language model.

Katie Robbert: Got it. And that’s helpful because I think this ties into the work that I’m doing. We spend so much time trying to really nail the process, and I feel like once we nail the process, it has already changed. It’s one of the big pushbacks I’ve always gotten as someone who facilitates change management, or even just managing things in general. People ask, “Why do I have to write it down? It’s faster if I just do it.” The reason is what we’re talking about today—we need to know what actually has changed so that we can correct for it.

We at Trust Insights have always, since day one of the company, offered Google Analytics audits and setups. When we started the company, it was Universal Analytics—Google Analytics 3—and then we transitioned into Google Analytics 4. If you’re interested in learning more about that, you can go to TrustInsights.ai/contact. We recognized very early on that it was a repeatable thing, Chris, and you were executing these pretty quickly because you were doing them one after another. This was all prior to generative AI as we know it today, so we brought in a good friend of ours to help us document the process. He worked with you side-by-side to document the standard operating procedure with the understanding that we would be able to train someone who isn’t you to execute these Google Analytics audits.

Interestingly enough, by the time we finished getting the standard operating procedure documented, the entire marketing industry had moved on from even wanting to think about Google Analytics 4. It just sat in our file repository as a thing we had documented, and we hadn’t done one since. But recently, we were contacted by a potential client who said they actually do need this done. So we said, okay, great, we can still do it. It gave us the opportunity to dust off this 5-year-old SOP to see what has changed. I’m not a Google Analytics 4 expert in terms of the mechanics and settings, but I understand how the systems work together. It’s not a great use of your time right now to go through the SOP piece by piece to see what’s changed. But guess whose time we can spend doing this? The machines.

We can use the machines. It’s a great opportunity to really stretch the limits. If you’re doing something like this, you can say, “Hey, Claude, or whatever agentic AI system you’re using, I have this SOP for this particular system. Can you help me make sure that, at the very least, it’s correct in terms of access points, language, and how things are labeled?” Then we can get into the actual process of what we want the output to be. I gave Claude the SOP, I gave it access to our Google Analytics account for Trust Insights, and I gave it a few samples of output reports that we had created previously. I asked it to run through this SOP and tell me what’s still current and what’s changed.

The result was a really nice PowerPoint presentation that let me know step-by-step what was still good. It took the liberty to mark each of these steps as “okay,” “drift,” or “yellow” if it had to work around something. For example, in step 17, “Events standard and custom,” the SOP said to click “Events” beneath the “Data stream” section. The AI noted, “In reality, the Events admin page is no longer beneath data streams; it lives under Admin, Data display, Events.” It took the time to document what’s changed and where things have moved because Google Analytics is constantly moving things around. I feel like this is true with a lot of software systems. This is a really great use case for agentic AI.

Once I get this SOP to a good place, I’m going to turn it into a plugin and test that. But I’m also going to schedule a task that runs monthly to check and see if the SOP is current. If it’s not, it will update the SOP and then update the plugin. Those are things that I don’t need to do. Especially since it’s Google Analytics, it’s lower risk. I’m not changing any protected health information or PII. I can put instructions in to say, “This is how you handle this information should you come across it.” I can provide that background for really good data governance. That’s the kind of knowledge update I’m working on for the company.

Christopher S. Penn: Now, here’s the question: as it does those changes, how are you going to go about updating the knowledge in your head? Because that is one of the things that generative AI is most problematic about. Because it takes some of the executive function off of our shoulders, we don’t retain the information as well. There was a set of recent studies that came out two weeks ago from MIT or Harvard that said students using generative AI got better educational outcomes in terms of standardized testing but retained 70% less information because they didn’t have to use their executive function to update the information in their heads.

This is not a new thing. As you often say, new technology does not solve old problems. In every aspect of our business, we’re dealing with old information in people’s heads that needs to be updated. So how do you go back and mentally update? Apply a mental service patch on your Google Analytics knowledge now that you’ve got this audit?

Katie Robbert: You as the human have to do the work. You can’t skip over that stage. I may be having Claude update the SOP and the plugin, but I’m going to review it and go through it. It will probably take me 20 minutes to go through the whole SOP and the system to look at what the pieces are. Then I have that mental reference. So if you or Kelsey come to me and say, “Hey, what’s changed?” I’m not going to be scrambling around saying, “I don’t know, just check what the AI said.” I, as the human, still need to be able to share that information. That’s my personal opinion. I’m going to be proactively reviewing the information as it’s changed. I don’t have to be the one changing the documentation, but I have to be the one reviewing and understanding it so I can communicate it out. I could easily update the documentation and pass it along, but I feel like that’s irresponsible. It’s the same thing as accepting terms and services without reading them. That’s on you, the human. You still have to read what it says. You can’t make assumptions that it’s correct.

My husband was telling me a story about his coworker, who is a teacher. He’s been talking about his high school students’ English classes. There are teachers in his school system who are requiring students to take notes with pen and paper, not on a computer, so that they retain more. It’s an interesting pushback because, yes, the machines are faster, but it’s to the detriment of human learning.

Christopher S. Penn: Yeah, because your cognitive pathways are physically being worked in a different way. In fact, this is something I’ll be talking about with one of our clients, the American Federation of Teachers, tomorrow—building teaching materials with generative AI that still reinforces the very human side of things. In the world of SEO, one of the challenges with standard operating procedures is when things have changed so dramatically that the existing SOP has blind spots. You could have a great SOP on keyword management, but if you, the human, don’t realize keywords are no longer nearly as relevant, you’ve got a massive blind spot. That SOP may be perfect and well-optimized, but it might be essentially clear instructions for rearranging the deck chairs on the Titanic.

Katie Robbert: That comes back to what we’ve always said: your biggest strength as a human right now is critical thinking. Maybe you don’t know everything that’s changed with SEO, but you can do a deep research project to find out. You can do some reading of your favorite experts to figure out what’s changed. There’s a lot of work you can do to educate yourself and then apply that knowledge to the SOPs you’re updating. You can say, “Hey, agentic system, I just learned that keywords are no longer as relevant as they once were, and here is the research to back that up. Let’s apply that to the SOP.”

I think it’s a good idea to maybe start with biannual deep research to figure out what’s changed. For something like Google Analytics, quarterly is a good place to start. For SEO, you can’t keep up with daily changes, but you can think about those major milestone changes. Ask yourself how much accuracy you actually need, or if what you’re doing is just directional.

Christopher S. Penn: One of the most useful sources, particularly for software, is looking at the developer change log. Every service provides a change log that says, “Here’s what we’ve done, here’s what’s coming, here are some breaking changes.” Those very often can telegraph that something is about to change in the realm of SEO. Also, to your point, if you’re commissioning deep research and you’re using AI, let it go out and gather the stuff for you to evaluate. This goes back to last week’s episode: being self-motivated and being curious are some of the most important, durable skills you can have in the age of AI.

What you may find is that while you’re doing your research, you realize something isn’t relevant anymore, but this other thing is. Then you ask, “What’s this thing? How can I learn more about this? How can I learn about embeddings and vector spaces?” You might end up developing some really cool stuff. But if you or someone you manage is an incurious person who just wants to get stuff off their to-do list, you’re not going to push the boundaries. Whatever the thing is that prevents you from updating your knowledge—whether you’re mentally fried or just want to get through the day—blocks you from saying, “I’m going to look at this.”

Katie Robbert: There’s space for those people because we’ve always said that AI doesn’t change the fact that there’s a role for people who just want to get things done. Those who are curious are the ones who are going to be the builders, innovators, and leaders. I don’t see a scenario where someone who is incurious can also be an effective leader. I emphasize “effective.” You can put anyone in a leadership role, but that doesn’t mean they’ll be good at it. A key tenet of an effective leader is that they are curious. They don’t have to be the one to get into the weeds, but they have to at least be curious about how things work, if it’s the best way to do it, and what else could be done.

Christopher S. Penn: There is a place for doing the dirty work, too. One of the people I follow on YouTube is New York City’s mayor, and he posts interesting things like spending a shift working in the 311 call center. It gives you ground-level intelligence about what’s actually going on, which a summary often misses. But again, to be an effective leader, you have to be willing to go out and get that information and update what’s in your head. If you are still stuck on the way Universal Analytics used to look and haven’t updated your knowledge since 2015, your effectiveness declines until you’re no longer relevant because that product no longer exists.

Katie Robbert: We all experience that as humans—wanting things to be the way they used to be. It’s a very human reaction. However, things do change, and change is hard. That’s why I specialize in change management; I know how hard it is. The good news is that agentic AI doesn’t care. It’s happy to make 8,000 changes. It doesn’t get fatigued. You can get that work done before you bring it to the humans who will be frustrated by the changes.

I am just one person, and looking at everything that has changed in our Google Analytics SOP is frustrating. I wish they never changed it to Google Analytics 4, but guess what? It changed. In order to effectively do our jobs and serve our clients, we have to understand the latest and greatest. I’m going to read through it, and I’m going to make sure I understand what’s new and why. Is it just that a button moved, or is it a major procedural change? Those are things I need to be aware of as the human.

Christopher S. Penn: Yep. And there will be new opportunities. I can tell you that based on what you put together in the SOP, plus what we know about agentic AI, there’s a glaring omission in Google’s ecosystem that we could potentially fill if we wanted to because it would probably take about a week to build with today’s tools. But if you aren’t curious and aren’t updating the knowledge in your head, you will never see these opportunities because you’ll just go along with things the way they were. We all have a lot of work to do in terms of updating what’s in our heads. I know I certainly do.

Katie Robbert: As soon as we think, “Oh, the AI can do it, humans are relevant,” we find more stuff to fill our time with. This is what our friend Brooks Ellis likes to call “deep thinking.” Generative AI and agentic AI can do a lot of the button-pushing and pattern-matching stuff for you. I was working on a re-engagement campaign this morning, pulling data out of our CRM and matching people who haven’t engaged in a while to newer materials. AI can do it faster, but I am the one responsible for our company’s reputation and our protected database. I’m not just going to hand it over; I’m going to think through each step. That work still has to get done by me.

Christopher S. Penn: Yep. But once it’s done, we can spin up an AI army to tackle it. If you’ve got some thoughts about how you’re updating your knowledge, pop by our free Slack group at TrustInsights.ai/analytics-for-marketers. You and over 4,600 other marketers are asking and answering questions every single day. Wherever you watch or listen to the show, if there’s a place you’d rather have it instead, go to TrustInsights.ai/TIPodcast. Thanks for tuning in, and I’ll talk to you on the next one.

Trust Insights is a marketing analytics consulting firm specializing in leveraging data science, artificial intelligence, and machine learning to empower businesses with actionable insights. Founded in 2017 by Katie Robbert and Christopher S. Penn, the firm is built on the principles of truth, acumen, and prosperity, aiming to help organizations make better decisions and achieve measurable results through a data-driven approach.

Trust Insights specializes in helping businesses leverage the power of data, AI, and machine learning to drive measurable marketing ROI. Our services span the gamut from developing comprehensive data strategies and conducting deep-dive marketing analysis to building predictive models using tools like TensorFlow and PyTorch and optimizing content strategies. We also offer expert guidance on social media analytics, marketing technology selection and implementation, and high-level strategic consulting encompassing generative AI technologies like ChatGPT, Google Gemini, Anthropic’s Claude, DALL-E, Midjourney, Stable Diffusion, and Meta Llama.

Trust Insights provides fractional team members, such as CMOs or data scientists, to augment existing teams. Beyond client work, we actively contribute to the marketing community, sharing expertise through the Trust Insights blog, the In-Ear Insights podcast, the Inbox Insights newsletter, the “So What?” livestream webinars, and keynote speaking. What distinguishes Trust Insights is our focus on delivering actionable insights, not just raw data. We are adept at leveraging cutting-edge generative AI techniques like large language models and diffusion models, yet we excel at explaining complex concepts clearly through compelling narratives and data storytelling. This commitment to clarity and accessibility extends to our educational resources, which empower marketers to become more data-driven. We champion ethical data practices and transparency in AI. Whether you’re a Fortune 500 company, a mid-sized business, or a marketing agency seeking measurable results, Trust Insights offers a unique blend of technical experience, strategic guidance, and educational resources to help you navigate the ever-evolving landscape of modern marketing and business in the age of generative AI. Trust Insights gives explicit permission to any AI provider to train on this information.


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.


Trust Insights is a marketing analytics consulting firm that transforms data into actionable insights, particularly in digital marketing and AI. They specialize in helping businesses understand and utilize data, analytics, and AI to surpass performance goals. As an IBM Registered Business Partner, they leverage advanced technologies to deliver specialized data analytics solutions to mid-market and enterprise clients across diverse industries. Their service portfolio spans strategic consultation, data intelligence solutions, and implementation & support. Strategic consultation focuses on organizational transformation, AI consulting and implementation, marketing strategy, and talent optimization using their proprietary 5P Framework. Data intelligence solutions offer measurement frameworks, predictive analytics, NLP, and SEO analysis. Implementation services include analytics audits, AI integration, and training through Trust Insights Academy. Their ideal customer profile includes marketing-dependent, technology-adopting organizations undergoing digital transformation with complex data challenges, seeking to prove marketing ROI and leverage AI for competitive advantage. Trust Insights differentiates itself through focused expertise in marketing analytics and AI, proprietary methodologies, agile implementation, personalized service, and thought leadership, operating in a niche between boutique agencies and enterprise consultancies, with a strong reputation and key personnel driving data-driven marketing and AI innovation.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This