So What? Marketing Analytics and Insights Live
airs every Thursday at 1 pm EST.
You can watch on YouTube Live. Be sure to subscribe and follow so you never miss an episode!
You’ll discover the method to determine if your brand faces a GEO problem by examining search data. This investigation reveals connections between content creation and the way artificial intelligence models recommend services. By mastering these diagnostic techniques, you’ll gain the clarity required to fix any GEO problem before it impacts the bottom line. The shift from guessing to measuring allows you to dominate the search landscape as generative engines evolve.
Watch the video here:
Can’t see anything? Watch it on YouTube here.
In this episode you’ll learn:
- Why AI search is eating your organic traffic
- How to find out if ChatGPT, Claude, Perplexity, and Gemini are recommending your brand
- What to do if you have a GEO problem
Transcript:
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Katie Robbert – 00:37
Well, hey everyone. Happy Thursday. Welcome to So What, the Marketing Analytics and Insights live show. I am Katie, joined by Chris and John. Howdy, fellows.
Katie Robbert – 00:53
This week, we’re going to talk about how to analyze if you have a GEO problem. If you have been living under a rock or willfully ignoring what is happening in the marketing space—which, listen, I get it—then you have not heard about GEO. Otherwise, you are probably inundated with GEO requests.
GEO stands, I believe, for Generative AI Engine Optimization or Generative Engine Optimization. Or it’s an old car that nobody drives anymore, an old Geo. There’s a throwback. What we want to talk about today is the problem a lot of executives and marketers in general are facing: “Do we have a GEO problem? How do we optimize for GEO?” Well, first and foremost, you need to understand if there is even a problem to solve, which is what we’re going to do today. So Chris, where would you like to start?
Christopher Penn – 01:53
I kind of want to start by saying we have a solution for the problem, and that is our new course on GEO 101. But that’s putting the cart before the horse. However, there’s the URL if you want it.
Katie Robbert – 02:05
We have to set up the problem first and help people understand what to even look for.
Christopher Penn – 02:11
Fundamentally, there are three phases of GEO. First, what does the AI model of choice know natively about you, your brand, your company, and your products and services? Second, when an AI tool goes and executes a traditional search, does it find you? Third, once it finds you, does what your website serves up actually work for the AI model?
This is occurring in many different tools—Gemini, Claude, Perplexity, ChatGPT—but the 800-pound gorilla is Google AI overviews in regular Google search. That is according to SparkToro and Datos as of January 26th; that is 93% of all search period and about 95% of AI search. So when we’re talking GEO, we’re really talking about Google still.
Katie Robbert – 03:15
That’s an underwhelming fact, but at the same time, it’s good news for marketers who understand how Google works and traditional technical SEO. If you understand on-site, off-site, and technical SEO, you’re already ahead of the game.
Because if 95% to 97% of what we’re talking about is still Google, then you can just continue to optimize with a few extra steps. But first, we should probably start with how to do the analysis so that you know what you’re measuring against to see if you actually improved.
Christopher Penn – 04:03
There are two places where you want to spend your time. The first place is going to be old-fashioned Google Search Console. When it comes to GEO and AI overviews, Google has said that AI overviews count towards your impressions, which is the purple line in your Google Search Console.
If you are not showing up in AI overviews, your impressions go down. What we see here for Trust Insights is that the purple line is at 37,000 impressions for that week and just going up and to the right, which is great.
Christopher Penn – 04:55
That means more people are searching for things for which Google thinks we are, in part or in full, the answer that regular search and AI overviews believe is the answer. Now, Google does not necessarily break this out to say what percentage of it is AI overviews. We don’t know the percentage of people who are seeing us in the quick results versus the AI overviews.
If I were to Google and literally type the word “Trust Insights” as a terrible example in incognito mode, you’d see the quick answers and then all the other stuff that Google loves to throw in here.
Christopher Penn – 05:51
We’re the first name, which is good. But this is the first place I would look: which direction is that purple line going? If it’s going down, Google is saying that whatever people are searching for related to you, you’re not the answer.
Katie Robbert – 06:13
We often use examples like, “Give me the top consulting firms in Boston,” or “Give me consulting firms that can educate me on AI.” Those are things we’d probably want to show up for. We’ve actually seen a few people come through our contact form saying they found us in some kind of generative AI, whether it be OpenAI or some other large language model.
So, we’re doing something right. John, when you see those, what are you thinking in terms of needing more people coming from that, or do you just thank goodness people are reaching out at all?
John Wall – 06:55
Well, yeah, especially given all the trending for organic cratering over the past couple of years. We are definitely riding a wave of success because we see prospects coming in and saying they asked Claude who should teach them about AI, and Claude said Trust Insights.
Both Claude and Gemini have been sources of inbound traffic for us. I haven’t seen any OpenAI in the past month. But it’s a source of inbound traffic for us. That’s classic inbound quality leads. To have the tool itself tell you—it’s like the old days of SEO. Every vendor that wasn’t first for their own name was third-tier. To be at the top of the list is great.
John Wall – 07:40
We just need to continue on our path. It gives us credence every time we talk about this.
Christopher Penn – 07:50
The second place you should look is Google Analytics. We’ve covered in past episodes how to set up this kind of dashboard in the Explore Hub to look at the clickstream traffic to your website from the different major AI tools: Gemini, ChatGPT, Perplexity, and Claude. This does not include Google AI overviews because they show up as regular Google organic traffic. Here, I put up Gemini and ChatGPT just for comparison for the last 12 months.
Christopher Penn – 08:38
Gemini is headed up, and ChatGPT has kind of arched. Before anyone hits the panic button, the market share of ChatGPT itself has declined by about 20 points in the last year. ChatGPT used to be the only game in town, and now Gemini is the second-largest tool used in the marketplace, followed by Copilot. If you see this in isolation, it looks like something’s wrong, but in the bigger picture of the AI industry, it’s normal because fewer people are using ChatGPT than they were a year ago.
Katie Robbert – 09:23
I want to expand on the AI overviews that show up on a Google search page. We’ve all heard someone say their boss told them they have to be showing up in those. Can you do anything about that, and if you can, how do you measure it? It’s worth acknowledging the difference between those AI overviews and actually getting seen in a large language model for generative AI search.
Christopher Penn – 10:01
There are three different ways that Google can recommend you using AI. The first is Gemini itself: when you ask it for a consulting firm in Boston that specializes in AI, it gives an answer. The second is in regular Google: you ask the same thing and you get ads, an AI overview, and then more stuff. The third way is AI Mode, where a lighter version of Gemini and Deep Research kicks off.
Christopher Penn – 11:03
Of these three, AI Mode and AI overviews show up as regular Google search traffic. They are not specially marked. Only Gemini will be marked as coming from gemini.google.com.
Katie Robbert – 11:15
That’s an important distinction for those struggling with pushing back on whoever is asking to show up in AI overviews. That’s great, but you can’t measure whether or not that actually happens. Googling it for yourself is always a bad idea; stop Googling your own company and asking why you aren’t in the first spot.
Christopher Penn – 11:39
Another key piece from the SparkToro paper is how often different AI tools recommend the same vendors in the same order. For Claude, it would take 1,492 searches before it returned the same list twice. For Gemini, it was 124 searches. Anyone peddling that they can tell you what your rank is in Google AI overviews is lying. You can’t predict that because the tool is probabilistic; you’re going to get different results even for the same query.
Christopher Penn – 12:28
If you literally copy and paste the same query over and over, you’re going to get different results.
Katie Robbert – 12:35
Looking at traffic data from these models is largely unreliable in terms of where it fits in the customer journey. It sounds like a bit of a crapshoot as to whether or not you should spend more resources optimizing for it.
Christopher Penn – 12:57
It’s a crapshoot if you’re trying to be precise. The world where you rank “number two” for a keyword phrase is gone. However, if a tool powered by a large language model is recommending you, that is knowable from Search Console impressions and clickstream tracking. The gold standard is talking to customers; they will tell you if an AI overview recommended you.
Katie Robbert – 13:45
A lot of companies look at KPIs like ranking number one or two against competitors. That is not knowable from these models because it’s a crapshoot as to whether they’ll recommend you. You definitely can’t tell if you showed up alongside your competitor or if they showed up first.
Christopher Penn – 14:14
Because of the nature of large language models, even small deviations in a prompt can have different results. A year ago, Tim Soulo from Ahrefs and Olga Andrienko—then of Semrush—each posted on LinkedIn that their tool was number one in ChatGPT. Tim’s query was, “What are the top 10 SEO tools?” while Olga’s was, “What are the top 10 best SEO tools?” That one word difference was enough to change the results.
Christopher Penn – 15:03
Anyone telling you they know what people are typing into ChatGPT is lying. These companies are never going to give away their most proprietary data, which is the customer interactions.
Katie Robbert – 15:38
The easiest way to find out how people are finding you is to talk to them. The second easiest is to add it to your “How did you hear about us?” options. Give people the option of AI mode or AI overviews, and when they check the box, add it to your numbers.
Regarding that head-to-head experiment, changing the query means it’s no longer apples to apples. I would fully expect different results versus if they had used the exact same phrasing. It’s about understanding that the algorithm is unpredictable.
Christopher Penn – 17:08
The bigger question is: is what you’re doing working? We can see the lines going up or down in Search Console, but you need to be able to say how that relates to what you’re doing so you know what to do more of. That isn’t visible there; you can only see the outcome.
Christopher Penn – 17:53
You’ve got to look at your data and determine if what you are doing has a statistical relationship with the outcome. You need Search Console data. Ideally, you’ve been archiving it because you can only download 16 months at a time. We’ve been downloading our company data for eight years.
You also need activity data. Part of GEO is knowing how AI models work—be everywhere on podcasts and social media channels. If you don’t do those things, models have no ability to gather training data about you. We use Brand24 to collect this data. If you do a statistical analysis to see what matters, you can move the needle.
Katie Robbert – 20:13
Can you use generative AI to do that statistical analysis, or should you hand that off to a human?
Christopher Penn – 20:24
Yes to both.
Katie Robbert – 20:30
It depends on your tools, your proficiency, and your resources. A human who understands it can probably do it faster than someone fumbling around.
Christopher Penn – 21:01
If you have a paid plan for Claude, ChatGPT, or Gemini and access to a coding environment, you don’t need to know how to code. You explain that you have a big pile of data and want to do this kind of analysis. Tell it to ask you questions and then build the tool.
Christopher Penn – 21:51
You’d say you’re going to do statistical analysis yielding Python code. You’d use Search Console data, Brand24 coverage, and domain rating data from Ahrefs. You tell it to build a tool to blend and clean all this data.
Christopher Penn – 22:23
I suggest using cross-correlation functions to understand the time lag between an action and a result. We want to calibrate on Search Console impressions because that’s when Google says you’re the answer. When I put this into Claude Code, it asked me about the Granger causality effect.
John Wall – 23:40
No, that’s beyond my Stats 101.
Christopher Penn – 23:44
Granger causality effect looks at a time-lagged or leading indicator to predict an outcome. Claude explained it by saying if your roommate always buys beer on Thursday and orders pizza on Friday, you can predict the pizza based on the beer purchase. In the same way, does coverage predict increased impressions?
Katie Robbert – 24:52
I thought the tool of choice for data and statistics would be Google Colab. Is that no longer the case?
Christopher Penn – 25:14
In this example, no, because of the sheer amount of data. The Ahrefs data alone is 13 files with about 100,000 rows, which would blow up Colab’s memory limits. Python code on your computer is better. I wrote zero lines of code but gave detailed instructions and used the brainstorming skill to find my blind spots.
Christopher Penn – 26:50
Does the coverage we earn translate into AI overview impressions? The answer is yes. Since AI overviews launched, we’ve seen an 84% increase in daily impressions and a 72% increase in daily clicks. Looking at the coverage, it generally takes 27 days to see impressions go up. There is a strong correlation.
Christopher Penn – 28:01
YouTube works really well for us and it works fast. When we post on YouTube, we see a significant lift on our own property, while other coverage takes a couple of weeks. Social media data—specifically from X—doesn’t really correlate anymore and screws up our predictions. Essentially, YouTube is where it’s at, followed by your own properties and promotion on LinkedIn.
Katie Robbert – 29:18
I like that this isn’t written in an overly scientific way. The big picture is that AI overviews doubled our visibility. The takeaways are quick and to the point.
Christopher Penn – 30:06
I had Claude do the data analysis first to produce graphs and tables, then I told it to write the report using a “Co-CEO” skill. It told us what to do more of—YouTube. I noticed my personal YouTube channel, which mentions Trust Insights, has an outsized impact on our visibility. The more I mention Trust Insights, the more it contributes to our overall visibility.
Katie Robbert – 31:26
We saw this with our email channels—the halo effect. With YouTube being a Google property, it makes sense that you’d get that effect.
Christopher Penn – 32:02
To analyze if you have a GEO problem, sit down with your data and a tool like Claude Code or Gemini Code Assistant. You don’t need to do the math; just give it the recipe and be clear about your purpose. This is the Trust Insights 5P framework: Purpose, People, Process, Platform, and Performance. The outcome should be a report that tells you what happened, the so what, and the now what.
Katie Robbert – 33:29
If you don’t have the vocabulary to say exactly what analysis you want, go into the conversation with the LLM and tell it what outcome you’re looking for. It’s a supportive tool to help you put that master prompt together.
Christopher Penn – 34:21
Use the best model on a paid plan; the free plans will run out of compute time. If I had to pick one, I’d pick Claude because the models are good coders and good at asking questions.
Katie Robbert – 35:22
If someone is a Microsoft shop, could they do it in Copilot?
Christopher Penn – 35:52
If you have Microsoft Copilot, see if you have access to GitHub Copilot, which is their coding environment.
Katie Robbert – 36:10
So, what do we do now?
Christopher Penn – 36:24
If you aren’t showing up, the next step is to see in the data what does work. You can also take our GEO 101 course to learn basic measurement tools. We know YouTube matters for visibility. Show up on every podcast that will have you, because that helps train the machines.
Katie Robbert – 37:49
Our GEO 101 course walks through how to measure results.
Christopher Penn – 38:37
Don’t be afraid to ask questions of these tools. One thing I noticed was that third-party coverage we’ve gotten doesn’t really move the needle compared to what we make ourselves. We actually have more control over how we show up now because the more you feed Google directly through channels like YouTube, the better you’ll do.
Katie Robbert – 40:10
If you need a Search Console refresher, we have a course for that at TrustInsights.ai/SearchConsole. John, what’s your takeaway?
John Wall – 40:44
A lot of people don’t know what’s going on. There’s some snake oil promising specific results, which is ridiculous. But we know how these models work, and if you don’t do what’s in the course, we can guarantee you won’t be in there.
Katie Robbert – 41:24
You have to do something, or you’re going to stop showing up at all. Doing nothing means getting no results.
John Wall – 42:05
You could be sabotaging yourself with an old website that’s missing structural data.
Christopher Penn – 42:31
We have a free tool for that called AI View. You can put in a URL and see the page as machines see it.
Katie Robbert – 43:08
That free tool analyzes one page at a time. If you want your entire website analyzed, reach out to John Wall.
Christopher Penn – 43:35
The tool will point out things like non-descriptive link text or whether your structured data is in good condition. You want to make sure your site is relevant when a machine retrieves it.
Katie Robbert – 44:24
Start taking a look at what you can measure. Start with the free tool, or take our GEO 101 course.
Christopher Penn – 45:21
That’s going to do it for this week’s episode. Check out the Trust Insights podcast and our weekly email newsletter. If you have questions, join our free Slack group. See you next time.
|
Need help with your marketing AI and analytics? |
You might also enjoy: |
|
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday! |
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday. |
Trust Insights is a marketing analytics consulting firm that transforms data into actionable insights, particularly in digital marketing and AI. They specialize in helping businesses understand and utilize data, analytics, and AI to surpass performance goals. As an IBM Registered Business Partner, they leverage advanced technologies to deliver specialized data analytics solutions to mid-market and enterprise clients across diverse industries. Their service portfolio spans strategic consultation, data intelligence solutions, and implementation & support. Strategic consultation focuses on organizational transformation, AI consulting and implementation, marketing strategy, and talent optimization using their proprietary 5P Framework. Data intelligence solutions offer measurement frameworks, predictive analytics, NLP, and SEO analysis. Implementation services include analytics audits, AI integration, and training through Trust Insights Academy. Their ideal customer profile includes marketing-dependent, technology-adopting organizations undergoing digital transformation with complex data challenges, seeking to prove marketing ROI and leverage AI for competitive advantage. Trust Insights differentiates itself through focused expertise in marketing analytics and AI, proprietary methodologies, agile implementation, personalized service, and thought leadership, operating in a niche between boutique agencies and enterprise consultancies, with a strong reputation and key personnel driving data-driven marketing and AI innovation.