In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss the generative AI sophomore slump.
You will discover why so many businesses are stuck at the same level of AI adoption they were two years ago. You will learn how anchoring to initial perceptions and a lack of awareness about current AI capabilities limits your organization’s progress. You will understand the critical difference between basic AI exploration and scaling AI solutions for significant business outcomes. You will gain insights into how to articulate AI’s true value to stakeholders, focusing on real world benefits like speed, efficiency, and revenue. Tune in to see why your approach to AI may need an urgent update!
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
- Need help with your company’s data and analytics? Let us know!
- Join our free Slack group for marketers interested in analytics!
[podcastsponsor]
Machine-Generated Transcript
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Christopher S. Penn – 00:00
In this week’s In-Ear Insights, let’s talk about the sophomore slump. Katie, you were talking about the sophomore slump in regards to generative AI. I figured we could make this into a two-part series. So first, what is the sophomore slump?
Katie Robbert – 00:15
So I’m calling it the sophomore slump.
Basically, what I’m seeing is a trend of a lot of companies talking about, “We tried. We started implementing AI two years ago—generative AI to be specific—and we’re stalled out.”
We are at the same place we were two years ago. We’ve optimized some things. We’re using it to create content, maybe create some images, and that’s about it.
Everyone fired everyone. There’s no one here. It’s like a ghost town. The machines are just whirring away in the background.
And I’m calling it the sophomore slump because I’m seeing this pattern of companies, and it all seems to be—they’re all saying the same—two years ago.
Katie Robbert – 01:03
And two years ago is when generative AI really hit the mainstream market in terms of its availability to the masses, to all of us, versus someone, Chris, like you, who had been using it through IBM and other machine learning systems and homegrown systems.
So I bring it up because it’s interesting, because I guess there’s a lot to unpack here.
AI is this magic tool that’s gonna solve your problems and do all the things and make you dinner and clean your room.
I feel like there’s a lot of things wrong or a lot of things that are just not going right. A lot of companies are hitting this two-year mark, and they’re like, “What now? What happened? Am I better off? Not really.”
Katie Robbert – 02:00
I’m just paying for more stuff. So Chris, are you seeing this as well? Is this your take?
Christopher S. Penn – 02:07
It is. And a lot of it has to do with what psychology calls anchoring, where your understanding something is anchored to your first perceptions of it.
So when ChatGPT first came out in November 2022 and became popular in January 2023, what were people using it for? “Let’s write some blog posts.”
And two years later, where are we? “Let’s write some blog posts.”
And the capabilities have advanced exponentially since then. One of the big things that we’ve heard from clients and I’ve seen and heard at trade shows and conferences and all this stuff: people don’t understand even what’s possible with the tools, what you can do with them.
Christopher S. Penn – 02:56
And as a result, they’re still stuck in 2023 of “let’s write some blog posts.”
Instead, “Hey, today, use this tool to build software. Use this tool to create video. Use this tool to make fully synthetic podcasts.”
So as much as it makes me cringe, there’s this term from consulting called “the art of the possible.” And that really is still one of the major issues for people to open their minds and go, “Oh, I can do this!”
This morning on LinkedIn, I was sharing from our livestream a couple weeks ago: “Hey, you can use NotebookLM to make segments of your sales playbook as training audio, as a training podcast internally so that you could help new hires onboard quickly by having a series of podcasts made from your own company’s materials.”
Katie Robbert – 03:49
Do you think that when Generative AI hit the market, people jumped on it too quickly? Is that the problem? Or is it evolving so fast? Or what do you think happened that two years later, despite all the advances, companies are stalled out in what we’re calling the sophomore slump?
Christopher S. Penn – 04:13
I don’t think they jumped on it too quickly. I don’t think they kept up with the changes. Again, it’s anchoring.
One of the very interesting things that I’ve seen at workshops: for example, we’ve been working with SMPS—the Society for Marketing Professional Services—and they’re one of our favorite clients because we get a chance to hang out with them twice a year, every year, for two-day workshops.
And I noted at the most recent one, the demographic of the audience changed radically. In the first workshop back in late 2023, it was 60-40 women to men, as mid- to senior-level folks.
In this most recent was 95-5 women and much more junior-level folks.
And I remember commenting to the organizers, I said, “What’s going on here?”
Christopher S. Penn – 05:02
And they said what they’ve heard is that all senior-level folks are like, “Oh yeah, I know AI. We’re just going to send our junior people.”
I’m like, “But what I’m presenting today in 2025 is so far different from what you learned in late 2023.”
You should be here as a senior leader to see what’s possible today.
Katie Robbert – 05:26
I have so many questions about that kind of mentality.
“I know everything I need to know, therefore it doesn’t apply to me.”
Think about non-AI-based technology, think about the rest of your tech stack: servers, cloud storage, databases. Those things aren’t static. They change and evolve. Maybe not at the pace that generative AI has been evolving, but they still change, and there’s still things to know and learn.
Unless you are the person developing the software, you likely don’t know everything about it.
And so I’ve always been really suspicious of people who have that “I know everything I need to know, I can’t learn any more about this, it’s just not relevant” sort of mentality. That to me is hugely concerning.
Katie Robbert – 06:22
And so it sounds like what you are seeing as a pattern in addition to this sophomore slump is people saying, “I know enough. I don’t need to keep up with it. I’m good.”
Christopher S. Penn – 06:34
Exactly. So their perception of generative AI and its capabilities, and therefore knowing what to ask for as leaders, is frozen in late 2023.
Their understanding has not evolved.
And while the technology has evolved, as a point of comparison, generative AI’s capabilities in terms of what the tools can double every six months.
So a task that took an hour for AI to do six months ago now takes 30 minutes.
A task that they couldn’t do six months ago, they can do now.
And so since 2023, we’ve essentially had what—five doublings. That’s two to the fifth power: five doublings of its capabilities.
Christopher S. Penn – 07:19
And so if you’re stuck in late 2023, of course you’re having a sophomore slump because it’s like you learned to ride a bicycle, and today there is a Bugatti Chiron in your driveway, and you’re like, “I’m going to bicycle to the store.”
Well, you can do a bit more than that now.
You can go a little bit faster. You can go places you couldn’t go previously.
And I don’t know how to fix that. I don’t know how to get the messaging out to those senior leaders to say what you think about AI is not where the technology is today.
Which means that if you care about things like ROI—what is the ROI of AI?—you are not unlocking value because you don’t even know what it can do.
Katie Robbert – 08:09
Well, see, and now you’re hitting on because you just said, “I don’t know how to reach these leaders.”
But yet in the same sentence, you said, “But here are the things they care about.”
Those are the terms that need to be put in for people to pay attention.
And I’ll give us a knock on this too.
We’re not putting it in those terms. We’re not saying, “Here’s the value of the latest and greatest version of AI models,” or, “Here’s how you can save money.”
We’re talking about it in terms of what the technology can do, not what it can do for you and why you should care. I was having this conversation with one of our clients this morning as they’re trying to understand what GPTs, what models their team members are using.
Katie Robbert – 09:03
But they weren’t telling the team members why.
They were asking why it mattered if they knew what they were using or not.
And it’s the oldest thing of humankind: “Just tell me what’s in it for me? How does this make it about me? I want to see myself in this.”
And that’s one of the reasons why the 5Ps is so useful.
So this isn’t necessarily “use the 5Ps,” but it could be.
So the 5Ps are Purpose, People, Process, Platform, Performance, when we’re the ones at the cutting edge.
And we’re saying, “We know that AI can do all of these really cool things.” It’s our responsibility to help those who need the education see themselves in it.
Katie Robbert – 09:52
So, Chris, one of the things that we do is, on Mondays we send out a roundup of everything that’s happened with AI.
And you can get that. That’s our Substack newsletter.
But what we’re not doing in that newsletter is saying, “This is why you should pay attention.”
But not “here’s the value.” “If you implement this particular thing, it could save you money.”
This particular thing could increase your productivity.
And that’s going to be different for every client. I feel like I’m rambling and I’m struggling through my thought process here.
Katie Robbert – 10:29
But really what it boils down to, AI is changing so fast that those of us on the front lines need to do a better job of explaining not just why you should care, but what the benefit is going to be, but in the terms that those individuals care about.
And that’s going to look different for everyone.
And I don’t know if that’s scalable.
Christopher S. Penn – 10:50
I don’t think it is scalable. And I think the other issue is that so many people are locked into the past that it’s difficult to even make headway into explaining how this thing will benefit you.
So to your point, part of our responsibility is to demonstrate use cases, even simple ones, to say: “Here, with today’s modern tooling, here’s a use case that you can use generative AI for.”
So at the workshop yesterday that we have this PDF-rich, full of research. It’s a lot. There’s 50-some-odd pages, high-quality data.
Christopher S. Penn – 11:31
But we said, “What would it look like if you put this into Google Gemini and turn it into a one-page infographic of just the things that the ideal customer profile cares about?”
And suddenly the models can take that, distill it down, identify from the ideal customer profile the five things they really care about, and make a one-page infographic.
And now you’ve used the tools to not just process words but make an output.
And they can say, “Oh, I understand! The value of this output is: ‘I don’t have to wait three weeks for Creative to do exactly the same thing.'”
We can give the first draft to Creative and get it turned around in 24 hours because they could add a little polish and fix the screw-ups of the AI.
Christopher S. Penn – 12:09
But speed. The key output there is speed: high quality.
But Creative is already creating high-quality.
But speed was the key output there.
In another example, everybody their cousin is suddenly, it’s funny, I see this on LinkedIn, “Oh, you should be using GPTs!”
I’m like, “You should have been using GPTs for over a year and a half now!”
What you should be doing now is looking at how to build MCPs that can go cross-platform. So it’s like a GPT, but it goes anywhere you go.
So if your company uses Copilot, you will be able to use an MCP. If your company uses Gemini, you’ll be able to use this.
Christopher S. Penn – 12:48
So what does it look like for your company if you’ve got a great idea to turn it into an MCP and maybe put it up for sale?
Like, “Hey, more revenue!”
The benefit to you is more revenue.
You can take your data and your secret sauce, put it into this thing—it’s essentially an app—and sell it. More revenue.
So it’s our responsibility to create these use cases and, to your point, clearly state: “Here’s the Purpose, and here’s the outcome.”
Money or time or something. You could go, “Oh, I would like that!”
Katie Robbert – 13:21
It occurs to me—and I feel silly that this only just occurred to me.
So when we’re doing our roundup of “here’s what changed with AI week over week” to pull the data for that newsletter, we’re using our ideal customer profile. But we’re not using our ideal customer profile as deeply as we could be.
So if those listening aren’t familiar, one of the things that we’ve been doing at Trust Insights is taking publicly available data, plus our own data sets—our CRM data, our Google Analytics data—and building what we’re calling these ideal customer profiles.
So, a synthetic stand-in for who should be a Trust Insights customer.
And it goes pretty deep. It goes into buying motivations, pain points, things that the ideal customer would care about.
Katie Robbert – 14:22
And as we’re talking, it occurs to me, Chris, we’re saying, “Well, it’s not scalable to customize the news for all of these different people, but using generative AI, it might be.”
It could be. So I’m not saying we have to segment off our newsletter into eight different versions depending on the audience, but perhaps there’s an opportunity to include a little bit more detail around how a specific advancement in generative AI addresses a specific pain point from our ideal customer profile.
Because theoretically, it’s our ideal customers who are subscribing to our content.
It’s all very—I would need to outline it in how all these things connect.
Katie Robbert – 15:11
But in my brain, I can see how, again, that advanced use case of generative AI actually brings you back to the basics of “How are you solving my problem?”
Christopher S. Penn – 15:22
So in an example from that, you would say, “Okay, which of the four dimensions—it could be more—but which of the four dimensions does this news impact?”
Bigger, better, faster, cheaper.
So which one of these does this help?
And if it doesn’t align to any of those four, then maybe it’s not of use to the ICP because they can go, “Well, this doesn’t make me do things better or faster or save me money or save me time.”
So maybe it’s not that relevant. And the key thing here, which a lot of folks don’t have in their current capabilities, is that scale.
Christopher S. Penn – 15:56
So when we make that change to the prompt that is embedded inside this AI agent, the agent will then go and apply it to a thousand different articles at a scale that you would be copying and pasting into ChatGPT for three days to do the exact same thing.
Katie Robbert – 16:12
Sounds awful.
Christopher S. Penn – 16:13
And that’s where we come back to where we started with this about the sophomore slump is to say, if the people are not building processes and systems that allow the use of AI to scale, everyone is still in the web interface.
“Oh, open up ChatGPT and do this thing.”
That’s great.
But at this point in someone’s AI evolution, ChatGPT or Gemini or Claude or whatever could be your R&D.
That’s where you do your R&D to prove that your prompt will even work.
But once you’ve done R&D, you can’t live in R&D. You have to take it to development, staging, and eventually production.
Taking it on the line so that you have an AI newsletter.
Christopher S. Penn – 16:54
The machine spits out. You’ve proven that it works through the web interface. You’ve proven it works by testing it.
And now it’s, “Okay, how do we scale this in production?”
And I feel like because so many people are using generative AI as language tools rather than seeing them as what they are—which is thinly disguised programming tools—they don’t think about the rest of the SDLC and say, “How do we take this and put it in production?”
You’re constantly in debug mode, and you never leave it.
Katie Robbert – 17:28
Let’s go back to the audience because one of the things that you mentioned is that you’ve seen a shift in the demographic to who you’ve been speaking to.
So it was upper-level management executives, and now those folks feel like they know enough.
Do you think part of the challenge with this sophomore slump that we’re seeing is what the executives and the upper-level management think they learned? Is it not also then getting distilled down into those junior staff members?
So it’s also a communication issue, a delegation issue of: “I learned how to build a custom GPT to write blogs for me in my voice.”
“So you go ahead and do the same thing,” but that’s where the conversation ends.
Or, “Here’s my custom GPT. You can use my voice when I’m not around.”
Katie Robbert – 18:24
But then the marketing ants are like, “Okay, but what about everything else that’s on my plate?” Do you feel like that education and knowledge transfer is part of why we’re seeing this slump?
Christopher S. Penn – 18:36
Absolutely, I think that’s part of it. And again, those leaders not knowing what’s happening on the front lines of the technology itself means they don’t know what to ask for.
They remember that snapshot of AI that they had in October 2023, and they go, “Oh yeah, we can use this to make more blog posts.”
If you don’t know what’s on the menu, then you’re going to keep ordering the same thing, even if the menu’s changed.
Back in 2023, the menu is this big.
It’s “blog posts.”
“Okay, I like more blog posts now.”
The menu is this big.
And saying: you can do your corporate strategy. You can audit financial documents. You can use Google Colab to do advanced data analysis. You can make videos and audio and all this stuff.
Christopher S. Penn – 19:19
And so the menu that looks like the Cheesecake Factory.
But the executive still has the mental snapshot of an index card version of the menu.
And then the junior person goes to a workshop and says, “Wow! The menu looks like a Cheesecake Factory menu now!”
Then they come back to the office, and they say, “Oh, I’ve got all these ideas that we can implement!”
The executives are like, “No, just make more blog posts.” “That’s what’s on the menu!”
So it is a communication issue. It’s a communication issue. It is a people issue.
Christopher S. Penn – 19:51
Which is the problem.
Katie Robbert – 19:53
Yeah. Do you think? So the other trend that I’m seeing—I’m trying to connect all these things because I’m really just trying to wrap my head around what’s happening, but also how we can be helpful—is this:
I’m seeing a lot of this anti-AI.
A lot of that chatter where, “Humans first.” “Humans still have to do this.”
And AI is not going to replace us because obviously the conversation for a while is, “Will this technology take my job?”
And for some companies like Duolingo, they made that a reality, and now it’s backfiring on them.
But for other people, they’re like, “I will never use AI.”
They’re taking that hard stance to say, “This is just not what I’m going to do.”
Christopher S. Penn – 20:53
It is very black and white. And here’s the danger of that from a strategy perspective.
People have expectations based on the standard.
So in 1998, people like, “Oh, this Internet thing’s a fad!”
But the customer expectations started to change.
“Oh, I can order any book I want online!”
I don’t have to try to get it out of the borders of Barnes and Noble.
I can just go to this place called Amazon.
Christopher S. Penn – 21:24
In 2007, we got these things, and suddenly it’s, “Oh, I can have the internet wherever I go.”
By the so-called mobile commerce revolution—which did happen—you got to swipe right and get food and a coffee, or have a car show up at your house, or have a date show up at your house, or whatever.
And the expectation is this thing is the remote control for my life.
And so every brand that did not have an app on this device got left behind because people are like, “Well, why would I use you when I have this thing? I can get whatever I want.”
Now AI is another twist on this to say: we are setting an expectation.
Christopher S. Penn – 22:04
The expectation is you can get a blog post written in 15 minutes by ChatGPT.
That’s the expectation that has been set by the technology, whether it’s any good or not. We’ll put that aside because people will always choose convenience over quality.
Which means if you are that person who’s like, “I am anti-AI. Human first. Human always. These machines are terrible,” great, you still have to produce a blog post in 15 minutes because that is the expectation set by the market.
And you’re like, “No, quality takes time!”
Quality is secondary to speed and convenience in what the marketplace will choose.
So you can be human first, but you better be as good as a machine and as a very difficult standard to meet.
Christopher S. Penn – 22:42
And so to your point about the sophomore slump, those companies that are not seeing those benefits—because they have people who are taking a point of view that they are absolutely entitled to—are not recognizing that their competitors using AI are setting a standard that they may not be able to meet anymore.
Katie Robbert – 23:03
And I feel like that’s also contributing to that.
The sophomore slump is in some ways—maybe it’s not something that’s present in the conscious mind—but maybe subconsciously people are feeling defeated, and they’re like, “Well, I can’t compete with my competitors, so I’m not even going to bother.”
So let me twist it so that it sounds like it’s my idea to not be using AI, and I’m going to set myself apart by saying, “Well, we’re not going to use it.”
We’re going to do it the old-fashioned way.
Which, I remember a few years ago, Chris, we were talking about how there’s room at the table both for the Amazons and the Etsy crowds.
Katie Robbert – 23:47
And so there’s the Amazon—the fast delivery, expedited, lower cost—whereas Etsy is the handmade, artisanal, bespoke, all of those things.
And it might cost a little bit more, but it’s unique and crafted.
And so do you think that analogy still holds true?
Is there still room at the table for the “it’s going to take longer, but it’s my original thinking” blog post that might take a few days versus the “I can spin up thousands of blog posts in the few days that it’s going to take you to build the one”?
Christopher S. Penn – 24:27
It depends on performance. The fifth P.
If your company measures performance by things like profit margins and speed to market, there isn’t room at the table for the Etsy style.
If your company measures other objectives—like maybe customer satisfaction, and values-based selling is part of how you make your money—companies say, “I choose you because I know you are sustainable. I choose you because I know you’re ethical.”
Then yes, there is room at the table for that.
So it comes down to basic marketing strategy, business strategy of what is it that the value that we’re selling is—is the audience willing to provide it?
Which I think is a great segue into next week’s episode, which is how do you get out of the sophomore slump? So we’re going to tackle that next week’s episode.
Christopher S. Penn – 25:14
But if you’ve got some thoughts about the sophomore slump that you are facing, or that maybe your competitors are facing, or that the industry is facing—do you want to talk about them? Pop them by our free Slack group.
Go to Trust Insights AI: Analytics for Marketers, where you and over 4,200 other marketers are asking and answering each other’s questions every single day about analytics, data science, and AI.
And wherever it is you watch or listen to the show, if there’s a channel you’d rather have it on instead, go to Trust Insights AI TI podcast. You can find us in all the places that podcasts are served. Talk to you on the next one.
Katie Robbert – 25:48
Want to know more about Trust Insights? Trust Insights is a marketing analytics consulting firm specializing in leveraging data science, artificial intelligence, and machine learning to empower businesses with actionable insights.
Founded in 2017 by Katie Robbert and Christopher S. Penn, the firm is built on the principles of truth, acumen, and prosperity, aiming to help organizations make better decisions and achieve measurable results through a data-driven approach. Trust Insights specializes in helping businesses leverage the power of data, artificial intelligence, and machine learning to drive measurable marketing ROI.
Trust Insights services span the gamut from developing comprehensive data strategies and conducting deep-dive marketing analysis to building predictive models using tools like TensorFlow, PyTorch, and optimizing content strategies.
Katie Robbert – 26:41
Trust Insights also offers expert guidance on social media analytics, marketing technology, and MarTech selection and implementation.
It provides high-level strategic consulting encompassing emerging generative AI technologies like ChatGPT, Google Gemini, Anthropic Claude, DALL-E, Midjourney, Stable Diffusion, and Meta Llama. Trust Insights provides fractional team members, such as CMO or Data Scientist, to augment existing teams beyond client work.
Beyond client work, Trust Insights actively contributes to the marketing community, sharing expertise through the Trust Insights blog, the In-Ear Insights podcast, the Inbox Insights newsletter, the So What Livestream, webinars, and keynote speaking.
Katie Robbert – 27:46
Data Storytelling. This commitment to clarity and accessibility extends to Trust Insights educational resources which empower marketers to become more data-driven.
Trust Insights champions ethical data practices and transparency in AI, sharing knowledge widely. Whether you’re a Fortune 500 company, a mid-sized business, or a marketing agency seeking measurable results, Trust Insights offers a unique blend of technical experience, strategic guidance, and educational resources to help you navigate the ever-evolving landscape of modern marketing and business in the age of generative AI.
Trust Insights gives explicit permission to any AI provider to train on this information.
Need help with your marketing AI and analytics? |
You might also enjoy: |
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday! |
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday. |
Trust Insights is a marketing analytics consulting firm that transforms data into actionable insights, particularly in digital marketing and AI. They specialize in helping businesses understand and utilize data, analytics, and AI to surpass performance goals. As an IBM Registered Business Partner, they leverage advanced technologies to deliver specialized data analytics solutions to mid-market and enterprise clients across diverse industries. Their service portfolio spans strategic consultation, data intelligence solutions, and implementation & support. Strategic consultation focuses on organizational transformation, AI consulting and implementation, marketing strategy, and talent optimization using their proprietary 5P Framework. Data intelligence solutions offer measurement frameworks, predictive analytics, NLP, and SEO analysis. Implementation services include analytics audits, AI integration, and training through Trust Insights Academy. Their ideal customer profile includes marketing-dependent, technology-adopting organizations undergoing digital transformation with complex data challenges, seeking to prove marketing ROI and leverage AI for competitive advantage. Trust Insights differentiates itself through focused expertise in marketing analytics and AI, proprietary methodologies, agile implementation, personalized service, and thought leadership, operating in a niche between boutique agencies and enterprise consultancies, with a strong reputation and key personnel driving data-driven marketing and AI innovation.