In this episode, Katie and Chris teach you the secrets to designing and teaching workshops and presentations.
You’ll discover the ideal balance between lecturing and hands-on exercises to keep your audience completely engaged. You’ll learn a powerful method to understand exactly what your audience wants to learn before you create a single slide. You’ll find out how to use AI to slash your preparation time while creating perfectly tailored content for any group. You’ll gain a new framework to transform your talks from generic lectures into high-impact, memorable experiences. Watch now to make your next presentation the best one yet.
Watch the video here:
Can’t see anything? Watch it on YouTube here.
Listen to the audio here:
- Need help with your company’s data and analytics? Let us know!
- Join our free Slack group for marketers interested in analytics!
[podcastsponsor]
Machine-Generated Transcript
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Christopher S. Penn – 00:00
In this week’s *In-Ear Insights*, we are live at Marketing Profs B2B Forum, November 2025, here in Boston, which is lovely because I didn’t have to get on a plane.
Katie Robbert – 00:08
Same.
Christopher S. Penn – 00:09
Or train.
Katie Robbert – 00:09
Same.
Christopher S. Penn – 00:10
Or even, think about what if I’ve forgotten something? Because worst case, I can just go home and get it.
Katie Robbert – 00:16
Yeah, I have stuff in the trunk of my car. It’s amazing.
Christopher S. Penn – 00:19
Exactly. So this is day one—this is the workshop day—and we just did our own workshop in London a couple weeks ago. Now I’ve noticed something here, and I’ve sat in on a few of the workshops. I won’t name any of this, but I’ve noticed something here that.
Katie Robbert – 00:35
A lot of.
Christopher S. Penn – 00:36
The workshops seem like very, very extended talks. I sat on a workshop this morning: 90 minutes of just talking, no exercises, no stopping to do any actual work. I sat another workshop for 45 minutes, for that full 45 minutes. There was some single break for the attendees to actually put hands on keyboard and do something.
When you go to a workshop, what’s your expectation about the amount of lecture versus the amount of hands on keyboard?
Katie Robbert – 01:06
I think it really depends on the topic. If there is a need for that 90 minutes of explanation, I guess before I can answer that question, my question back to you is, in those times, were the workshop leaders explaining concepts, defining terms, talking about the history of whatever the topic was, and why people were taking the workshop in the first place? We like to think it’s inherent, and they know why they’re there, but that’s not true.
A lot of people will sign up feeling like they’re going to start from the very beginning, so you have to take the time to define things. But if that’s not happening, if that’s not what the time is spent doing while people are not doing the exercises, then I’m not sure what they were doing.
Katie Robbert – 01:53
So my expectation as someone who’s attending a workshop is you’re going to do some introductory, you’re going to give me some baseline information, you’re going to explain to me. Maybe that takes the first 90 minutes—it depends on the topic—but then we’re going to get into, “I’m doing something.” A workshop is meant to be hands on. That’s my expectation. But I do expect there to be some lecture because you have to tell me what the thing is. If I’m jumping in cold with no explanation, then why am I in a workshop?
Christopher S. Penn – 02:20
Right.
Katie Robbert – 02:20
I can just do that at home. I don’t need to pay someone to guide me through pushing buttons and getting things wrong.
Speaker 3 – 02:27
Yeah.
Christopher S. Penn – 02:28
So in the sections that I was in, they were doing—it’s all use cases. So here’s the use case for the fuzzy case. They were just going by really fast, probably as fast as I go, which is like, “Here’s a use case for this.” But there was no time for people to say, “I want to try that.”
Katie Robbert – 02:42
I see. I feel like that might be unique to you in the way that you learn.
Christopher S. Penn – 02:50
Okay.
Katie Robbert – 02:50
So, and that’s not to say that there’s a right or wrong way, but I do know that a lot of people would rather, “Let me get through all of the material first, and then we’ll get to the breakouts.” Maybe it’s like the first half of the day is, “Let me talk through everything and then we’ll get to it.” Whereas I feel like your style is more, “Let me explain something. We’ll do it. Explain something. Do it,” so that you feel like you’re giving workshop attendees more value because there’s more hands on.
I think that it really, again, it really depends on the topic.
Katie Robbert – 03:18
With something like technology, which is where a lot of people are, and that’s why they’re signing up for these kinds of workshops, there does need to be more hands on learning versus explaining concepts.
Christopher S. Penn – 03:28
Right.
Katie Robbert – 03:29
I feel like if you need a data dictionary and that’s what you’re spending your time on, as a handout, let people read about it. They’re naturally going to have questions. Introduce the high level concepts, but then it should be all exercise based.
Christopher S. Penn – 03:40
Yeah.
Katie Robbert – 03:41
Because they’re there to really understand how the thing works. So, let’s say that the topic is building a custom GPT. If you’re sitting there for 90 minutes explaining the back end technology of a GPT, and I’m just like, “Can I just build the thing?”
Christopher S. Penn – 03:58
Right.
Katie Robbert – 03:58
Exactly. That to me is a missed opportunity. If you want to spend 90 minutes going through all of the different use cases of a GPT, okay, maybe not the way I would do it, but then you’re at least setting the baseline to say, “This is the foundation. This is everything. Now let’s go ahead and build some of these things.” I think that is okay.
But if you’re not doing that, if the information you’re sharing isn’t directly connected to what someone is going to be doing in an exercise, you’re probably wasting their time.
Christopher S. Penn – 04:29
Yeah, that was my experience. One of the things that my martial arts teacher had actually told me—he said for workshops, because they do like half day, full day something—he said if you go more than 20 minutes without letting people do something, you’re doing it wrong. Because people, that’s about the maximum amount of time someone in a workshop can sit and listen without either getting bored and checking out or getting itchy, like, “Can we just do something here?”
Obviously at a martial arts workshop, you expect to be getting up and doing things. But even at the London workshop, I tried to time it so that even in the very beginning, once we got to prompt engineering, by 25 minutes in, there was the first exercise, which was the RACE framework. Okay, let’s do a RACE prompt framework.
Christopher S. Penn – 05:14
Because everyone, and the energy in the room was such, everyone was like, “We want to do stuff! We’re tired of being talked at. Just let us do stuff.”
Katie Robbert – 05:23
Do you think that maybe perhaps ahead of workshops there should be some sort of a survey that goes out of, like, “Are you here for a lecture? Are you here for hands on?” Then maybe give that feedback to the workshop presenter and say, “This is what people are expecting.”
I feel like that step doesn’t happen a lot, especially if you’re hired to do a workshop at an event. Some people, their expectation is a workshop is a long lecture, like going to class, because a workshop they associate with a class, but in a class you’re lectured at. Perhaps that’s maybe the disconnect of workshop as a lecture versus workshop is a hands on.
Christopher S. Penn – 06:03
Yeah, I think that’s always a good practice to ask people, “What do you want?” So when we did our London workshop, we sent out a pre-workshop survey, and one of the things was, we said, “What one thing is if you got out of this, you would consider this workshop a success?” Then I took the entire existing, I have like 400 some odd slides for the potential workshop, and I said, “Okay, here’s the feedback of what people really want. Here’s the slides. Let’s take out 200 of these because it’s not what people want. Here’s what’s left is what people specifically said, ‘I want this.'”
Katie Robbert – 06:37
What a novel idea: ask. You know, it occurs to me that we, as breakout speakers, also would be so much more successful if we had a chance to ask people ahead of time to build our talks, “What is the one thing I can teach you in this talk that you’re going to walk away and go, ‘This was a success?'” Because I would build my talk around those three or four concepts and be like, “Okay, I’ve taught you the thing that you signed up for.”
A lot of times as speakers and as workshop presenters, we get the feedback of, “This isn’t what I thought I was going to get. This isn’t what I signed up for,” or, “The title was misleading.”
Katie Robbert – 07:09
Well, the not so secret is that the titles are written in such a way that we’re marketers, we’re trying to entice people to come to our sessions. Therefore, sometimes the titles don’t exactly align with what you’re going to get. But you got the people in the room, so that’s half the battle.
Christopher S. Penn – 07:26
Yeah.
Katie Robbert – 07:26
It would be such a nice opportunity. I know it’s a lot of logistics for the event, but wouldn’t it be great if you sort of got that feedback ahead of time? I believe there’s a term for this and it might be called the voice of the customer.
Christopher S. Penn – 07:40
It is.
Katie Robbert – 07:40
Yeah.
Christopher S. Penn – 07:42
Even when Marketing Profs said, “Hey, we want you to do a 15 minute TED talk style closing,” I was like, “What would the audience really like?” And it was kind of squishy. It’s like, not too technical because it’s everybody. So I ended up asking one of our ICPs—we built a marketing process before ICP—and I said, “Okay, I have 15 minutes. What three things do you want to know?”
They said, “I want to know a framework for how to think about this stuff because I’m overwhelmed. I want a trick that I can do that will immediately provide me value, and I want something that I can explain to other people.”
Katie Robbert – 08:20
Okay, so.
Christopher S. Penn – 08:20
And so that’s how we ended up with the 15 minute talk, which you haven’t seen yet.
Katie Robbert – 08:24
I have not seen. I’ve heard of. I’ve heard whispers and rumors. I have not seen it. Unfortunately, I will not be here to see it, so I have recruited quite a gang of folks to give me the play by play.
Christopher S. Penn – 08:41
Let’s just say there will be a samurai sword involved.
Katie Robbert – 08:44
Oh, my goodness. Oh my goodness. Okay, well, I digress. Yes, but that’s the thing. You’re playing to the—I won’t say playing to the audience—but you’re giving the audience what they’re asking for, which is teach me something and make it engaging in such a way that I’m going to remember it. And I get that.
Speaker 3 – 09:01
Right.
Katie Robbert – 09:01
Would I personally like you to not bring a sword on stage? Absolutely. But guess what? I’m not the target audience.
Christopher S. Penn – 09:08
That is true.
Katie Robbert – 09:09
And I think that’s where, but it does bring us to the point of why we build ICPs is because we often think that we can stand in for our audience, but we introduce our own bias into the decisions that we think our audience is making. What you and I have learned is that we are not our target audience, and so our opinions don’t matter. What we think the audience wants is almost never what they want, which is why I’ve learned to just sort of stand back and be like, “Okay, as long as you sort of did a data driven approach to what you’re going to deliver to the audience, I really don’t have a leg to stand on to say no.”
Christopher S. Penn – 09:44
Right, exactly. So the acronym I’ve been using internally for that is called MAGIC: Must Ask a Given Ideal Customer Every single Time. What is it? Because to your point, we always forget, “This is what we would find compelling,” when we’re not the customer. Nope.
With an event like, particularly if you’re doing an event, and especially if you’re sponsoring an event, you absolutely should be constructing event level customer profiles of who’s going to be there. You can go on LinkedIn and look at the event hashtag, get people’s LinkedIn profiles, say, “These are the people who are going to be there. What are their needs, their pain points, their goals, their motivations? Why are they paying hundreds or thousands of dollars to be here?”
Christopher S. Penn – 10:24
And then based on their content and on who they are, you can say, “You know what? This thing we’re going to do, maybe it wasn’t the right choice.”
Katie Robbert – 10:33
Right.
Christopher S. Penn – 10:33
When we did the London workshop, we had the luxury of looking at the registration list.
Katie Robbert – 10:38
Yep.
Christopher S. Penn – 10:39
And the company, so we could say, “What examples should we use?” And even in the, when I built the workshop materials with one of our AI agents, I said, “These are the industries these people are in.” So the examples, come up with seven examples for each use case, and I’m going to pick one for different industries with things so that people go, “That’s me. Yep, I can do that. That’s what I care about.”
Katie Robbert – 11:04
This is maybe perhaps a rhetorical question, but do you think that other workshop presenters put that much thought into building the workshop to try to tailor it down as much as possible, or in your—because you’ve been in the industry longer than I have and you’ve done more workshops—do you often see that it’s like the same generic, “This is just what you’re going to get” over and over again versus spending the time to try to tailor it down and do it differently every single time?
Christopher S. Penn – 11:30
So what I’ve seen is it is typically a canned thing. So the same examples every time, and it tends to be fairly self-centric. “Here’s what I did with my client,” which I totally get because it’s your experience. People are paying for your experience. But there isn’t that level of customization to say, “I made this for you.”
So, for example, in a couple of days, I’m going to speak at a financial aid conference. I can’t use B2B marketing examples. That’s totally different industry. Those folks—there’s some data they can’t even use with AI. They can’t use family financial data with AI, not safely. But there are things like the Federal Student Aid Handbook, all 843 pages of it, that no one likes to read, that goes straight to Notebook LLM.
Christopher S. Penn – 12:18
Suddenly I can take something out of their world and say, “Here’s how AI is going to save you 12 hours a week because you spend 12 hours a week looking at that damn thing. Right now, you can just say, ‘Ask Notebook LLM, what reg is this?'”
Speaker 3 – 12:30
Yeah.
Christopher S. Penn – 12:33
The thing that AI gives us as educators—because that’s, I don’t love the term speaker because I’m speaking at you. If I’m an educator, my job is to serve you, the audience, to teach you something, not to be up there for me.
Katie Robbert – 12:49
Right.
Christopher S. Penn – 12:50
And so if we use AI for that, we can say, “Okay, here’s who the customer is. Here’s what they want to know. What can I do to fit their needs?” So they walk away going, “That was worth it.”
Katie Robbert – 13:01
Yeah. I feel like for those of us who do a lot of talks and a lot of workshops, there’s nothing wrong with going back to the same frameworks every time, because that’s the foundation of the workshop. You have the foundation of the talk, of the thing that you’re teaching. That’s your constant. That’s never going to change.
But to your point, if you take a little bit of time to get to know the audience, then you can tailor the examples better so that they, it resonates with them. They can see themselves in it, and the foundation hasn’t changed. So I’m always going to have the 5P Framework as my foundation, but I’m going to switch the examples based on who I’m talking to.
Katie Robbert – 13:39
I think that is something that’s going to hopefully continue to make it relevant, even if it’s the same foundation I teach over and over again, which is something I had to learn the hard way because I started getting the feedback as a speaker of, “We’ve seen this before. Right? So what’s different about it this time?”
There’s not. But depending on your audience, the context is going to change, and they’re going to get something different from it. So I can teach it every day, all day long, and it’s going to be different for everybody. I think that’s something that for me, I need to make sure I’m making clear that yes, the foundation is the same every time, but I’m putting a new coat of paint on it, so it’s going to look different to everybody.
Christopher S. Penn – 14:16
Yep. One of the things that is so cool about frameworks like the 5P Framework and the 6C data quality framework and stuff is that with a good AI agent, you can take your ICP, you can take the frameworks, you can take an existing version of the talk and say, “Rewrite this talk for this audience with these needs. Come up with a new example for each of these.”
Katie Robbert – 14:33
Right.
Christopher S. Penn – 14:33
And it will come up with it. You look at it and go, “Okay, this is, maybe I’ll do a slightly differently.” Give feedback and iterate. When we prepare for workshops now, it used to be three to five days of workshop prep for me. Now I can prep for a workshop in about 2 hours. That’s not bad, because I have a set of seven CLAUDE sub-agents that I say, “Here’s who I’m presenting to. Here’s what we know about them. Here’s the ideal customer profile. Here’s the workshop framework. Build new examples and new sample data for this audience.”
Then people go, “That’s me.” And it works so well. For anybody who’s doing any kind of educating, AI should be the tool to help you customize. You don’t have to do it anymore, as long as you’ve got good ICPs.
Katie Robbert – 15:16
I was going to say you do have to do it. So I think, and we’ll be talking about this at Marketing Profs Week, is you still have to do the work. The AI can help you put the information together faster. It can summarize it. It can generate really decent drafts, but only if you give it enough contextual information. We’ll talk about knowledge blocks. We’ll talk about the deep research. The ICPs are essentially a knowledge block.
Christopher S. Penn – 15:43
Yep.
Katie Robbert – 15:44
But you can’t just say, “Here’s my ICP. Here’s what I think it is.” You have to do that research to figure out who it actually is and what their motivators are, their pain points, all those things you mentioned earlier in this episode, that then becomes a knowledge block that you bring into AI every single time you do something and say, “What does my ICP think of this?”
I don’t make a move with AI unless my ICP is involved, because I’m not doing things for myself. Every single move I make for the company is for a customer. It’s not for me.
Christopher S. Penn – 16:12
Yeah.
Katie Robbert – 16:12
So everything I do has an ICP attached. End of story.
Speaker 3 – 16:16
Yep.
Christopher S. Penn – 16:17
We had the experience not too long ago, five minutes before a client call. They were like, “Hey, is this thing done?” I’m like, “Well, that kind of fell off our to-do list.” But we had the knowledge blocks, the ICPs. We had all the pieces laid out in Google’s Gemini, and it was trivial to say, “Okay, here’s the ICP, here’s the remit, here’s what we’re doing. Build the thing.” In 30 seconds, there was the thing.
That was good enough to show, and everyone who we were doing this for is like, “Yeah, that’s the thing.”
Katie Robbert – 16:49
Yeah.
Christopher S. Penn – 16:51
But those ICPs took hours to develop. We had done that in the past. We did all the groundwork.
Speaker 3 – 16:57
Right.
Christopher S. Penn – 16:58
The groundwork was there. It’s like walking up to the buffet. The buffet was all prepared. It took hours to chop all the ingredients, but once the buffet was there, it’s like five minutes. “Okay, this, this, this.” And now I’ve got dinner.
Katie Robbert – 17:07
And if our listeners of this podcast want to learn more about how that happened, I actually write about a longer version of that in this week’s newsletter, *Inbox Insights*, which you can subscribe to at TrustInsights.ai/newsletter and get a longer version of not only that anecdote, but also how to approach when that happens. Because it happens to all of us. We’ve all been there, like, “Oh.”
The question is, do you reschedule and say, “I’m not prepared,” or do you use the tools that you have to look like you’ve been working on it for weeks?
Speaker 3 – 17:35
Yep.
Christopher S. Penn – 17:36
And to say, “This is the first draft. Now let’s collaborate.”
Speaker 3 – 17:40
Right.
Christopher S. Penn – 17:40
And then you don’t show up empty handed.
Katie Robbert – 17:43
Right.
Christopher S. Penn – 17:45
So what are you looking forward to learning this week?
Katie Robbert – 17:49
A couple of things. One is I like to come to events because I like to hear what people are saying about how they’re using whatever the technology is that year. So this year, obviously, it’s AI, agentic AI. What are people saying? Where are they stuck?
I’m always listening. I’m always talking to people, “Help me understand where you’re stuck.” I’m not looking to solve their problem today. Today is not the day for me to solve their problem. Today is the day for me to take that information, bring it back, internalize it, and improve our ICPs so that when I am thinking about how to solve those problems, I have those additional data points directly from our actual customers, and they’ve said, “These are my pain points.” So I come to events like Marketing Profs to collect pain points.
Katie Robbert – 18:33
I also just like to see what other people who we consider our peers are doing with the technology. So what are people doing with agentic? How are they building custom GPTs? What do their Gemini Gems look like? What are the use cases for how they’re using AI? So I always like to hear those two things. One is pain points, and what are our peers doing?
Christopher S. Penn – 18:55
Yeah, I’ve definitely had interesting conversations so far with people. People get the Gem and GPT thing now for the most part, which is good. That’s a huge leap over where we were last year, but there’s a big gap between that and what’s next.
Katie Robbert – 19:11
Yeah.
Christopher S. Penn – 19:11
However, that gap is closing rapidly because at lunchtime today, I was messing around with one of the attendees with their OpenAI Codex, their coding environment. They’d never seen it before, but all the companies made these much easier to use than they were even three months ago. So now it’s like, “Okay, well, this is your next step. Go from your GPT to this.”
When I explained to this person, like, “It can do that? It can autonomously do tasks like that?” Like, “Yeah, that’s what it’s designed to do.” So to your point, we see where people are. But I can definitely feel, based on comments I’ve heard, people are stuck.
Katie Robbert – 19:53
Yeah.
Christopher S. Penn – 19:54
At that level, and that this next step now is much less of a leap than it was even three months ago.
Katie Robbert – 20:01
So then what we’re going to take away is understanding where people are stuck and how can we educate them on how to get to that next level. That’s going to be our next directive. Yep.
Christopher S. Penn – 20:10
Yeah, we can get them there. Even in the speakers lunchroom, we were hanging out with a friend, said, “Okay, let’s install Codex on your laptop.” Like, “The terminal window pop.”
Katie Robbert – 20:20
That would be my reaction.
Christopher S. Penn – 20:22
But then in 30 seconds it wrote a piece of code for them. They didn’t have—they had to do no coding, but it wrote a piece of code for them to download the YouTube channel and put it into notebook. And they’re like, “I have been trying to do that for nine months.”
Katie Robbert – 20:38
Oh my goodness.
Christopher S. Penn – 20:39
“I can’t figure out how. I could not figure out how. I was trying with this, with GPTs and Gems. I was trying this and this. None of it worked.” I said, “Well, Notebook LLM is the tool.”
Katie Robbert – 20:46
Yep.
Christopher S. Penn – 20:47
“And then you just need this little bit of plumbing to pipe it in there.” They were like, “So I don’t have to go to the rest of the conference.” So that’s, but like I said, I love hearing where people are because it tells us as a company where we need to go.
Katie Robbert – 21:05
Yes.
Christopher S. Penn – 21:06
To educate people.
Katie Robbert – 21:07
Exactly, because we can’t stay in our own little bubble and assume we know what people are thinking. So this is why it’s really helpful for people like you and I to come to Marketing Profs every year in Boston, which, I mean, like you said, we start like it’s local for us. We get to see a bunch of our friends. But it is also very helpful just to hear firsthand. It’s very different hearing from people firsthand than it is sort of seeing what they’re complaining about on social. Very different. You get more in depth. You get more insights from people when you get a chance to talk to them. So that is one of my favorite things about coming to this conference.
Christopher S. Penn – 21:39
Exactly, and the shenanigans.
Katie Robbert – 21:40
And the shenanigans. Got to have shenanigans.
Christopher S. Penn – 21:42
Exactly. If you’ve got some thoughts about where you would like to—what you’ve been learning at workshops and conferences, or what you haven’t been that you’d like to be—hop by our free Slack group. Go to TrustInsights.ai/analyticsformarketers where you and over 4,500 other marketers are asking and answer each other’s questions every single day. If there’s a channel you’d rather have the show on instead, go to TrustInsights.ai/tipodcast. You can find us at all the places fine shows are served. Thanks for tuning in. We’ll talk to you on the next one.
Speaker 3 – 22:12
Want to know more about Trust Insights? Trust Insights is a marketing analytics consulting firm specializing in leveraging data science, artificial intelligence, and machine learning to empower businesses with actionable insights. Founded in 2017 by Katie Robbert and Christopher S. Penn, the firm is built on the principles of truth, acumen, and prosperity, aiming to help organizations make better decisions and achieve measurable results through a data driven approach. Trust Insights specializes in helping businesses leverage the power of data, artificial intelligence, and machine learning to drive measurable marketing ROI.
Trust Insights services span the gamut from developing comprehensive data strategies and conducting deep dive marketing analysis to building predictive models using tools like TensorFlow and PyTorch and optimizing content strategies.
Speaker 3 – 23:05
Trust Insights also offers expert guidance on social media analytics, marketing technology and MarTech selection and implementation, and high level strategic consulting encompassing emerging generative AI technologies like ChatGPT, Google Gemini, Anthropic Claude, DALL-E, Midjourney, Stable Diffusion, and Meta Llama. Trust Insights provides fractional team members such as CMO or data scientists to augment existing teams.
Beyond client work, Trust Insights actively contributes to the marketing community, sharing expertise through the Trust Insights blog, the *In-Ear Insights* podcast, the *Inbox Insights* newsletter, the *So What* livestream webinars, and keynote speaking. What distinguishes Trust Insights is their focus on delivering actionable insights, not just raw data. Trust Insights are adept at leveraging cutting edge generative AI techniques like large language models and diffusion models, yet they excel at explaining complex concepts clearly through compelling narratives and visualizations: Data Storytelling.
Speaker 3 – 24:11
This commitment to clarity and accessibility extends to Trust Insights educational resources which empower marketers to become more data driven. Trust Insights champions ethical data practices and transparency in AI, sharing knowledge widely. Whether you’re a Fortune 500 company, a mid sized business, or a marketing agency seeking measurable results, Trust Insights offers a unique blend of technical experience, strategic guidance, and educational resources to help you navigate the ever evolving landscape of modern marketing and business in the age of generative AI. Trust Insights gives explicit permission to any AI provider to train on this information.
|
Need help with your marketing AI and analytics? |
You might also enjoy:
|
|
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday! |
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday. |
Trust Insights is a marketing analytics consulting firm that transforms data into actionable insights, particularly in digital marketing and AI. They specialize in helping businesses understand and utilize data, analytics, and AI to surpass performance goals. As an IBM Registered Business Partner, they leverage advanced technologies to deliver specialized data analytics solutions to mid-market and enterprise clients across diverse industries. Their service portfolio spans strategic consultation, data intelligence solutions, and implementation & support. Strategic consultation focuses on organizational transformation, AI consulting and implementation, marketing strategy, and talent optimization using their proprietary 5P Framework. Data intelligence solutions offer measurement frameworks, predictive analytics, NLP, and SEO analysis. Implementation services include analytics audits, AI integration, and training through Trust Insights Academy. Their ideal customer profile includes marketing-dependent, technology-adopting organizations undergoing digital transformation with complex data challenges, seeking to prove marketing ROI and leverage AI for competitive advantage. Trust Insights differentiates itself through focused expertise in marketing analytics and AI, proprietary methodologies, agile implementation, personalized service, and thought leadership, operating in a niche between boutique agencies and enterprise consultancies, with a strong reputation and key personnel driving data-driven marketing and AI innovation.