livestream header

So What? The AI Maturity Model

So What? Marketing Analytics and Insights Live

airs every Thursday at 1 pm EST.

You can watch on YouTube Live. Be sure to subscribe and follow so you never miss an episode!

 

In this week’s episode of So What? we focus on the AI Maturity Model. We walk through the different steps, how to know if your team is ready and if AI will take your job. Catch the replay here:

So What? The AI Maturity Model

 

In this episode you’ll learn: 

  • how to assess your team for AI readiness
  • what steps you need to take to introduce AI
  • when AI is not the right solution

Upcoming Episodes:

  • TDB

 

Have a question or topic you’d like to see us cover? Reach out here: https://www.trustinsights.ai/resources/so-what-the-marketing-analytics-and-insights-show/

AI-Generated Transcript:

Katie Robbert 0:34
Well, hey, howdy, happy Thursday, everyone. I’m Katie, this is so what the marketing analytics insights live show, I’m joined by Chris and John. Both me can hold them up today. I never get that, right. But yeah, I’m gonna try every time anyway. So on today’s episode, we are talking about the AI maturity models. So for the past couple of weeks we’ve been talking about, we’ve been coming at AI from a couple of different ways. Last week, we talked about will AI take my job. And if you missed that episode, we were talking about how highly repetitive and low creative tasks are the ones that AI will take. And then the inverse is true, the low repetitive, high creative are the ones that are the safest, you can definitely check that out on our YouTube channel. This week, we’re talking about how to assess your team for AI readiness, what steps you need to take to introduce AI and when AI is not the right solution. So last week, as we were talking about AI, take my job it came up in the conversation of it may all be a moot point if your organization is not even ready to introduce artificial intelligence into the workflow. So Chris, where do you want to start today?

Christopher Penn 1:43
Let’s start with the AI Maturity Model. This is a model that, gosh, we put together about four years ago now. And it’s something that we use as a way to help people kind of navigate. Are you ready to be tackling this stuff? Because a lot of the executive suite has what we would sometimes gently call airline magazine syndrome, where a high ranking executive read something on a flight about AI. And then they storm into the next meeting and say we need to put AI into products like Sir, we make.

Katie Robbert 2:20
But I need it, I need it now to stay competitive.

Christopher Penn 2:24
Exactly. So. So we can use this framework as a way of saying, evaluating your organization from top to bottom, or your department or even you as a professional. How ready, are you? So it’s a seven stage framework, the bottom stage, which is the foundation on which you build everything? Is that data foundation? Which is finding, cleaning, preparing and unifying your data sources? Do you know where your data is? What condition is it in? Right? If data is the raw ingredient for AI, so if you don’t know what you have, and you don’t know, quality is, you can’t really go any further.

Katie Robbert 3:08
Well, and before we get into all of the other steps in the model, so this is the bottom, this is the foundation. And I would say at least in my experience, the larger the organization, the more they’re stuck at this first step because data is everywhere. People are siloed and spread out and an institutional knowledge comes and goes so quickly. There’s a lot of turnover. So the data foundation is the step that a lot of people try to skip at. But you really can’t. John, what do you see when you’re talking with communities and prospects about sort of where they’re where their heads are at with the data foundation? Do you think that they feel like they’re ready?

John Wall 3:54
No, that’s like the ongoing joke, because, you know, I talked about us doing predictive and attribution. Those are kind of the two cornerstones but then the reality is, more than 95% of the people that we deal with don’t have their data in order, it’s a complete mess, and they need us to come in and fix, you know, the plumbing. And it’s, you hit a huge point, because it’s, it becomes a problem, the larger the organization gets, because you even get to this point where the bureaucracy, you know, there’s people that don’t want to share the data, like they just don’t even want the results to go to other departments or bosses. It’s just easier for them to say, Yeah, we can’t do that, or that’s impossible. So, yeah, it’s a real cultural hurdle, just to, you know, get that foundation laid, and try and get to somewhere where you can start doing more with it.

Katie Robbert 4:36
And, you know, Chris, once we go through all these steps, I think it would be worth going back through and talking through the people piece of it unless you want to talk about that as we’re going through each step.

Christopher Penn 4:47
Well, I think it makes sense. It we, we always look at everything as the people involved the processes you need to operate it, the technology platforms that you use to administer it. And the raw ingredients, right? It’s like we were saying the kitchen, there’s a chef, there’s appliances, there’s ingredients, and there’s recipes, and you need all four. And you have the overarching reason what you talked about in the beginning, Katie is, why are you doing this? Right? Each stage has to have a purpose, as well. The next level up, you know, well, if you think that the data Foundation, the data is the stuff, right, and the reason for it is everything else. Having people who can work with data, there’s, there’s two, three, or three different kinds of people or types of roles you might want to have, you might want a data analyst who can analyze the data give you insights into its health, etc, you might want a data scientist who can run experiments on it and give you some deep looks. We talked about exploratory data analysis not too long ago, I think a data engineering, which is the people who can help you store the data, make it accessible, put it in different types of formats, and things. And all three of those roles are, are people roles with data that, again, not a lot of companies think about,

Katie Robbert 6:09
I would add a fourth role to that, which is the project manager. Because someone, you know, as, as someone who’s been on both sides of that, you know, skill set. If you are focused on doing the thing, then it’s hard for you to see what everyone else is doing. And so having one person whose sole responsibility it is to keep things moving forward is so important. So I would add project manager to pretty much any of these steps, probably all of them.

Christopher Penn 6:41
Right. And you might even have like a PMO that sort of oversees all of them. So the second layer is a measurement and analytics layer. So you have data, but just because you have data doesn’t mean you’re doing anything with it. So the second layer in the maturity model is how a data driven Are you? Alright, how important is data in your decision making processes? Have you identified and measured KPIs? Do you understand what happened in your data, just because you have Google Analytics doesn’t necessarily mean you know, what the data is telling you to do? It could just be a big pile of, of numbers. So that’s the stuff. The processes around becoming data driven? is a big culture change to get people to say, Stop guessing.

Katie Robbert 7:28
Yeah, well, you know, and it’s funny, because as you’re saying that it occurs to me that I’ve seen instances where people call themselves data driven. But they’re picking and choosing the data that best suits them, versus all of the data good, bad, ugly, indifferent. And so that, to me, is not data driven, that’s using and manipulating data, to kind of make yourself feel better, to make your company feel better to make your team feel better. It’s not truly being data driven.

Christopher Penn 8:01
Exactly. The example we always give is, every time you use the Google Maps to go somewhere, you’re literally being data driven, you decide the destination, but then you have data helping you drive the actual route you want to take. And sometimes there is a place for your experience, if you know the backroads of your hometown better than Google does. might know, that route might be technically a little bit longer, but a heck of a lot more pleasant to drive. All the times like when you’re in unknown territory, like now, you know, I’m gonna listen to the machine. It’s gotten better, isn’t it? I don’t want to end up on some guys like pig farm in the middle of eastern Pennsylvania is nothing good happens there. So that’s the second layer. The third layer is insights and research. This is your qualitative capabilities, the ability to do market research, because one of the things that’s a challenge with data, and the analysis of data is that you very rarely get to know why, like, why do people abandon their shopping carts on your website? Why do customers give you two stars on Amazon? Why did these things happen? And you can’t ever get that from quantitative data. Right? You have to, unfortunately, talk to human beings.

Katie Robbert 9:26
It’s funny you say, unfortunately, whereas I on the flip side, I really enjoy that behavioral data that that psychology Why did people do it this way? Because I think I mean, you’re absolutely right, Chris. There’s too much of a reliance on the quantitative data, but it doesn’t tell you the whole story. And so a great example of that is website traffic. Well, traffic is up this month, traffic is down this month, traffic is up this month, traffic is down and down and down and then up. You can see the patterns in the data But you still have no idea why you can guess you can look at external factors. But it doesn’t really tell you why unless there’s like a clear cut answer, like, oh, because we accidentally disabled our website. That’s a clear cut answer. And that’s a really good insight. But 99.9% of the time, that’s not the reason why. So you do need to talk to people to find out why have they stopped coming to your site?

Christopher Penn 10:23
Exactly. And from the people side, this market research is a separate profession. It is entirely a separate profession. It’s not something that your data analyst is necessarily going to have training. And and so from a maturity perspective, if you don’t have market research capabilities, either in house or with an agency partner of some kind, it’s hard to get comprehensive, rigorous understanding of what’s happening in your data. And the reasons behind it, you really do need that specialization, if you want the data to be credible. One of the challenges we see with market research is there’s a whole lot of Hippo issues. Right now hippo is what we the revision for the highest individually paid person’s opinion, which is usually the executive on staff was like, Oh, well, our customers want this, like, you’re not our customer. John,

John Wall 11:18
I’m always making stuff up. Yeah, it’s gonna be huge. That’s the next big thing.

Katie Robbert 11:24
As someone who is trained in market research and clinical research, it really is its own specialty. And one of the pitfalls that we see marketing teams fall into is someone you know, that highest paid opinion, says, Well, you know, just create a survey, which, you know, in execution is fairly straightforward, you know, putting the questions together. The question development itself is, Chris, you had mentioned credible credibility. And so what often ends up happening is that these surveys are built in such a way to get the responses that they want to make themselves look in positive. So back to the picking and choosing the data points that you think best suit your needs. The same is true of the pitfalls with survey development and market research of asking questions in such a way that they’re leading, that they’re pushing the users in a direction to answer a certain way. And that’s one of the arts of market research and survey development, is making sure you’re asking the question in such a way that people can answer it. However, they feel like that you’re not, you know, biasing them towards what you want them to answer. It’s a huge,

Christopher Penn 12:35
my, it’s a huge problem. And there’s a lot of junk out there. My wife was an admin years and years and years ago for a market research company in Boston, that served political organizations. And they were the antithesis of market what market research should be, they said, and client, usually some politician, come on, and you tell us what you want the data to say. And we will get the responses to be able to say that with something that looks like research that backs up whatever crazy position you have. And the benchmark that we use for determining the credibility of a market research firm is a question like, how, how, okay, are you with answers that you don’t want to hear? Okay, are you with bad news? Right? And we judge that with clients, when we onboard a client, we’re like, how, okay, are you with bad news, right with hearing that your numbers are going down and into the right instead of up into the right, and if they’re like, oh, no, no, no, it’s gotta say this, like, Okay, this is not going to be a productive engagement because we traffic in accuracy of data. Whether or not that’s what you want to hear. And so, with market research, that’s, that’s especially challenging because it’s so qualitative to begin with.

Katie Robbert 13:57
It reminds me of a Parks and Rec episode, Parks and Rec being a TV show with Amy Poehler. And it was they were like polling the community. And it was they phrased the question something along the lines of, you don’t not want this thing that you don’t not want. And it was like 52%, agree. 38% disagree. 16% are just flat out confused by the question. And so clarity like that clarity, getting to the point, but also not, you know, pushing people in a certain direction is just it’s so important.

Christopher Penn 14:32
Exactly. So the fourth layer on this model is process automation. One of the things about adopting AI is that if you do it, right, it it takes time. It takes resources, it takes people it takes budget, to do it well. And you need to find those resources within your organization to be able to to do it well, which means you’ve got to save that’s somewhere else or, you know, invest more heavily in it. And so one of the things that we look for is whether a company has done a good job with Process Automation. This is not AI, AI and automation are not synonymous. This is have you written code, for example, somebody organization to handle simple tasks, have you automated away the most repetitive tasks that require no machine learning whatsoever. For example, at the PR firm, we used to work at one of the junior most persons jobs, which is appalling, was watching them Google for stuff, then copy and paste Google results into a spreadsheet. And that was their job eight hours a day. Now, that was a job that, you know, we always like to, you know, Katie, you just answered this in the Twitter chat, you know, will AI take your job? Well, in this case, yes, that person’s job shouldn’t be taken by a machine. And it should be because it’s a low value job. super grateful. Exactly. So if a company has not done any kind of process automation, they’re probably not in a state to be able to do AI. Well, especially because Process Automation involves working with a lot of the similar underpinnings of AI. So an example. We curate our content for things like the Trust Insights, newsletter and stuff, with code. Now there’s no AI involved in it, right? It just downloads data, does some keyword matching, and then puts it in a database. And then we have another piece of code that extracts it scores it, and spits out a essentially a spreadsheet, there’s no coding involved. I mean, there’s no machine learning involved in the current version, this will be the next version. That process automation, took what is a three and a half to four hour a week job, and boiled it down to a three and a half to four minute a week job. And most of the time is just waiting for a server to catch up. That’s an indicator that we’re ready to go beyond static automation, because we figured out, we can connect to data sources, we can get the right people at the table who have technical talent, we have somebody who can think about data and how to store it, and how to retrieve it and how to process it. And so the prerequisites for AI are also the prerequisites for process automation. So it’s a really good barometer of saying, Yes, this, this organization is ready.

Katie Robbert 17:25
It’s you know, it. It’s funny, Chris, that you started off by saying like, Have you written code? Have you done this? I would say, has anyone even written down? How you do the thing? That’s, I’m never, I guess I shouldn’t be surprised. But I’m always surprised every time I sort of go into a new team, or, you know, start with a new client. And I was like, Well, how do you, you know, put this report together? And then you ask someone else? Well, how do you put this report together? And then you ask someone else? How do you put this report together? And you get three different answers of how it’s done. Versus here’s a standard process. I remember, with our old team, I think one of the first conversations I had with the team was, well, where’s your list of standard operating procedures, and they all kind of looked at me like, are what in the what now? I was like, you know, your documentation. And, you know, it was a professionally immature team. So if the team wasn’t that old in terms of its formation, and so a lot of that documentation didn’t even exist. So we couldn’t even get to process automation. Until we could just come to a shared agreement amongst eight people of here’s how you actually do the thing.

Christopher Penn 18:35
Exactly. And again, Process automation, particularly once you’re starting to talk about automating with technology is yet again, another profession. It’s the ability to code write code, write macros, and stuff like that. Now, you don’t have to be a hardcore programmer, you can do a lot with things like Excel macros, right? Because that is a very valuable skill. But once you get beyond a certain point, yeah, you start getting into languages like R or Python, or C sharp, or even Java, to be able to to process those underlying data sources. So the fifth layer is a data science capability. And this is it’s at this point in organization where we say the road splits, right if you’re going to be using artificial intelligence, as a core competency as part of your secret sauce. The next three steps in the model are things that you have to have in your organization. On the other hand, if you are applying AI in something, it’s not a core competency, right maybe you just want to make content curation, faster, your your content marketing team better ads, and you’re not a content marketing company, you’re like a yogurt company. Then the next three steps are things that you would use to audit partners and agencies, somebody that you would want to bring in to help you out with. Step five is data science. So the underpin In of all artificial intelligence statistics and probability mathematics, and you’ve got to have those statistical and mathematical capabilities, to code, the engineering, to be able to work with data, and understand what is capable of what’s possible. Again, going back to the previous episode on exploratory data analysis, we said that was a professional in onto its own. And it really is, it’s part of this data science layer. So you need data science people, processes around it, and of course, the requisite technologies to make data science possible before you can even start talking about AI.

Katie Robbert 20:37
You know, there’s a joke, you know, and depending on which profession you’re talking about, it’s, well, I got into this because I don’t like math. So you know, we often work in the marketing industry, with other marketers, and the joke being, I got into marketing because I don’t like math. Well, unfortunately, joke’s on you. Because every single one of these things that we’ve been talking about involves math. So if you don’t like math, AI is probably not the right tool for you. Exactly. John, do you like math?

John Wall 21:06
I started math at a young age and became an economics major. So yes, I’m guided but I am data driven.

Katie Robbert 21:16
John is personally data driven. I love it. That’s your tagline from now on.

John Wall 21:23
I don’t even know where to go with that. No, math is painful. If people do have to remember that math is painful. I think that’s the biggest thing that people miss is that they think there’s some people out there, that math is always easy all the time. And no, even the people in the cutting edge of math are banging their heads against the wall trying to solve problems and get to the answers. And that’s the mind shift that you need to make, like, Don’t ever think it’s easy, but don’t think that you don’t have it right just because you’re having a hard time with it.

Christopher Penn 21:51
Exactly. So you or the agency partner you’re working with needs to have data science capabilities to because before you can start the process of doing machine learning, you got to know whether you have the the data that is suited for it and and tasks that are suited for it. You know, one of the things that we’ve been trying and Katie did a great job with this in the Twitter chat just now. If disabusing people the notion that AI is somehow magic, it’s just a big pile of math. And if you have a problem that is inherently non mathematical in nature, AI will not help you it will just extremely quickly and very fancily waste a lot of your time and money.

Katie Robbert 22:33
And so you know i Everyone saw I’ll sort of mentioned that when I was a project coordinator a couple of jobs ago, I had to manually test the logarithmic functions by writing out the formula by hand and test it against what we had programmed into the computer. And at that time, when I was when I was doing that job, many, many, many, many, many moons ago. AI in this context just wasn’t an option. Automation wasn’t really even an option. We were still doing things with floppy disks and external hard drives. And it just being in that sector in the clinical trials sector, it just wasn’t a thing. And oh my God, would I have loved to have AI doing that for me. Because it was just a math equation. That was all it was. And I would love to have had that as an option.

John Wall 23:35
Exactly where you like rockin a ti 35 calculator.

Katie Robbert 23:39
I was actually

John Wall 23:41
this is totally off topic, but it’s important. You know, on iPhone, right? When you’ve got the cheeseball calculator, if you turn it right, you’ll get like the full ti deck. Oh, so I know you were just sitting around. You’re like, I need to calculate the log for that would.

Katie Robbert 23:55
Yeah, I was actually no, but that’s but yes, I was using and I went through a lot of them because we did a lot of testing. But that’s the kind of stuff that AI is like, Yep, I got that I can do that. I can do that. nanoseconds. Let me just give me

Christopher Penn 24:15
exactly. So the the next layer in the maturity model is the deployment of machine learning, at least in some kind of testing capacity. And this is where you are now starting to actually do AI in you’re taking data, you’re telling a machine learn from it, and write your own code. Because machine learning model which is a term you hear a lot is just a fancy way of saying software written by machine. A humans wrote, for example, Microsoft word that’s a piece of of human software. Humans wrote Candy Crush, machines have written things like GPT three, which is the general purpose transformer for natural language generation. A human never wrote that a machine wrote that, but it’s still just a piece To software. And so when you’re deploying machine learning within your company you’re looking for and building pieces of software that take in data, learn from it, and then do something with it. So I’ll give you a real simple example of one. This is one that we wrote for Katie’s Twitter chat the previous hour. And this is a topic modeler, it takes in all the tweets from the freelance chat that Katie was just speaking on, takes them apart, looks at the words and then reassembles them into a series of topics. And that’s all it is. It’s it’s, but there is learning because the machine has to essentially learn what was said, and then categorize and group it together. And that’s essentially what machine learning is. So here, that’s the diagnostic. That will, and that chart tells you how many topics are actually in the conversation. But it’s not particularly exciting and fun to look up. There we go. That’s it. So here are the topics, right, natural language processing was in there, Katie and the various Twitter handles, asking AI tools and math equations, generated content? Again, that’s a pretty good representation of what a lot of the conversation, particularly the things where people had a lot of interaction. That’s machine learning. So this is we now have a model to be able to judge what happened in this Twitter chat, instead of having to read through how many tweets was that anyway? It was 660 tweets, right, I don’t have time,

Katie Robbert 26:39
it felt like a lot. My head still hurts.

Christopher Penn 26:42
Exactly. I don’t have time to read 660 tweets, but I have time to look at a single chart with some understanding about the community that was involved and take some decisions on it. So that’s machine learning. Is is taking data, learning and building software.

Katie Robbert 27:02
Is it is it a true statement that math is the same? In any language? So math equations, math, like the correct the factory and theorem is the same? In any country? It’s just the Pythagorean theorem. And so that’s one of the things that I find most interesting about the potential of AI, because it’s just powered by math equations. It is a universal language.

Christopher Penn 27:31
It depends. Which, you know, is gonna say,

Katie Robbert 27:36
actually, I was hoping you’d be like, yes, you’re right. You’re a genius. But clearly, that’s not today, maybe tomorrow.

Christopher Penn 27:44
The underlying mathematics Yes, are portable, but there may be data that the machines have learned from that are baked into the model that are are dependent on regional factors, natural language models will be an example.

Katie Robbert 27:59
Yeah, no. And that within the data that the models and chest I feel like it’s totally different from just having the core set of math equations, like you know, we talk about how we, we use the ARIMA model. For predictive forecasting, the every model in North America is the same ARIMA model that you’re going to use in the UK, because it’s just a math equation. And so yeah, auto regressive. Yeah. Yeah, the data that you ingest, that’s where you get into the all the differences, but the equations themselves are universal.

Christopher Penn 28:34
That’s correct. That is 100%. Correct. Yeah. And auto regressive, integrated moving average is the same in every country. So the last stage in the maturity model is the deployment of artificial intelligence of machine learning in the enterprise, with a, a strategic focus to say that, as a company, you are using artificial intelligence to solve problems and that you are approaching problems from the perspective of building artificial intelligence models to solve those problems. So the probably the most famous example of this is Google, Google, in how employees are asked to think about promises, say, I want you to solve this problem, not once. But I want you to think about a system, possibly powered by machine learning, that will solve it forever. And it will make it go away. As opposed to if you think about what’s on our to do list every week, every week, there’s things like, oh, yeah, we got to do this this week. And it’s the same task pretty much as last week. As opposed to saying, Let’s build a piece of software that solves this problem and makes it go away permanently, because the machine is going to handle it now. And so there’s really only a handful of companies who are at that level, mostly large technology companies and even then it’s it’s not necessarily the entire company. For example, With IBM, there’s certainly plenty of places and, and departments and divisions inside of IBM where AI is, they are not an AI first company universally, Google is not an AI first company universally, but these companies have made it a strategic priority. So that’s that is the ultimate sort of pinnacle of AI adoption.

Katie Robbert 30:22
Do you feel like it’s a problem to put to say that you’re AI first versus people first?

Christopher Penn 30:32
I think it’s a false dichotomy. I think it can be both. You are solving problems strategically, by trying to make them go away in an in an automated fashion. Ideally, that means that you are actually then freeing up your people to be able to do to tackle problems that are not repetitive, right? If you if you are AI, first, you want all of the repetitive BS to go away.

Katie Robbert 30:57
Make sense? All right, John, pop quiz. Where on this model, do you think Trust Insights falls?

John Wall 31:05
Let’s see. Well, we are AI powered, I mean, I can say that, you know, digital customer journey, and some of these reports that we run are run on machine learning. And that’s the only way we get to that stuff. So at least portions of the business are fully AI driven. On the other hand, say, our Facebook ad, or Facebook ad campaigns are more down at the data foundation problem level, you know, we’ve got some reports, and we’re sort of aware of what’s going on. But yeah, I think we have a corporate competitive advantage because of our use of AI. So yeah, at least part of the company is fully mature.

Katie Robbert 31:44
What do you think, Chris?

Christopher Penn 31:47
I have answers, but I want to hear yours first.

Katie Robbert 31:52
I feel like depending on the day, we could fall into any one of these buckets. You know, I think John’s right it to certain parts of our company, you could say we go all the way up to the top of the maturity model. But I would say other parts of our company, you know, the Facebook ads is a great example. We can’t even access our Facebook ads at this point, because that bed because Facebook locked us out. So we’re we are solidly stock at the first rung of the ladder. In that particular contacts. You know, I think that we want to be more all the way up to the top of the rung of the ladder. But because we’re such a small company, we it gets down to resources and skill set. And so with Chris being the primary resource for a lot of our data science, machine learning, and those pieces, it becomes a challenge just because of his personal bandwidth.

Christopher Penn 32:52
I would say our our baseline, as a company is around the green step, the process automation, where we have automated, I think as much as we reasonably can, again, with those resource constraints in mind, but far more than other companies of our revenue, for example, and with far fewer people than those that other companies, other peer companies. And we do that through process automation. Because a lot of what we do is not AI, right? When we’re processing customer data like we are every week, we take millions of records from one of our clients, process them, stick them up into a BigQuery database. And then you know, that’s available for reporting. That’s pure Process Automation. There’s no not a lick of AI in there anywhere. And it doesn’t have to be because it’s the wrong tool for the job. We are, as John said, in some areas, we are using machine learning, monitoring your Twitter chat is as an example. But one of the things that I think we have to be cautious of and and advise people is that, you know, just kind of what you were saying, Katie, not everything has to use AI. Right. It’s it’s the right tool when you got a lot of data. It’s the right tool when you got a lot of repetition is the wrong tool for everything else. So if you’re sitting there saying like, how do we solve, you know, our, our creativity crisis, you know, and we know people are not being innovative enough, and it’s not gonna fix that problem for you. Right? Adding tools on top of problems. Exactly. It’s like you you have a kitchen and you just keep buying appliances and still nobody you haven’t hired a single person to cook. Right? So it’s like, hey, 14 blenders like, Yeah, nobody, you know, the dog is in there staring at them. That’s all that’s in the kitchen.

Katie Robbert 34:50
This really funny visual of my dog just looking at the blender like, second to do something. Exactly. Maybe Okay.

Christopher Penn 35:00
And but that’s a really it’s a good funny visual. But it’s also true. That’s what a lot of AI tools are like. So the AI maturity model is something that you can look at broadly as as a company, but it should be, ideally be matched up with a problem statement, write a user story of some kind. Is this a problem that AI should be used to solve? Right? It’s like a blender. Should you solve a soup problem with Blender? Maybe? Should you solve a steak problem with Blender? No,

Katie Robbert 35:32
only if your towels wired shut.

Christopher Penn 35:35
Exactly.

Katie Robbert 35:36
You know that, you would just determine in your user story, I, the person who is hungry, want to blend up a steak because my jaw was wired shut, and I need the protein.

Christopher Penn 35:49
Exactly. And so if you are looking at this model, and you’re looking at your company, or your organization within a company, and you’re on the green rung or the orange wrong, it’s not an indictment that you’re bad somehow, it’s not an indictment saying, Oh, you’ve you’ve fallen behind, well, you might not have problems that are solvable by AI. That’s, if you, for example, if you’re doing food innovation, you maybe you’re making sweeteners and thickeners, and stuff like that, yes, you can do some mixing and matching of large scale nutrition datasets and process, you know, the Open Food Facts database to see what ingredients are showing up. But at the end of the day, you’ve got to feed that to somebody and say, Hey, how’s that taste? Test Kitchen. And that’s not something that that’s another problem AI can solve?

Katie Robbert 36:37
Well, and, you know, I think that, you know, as you were sort of talking through this, Chris, you know, I think that there’s a lot of there’s that shiny object syndrome about AI. And there’s this misunderstanding that, you know, if you introduce AI, then you are suddenly you know, competitive or have a competitive advantage over your peers. That’s not necessarily true, because to your point, the problems may not have anything to do with AI, you may have the wrong team, you may be serving the wrong market, you may have, I mean, we talk about it at a very basic level, it’s what the audience the offer and the creative. If one of those things are off AI can’t necessarily fix that, it may help you understand what the problem is. But it’s not just going to be like, and here’s your solution. And now if you just do this one thing, you’ll make a million dollars.

Christopher Penn 37:31
And, again, this is something that came up during during your Twitter chat. AI is trained on data, right, and the largest amount of data comes from the biggest sources, the biggest sources typically are like, you know, large competitors of yours, which means that if you’re, if you’re using AI, as a competitive tool, maybe for like you’re generating content, and you’ve trained it on your competitors, mediocre content, you will inherently be spitting out more mediocre content. Right. So by default, that’s how AI works. It trains on what is existing, and it’s always biased towards the largest data source. So if you want to sound like everybody else, by all means, you know, have AI write your content from start to finish. If you want to sound different, and ideally better, at the very least, you’re gonna have to fine tune a dataset to you know, to, to remove a lot of the mediocre crap. And you may find that yeah, it’s not the right solution, because the quantity of excellent content is so small compared to the ocean of mediocrity, that there’s not enough excellent content to train an AI on and so you may just say, You know what, we’re just gonna do it the old fashioned way. We’re gonna get on our pens and pencils, we’re gonna go local bar, we’re gonna get drunk and we’re gonna write until something good comes out.

Katie Robbert 38:54
John, I think ocean of mediocrity should be the name of your Morrissey cover band.

John Wall 39:00
If I was going to start a Morrissey cover band, yeah, that’s definitely has a more secure slant. guyliner

Katie Robbert 39:13
I think, you know, this is all you know, it’s, it goes back to, you know, AI might even be a moot point, or rather, will AI take my job might be a moot point. If your organization is not even close to being ready to introduce AI if you don’t know where your data lives, if you don’t have anyone to pull the data together into some sort of understandable form. If you don’t have anyone to analyze the data, pull out insights, if you don’t have standard processes in which the data is collected in which you do things like those are the four basic things that you need to master before introducing artificial intelligence into your team into your role into your company. So if you’re still stuck at the first rung, then you definitely have nothing to worry about in terms of AI taking your job because AI cannot fix that problem. AI cannot fix your bad data processes and bad data integrity, it can highlight the issue but it can’t fix it.

Christopher Penn 40:14
And when it comes to, you know, thinking about whether AI can take your your job or not, and what you should be doing with your career, we talked about as we walked through this model, how many individual professions are bundled in here, data scientists, analyst, engineer, coder, mathematician, statistician Process Manager, there’s so many individual professions that if you want to uplevel your career, figuring out what you do right now, and then adding in some of the skills from those other professions will make you a more valuable to your organization, but be much harder to replace. Because if you have data analysis skills, but you also have data engineering skills, and you know, for example, how to move a database in or out of third normal form, you are a better analyst because of it. And as from the engineering side, you understand how the architecture of your data will impact the analysis. So what you if you have one foot in both worlds, you’re much more valuable than someone who specialized in one or the other. And even if you take away little tasks from both those, there’s still so much synergy between these two different roles, that a machine is going to have a real hard time replacing, even though the majority of your tasks because they they’re, they’re so convoluted and intertwined together. So the more that you can take multiple synergistic skills and combine them, the harder it is to replace you. Agreed. All right. So that’s the AI maturity model. Again, think about where you are on it. Think about where you might want to be personally, professionally as an organization out of this company. But don’t beat yourself up. If you are not as far as you think you should be. Or if you perceive that a competitor is further ahead than you are because again, just because someone else has a nicer blender, doesn’t mean they’re a better cook. Any final parting words.

Katie Robbert 42:25
I’m gonna have to write that quote down. I think that covers it.

Christopher Penn 42:30
Folks, thanks for tuning in. And we’ll see you all next week. Thanks for watching today. Be sure to subscribe to our show wherever you’re watching it. For more resources. And to learn more, check out the Trust Insights podcast at trust insights.ai/t AI podcast, and a weekly email newsletter at trust insights.ai/newsletter Got questions about what you saw in today’s episode. Join our free analytics for markers slack group at trust insights.ai/analytics for marketers, see you next time.

Transcribed by https://otter.ai

 


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This