{PODCAST} In-Ear Insights: Exploring the AI/ML Lifecycle

In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris walk through the (Artificial Intelligence and Machine Learning) AI/ML Lifecycle. When you’re thinking about deploying artificial intelligence and machine learning in your organization, especially for marketing purposes, what are the steps in the process you need to consider? Listen in and discover what basic reporting dashboards and AI have in common, and what steps you need to take to prepare for the use of AI in your organization.

[podcastsponsor]

Watch the video here:

{PODCAST} In-Ear Insights: Exploring the AI/ML Lifecycle

Can’t see anything? Watch it on YouTube here.

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn 0:02

This is In-Ear Insights, the Trust Insights podcast.

In this week’s In-Ear Insights, we’re taking a look at, again more frameworks and strategies and ideas, revising them, the ones that especially have been a little long in the tooth here and there.

And this week, we figured we’d take a look at the AI and machine learning lifecycle.

So this is a framework, you can find it over on the website, if you go to trust insights.ai.

And this is essentially the framework that we’ve been working with.

Gosh, it for years now, even actually, technically predates the the company’s existence.

And so Kate, wanted to get your thoughts about how we’re thinking about AI and its uses in terms of building projects with this particular lifecycle lifecycle? And are there things that in your experience, the last few years have dramatically changed?

Katie Robbert 1:00

Well, before we even get into that, I guess a couple of questions I have is, what is the likelihood that people who are introducing AI into their organization are using this kind of a lifecycle? And is that an appropriate use of the lifecycle? Or does this lifecycle assume that AI already exists somewhere in the organization, and this is how you create a project with it.

So I guess first, I want to understand better sort of the, the use cases for this lifecycle.

Christopher Penn 1:32

This is really designed for if you are building and deploying a machine learning solution in your organization.

So I think that’s a really important question, because you would have maybe a different look at this, if you were just bringing in a vendor, right.

So if you wanted to bring in a vendor, a new tool of some kind, you would have some things here, for example, the sections in green, where you’re doing model selection, and model evaluation and model deployment.

The vendor is responsible for that.

So the green, really the green, and the blue sections are the vendors responsibility.

But I think the red and the yellow sections on here, those are still either your responsibility as the organization or shared responsibility with the vendor.

Because ultimately, they need something to work with, you know, the data that you provide to them has to be usable for their product or their service to take some action.

So I’d say this really is designed, it’s very similar to the software development lifecycle for building your own thing.

And there’s maybe a separate version for how to manage an AI project where the vendors providing all the nuts and bolts.

Katie Robbert 2:47

So this is not introducing AI into your organization, this assumes that AI already exists in the organization,

Christopher Penn 2:56

or the capabilities exist.

So I would say, if we go back to our cooking analogy, this framework presumes that you’ve got the talent, you have some tooling already, that you’re trying to find the data, and that you’re trying to build a recipe, but you do have an outcome in mind.

So it’s sort of like, you’ve got ingredients, you’re hungry, you know, you want to bake something, or fry something or whatever.

And you’ve got a reasonably well equipped kitchen.

But you don’t have a recipe yet.

And you and you just know that you’re hungry, and you want to cook something.

Katie Robbert 3:32

Okay, so, okay, that’s helpful, because, you know, I’ve been thinking about it in terms of how do I introduce something like AI and machine learning into my organization when it’s a brand new skill set, or a brand new vendor? So that’s helpful context.

So if we go with, you know, your example of the capabilities, skill sets, those things already exist, then yes, this is very similar to the software development lifecycle, a project lifecycle, whatever you want to call it, it’s basically borrowing all of those elements because that cycle works.

Now, if I was introducing this to a client, which we’ve done before, the first thing they say is, well, that’s a lot of steps.

And I don’t have time for that.

And so the challenge we see with these life cycles is people want to skip right to the very end.

They want to know what the heck is happening.

And so the reason we discourage that is because that’s when you end up wasting honestly the most money and resources and everybody’s time.

And so as boring and mundane as it can be for some people doing the business requirements and the planning upfront capturing what the heck it is you want to do, why you want to do it, who’s going to do it how you’re going Do it, it may you may be like I already know all this already know all this, I don’t need to write down, yes, you need to write it down.

Because sometimes you get to step six, you’re like, wait, I forgot what the heck we’re doing.

And then you just kind of keep going forward or someone changes their mind midstream.

And all this work that you’ve done, you have nothing to point back to be like, but this is what you wanted in the first place.

So that’s my tiny little soapbox, like don’t skip business requirements.

That said, What I don’t see happening a lot of times, is what happens in the yellow.

So I don’t necessarily see data requirements, I don’t see data collection, or exploratory data analysis in the way that we’re hoping it’s done in a very thoughtful and thorough way, which goes back to the six C’s of data quality.

So my question, Chris, is, um, you know, is there a version of this that’s shorter?

Christopher Penn 6:04

Obviously, with the vendor version, yes, because the green and blue steps don’t exist, just at that point, you hand off to the vendor.

And as their problem, which is, obviously the way a lot of folks want to go, those red and yellow steps, though, I don’t think they are, they can be made shorter.

Not if you want to do it well, and not waste a lot of money.

And I would actually change the order of one of the things on here, because there, this is portrayed as a very linear path, you know, A to Z follow the steps.

And the reality is, like that exploratory data analysis step is almost like a Scrum or an agile component where it happens a lot in loops.

All throughout, because, you know, we have your business requirements is first in the analytics approach a second, well, the analytics approach, when we’re talking about AI, you’re talking about two basic approaches, right regression or classification.

Even that, if you haven’t looked at the data, you can’t make the decision.

So it actually probably move that towards the end of the yellow part is to say, Okay, now you’ve done your exploratory data analysis, you looked at the data, you tried a few scratch things and found out Oh, wow.

So let’s use a practice.

And let’s, let’s say we’re doing attribution modeling, right? We want to build a machine learning powered attribution model, we have our business requirements, which is, we need to know what’s going on.

We have data, like Google Analytics, and your Twitter data and all this stuff.

You got to get all that data.

And they got to look at it, and say, Okay, what do I have? And to your point, Katie, this is where it’s really important to have those business requirements and those data requirements specified.

Because if you’re halfway through this process, and somebody says, Oh, yeah, we want to do billboards.

Well, now you tell me, we needed that earlier.

And that might change the analytics approach.

Because if you go, if you suddenly have, say, non clickstream stuff, then you can’t use for example, Markov chain modeling, because it’s, it’s simply not available, that data is not available.

So you have to use se booster gradient boosting instead.

And so that yellow section really is kind of a, an iterative loop going around until you feel like okay, we’ve got a handle on the data requirements, we’ve got the ability to get the data, we’ve done some EDA, multiple times.

And then we can finally say, yes, okay, we’ve settled on this analytics approach.

And there, I think there’s a huge people and process component of that, which is where you say, Okay, this is the approach, I need you to sign off on it.

So that you don’t come to me in two weeks and say, Oh, wow, this isn’t what I wanted, like, No, this is what you signed off on.

Katie Robbert 8:42

I am, I’m glad and you gave the right answer.

There is no shortcut for the first two sections.

It was a trick question.

I was testing you.

But I’m wondering, so we’re actually going through this process with one of our clients right now.

And we’re building a set of dashboards.

And you know, it’s, oh, just dashboards like, that should be pretty straightforward.

Well, unfortunately, it’s not, which is why we’re going through a very stringent process with them.

To slow them down a little bit.

Because the challenges we’ve run into on these kinds of projects is that indecision of, well, I want this but no, I want this.

And so we’re really trying to focus in what is it that you actually need, because then it changes the analytics approach.

Now, what’s not on here, and maybe it’s just a matter of how we’re calling things is the technical requirements.

And so maybe that’s, you know, synonymous with the analytics approach or the data requirements or maybe it’s a combination of both.

But what we are working through and documenting is, okay, from the business requirements.

This is what we know the stakeholder wants.

And if the stakeholder wants to know this, then these are the sources of data that we need.

With these sources of data, how can we what is the process for extracting the data on a regular basis, cleaning it and then getting it into the dashboard, visible visualization.

And so where, where do the technical requirements fit into this life cycle,

Christopher Penn 10:19

they, like you said they are a blend of the data requirements in the analytics approach.

And I think you’ve stumbled upon something there, that’s really very brilliant, which is, dashboards are essentially human learning, right.

And what we’ve got here is machine learning, the inputs for either a dashboard or machine learning model aren’t the same thing.

It’s just that who’s doing the processing is the machine learning from it as a human learning from it.

So when we put a dashboard before an executive, they are going to process the data, make inferences, choose a model in their heads of what they think reality is, and make decisions on it.

So it really is no different, except that the machines are more reliable, have fewer overhead costs don’t have HR costs or health insurance.

Otherwise, it’s the same thing.

So having maybe we explicitly call out yeah, there has to be technical requirements as well, because the same things, the same inputs that go into a dashboard are the same inputs that go into a machine learning model.

And they are, so they it is all the same thing.

Katie Robbert 11:19

I like that because I think for someone who might be newer to a lifecycle, whether it be software development, or machine learning or project, there is always there’s a difference between the business requirements and the technical requirements.

And this is something that a couple of jobs ago, the teams used to struggle with, because they would try to combine them, which is fine.

The challenge with that is the audience in which you’re trying to extract information from.

And so one of the one of the sets of requirements, the business requirements is really, you know, the why we’re doing the thing, who cares about it? What decisions are we going to make with it.

And then the technical requirements is really the how are we going to do the thing.

And quite often the people who are giving the business requirements are not the same people giving the technical requirements.

And so what I found in that situation is that, in an effort to make it efficient, we would try to combine both, but then the stakeholders who weren’t, the technical folks would get bogged down with trying to understand and give input into the approach, which is fine, but that really wasn’t their expertise.

And it would actually slow the process down even more, trying to educate them and have them give input.

Now, the caveat here is you want your stakeholders to understand technically what’s possible and what’s not possible.

But having them weigh in on the technical requirements is not necessarily the best use of their time.

Ultimately, they shouldn’t really care how they think gets done, as long as it gets done the way in which they wanted to get done.

And then there’s, there’s a different time and place for educating them on the technical capabilities of the systems that you have.

So I just want to throw that caveat out there that I’m not saying that you should keep your stakeholders in the dark, but pick and choose the time in which you, you know, educate people.

So you do your business requirements, and then the technical requirements, which is really the approach, how are you going to get this thing done? So I really like, you know, if we’re going to change this lifecycle, or adapt it, or update it, making sure that there’s the human element of the inputs and the outputs like, Who is this? Maybe its audience? I don’t know.

Christopher Penn 13:44

So I think audience or stakeholders or something like that definitely belongs there.

Because the green and blue sections, if you think about this in terms of dashboards, the green and blue sections are you design your dashboard, right? You QA, you test it out, and you hand it off to stakeholder.

And then that tuning section is essentially the stakeholder giving you feedback like oh, no, I want this to be a pie chart and they slap them.

The green, though, the red and yellow parts really are totally the same.

No matter what no matter what approach we’re taking.

So in terms of rolling out machine learning in some way to an organization, one of the things I think is probably not reflected here and doesn’t belong on here.

Is is the problem is the organization itself equipped from a cultural perspective to deal with machine learning, because it may not be technical stuff aside, you know, you will have issues where people question like, should we even be using artificial intelligence or not? Now, one thing that I think is worth possibly putting in here, as well is in the intersection between business requirements data requirements, is I would call problem identification.

Do you have a problem that can be solved with machine learning or not? Because if you have a problem It cannot be solved with machine learning or it’s, you know, higher cost than return, then at that point you need to bail out.

So real simple example.

Machines are really good at things humans aren’t and vice versa.

Like we are really good at vision and hearing at language machines.

It’s computationally very expensive for them to do that.

Machines are good at mathematics and statistics and probability.

And we are not right, our brains are simply not equipped to do like fourth order derivative equations in our heads.

Machines can do that.

Katie Robbert 15:32

So yourself, Mr.

Penn.

Christopher Penn 15:35

Alright, I will put that to the test.

Okay, you solve this derivative in your head.

And so that, that, I guess, problem selection problem identification belongs there.

It’s separate from business requirements, it really is a question of what is the best approach to solve this problem.

Now, the business requirements part is part of that.

Because if the ultimately you’re just trying to get a, a an insight to a stakeholder, so they can make a decision faster and better, then that is a core business crime, it belongs there.

But the method which you get to answer could vary quite a bit.

You know, like you’re saying, we’ve got one client we’re working with right now, where there’s a manual data processing step, and there’s no way to work around that like the the system that the data is coming from is so antiquated, and so primitive, that you literally have to just copy and paste stuff, there’s, there’s no way to automate that fully.

So that’s part of, you know, the the challenge that go into that would be more in the data collection section here.

But there always are going to be the with these weird acceptances of putting this thing together to try and figure out what is the end objective.

At the end objective for in this particular example, is a, the clients client has to get the outputs and the insights things.

So you know, there’s there’s additional layers of fun on top of that.

Katie Robbert 17:01

So and that goes back to my original question is, what is the use case for this particular lifecycle? Is it introducing AI? Or is it assuming AI already exists? And so I think that this lifecycle needs to be complementary to some of our other frameworks of, you know, the readiness of your organization for AI.

And so this is really phase two, phase one is, you know, should you be using AI at all? Does AI solve your problem? We have a framework for that.

And then if the answer, it’s almost like that decision tree, if the answer is yes, move on to the AI machine learning lifecycle? If the answer is no, move on to your general software development lifecycle, which, you know, one could argue they’re one in the same, but really, it’s just a matter of, are you using machine learning or not to solve your problem? And, you know, is your organization ready for that kind of technology? Not everyone is and not everyone needs it?

Christopher Penn 18:06

Exactly.

I think we have an analytics maturity model.

But I think an AI maturity model would actually be helpful to sort of that decision tree up front to say, here are the questions we need to ask, if you have problems that AI solves.

B, do you have talent, right? So that’s process, that’s people? And then do you have the technology that AI solves? Because, again, if you That was pretty easy, in some respects, but in other respects can be challenging if you don’t have the right people to run the technology, because anybody can fire up a Google Cloud account, you know, start using TensorFlow and BigQuery mL mL stuff.

This is not a question of accessibility to the technology, but it is having the people and the knowledge to be able to make use of the technology offerings that are available.

It’s like, you know, putting my dog in the kitchen.

Yes, there’s all these wonderful expensive appliances, but it’s a dog.

He can’t use the appliance as well.

Katie Robbert 19:04

You know, it’s your it’s funny, because as you’re saying, like, it’s a fairly easy question to answer.

Does AI solve this problem? I would argue that it’s not, because there’s probably a lot of other questions that you need to ask and a lot of other information you need to gather before you can answer that question.

Because I would venture a guess that it’s not clear what problems AI can solve, like, we talked about it in general terms of you know, repeatability and calculations.

But are there other problems that you may not be aware of? I’m speaking sort of the general you, not you Chris Penn, because you know everything about AI, but you know, what, okay.

into well, so the general you have, do you know, what kind of AI is available to solve the problems and are you clear on what the problem is that you’re trying to solve? So we know that moving into next year, there will probably be more usage I have AI that writes content.

And so that’s not a repeatability thing.

That’s not a calculation thing.

That’s actually one would venture like a creative thing.

And so being aware of the kinds of problems that you have, the AI can solve and what problems AI actually is effective for, for your organization.

So that in and of itself, is a whole deep dive survey, to really have that awareness of like, what is it that you need AI to help you do?

Christopher Penn 20:35

I think that’s a really good example of a problem because as a business stakeholder with a business problem, your business problems, you can’t generate enough content fast enough, right? It’s fundamentally wide user stuff.

And one of the big questions that you have to ask is, is the content that you generate right now good enough, that it’s not worth AI generate, because AI generates really awesome mediocre content? Right? It generates like bland, boring stuff.

at a blistering pace, you can crank out 100,000 page books in seconds that are just filled with tripe.

You cannot create great stuff with AI.

It’s the models just are not there.

So to your point, Katie, the question is not whether AI is the the right approach.

But what is the problem we’re trying to solve? If you are trying to solve for we want really great content AI is actually not the answer, because machines simply can’t build enough unique good content, because the training dataset for it is so small.

So even in things like that.

I was priek, it’s also a precursor to the AI maturity model of is, is the problem you’re trying to solve one that is solvable period by your business, because it may not be if you don’t have the money or the talent or the capabilities to use AI.

And you also don’t have the right people to generate the high quality stuff that you’re looking for.

You’re kind of in a bind.

Katie Robbert 22:07

I agree with that.

And so it’s, it is funny, we always kind of come back to the same conclusion, the same theme of Is it a people problem? Is it a process problem? Or is it a platform problem? And those are the questions that we you and I, Chris, are very well equipped to help people answer.

And that’s going to be one of our, you know, doubling down the focus on educating people on how we do that, in 2022.

And, you know, so with the AI, and machine learning lifecycle, it assumes that you have all of those questions answered, you know, what people you you have, you know, you have, you know, the skill sets, you know, what processes you can follow, and you know, the platforms that are available.

So, it assumes a very high level of analytics and AI maturity.

And so I think that as you and I are rethinking how we’re positioning these, we need to couple them with other frameworks to say, you know, it’s appropriate for you to use the AI and machine learning lifecycle, if you score, you know, four out of five, on the AI maturity model.

And so helping people see where they are in that moment, can save them a lot of headache of trying to introduce AI when AI is not the right thing.

And so that’s our goal is to help people have that awareness of where they are in their analytics and AI journey.

Christopher Penn 23:36

The other aspects we have to figure out is dealing with what happens when AI gets forced upon you, and you don’t have a choice.

For example, as you adopt Google Analytics, for guess what it is entirely powered by machine learning, it is fantastic as a tool, it is complex as a tool.

And it is, I would argue, substantially more difficult for the average marketer to use.

Because of the way the tool was engineered.

It was really engineered.

Is it a wonderful example of a tool that was made by engineers, in some ways for engineers, for very technical people? It is not a user friendly tool.

And a lot of AI insights that it derives are very powerful, but good luck finding them, if you are a novice user.

And so there’s I think there’s almost a, a people evaluation process that has to happen first, which is, are you even equipped to deal with a rapid change? Like if Google says, you know, beginning of January 120 23, Google Analytics three is going away.

What do you need to do as a plan to adapt to that, you know, because you’re gonna have this brand new pile of AI dropped on you.

And because you’re getting what you pay for you don’t get a choice in the matter.

When you know, when Google Ads revamps, it’s, it’s up when Facebook revamps and stuff and a lot of these cases, you don’t get a choice.

With these tools.

You have to You deal with what you’re given? And so how do what’s the framework look like for helping people adapt to substantial technological change that they may not be ready for?

Katie Robbert 25:12

You hire Chris Penn.

Christopher Penn 25:16

No, I make things worse.

Katie Robbert 25:19

You hire a combination of Katie and Chris Baer go? Well, no.

And, you know, I feel like that’s a really good example.

Because again, all of these frameworks and life cycles have a lot of assumptions built in place of a lot of knowns.

And so there definitely needs to be those alternate versions, where, you know, at step three, everything changes, you know, the company is suddenly acquired, or, you know, your main stakeholder, you know, decides to, you know, finally take that retirement and go off and live in Hawaii.

So now you have a whole new stakeholder.

And so there’s always those, you know, caveats with these frameworks.

And so I think that that’s something Chris You and I can start to think through and get a little bit better about building in is that in a perfect world, this is your framework, but the world is not perfect.

So now here’s the messiness of what it really looks like.

But let us guide you and make sure you can get through it start to finish because it is possible.

It just may not look like those perfect little boxes the way we have it outlined.

Christopher Penn 26:24

Ya know, it’s it doesn’t go it’s like you know, a child dumped a bucket of Legos all over the floor.

Some found right.

If you have comments or questions about this framework, and the other ones that we’ve talked about recently, pop on over to our free slack group go to trust insights.ai/analytics for marketers, where you have over 2100 other marketers are asking answering each other’s questions every day, and wherever it is that you watch or listen to the show.

If there’s a challenge you’d rather have it on.

Most of them will be here at trust insights.ai/t AI podcast.

Check it out on YouTube and all the other places.

Thanks for tuning in.

We’ll talk to you soon.

Take care.

Need help making your marketing platforms processes and people work smarter.

Visit trust insights.ai today and learn how we can help you deliver more impact


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This