GOOGLE SEARCH CONSOLE FOR MARKETERS 34

{PODCAST} In-Ear Insights: Dealing With Failed Analytics Projects

In this week’s In-Ear Insights, Katie and Chris discuss what to do with an analytics or data science project goes off the rails. How soon should you call a marketing analytics project a failure? What do you do once it’s clear a data research project isn’t going to meet its goals or produce anything useful? What’s the best strategy for averting such failures in the future? Listen in for what just happened to a bunch of social media data for a research project, and what the next steps would be in this case study.

[podcastsponsor]

Watch the video here:

{PODCAST} In-Ear Insights: Dealing With Failed Analytics Projects

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn 0:02
In this week’s in In-Ear Insights, sometimes things go sideways, sometimes things don’t work out the way we expected them to.

I was working on a large data analysis of Reddit data last week.

And there’s a certain point where like, you know, what, not only is not yielding good results, it’s not yielding much of anything except junk.

And so the question I asked myself and the question I want to ask you, Katie is, at what point do we throw it all away? At what point should we be willing to say, you know what, we’re throwing good money after bad we’re throwing good time after bad.

Let’s candidate versus let’s try and salvage it.

How do you how do you balance it? How do you make that that judgment call when you’re running a project, particularly one that’s Uh, you know, a big investment where a lot of eyes are on what you’re doing.

Katie Robbert 2:05
So, you know, easy questions on one day morning God.

You know, it is it’s a tough call.

And it’s usually, in my experience, it’s never been a decision that’s made lightly.

And it’s never been a decision that’s made solely by one person.

And so the project manager may be the person responsible for moving things forward.

But the project manager is rarely solely responsible for making big business decisions, especially when they involve money.

And so, you know, it, it.

It really depends.

It depends on how much we’re talking in terms of a sunk cost.

It depends on whether or not you actually set that money aside for research and development.

And when it’s not really a sunk cost.

It depends on you know, how well planned out the project was and if there’s an opportunity to salvage it.

So, you know, if we’re going down the road of, you know, it’s a small, you know, project.

And you know, it’s only maybe a few thousand dollars, which is still a lot of money, especially to a small business.

You know, you may just have to say this thing didn’t work out, we experimented.

The thing that you salvage from there is you take your lessons learned of what didn’t work, what didn’t we do this time around, that we should do moving forward? Maybe it’s, you know, our hypothesis for the research wasn’t really that strong, or we just kind of started like digging around to see what we would find that’s almost always going to end up as a sunk cost.

Because if you’re not thinking about what you’re going after, if you’re just sort of like, I don’t know, maybe this might work.

Maybe this over here might work.

You are likely always going to have some kind of a sunk cost and that’s what you want to avoid.

And so unsurprisingly, it always comes back to how much of it have you planned out ahead of time now planning process itself can be a sunk cost because you could go down the road of, you know, planning requirements and starting to do the data gathering and then find out that there’s nothing there.

But generally speaking, that planning costs a lot less than actually having a data scientist or an engineer or someone start to do the digging through the data itself.

Christopher Penn 4:23
I feel like in the, in this particular example we’re using I think we did go about this backwards in the sense of we came up with a hypothesis of the plan first before doing the exploratory data analysis.

And then as we got into the exploration realized, there’s not anything here is there’s nothing in this pile of stuff.

It’s all chaff, no, no wheat which I guess is can be challenging because you also don’t want to have your you know, data data scientist at you know, how $500 an hour or whatever.

Just digging around data for without a clear goal in mind, either So how do we balance the two? How do you bounce? Like, okay, we we should do the exploration, to see if there’s any idea before came for the hypothesis or we have a plan.

And then we start to execute the plan and find out the plan.

It goes off the rails very, you know, fairly soon on, how do you balance those two scenarios?

Katie Robbert 5:18
So that’s where a lot of like, proof of concepts and beta tests and those sort of experiments come about.

Very rarely is proof of concept or a beta test.

Just sort of like, I don’t know, let’s just see what we want to get.

There’s still a level of planning involved.

And so, you know, we spent, you know, 30 minutes, maybe an hour over the past couple of weeks sort of sketching out the outline, putting together a plan for this research project that we wanted to do and then you spent a little bit of time digging around the data to find out is there something there and so we haven’t gone so far down the road, as we’ve fully committed to this is the data that we have to use.

This is the structure of the paper, we’ve been promoting it like our sunk costs are very minimal because we were essentially working as if this your initial data exploration was a proof of concept.

So I wouldn’t say we did it backwards.

We actually did it the way that it’s supposed to happen, because you need to find out, am I going to get the kind of output that I’m after? And so when we talked earlier this morning about here’s what I’m finding, we were able to start to pivot the plan to say, What if we did this instead? So the data is salvageable is still usable, we’re just going to use it in a slightly different context, you know, and we’re in a fortunate position where we own the company and we can do that we can decide to change, you know, the research, we can change the hypothesis, we can change, you know, the topic of the paper, not all marketers are in that situation, and so they are scrambling to figure out well, I’ve already pitched out this concept To all of these other customers and, you know, articles and whatever, and now I have to stick with it.

And that’s where you start to get into trouble is when you start to put the horse, the cart in front of the horse.

Yes, I don’t know how horses and carts work.

And I know that one of them does not belong in front of the other.

Christopher Penn 7:19
Well, I mean, that brings up a really good point, because we’ve also had an experience with a client project where there was a very good plan, there was a, there was a survey plan that was rolled out, the survey got done differently, and then you ended up with something that, you know, the that that train had left the station, but it was filled with the newer instead of gold.

And so at the end, and now, in this particular instance, it’s in the train is still full of maneuver, and is probably not going to be successful.

So should every company should every organization have an r&d department that’s working on this sort of things so that you’re, you’re doing those rapid proofs of concepts all that Time and that you’re not ever left holding the bag of Well, what do we do with this thing, as opposed to being in a situation where we’ve invested $100,000 or $200,000, or whatever in this thing, and only when you’re 60% of the way through the project, you realize, wow, this thing is not going to go anywhere, we basically burned a quarter million dollars for no reason.

Katie Robbert 8:18
They absolutely should.

And that’s what falls into some of the Agile methodology is sort of those two week iterative sprints.

You’re constantly building and iterating.

And coming up, you know, at the end of two weeks, you should have some sort of a proof of concept that you can demonstrate to the rest of the team and stakeholders.

They don’t, you know, it should be something tangible.

It shouldn’t just be a theoretical, you know, for the past few weeks, we’ve been building a bunch of code and you can see it in six months.

You need to have something that, you know, I as a non developer should be able to at least look at and go Okay, that kind of looks like the thing that we were talking about, you know, keep moving down this road.

That’s not what we asked for, we need to change it.

And that way, you’re not investing millions and millions of dollars, you’ve only sunk about two weeks worth of work to then pivot and you’re still at that early point where you can continue to pivot.

And then two weeks later, you’re just building on top of the thing in a small, iterative way.

And that’s exactly what agile methodology is meant to do.

And so a lot of development teams, you know, ones that I’ve managed in the past have always fought for and pushed for that r&d time so that they can do that experimentation.

It’s difficult to build into your budget, because it is going to be likely sunk costs, you know, not all of the, you know, research that you know, we do with data or we do with code is going to pan out to be anything to your point initially, Chris, but there needs to be some sort of a structure around it of like, what what am i researching? Am I researching if The new, shiny, you know, plug in on top of our studio does a better version of generating a chart than tableau.

That’s a useful thing.

And we might find out that no, it doesn’t or yes, it does like, then at least you can say, at the end of, you know, your two week sprint, I found out that I got pretty close.

And I just need a little more time or this isn’t going to do what I ever want it to do.

So I need to abandon it and move on.

And I won’t keep wondering like, well, what if I just spent more time What if I just did this? You’ve already done the proof of concept.

And so I do think that it makes sense for companies to build it in but there needs to be structure around it and there needs to be guidelines and guardrails.

Even though r&d feels like it should sort of be like a playground.

It should but it should have a fence around it.

Christopher Penn 10:49
Yes, don’t let your your scientific cobblers run around unguarded.

In that context, though, so there’s there’s two things that are really worth Pointing out one.

When I look at the like amount of code and research and stuff that we do as an organization, I would say 95% of the research we do never sees the light of day, like it just kind of sits in a folder.

And pieces of it definitely do get reused for other projects, like I was working on this one thing, we’re building a recommendation engine, and the prototype was a miserable failure.

version two was moderate failure.

version three is only slightly a failure.

But all those will never see the light of day.

So I think there it probably is worth pointing out to folks that Yeah, there is going to be a sizable amount of cost in research.

And the second thing is what if you’re in organizations, we obviously have many clients, you know, and we know many folks like this, like in our analytics for marketers community, which if you haven’t joined, go to Trust insights.ai slash analytics for marketers free slack group of over 1200 marketers talking about stuff like this and a lot of the organizations Don’t have an r&d department or if they do, it’s in product and it’s not in marketing.

For the marketers out there, what do we do? What do they do? If they don’t have, you know, someone like me who’s just messing around on a Saturday with stuff for fun?

Katie Robbert 12:14
Well, before I get into that, let me step back because you had mentioned your recommendation engine version one, version two, version three.

My question to you is did you learn something between each version?

Christopher Penn 12:29
No, definitely I wouldn’t.

Well, how not to make a recommendation and ship it

Katie Robbert 12:35
to me that means that it’s not completely a sunk cost because you were able to learn new skills and figure out new techniques and that so that in and of itself, makes it valuable because then you can take that and apply it forward.

And so yes, version one, shit the bed version two, kind of crapped out version three.

puttered along, it got incrementally better each time.

And so I think that’s one of the things that, you know, companies, if they’re like, Oh, well, if I’m not going to get anything useful out of this, then I’m not going to do it.

There, the usefulness needs to be in proper context.

So you might not have like a widget or a thing at the end of it.

But your team is likely going to feel a little bit more fulfilled, because they had a little bit more freedom to mess around.

They’re going to learn some new things that they might not have had a chance to learn if they keep doing things the exact same way, every single time because this is how we’ve always done it.

So I just want to like I wanted to point that piece of it out too, because I think it’s important, as companies are thinking about, you know, what am I getting out of letting my team do this kind of research and development and experimentation? Well, you’re getting a team that, you know, is being more creative is being more you know, using their critical thinking skills.

to that problem solving, and really sort of thinking outside of those, you know, boundaries of this is how we’ve always done it.

Now to your question about, you know, usually r&d lives within product, not within marketing.

Don’t try to do it all at once we say that with a lot of like AI projects, for example, AI is one of those things that, you know, marketers are trying to experiment with, find a very small use case and do a small proof of concept, something that, you know, you could probably back to the envelope, figure out how much it’s likely going to cost in resource time, in your time in budget, and just sort of work in a small way and start to scale it up.

Once you’ve been able to prove out whatever the thing is, you know, if you don’t have proper research and development, then you need to just make like, you need to keep it small and contained.

And you need to be transparent about the thing that you’re doing.

Hey, I thought that we might experiment with this other way of doing something.

So I’ll do it this way too.

But I might also try This way just to sort of do sort of like an A B test, and we talked about that a little bit last week when we were evaluating martex facts.

Unknown Speaker 15:08
Okay, that makes total sense.

Christopher Penn 15:11
I would also say that it’s probably not a bad idea to be a part of an analytics community or marketing community.

So you can if even if you don’t have r&d of your own, at least be listening to what other people are doing, asking them what they’re doing, and seeing if there are any lessons they’ve learned that you can import to your organization.

Even if you don’t have those capabilities yourself.

You know, if you hear somebody like saying, I ran an AV test with Google Optimize, and I got no usable results, like good, that’s something that it broadens your knowledge.

No.

Oh, yeah.

If I don’t have a good testing plan up front, then I’m going to test a bunch of things that don’t matter and end up with any conclusive results.

Katie Robbert 15:49
I think that’s a really, really good pro tip for people is, you know, join a community and say, What were your lessons learned from doing this thing and that way You can sort of go into it saying, Okay, this person didn’t put a plan together, let me at least put a plan together and then see how far I can get.

I think that that’s such a smart way in sort of what they call it like the hive mind or crowdsourcing like asking people.

What did you do in this situation?

Christopher Penn 16:17
Yep.

So,

Katie Robbert 16:19
to wrap up,

Christopher Penn 16:21
when it comes to figuring out where something is a sunk cost, and when to bailout ideally, it’s sooner rather than later.

And if you structure your plan to accommodate for things like exploratory data analysis things upfront, you will hopefully reduce the amount of sunk costs.

And to Katie’s point.

Even if something is a sunk cost, there may be salvageable lessons or pieces of code or analysis that you can reuse for more successful projects later.

If you’ve got questions about this or any other topic about analytics, join our free slack community go to Trust insights.ai slash analytics for marketers and join over 1200 other marketers talking about things just like this.

And while you’re at it, Go to the Trust Insights website again TrustInsights.ai dot AI and sign up for our newsletter where you can learn about this and get fresh data every week.

Thanks for listening.

We’ll talk to you soon take care want help solving your company’s data analytics and digital marketing problems, visit Trust insights.ai today and let us know how we can help you


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This