incrementality

Incrementality and predictive analytics

Data was originally featured in the October 20, 2021 newsletter found here: https://www.trustinsights.ai/blog/2021/10/inbox-insights-october-20-2021-effecting-change-technical-content-marketing-incrementality/

In this week’s Data Diaries, let’s talk for a brief moment about incrementality and the use of predictive analytics. Incrementality is a marketing analytics concept which answers how we spend our marketing resources – time, people, budget, etc. Within incrementality, we have our baseline, and we have our incremental lift as our core concepts.

Our baseline is the effect of our always-on marketing efforts, and it’s the marketing results we get from doing nothing extra. We blog, we post on social, we run always-on ads, and we get a certain level of performance.

Then, if we do something different – start a new campaign, change our tactics, etc. – what we get on top of that baseline is our incrementality, what we got that was extra. What marketing is always in search of is the biggest bang for the buck.

The challenge with incrementality is that A/B testing is very, very difficult, especially on channels where you have no ability to realistically do so. How do you demonstrate incrementality when you no longer have baseline data?

There are two different approaches we can take to address this issue. The first, which we’ve discussed in previous newsletters, is using propensity matching, matching similar days by all other factors that could influence an outcome and then looking at the difference between similar days when a campaign was running and when it wasn’t running. This is an excellent method for determining incrementality, but because it collapses time, it’s very difficult to illustrate the difference.

The second approach is to use the conceptually odd retroactive predictive analytics. This is when we forecast what should have happened using predictive analytics and then compare the forecast with what actually happened. As long as the forecast is statistically sound, we can illustrate the difference between what was baseline and what actually happened.

For example, in August I decided to try using LinkedIn differently, which we discussed on an episode of So What? previously. Now, how would we measure the incrementality of that effort? After all, I can’t A/B test my own account, behaving differently only some of the time. If we use predictive analytics to forecast what August and September SHOULD have looked like, it would look like this:

LinkedIn Forecast

For August through today, I should have earned 351 visits from LinkedIn, based on the forecast.

What actually happened?

LinkedIn Actuals

The campaign earned 1,887 visits. The difference – 1,536 visits – is the incrementality, what I earned that was above and beyond the baseline. That’s incrementality, what’s additive from our efforts. When you embark on efforts to measure incrementality in your own marketing, try this approach.


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This