INBOX INSIGHTS, August 23, 2023: AI Culture Shift, Non-Language Data and AI

INBOX INSIGHTS: AI Culture Shift, Non-Language Data and AI (8/23):: View in browser

Inbox Insights from Trust Insights

👉 📕 Get our new book, The Woefully Incomplete Book of Generative AI, absolutely free of financial cost 📕

👉 Watch the newest version of our keynote talk, the Marketing Singularity, recorded live at MAICON 2023! 📺

AI Is a Culture Shift

Are you tired of hearing about AI? Unfortunately, it’s not going anywhere. If anything, it’s going to become so mainstream you won’t even notice it anymore. Until you do. Until your leadership decided that your company is going to be “AI first”. Until your boss tells you that you have all new tools that have AI in them that you need to use. Until you start to lose sleep, worried that AI is going to take your job.

I’m not here to create additional anxiety. What I want to do is help you recognize the pitfalls of introducing AI and how to think about preventing them.

Here’s the short version: AI is a culture shift. If your culture is shaky or non-existent, AI won’t fix it. If your culture is people-first and your leadership wants to start leading with AI, this will fail. AI won’t fix people problems.

A million bazillion years ago when I worked as a product manager, the company I was at was undergoing a lot of changes. The business model was shifting from clinical research to commercial products. The organization was changing from a top-down approach to a more flexible approach called a matrix structure. (I should make a separate post about that.) The tech stack was moving from CD-Rom to dot com. New disciplines were being tested, new managers were being promoted, new technology was being embraced. It was a lot all at once.

Because there was a lot of change, the organization brought in an outside consultant. The consultant talked with “key” employees and leadership about creating a strategic plan. This plan, broken down into several BHAGs (Big Hairy Audacious Goals), was then rolled out to the entire company. Each goal had an owner from the leadership team and milestones to hit.

So far, so good, right? Nope. It all looked good on paper, but it didn’t work. It failed spectacularly. Not because the employees weren’t on board (we were) but because there was no follow-through from leadership. They lacked transparency. They stopped updating the employees on the progress. They stopped engaging in the initiatives. They stopped prioritizing the new goals as valuable time spent. The leadership team closed the doors on their “high-level” conversations and locked the rest of the company out. They patted themselves on the back for creating a strategic plan, telling us about it, and giving us instructions to execute it. There were graphics created for this thing. There was a company-wide meeting. There were snacks. And then…crickets.

What went wrong? Well, aside from everything, it wasn’t a true culture shift. Culture is set from the top down. It’s not enough for leadership to go through the motions of what they want the rest of us to do. Leadership needs to consistently participate.

Enter into the chat: AI. I’ve lost count of the leadership teams that have asked us to help them implement AI into their company. I’ve also lost count of the times that I’ve advised them to get aligned with each other on why they want to introduce AI. That’s usually where the conversation stops. If your leadership team is not crystal clear on why they want to change the culture with the introduction of new tech, it won’t work.

Think back a few years to the buzzword of the hour, “digital transformation”. Simply put, digital transformation meant wanting everyone to be digital first and use a bunch of digital platforms that don’t communicate. Where digital transformation got it wrong, and still gets it wrong, is the people. This was the inspiration for the 5P Framework.

Ah, we almost got through a whole post without a mention of the 5P Framework. I’m not going to go through all the pieces (Purpose, People, Process, Platform, Performance). I want to bring it up because by far, the most important “P” is people. And that is the P that is often given the least amount of time and thought.

Back to the point. In this context, AI is just a new piece of technology. It’s another platform. It’s an updated process. However, if you don’t have alignment and participation from the top down, the shiny new object will be just that. It will be a distraction. We will forget to use it. The tool will become just another invoice to pay.

If you’re wondering if the company I worked at a bazillion years ago is still around, it’s not. The leadership team became divided and eventually dissolved. The company sold , changed hands again, and then one more time. It’s a shame, because the BHAGs that we had set could have grown into big things. Ah well, lesson learned.

Are you undergoing a culture shift? Reply to this email to tell me about it, or come join the conversation in our Free Slack Group, Analytics for Marketers.

– Katie Robbert, CEO

Share With A Colleague

Do you have a colleague or friend who needs this newsletter? Send them this link to help them get their own copy:

https://www.trustinsights.ai/newsletter

Binge Watch and Listen

In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss how to leverage large language models like ChatGPT with your business analytics data. We talk through real examples of using AI to analyze CRM data for buyer personas, extract tables from PDFs, summarize PowerPoints into executive briefs, transform your data into actionable insights, classify social media comments, and answer questions about your data. We explain that while generative AI is exciting, your analytics data should still be the backbone of business decisions. Tune in to learn creative ways analytics and AI can work together to enhance your marketing strategy.

Watch/listen to this episode of In-Ear Insights here »

Last time on So What? The Marketing Analytics and Insights Livestream, we walked through audio production and doing an audio production bakeoff. Catch the episode replay here!

This Thursday at 1 PM Eastern on our weekly livestream, So What?, we’ll be doing our Q3 large language model bakeoff to see what’s changed with Bing, Bard, ChatGPT, and others. Are you following our YouTube channel? If not, click/tap here to follow us!

In Case You Missed It

Here’s some of our content from recent days that you might have missed. If you read something and enjoy it, please share it with a friend or colleague!

Paid Training Classes

Take your skills to the next level with our premium courses.

Free Training Classes

Get skilled up with an assortment of our free, on-demand classes.

Data Diaries: Interesting Data We Found

This week, let’s talk about data that isn’t necessarily language. In the rush to embrace generative AI for literally everything, sometimes we forget that an awful lot of data [a] isn’t in generative models due to training times (ChatGPT famously has no memory after September 2021 for most tasks) and [b] isn’t language per se that large language models are capable of processing so well.

For example, let’s take Spotify playlists. Spotify playlists are rich sources of data; when users publish them, we learn quite a bit about what music is resonating with pop culture. Yet if you were to ask most large language models what the most popular songs are on Spotify, most of them are going to come up with either no answer or hallucinatory answers, because that data isn’t readily available.

So why would a large language model like GPT-4 (the one that powers ChatGPT’s paid version) or others not be the right tool? Isn’t AI smart enough to know what to do? The short answer is no, and here’s why. In this specific use case – and there are many others, like page titles on your blog, company names in your CRM data, etc. – these song titles appear like language, but they’re not language.

Wait, what? How is a song title like “Welcome to the Black Parade” not language? To answer that question, we have to answer what language is. The OED defines language as this:

“the principal method of human communication, consisting of words used in a structured and conventional way and conveyed by speech, writing, or gesture.”

When we use words in a structured and conventional way, we are using not just the words, but words in a specific and rational order to convey meaning. That’s why large language models are good at language, because within their statistical libraries, they not only understand words, they understand the relationships of words to other words. They know that “I ate at Burger King” is NOT the same as “Burger I at King Ate”, even though those two pieces of text use exactly the same words.

The atomic unit of data in a Spotify playlist is the song, and most playlists do not use song titles in the structure of language. There are patterns, to be sure – playlists exist that are composed entirely of sad songs or songs for someone’s birthday, but that structure is not language because it’s not structured or conventional. You can’t pick up a pile of playlists and infer the meaning of the playlist solely from the use of songs and their order on the list.

When we use analytics data, we’re not necessarily interested in the language itself, but the dimensionality of the data. For example, suppose we were to put together a list of the top X songs on Spotify this summer. Each song title’s language is not something we’re especially interested in; what we’re interested is the frequency of song titles in their appearances on playlists.

That’s not language.

That’s math.

And language models, despite the bombastic claims of many, are not particularly good at math. In fact, natively, they’re quite poor at math because they can’t actually do math. They can only predict what the next word is likely to be in a sequence based on other words they’ve already seen. They know 2 + 2 = 4 only because they’ve seen that particular string of text – language – many times, and thus they have a statistical understanding that when 2 + 2 appears, the next words in the sequence are likely to be = 4.

When we look at song titles and popularity lists, we’re really treating them as just high-dimensionality data. We don’t actually care what the song title is (language), we just care about counting them (math).

Let’s take a look at an example. We downloaded more than 6,000 publicly published Spotify playlists from the last 3 months, which all have a format like this:

  • Wolf Alice – Don’t Delete The Kisses.mp3
  • Bleach Lab – I Could Be Your Safe Place.mp3
  • Zeph – world.mp3
  • Christian Kuria – Sunbleach.mp3
  • Del Water Gap – Ode to a Conversation Stuck in Your Throat.mp3
  • Carly Rae Jepsen – Bends.mp3

Is this language? No. It’s three dimensions of data – a musical artist, a song title, and a file format. Processing this data involves taking the data and slicing it up into its components so we can correctly count it.

The top 40 songs of the summer on Spotify look like this:

Spotify Top 40 Summer Playlists

Again, this is math, and because it’s a math task, it’s not well suited for large language models to handle. To the extent that they can do so, it will be of lower quality and lesser outcome than traditional machine learning or even just basic data science techniques. It’s counting.

So what? What’s the point of this, besides having some new songs to add to your own playlists? Understanding what language is and is not helps us understand what tasks a large language model and its associated software will and won’t be good at. ChatGPT cannot capably give us the top songs of the summer, even if it had current data. The same is true for the contents of your CRM, content on your blog, etc. – if you’re trying to do math on something, a language model (at least in their current incarnations today) isn’t natively best suited to handle the task.

Suppose you still wanted to use generative AI. How would you do that? It turns out that language models have something of a workaround: you can ask them to code. You can ask them to write Python code or R code or the coding language of your choice to accomplish the math tasks you want to do, which is what I did with the Spotify playlists. I didn’t ask ChatGPT to tell me what the top songs of 2023 are. I did ask Spotify to help me write the code necessary for downloading the data, and then to process the data into tabular format, something that you can open in spreadsheet software. Why does this work? Because coding IS language, and thus language models are good at it.

Take this lesson and apply it to all your data. If you’re using language, a language model will help. If you’re not – if you’re doing math or other non-language tasks – then a language model will not be the right tool for the job.

Trust Insights In Action
Job Openings

Here’s a roundup of who’s hiring, based on positions shared in the Analytics for Marketers Slack group and other communities.

Join the Slack Group

Are you a member of our free Slack group, Analytics for Marketers? Join 3000+ like-minded marketers who care about data and measuring their success. Membership is free – join today. Members also receive sneak peeks of upcoming data, credible third-party studies we find and like, and much more. Join today!

Blatant Advertisement

Now that you’ve had time to start using Google Analytics 4, chances are you’ve discovered it’s not quite as easy or convenient as the old version. Want to get skilled up on GA4? Need some help with your shiny new system? We can help in two ways:

👉 We can do it for you. Reach out to us if you want support fixing up your Google Analytics 4 instance.

👉 You can do it yourself. Take our course, Google Analytics 4 for Marketers, to learn the ins and outs of the new system.

Interested in sponsoring INBOX INSIGHTS? Contact us for sponsorship options to reach over 26,000 analytically-minded marketers and business professionals every week.

Upcoming Events

Where can you find Trust Insights face-to-face?

  • ISBM, Chicago, September 2023
  • Content Marketing World, DC, September 2023
  • Marketing Analytics Data Science, DC, September 2023
  • Content Jam, October 2023
  • MarketingProfs B2B Forum, Boston, October 2023
  • Social Media Marketing World, San Diego, February 2024

Going to a conference we should know about? Reach out!

Want some private training at your company? Ask us!

Stay In Touch, Okay?

First and most obvious – if you want to talk to us about something specific, especially something we can help with, hit up our contact form.

Where do you spend your time online? Chances are, we’re there too, and would enjoy sharing with you. Here’s where we are – see you there?

Featured Partners and Affiliates

Our Featured Partners are companies we work with and promote because we love their stuff. If you’ve ever wondered how we do what we do behind the scenes, chances are we use the tools and skills of one of our partners to do it.

Read our disclosures statement for more details, but we’re also compensated by our partners if you buy something through us.

Legal Disclosures And Such

Some events and partners have purchased sponsorships in this newsletter and as a result, Trust Insights receives financial compensation for promoting them. Read our full disclosures statement on our website.

Conclusion: Thanks for Reading

Thanks for subscribing and supporting us. Let us know if you want to see something different or have any feedback for us!


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This