In-Ear Insights AI, Empathy, and the Future of Work

In-Ear Insights: AI, Empathy, and the Future of Work

In this episode of In-Ear Insights, the Trust Insights podcast, Katie and Chris discuss the potential impact of AI on empathy and society. You’ll learn about the risks when AI creators lack empathy and the importance of compassion in a tech-driven world. Discover how democratizing technology and prioritizing human concerns can combat these risks. Find out how you can be part of the solution and shape the ethical use of AI.

Watch the video here:

In-Ear Insights: AI, Empathy, and the Future of Work

Can’t see anything? Watch it on YouTube here.

Listen to the audio here:

Download the MP3 audio here.

[podcastsponsor]

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn 0:00

In this week’s In-Ear Insights, let’s talk about AI, specifically generative AI, and empathy.

And here’s, here’s the perspective that, Katie, I want your take on.

There was a statement on LinkedIn as a video of Sam Altman from OpenAI, talking about how the race is on in Silicon Valley, to who can be the first to build the first billion dollar company, with one employee, and no one else, everything else be handled by AI by machines, by AI agents, etc.

And that got me thinking, on the one hand, as a as a nerd, that’s cool, right as just just from the pure nerd perspective, that’s cool.

But on the other hand, it struck me that that was a absurdly non empathetic point of view, about how to use this technology to, to better humanity, which theoretically is open AI is mission, building a billion dollar company that employs no one, but you strikes me as I want to say selfish, but it is kind of selfish, in the sense of, like, yeah, I want all the money, and I don’t want to give it to anyone else.

And that, to me, is highlights something that we’ve seen for the last 30 years, but really badly in the last five, you look at papers and research on the topic of just the absolute decline in our overall ability to be to practice empathy, sort of an empathy deficit.

So what that taught me thing is, you have a bunch of people who are building these systems, who have no sense of empathy, and no sense of connection to like regular people.

And that does not bode well for how this technology is going to be used at a societal level.

What’s your take on this?

Katie Robbert 1:54

I mean, that’s a lot to unpack.

So, you know, I have a lot of questions, as you would imagine, number one, what kind of business? Are we talking about? You know, number two, you know, is this assuming that the one person who builds the business is the one person who’s going to train and build and maintain everything? Therefore, we already know that there’s flaws there because of the inherent bias that will be built in? Number three, based on the kind of business who wants this thing? That’s, you know, as consumers, we’re like, solely, okay, this is just the brainchild of Sam Altman.

I mean, we’re seeing this with people like Elon Musk, bro, like, Yeah, this is he’s trying to do this like solely as his perspective, his opinion, and it’s going poorly every step of the way.

And he’s seeing it happen publicly.

And then the other thing is, so, you know, you’re doing this research on the empathy deficit.

I mean, if my assumption I haven’t read, the research that you’ve read, is that a lot of this is politically charged by the people who have been, quote, unquote, in power.

And it’s, you know, sort of driving the society to act certain ways, which is less empathetic than it has been historically.

I then look at the groups that are pushing back on those people in power, you know, so to not be so vague, when I look at past administrations, specifically, you know, people who had no business being president in the first place, their opinion, their actions, took a toll on society on, you know, the country as a whole in terms of how they felt entitled to behave, how they approached situations, how they lack of problem solved and just sort of bullied and pushed people around.

That, to me is sort of the root of where you’re feeling that society is declining in terms of empathy.

I then look at organizations like the human rights organization that are every single day fighting the fight, pushing back gathering numbers, I look at prominent public figures, not to jump on the bandwagon, but someone like a Taylor Swift who has millions, if not billions, of people hanging on her every word.

And her saying the way that people are behaving in power is not okay, we need to do something about it.

So my sense is the research that you’re reading is very narrow, because this is so let me take a step back the problem with academic research, I’ve been in academic research.

The problem with academic research that I’ve always found is that you’re supposed to get enough of a quote unquote, representative sample of people in order to say that this represents everybody.

But that’s impossible.

So you get Like an n of 256 people, and that’s meant to represent, like 10,000 people, people are so complex and unique that there is no possible way that that small n of a quote unquote representative sample truly represents the way that we all think and feel, especially if you haven’t asked us.

But you can’t ask us all how we feel.

Because you have to get the research out the door, you have to get your name on a paper, you have to be the first to market with the thing.

So all of this is a way of saying, I feel like, there’s a couple of things happening for you, Chris, I feel like one, I know you personally seek out the doom and gloom, not in a, you seek it out in order to understand it.

And so I feel like, you know, similar to doom scrolling, like what we do when we’re scrolling through a social media, and we’re seeing just the bad news, I feel like, perhaps you’ve gone down this rabbit hole of looking for the bad that it’s hard to see the good.

And then you’re also looking at research that is so narrowly focused on the bad that I do feel like you’re missing, the good that happens.

Christopher Penn 6:14

But let’s take a step back, like you said, and work with the definition of empathy that I’ve been working with, which is empathy is commonly defined as being able to understand another person’s experiences and feelings, from their point of view, as opposed to sympathy, which would be being able to understand someone’s experiences and feelings from your point of view.

So this is something that guys in particular are really bad at, right guys like, Oh, that’s terrible.

Let me tell you all about the time that I had this problem, right? We’ve all had those, you had that experience where someone, typically a guy has has heard you, but then injects himself into the conversation entirely, right? That would be sympathy, that’s, that is a lack of empathy, because that person has, cannot see from your point of view, like they literally are so self centered, that they cannot see you from your point of view, they just don’t understand and they don’t make an attempt to understand.

So that’s kind of the definition I’m working with is like empathy would be someone who understands your experiences and your feelings from your point of view and not theirs.

Well,

Katie Robbert 7:20

and even in that example of you’re saying that, like they heard me, they didn’t hear me at all.

That’s not That’s not sympathy, either.

That is just waiting for your turn to talk.

No, and I’m being serious, like, Oh, I know.

It’s not sympathy.

It’s not.

Yeah, you know, you had such a bad day, let me tell you about my bad day.

That’s them, going through the motion.

And women do it too.

It’s them going through the motion of let me let me demonstrate to you let me check the box, that I listened to the words you said, and let me relate to you by sharing something about me instead.

And it’s actually pretty common people feel they struggle to make those connections with other people.

So they feel like if they share a relevant story about themselves, they’re then demonstrating I know what you’ve been through.

And sometimes that’s okay, sometimes that’s appropriate.

But when someone’s dealing with some sort of, you know, trauma or deep emotions, it’s honestly, it’s best not to talk about yourself and just shut up.

And just like, keep your ears open and keep your mouth closed.

And that is honestly one of the best ways to show both sympathy and empathy is to not talk about yourself at all, because it’s not about you.

And I think that that sort of what you’re trying to get at, especially with this, Sam Altman quote is, it’s all about him.

There is zero compassion, there’s zero, you know, thought about what this is going to do to other people.

Christopher Penn 9:01

Exactly.

Because in this framework, empathy enables compassion.

If you don’t have empathy, you cannot express compassion, because you don’t know what action to take, that would address the person the other person’s experiences and feelings from their point of view.

So if you were saying like, Yeah, let’s, let’s get rid of employees, right? Employees are unnecessary to the functioning of a company.

That’s all going to be AI.

Like, I I love the technology, right? I love it’s like, it’s like feeding my inner 12 year old I love playing with the toys.

But these toys have the potential.

They’re just tools.

They they’re not sentient than software, they have no capability.

They are just tools, but tools in the wrong hands can be very dangerous.

Right? A chainsaw in a trained lumberjacks hands super useful.

Don’t build your house.

Right.

And

Unknown Speaker 9:55

Jason, and Jason

Christopher Penn 9:57

from Friday have no teeth.

Not at all you really Want him out.

Katie Robbert 10:00

Although one could argue that he does know how to use it,

Christopher Penn 10:05

I would say actually Evil Dead, probably a better use,

Katie Robbert 10:07

let’s say he saw it in my hands.

Not a great idea because I don’t have the skills I haven’t been trained, I know how to make the teeth spin around.

Probably not even using the correct terminology.

If I can’t use the correct terminology, I probably shouldn’t go anywhere near it.

Christopher Penn 10:26

And I think that’s a really good fun way of expressing compassion and empathy are, are are techniques, they’re tools, they can be learned skills, and the folks who are making the systems that control a substantial amount of our reality or will don’t have those skills in the same way that the folks who control the vast amount of the media we consume today, like Facebook, or Instagram, don’t have those skills, innately in them or don’t demonstrate that they’re capable of using them competently.

And so what you end up with is the hyper politicized environment, we live in the very absolutist perspective, like you’re either with us or you’re against us.

And it’s a really dangerous thing, because that leads to an inability to relate to others.

Armed insurrections, you know, all sorts of negative consequences from just not being able to go, Oh, hey, getting rid of employees might be good for my wallet in the short term.

But it means I have to live in a walled armed compound, because everyone else around me is not doing well.

This is basically the way Venezuela was in the 1990s, where you had the wealthy were basically in gilded prisons, because they couldn’t leave their homes without their own personal army,

Katie Robbert 11:47

what we’re coming down to, and this, you know, I don’t, I mean, it may go sideways.

But you know, here we are, is that there’s this old school, very old, I mean, we’re talking back to the dawn of time thinking that, you know, business and emotion are two different things, or power and emotion are two different things.

And you can’t be too emotional.

You can’t, you have to make smart decisions, you can’t make emotional decisions.

I mean, that’s been the whole pushback on having, you know, women leaders in general is that we’re too emotional, we lead with our emotions, we don’t we can’t make you know, those unemotional decisions that are needed to be made in order to drive business forward in order to, you know, retain power.

And what you’re talking about is a small, yet powerful group of people who are making emotionless decisions without compassion, because they want to be the best of the best.

I mean, you’re talking about, you know, some of the most insecure, like, I don’t know who I am, unless I have, you know, the number one ribbon trophy, staring me that’s the first thing I can see, when I wake up in my billion dollar house like to them that means success, that means they’ve made it because they’re lacking in all of those other human factors.

I mean, we’ve seen it we’ve we’ve it’s not my opinion, it’s we’ve seen based on their behaviors, they are not well rounded humans.

And so their definition of happiness and success is absolute power, regardless of what it means for anybody else, because all they’re thinking about is themselves.

Now, we’ve also seen that this goes really poorly.

We’ve also seen that they are an n of one, and they still have to deal with the billions of people on the planet Earth, who think that their way of doing things is incorrect.

Now, you’re always gonna get people were like, no, they’re not that they’re not that bad.

It doesn’t impact me.

And that’s fine.

Let those people you know, Darwin themselves right off the side of the flatter, that’s fine.

But then you have more level headed folks like ourselves who are willing to question things.

And as asking those questions, is this right? Is this going to be good and longer? You don’t have to know the answer.

The fact that you’re questioning it at all, means that there is still hope for humanity.

And that’s where this whole conversation started last week, when you and I are having it is that you felt that humanity was doomed.

Based on the comments of one single person.

From my perspective, if we are still questioning, is this the right tool? Is this the right approach? Is this the right decision? We’re not that far gone.

Christopher Penn 14:45

So from a practical perspective, what are our next steps? I know from a technological perspective, things like open models where you can download and run these things yourself.

And they’re not in the hands of a small tech Olympic or those are the Isn’t that’s a technological good thing the more democratized technology is, the less any one group of people can control it? What are all the things that we should be thinking about as business leaders, as technology leaders, to re inject some of this empathy and compassion into the work that we do?

Katie Robbert 15:20

It’s not enough to sit back and complain about it, we need to if we feel passionate, that things are going in the wrong direction, we need to take action, we need to get involved.

And that includes getting involved in communities, it includes sitting on boards, it includes, you know, really trying to understand is how does this thing work? And what are the long term impacts? So think about it in terms of, you know, medical research, and, you know, we have like, a really good example of this is the opioid crisis, I was working right in the middle of all of that, that’s the academic research that I was doing.

Pharmaceutical companies were coming out with these magic cure all drugs, you know, specifically, you have Oxycontin, hydrocodone, Percocet, all, you know, you can list them all off, that were meant to relieve pain, and not be addiction forming and not be abusable.

Like, they felt like they had come up with the thing.

And the short term research said, Yes, that’s true.

But there was no long term research being done.

That’s where, you know, my team and my company came in was, what happens after, and we found, as you know, is very well documented that these drugs were abusable, that these drugs that these drugs did for me today, it was a whole crisis, that is still happening in a lot of the world, because these drugs are so habit forming.

And they were given a lot of times to people who had, you know, a toothache, or a torn ACL or something that had nothing to do with becoming addicted to a drug.

But because of the short term thinking of these farmers of the medical profession, they created a whole society of people who were addicted to pills.

And I feel like, we’re going to see the same kind of situation happen with generative AI because that short term thinking, if we start to lay everybody off and let generative AI just take over, then that’s what’s going to happen.

Because we don’t know enough about the long term effects.

And so this is where we, as humans, as business leaders, as people who are concerned have to look at what’s happened historically and say, how do we make sure that doesn’t happen again? Where did it go wrong? We made decisions too quickly, without enough time without enough information, because we were trying to make as much money as possible.

Because we were trying to be first out of the gate.

And that’s where people like you and I, the concerned citizens, the you know, concern data scientists, the concern analyst, the concern business leaders have to push back and get involved and say, we’ve done this, I’ve seen this movie before.

This is how it’s going to end is are we okay with that ending? And I can say I’m not, I don’t want to go through that, again, I don’t want to watch people sort of, you know, fall apart.

And it wasn’t their fault to begin with, because a few people made some really short term decisions because they wanted to make money.

So I would say, you know, next steps is great, we have to keep having the conversation, we have to keep having difficult conversations.

And we have to put people first, technology and money last.

And those are not things that everybody is comfortable doing.

But somebody has to and I’m willing to step up to the plate to do that.

Christopher Penn 18:58

And this is why you run the company.

If you have some thoughts on this topic that you would like to share, pop on over to our free slack.

Go to trust insights.ai/analytics for marketers, where you have over 3000 or the marketers are asking answering those questions every single day, including topics like this, we have a bunch of we call them work life balance channels that are not just about analytics and AI but about all the things that we care about.

And wherever does you watch or listen to the show if there’s a challenge rather have it on instead, go to trust insights.ai/ti podcast where you can find us on every major podcasting channel.

And while you’re on your channel of choice, please leave us a rating and review.

It does help to share the show.

Thanks for tuning in.

I will talk to you next time.


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This