Select Page

In this week’s episode of In-Ear Insights, Katie and Chris tackle the thorny issues of user data privacy and what expectations a company should plan to meet when it comes to protecting users – even when the users make bad choices. Learn the different types of private data – PII, SPI, and PHI – as well as hear about a massive dataset in the wild that probably shouldn’t be.

Sponsor: Knowledge Matters

In-Ear Insights is brought to you by Knowledge Matters.

Do you teach marketing at the collegiate level? Imagine using immersive, visual, interactive marketing simulations designed for today’s digitally-native students.

Besides bringing marketing concepts to life, it's cloud based and includes automatic grading and easy integration with all the popular collegiate learning management systems such as Canvas, Blackboard or Moodle.

If you would like to check out case simulations that cover all the key concepts for an Intro to Marketing course like Pricing, Promotion, Market Research, Market Segmentation, and much more, sign up today for a demo at: www.knowledgematters.com/podcast

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn
In this week’s episode of insights, we are talking about data protection users, user responsibilities, vendor responsibilities, and all things related to essentially preventing marketing disasters, because the data that you are not protecting, to give a bit of context, a little bit of table setting, we were looking at a massive data set released by a consultant who mind the public, completely public, no API keys needed no security violations, no hacking, this, this person pulled out 10 billion, then mo transactions from then most public logs of what people were paying money for. Everything from paying for pizza, to paying for intimate relationships with other people. And so we want to talk today about what a sort of the things that are important to keep in mind when you’re working with user data, everything from the marketing aspect to the system aspects to gosh, you just shouldn’t have done that. So Katie, why don’t you kick us off with when you hear about this kind of a data set, what comes to mind,

Katie Robbert
my first thought was that what’s the responsibility of the company to protect their user data, the fact that somebody can just without any sort of security keys download, that type of personally identifying information to that level of detail to me is super concerning. And something you would shared with me was that actually with invent mo and and it’s not a system that I use, with invent Mo, it’s the user who actually has to check certain boxes, and the ability to keep their data private is not the default setting. And that was just hugely concerning to me. And I understand, as long as companies, you know, bury in the fire print of, you know, their terms and their use in their privacy of things that people are never going to read that this is the way that they operate, then technically, it’s all legal. But does that make it right, knowing that the consumer isn’t going to read all of that paperwork, when they just want to send money to someone else for buying them a pizza that night?

Christopher Penn
So originally, event mode, I think the idea was it was supposed to be this, we can debate the wisdom of this idea. It was supposed to be a social network for payment, where you could see what your friends were doing. You know, and and be able to point out like a you hung out with, you know, so last night and bought like three things of wine with them, Why wasn’t I invited sort of like that, that whole encouraging foam Oh, kind of thing. I, since PayPal bought them. I don’t know if that design intent is still a priority to the but it is it is definitely still part of the DNA of the apple. So but you’re right, one of the things that is is somewhat concerning is that within this data set, you not only get the transaction data, you also get personally identifiable information, you get first names, last names, you get who was sending the money and who was receiving the money. And then you get the context of the note that the person attaches to the money. We saw a couple of examples where you’re perusing the dataset of, Oh, this is clearly a payment for intimate services of some kind. And that’s not the kind of thing that you would want. I think the sharing public, certainly not publicly, but I’m not even sure that you’d want to be sharing that with your friends necessary. I mean, I don’t know, I don’t have those kinds of friends. But maybe there’s a market for it. But it seems like there is much greater risk to the individual and to the organization, if that kind of information leaks out?

Katie Robbert
Well. And so it’s burned a larger conversation, which is what this one, and it’s the what is the responsibility of the company, and what is the responsibility of the user. So to go back to that example of what ven mo was originally designed for, even if the intention was to share socially, you know, with my friends, how many bottles of wine or how many pizzas I had the night before, my understanding as a consumer is that that information is only available to the people that I choose to share it with, unless I decide to make it public. And so I think that this is where that broader conversation of corporate responsibility versus user responsibility comes in. And I am a firm believer that it’s a 5050 equal responsibility, the company needs to be up front with how the data is collected, how it’s, you know, shared, not just buried in fine print that you know, especially on an app, you can’t even get to those privacy policies, you know, and be able to read them and understand them. But then the user also does have to take active steps to make sure that their settings are set up correctly, make sure that they have understood what is going to happen with the data. I mean, this you could have the same conversation about something like Facebook, it doesn’t matter the platform, the conversation of responsibility is still the same. Well, what do you think about that?

Christopher Penn
That was sort of the the genesis of GDPR general data protection regulation in the EU, one of the things that the the European Union legislated into GDPR was, look, you can’t have these 48 page end user license agreements with everything buried in fine print, everything has to be easy enough for the layman to understand Easy enough that a user knows what they’re signing up for. And that privacy is the default privacy is not an add on privacy is built in at the core. And privacy is the assumption that you build your systems with knowledge, the opposite. So in this in this example, it is kind of curious, because in many ways ven Mo, being owned by PayPal, which is a multinational corporation, is designing opposite to what one of the territories they operate in, expects and requires of its of its companies.

Katie Robbert
So, as a consumer, I imagine I already know the answer to this question for you specifically, but as a consumer, how often are you reading all of the Privacy Information before you download and install an app on your phone?

Christopher Penn
It depends on the level of risk. So for something where I’m installing, like a video game, I will just check to see like, you know what, what permissions is going to ask for? Because one of the nice things in in both iOS and Android is that when you install an app, the apps have to ask for separate permissions for the can I access the microphone? can I access the camera? can I access location services? and so on and so forth? Can I turn on notifications? And I’ll check the policy to see like what exactly you got to do that information. But for the most part, for a low risk thing, like a video game, or like, yeah, I’ll accept the policy. And then guess what, I’m not going to give you permission to do any of those things. I just expect you to entertain me for something that is higher risk, like a payment app like then Mo, I am going to read it in full because I want to know what happens to that data like are you sending this data to the federal government? I mean, you actually, you know, any any payment organization has an obligation to do so for any transaction over $10,000 in cash. Thank you money laundering. But beyond that, now, I will I will read those carefully. Because anything that involves health data, Google calls in SEO, your money or your life, right, your money, your health, things like that. Those are apps, you definitely want to read the privacy policies for how about you?

Katie Robbert
Um, I do. And I think that that comes from my background in being in a regulatory environment and actually writing those privacy policies, and the terms of use. So I understand, I know what to look for, I can kind of skip around a bit. But you know, again, that’s unique to me. What I find interesting, especially with some of these apps, you mentioned that Android and iOS will say I want to access your microphone, I want to do this, I when I get those notifications, and I say no, the app just stops working. So they’ve designed it in such a way that they’re asking, but not really, they’re basically telling you, I’m going to go ahead and do this. Now. The other thing that I find interesting, and I think that this is really where companies kind of let consumers down is, a lot of times you can’t access that privacy document or those terms of use until you signed up for the app and its installed on your phone. And then within the settings, you can find it as one of the bottom menu items. But a lot of times like it’s not a clear user experience. And I think that that’s part of the problem, too. And, you know, for companies, that’s not what they want people to focus on, they just want them to sign up for the thing. So that they can start using it and collecting data or getting money or whatever the purpose of the app is. But they make it so hard for the user to do their due diligence to make sure that they’re protected. And that to me is what bugs me the most about it. That’s what really grinds my gears.

Christopher Penn
If you want to take a baseball bat to their knees, and you have you have a VPN of some kind, you the way GDPR legislation is written is that somebody who is within the European Economic Area even virtually is subject to GDPR, not EU citizens, anybody. So like when I went to Poland to speak the moment I set foot on European soil, and it was using European ISP is a GDPR applied to me. So any app that was doing that was in violation GDPR and that and now that you could if they chose to go after 4% of that company’s revenue, and so on you if you find an app that is especially objectionable turn on a VPN set your exit point somewhere within the EU. Now GDPR applies to you and your phone. You could you could make a legislative case. Let’s finish up today with Oh, sorry, go ahead.

Katie Robbert
Well, you know, it’s interesting, but the only thing I’m thinking as you’re talking about setting up this VPN is what is the likelihood that my parents are going to do something like that. And I think that unfortunately, we see people who are less educated about privacy become more susceptible to these, you know, different privacy hacks, these data hacks. So what? Okay, so VPN is one solution. Are there other things that people could be doing, that don’t require having to understand how a VPN works?

Christopher Penn
Not really. And that in that it lies sort of at the heart of why, for example, California has introduced its own consumer Privacy Act, and you’re seeing these different legislative packages coming up across the planet, because people are realizing even even legislators who are not the most technologically savvy people are realizing that companies are taking advantage of their users in very unscrupulous ways. And so they’re legislating this into law. And the best practice, of course, for any company in the space is to adhere to the strictest standard, and then every other subsequent standard is is easy to comply with. So if you are a company that is dealing with user data in this fashion, be GDP are compliant. The rest of the issue, and everything else is easy.

Katie Robbert
What about companies like us who aren’t collecting sensitive information,

Unknown Speaker
wherever we are?

Katie Robbert
We’re collecting first name, last name. But we’re not collecting bank account numbers. We’re not collecting health information. We’re not collecting social security information. And so and I think that that’s a separate conversation about what PII pH I and all of those different terms mean, that’s, you know, let’s put a pin in that and come back to that if people have questions, feel free to contact us directly. You know, but for marketers, for example, who are just collecting basic contact information that yes, is still sensitive. You know, where, where’s that line?

Christopher Penn
So under EU law, there is no difference between SPI and PII meaning no sense. There’s no difference between protected health information and basic information, you are expected to treat it all as protected information, you’re expected to encrypt it all you’re expected to secure at all you’re expected to be, you know, have a data protection officer on staff if you do business, in any capacity within the European Economic Area.

Unknown Speaker
So far, I’m not in the EU.

Christopher Penn
We are not in the EU. But we do business in the EU by default, because we have a website that is global.

Katie Robbert
But what about companies that don’t do business in the EU don’t recognize that by having a website, you technically are global. Like, I think that that’s the piece of the education that’s missing.

Christopher Penn
This was a really popular topic last year, and the general consensus is that like Ned’s pizza shop in Omaha, Nebraska, which yes does have a website, sort of, you know, but it’s literally just the pizza menu, and they have an email list, but they they’re not obviously going to ship a pizza to Europe. Better night anyway, they are technically in violation of GDPR. But the European Commission is highly unlikely to go after them. It is a question of risk and what your appetite for risk is for companies that do business in the EU and and basically can be sanctioned in an EU court, the risk is high. So Google, Facebook, then Mo, you know, these are companies that are at risk and must be compliant. For a company like us, we are technically at risk, you know, because it something like download this ebook, or this white paper is meets the definition and GDPR of you are doing business in the EU, you’re providing services to US citizens, you people within the European Economic Area. But because it would be fairly difficult for us to go to court in the Netherlands. And we have no bank accounts that are subject to seizure by the EU, our relative risk is low.

Katie Robbert
You know, it’s I remember when GDPR was first introduced and much like y2k, people were freaking out. And they were hiring specialists. And people who only knew about this one thing as a consultant, I’m sure those people cleaned up and made lots of money as consultants, but it is something that people still need to educate themselves on still need to be aware of. And on the flip side, consumers need to be aware of what is happening with their data. And so as we wrap up today, do your due diligence. And what that means is make sure you know what’s happening with your information, make sure you don’t just out of the box, don’t check any settings, you just start using the app or the software or whatever it is that you double check and make sure you understand what all of the settings mean. And then it’s something that you’re comfortable with. Now, here’s the thing, if you’re totally comfortable with all of your information being out there, you have nothing to lose, you live the yellow lifestyle, and you don’t care how many pizzas you’ve eaten, because you know it’s your life and you’re going to live it the way you want, then that’s fine. We’re not saying don’t share your information, we’re just saying be smart about it.

Christopher Penn
Yeah, one thing I want to tackle before we close out, and this is something that applies to us as marketers who are using data is what and this is something you raised in our initial research and the vendor data, what are the obligations, ethical, etc, of data scientists have data savvy marketers when it comes to information like this, like we have, for example, a tremendous amount of information in this 10 gigabyte file from Ben Mo, what are what are the things that that we should be thinking about from an ethics perspective with this data?

Katie Robbert
ethically, you know, it’s interesting, because, you know, it’s sort of the ethics of like, it’s not our business to share that information. However, from a very practical and pragmatic standpoint, the thing that we can do and other data scientists can do is, again, that due diligence, put yourself in the shoes of the consumer, and go through the information go through what Venlo has set up or whatever company, make sure that the privacy policy and the terms and use clearly state how the data will be used, make sure you understand the settings to make from the user perspective so that it’s very clear how the data was shared. Once that’s all set, and everything is aboveboard, the company has not been in any sort of violation of privacy or data sharing, and that they very clearly stated, this is what we are going to do with your data, then technically, yes, it’s fair game. Now ethically, should you be sharing that information. It depends, it depends on why you’re sharing it and what the research is used for. If you’re using it as a training data set to understand how to work with that kind of data. But it’s all internal, and it’s never really going to see the light of day, then that’s fine. But if you’re sharing it publicly, then you are technically putting people’s identities and personal habits out there that they may not have been aware that are about to be shared. So that’s where the ethics come in. So I’d say it depends, you can share it provided everything as above board with the company. But think about your reasons why.

Christopher Penn
If this was a clinical research data, because it kind of is in some ways, it’s someone’s personal financial information. What are the what are the What did you do in the medical field? Do you like, if I recall correctly, medicine requires you to have consent from the patient themselves to to publish the data, right?

Katie Robbert
It does, you need consent from the patient. But you also never publish the personally identifying information, definitely not names. Everyone is given a unique ID and that is a good way to handle it. So really, you’re just publishing on general demographic behavior. And there’s rules about that as well. Because if certain parts of your demographic like your zip code fall below, if there’s a population of less than a certain amount of people, then you can’t use that because it is still possible to identify that person. If they are the only 40 year old white female living in that area, people will know exactly who it is. So there are still general rules about it. So even as a data scientist using marketing data, like what what you’re talking about, you still would have to D identify the data itself and only be publishing on the behavioral and the demographics within certain rules to make sure you’re not violating HIPAA.

Christopher Penn
Gotcha. So, for those folks who are data savvy marketers, clearly the message is, know what you’re doing when it comes to people’s personal information. If you don’t have consent written down from the end user, anonymize it, and D identify it so that you are not putting yourself or people at risk.

Katie Robbert
Well, just to clarify, you need consent regard. You need consent, but you need to anonymize it regardless,

Christopher Penn
you know, a case like the demo data set, then where you do not have consent, then mo hat had consent, but the person who mind this data, which is public public information, but we still don’t have consent of the users from Denmark to use their data.

Katie Robbert
Correct. which is again, where sort of that ethics conversation comes in. So regardless, best practice is to anonymize and D identify the data. But then also, if you’re going to be publishing then most edit, probably check with them to see they have some policies about what to do with their data. So probably check with the company you’re getting the data from?

Christopher Penn
Yep. Alright, we will wrap up on that note, as always, please subscribe to the trust insights podcast while you’re listening to it. So subscribe to it. Just listen to the blog post, and a to our newsletter, which you can find in the show notes below and we’ll talk to you next time. Take care


Need help with your marketing data and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, Data in the Headlights. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new 10-minute or less episodes every week.


Also published on Medium.

Pin It on Pinterest

Share This