What is First Party Data Second Party Data and Third Party Data 23

{PODCAST} In-Ear Insights: Customer Data Protection and Privacy

In this week’s episode of In-Ear Insights, Katie and Chris tackle the thorny issues of user data privacy and what expectations a company should plan to meet when it comes to protecting users – even when the users make bad choices. Learn the different types of private data – PII, SPI, and PHI – as well as hear about a massive dataset in the wild that probably shouldn’t be.

[podcastsponsor]

Listen to the audio here:

Download the MP3 audio here.

Machine-Generated Transcript

What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.

Christopher Penn: In this week’s episode of In-Ear Insights we are talking about data protection users, user responsibilities, vendor responsibilities, and all things related to essentially preventing marketing disasters because of the data you are or are not protecting.

To give you a little bit of context, we were looking at a massive data set released by a consultant who pulled out 10 billion Venmo transactions from Venmo’s public logs of what people were paying money for. Everything from paying for pizza to paying for intimate relationships with other people. So we want to talk today about the things that are important to keep in mind when you’re working with user data, everything from the marketing aspect to the system aspects, to ‘Gosh, you just shouldn’t have done that.’

So Katie, why don’t you kick us off. When you hear about this kind of data set, what comes to mind?

Katie Robbert: My first thought is the responsibility of the company to protect its user data. The fact that somebody can just download without any sort of security keys that type of personally identifying information, to that level of detail, is super concerning to me. And something you shared with me was that within Venmo, it’s the user who actually has to check certain boxes, and the ability to keep their data private is not the default setting. That was just hugely concerning to me. And I understand, as long as companies bury in the fine print their terms and their use and their privacy, things that people are never going to read, then technically, it’s all legal. But does that make it right, knowing that the consumer isn’t going to read all of that paperwork when they just want to send money to someone else for buying them a pizza that night?

CP: So originally, I think the idea of Venmo was supposed to be a social network for payment, where you could see what your friends were doing, and be able to point out that you hung out with so-and-so last night and bought like three things of wine with them, “Why wasn’t I invited?” Since PayPal bought them, I don’t know if that design intent is still a priority to them, but it is definitely still part of the DNA of the app. But you’re right, one of the things that are somewhat concerning is that within this data set, you not only get the transaction data, you also get personally identifiable information. You get first names, last names, you get who was sending the money and who was receiving the money. And then you get the context of the note that the person attaches to the money. We saw a couple of examples where you’re perusing the dataset and can clearly see it is a payment for intimate services of some kind. And that’s not the kind of thing that you would want to, I think, be sharing publicly. I’m not even sure that you’d want to be sharing that with your friends necessarily. I mean, I don’t know, I don’t have those kinds of friends. Maybe there’s a market for it. But it seems like there is a much greater risk to the individual and to the organization if that kind of information leaks out.

KR: It’s burned a larger conversation, which is, what is the responsibility of the company, and what is the responsibility of the user? So to go back to that example of what Venmo was originally designed for, even if the intention was to share socially with my friends how many bottles of wine or how many pizzas I had the night before, my understanding as a consumer is that that information is only available to the people that I choose to share it with unless I decide to make it public. So I think that this is where that broader conversation of corporate responsibility versus user responsibility comes in. And I am a firm believer that it’s a 50/50 equal responsibility. The company needs to be upfront with how the data is collected and how it’s shared, not just buried in fine print that, especially on an app, you can’t even get to those privacy policies, and be able to read them and understand them. But then the user also does have to take active steps to make sure that their settings are set up correctly, make sure that they have understood what is going to happen with the data. I mean, you could have the same conversation about something like Facebook, it doesn’t matter the platform, the conversation of responsibility is still the same. What do you think about that?

CP: That was sort of the genesis of GDPR (General Data Protection Regulation) in the EU. One of the things that the European Union legislated into GDPR was that you can’t have these 48-page end user license agreements with everything buried in the fine print. Everything has to be easy enough for the layman to understand. Easy enough that a user knows what they’re signing up for. That privacy is the default privacy, not add-on privacy. And that privacy is the assumption that you build your systems with, not the opposite. So in this example, it is kind of curious, because in many ways Venmo, being owned by PayPal, which is a multinational corporation, is designing opposite to what one of the territories they operate in expects and requires of its companies.

KR: So, as a consumer—I imagine I already know the answer to this question for you specifically—but as a consumer, how often are you reading all of the Privacy Information before you download and install an app on your phone?

CP: It depends on the level of risk. So for something I’m installing, like a video game, I will just check to see what permissions it is going to ask for. Because one of the nice things in both iOS and Android is that when you install an app, the apps have to ask for separate permissions like, ‘Can I access the microphone?’ ‘Can I access the camera?’
‘Can I access location services?’ and so on and so forth. And I’ll check the policy to see what exactly they’re going to do with that information. But for the most part, for a low-risk thing, yeah, I’ll accept the policy. And then guess what, I’m not going to give you permission to do any of those things. I just expect you to entertain me.

For something that is higher risk, like a payment app like Venmo, I am going to read it in full because I want to know what happens to that data, like are you sending this data to the federal government? I mean, any payment organization has an obligation to do so for any transaction over $10,000 in cash—thank you money laundering—but beyond that, I will read those carefully. Because anything that involves health data, your money, or your life you definitely want to read the privacy policies for.

How about you?

KR: I do. And I think that comes from my background in being in a regulatory environment and actually writing those privacy policies and the terms of use. So I know what to look for, I can kind of skip around a bit. But you know, again, that’s unique to me. What I find interesting, especially with some of these apps, you mentioned that Android and iOS will say things like ‘I want to access your microphone.’ When I get those notifications, and I say no, the app just stops working. So they’ve designed it in such a way that they’re asking, but not really. They’re basically telling you, I’m going to go ahead and do this.

The other thing that I find interesting, and I think that this is really where companies kind of let consumers down is, a lot of times you can’t access that privacy document or those terms of use until you signed up for the app and it’s installed on your phone. And then within the settings, you can find it as one of the bottom menu items. But a lot of times it’s not a clear user experience. And I think that’s part of the problem, too. For companies, that’s not what they want people to focus on, they just want them to sign up for the thing, so that they can start using it and collecting data or getting money, or whatever the purpose of the app is. But they make it so hard for the user to do their due diligence to make sure that they’re protected, and that to me is what bugs me the most about it. That’s what really grinds my gears.

CP: If you want to take a baseball bat to their knees, and you have a VPN of some kind, the way GDPR legislation is written is that somebody who is within the European Economic Area, even virtually, is subject to GDPR; not EU citizens, anybody. So like when I went to Poland to speak, the moment I set foot on European soil, and I was using European ISP, GDPR applied to me. So any app that was doing that was in violation of GDPR and now you could, if you chose to, go after 4% of that company’s revenue. So if you find an app that is especially objectionable, turn on a VPN, set your exit point somewhere within the EU. Now that GDPR applies to you and your phone, you could make a legislative case.

KR: You know, the only thing I’m thinking as you’re talking about setting up this VPN is what is the likelihood that my parents are going to do something like that? I think that unfortunately, we see people who are less educated about privacy become more susceptible to these different privacy hacks, these data hacks. So VPN is one solution. Are there other things that people could be doing, that don’t require having to understand how a VPN works?

CP: Not really. And that lies at the heart of why, for example, California has introduced its own Consumer Privacy Act, and you’re seeing these different legislative packages coming up across the planet. Because people are realizing—even legislators who are not the most technologically savvy people—that companies are taking advantage of their users in very unscrupulous ways. So they’re legislating this into law. And the best practice, of course, for any company in the space is to adhere to the strictest standard. Then every other subsequent standard is easy to comply with. So if you are a company that is dealing with user data in this fashion, be GDPR-compliant. The rest of the issue, and everything else, is easy.

KR: What about companies like us who aren’t collecting sensitive information?

CP: But we are.

KR: We’re collecting first and last names, but we’re not collecting bank account numbers. We’re not collecting health information. We’re not collecting social security information. I think that’s a separate conversation about what PII and PHI, and all of those different terms mean, so let’s put a pin on that and come back to that. If people have questions, feel free to contact us directly. But for marketers, for example, who are just collecting basic contact information that yes, is still sensitive, where’s the line?

CP: Under EU law, there is no difference between SPI and PII, meaning there’s no difference between protected health information and basic information. You are expected to treat it all as protected information, you’re expected to encrypt it all, you’re expected to secure it all, you’re expected to have a data protection officer on staff if you do business, in any capacity, within the European Economic Area.

KR: We’re not in the EU.

CP: We are not in the EU. But we do business in the EU by default, because we have a website that is global.

KR: But what about companies that don’t do business in the EU and don’t recognize that by having a website, you technically are global. I think that’s the piece of the education that’s missing.

CP: This was a really popular topic last year, and the general consensus is that like Ned’s pizza shop in Omaha, Nebraska, which yes does have a website but it’s literally just the pizza menu and they’re obviously not going to ship a pizza to Europe. They are technically in violation of GDPR. But the European Commission is highly unlikely to go after them. It is a question of risk and what your appetite for risk is.

For companies that do business in the EU and basically can be sanctioned in an EU court, the risk is high. So Google, Facebook, Venmo, are companies that are at risk and must be compliant. For a company like us, we are technically at risk, because something like ‘Download this ebook’ meets the definition in GDPR of doing business in the EU; you’re providing services to EU citizens, people within the European Economic Area. But because it would be fairly difficult for us to go to court in the Netherlands, and we have no bank accounts that are subject to seizure by the EU, our relative risk is low.

KR: I remember when GDPR was first introduced and much like Y2K, people were freaking out. They were hiring specialists. And people who only knew about this one thing as a consultant, I’m sure they cleaned up and made lots of money. But it is something that people still need to educate themselves on; they still need to be aware of.

On the flip side, consumers need to be aware of what is happening with their data. So as we wrap up today, do your due diligence. What that means is, make sure you know what’s happening with your information. Make sure you don’t just start using the app or the software or whatever it is. Double-check and make sure you understand what all of the settings mean so that it becomes something you’re comfortable with. Now, here’s the thing, if you’re totally comfortable with all of your information being out there, you have nothing to lose, you live the YOLO lifestyle, and you don’t care how many pizzas you’ve eaten, because you know it’s your life and you’re going to live it the way you want, then that’s fine. We’re not saying don’t share your information, we’re just saying be smart about it.

CP: Yeah. One thing I want to tackle before we close out—and this is something that applies to us as marketers who are using data and something you raised in our research about the Venmo data—, what are the obligations of data scientists and data-savvy marketers when it comes to information like this? We have, for example, a tremendous amount of information in this 10 GB file from Venmo. What are the things that we should be thinking about from an ethics perspective with this data?

KR: It’s sort of the ethics viewpoint that it’s not our business to share that information. However, from a very practical and pragmatic standpoint, the thing that we can do and other data scientists can do is, again, that due diligence. Put yourself in the shoes of the consumer, and go through the information. Go through what Venmo has set up, or whatever company. Make sure that the privacy policy and the terms and use clearly state how the data will be used. Make sure you understand the settings from the user’s perspective so that it’s very clear how the data is shared. Once that’s all set, and everything is aboveboard, meaning the company has not been in any sort of violation of privacy or data sharing, and that they very clearly stated, “This is what we are going to do with your data,” then technically, yes, it’s fair game.

Now ethically, should you be sharing that information? It depends. It depends on why you’re sharing it and what the research is used for. If you’re using it as a training data set to understand how to work with that kind of data, but it’s all internal and it’s never really going to see the light of day, then that’s fine. But if you’re sharing it publicly, then you are technically putting people’s identities and personal habits out there that they may not have been aware they’re about to be shared. That’s where the ethics come in. I’d say it depends. You can share it, provided everything is above board with the company. But think about your reasons why.

CP: If this were clinical research data because it kind of is in some ways—it’s someone’s personal financial information—what did you do in the medical field? If I recall correctly, medicine requires you to have consent from the patient themselves to publish the data, right?

KR: It does. You need consent from the patient. But you also never publish the personally identifying information; definitely not names. Everyone is given a unique ID and that is a good way to handle it. So really, you’re just publishing general demographic behavior. And there are rules about that as well. Because if certain parts of your demographic, like your zip code, fall below; if there’s a population of less than a certain amount of people, then you can’t use that because it is still possible to identify that person. If they are the only 40-year-old, white female living in that area, people will know exactly who it is. So there are still general rules about it. Even as a data scientist using marketing data, like what you’re talking about, you still would have to D-identify the data itself and only be published on the behavioral and the demographics within certain rules to make sure you’re not violating HIPAA.

CP: Gotcha. So for those folks who are data-savvy marketers, clearly the message is, know what you’re doing when it comes to people’s personal information. If you don’t have consent written down from the end-user, anonymize it, and D-identify it so that you are not putting yourself or others at risk.

KR: Just to clarify, you need consent, but you need to anonymize it regardless.

CP: In a case like the Venmo data set where you do not have consent, the person who mined this data, which is public information, shows that we still don’t have the consent of the users from Venmo to use their data.

KR: Correct. This is, again, where that ethics conversation comes in. So regardless, the best practice is to anonymize and D-identify the data. But then also, if you’re going to be publishing Venmo’s then probably check with them to see if they have some policies about what to do with their data. So probably check with the company you’re getting the data from.

CP: Alright, we will wrap up on that note. As always, please subscribe to the Trust Insights podcast; well you’re listening to it, so subscribe to it. And to our newsletter, which you can find in the notes below. We’ll talk to you next time. Take care.


Need help with your marketing AI and analytics?

You might also enjoy:

Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, INBOX INSIGHTS. Subscribe now for free; new issues every Wednesday!

Click here to subscribe now »

Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new episodes every Wednesday.

Leave a Reply

Your email address will not be published. Required fields are marked *

Pin It on Pinterest

Share This