So What? Marketing Analytics and Insights Live
airs every Thursday at 1 pm EST.
In this week’s episode of So What? we focus on vetting source credibility. We walk through what credibility means, what questions to ask, and what to do if you don’t have a credible source. Catch the replay here:
In this episode you’ll learn:
- tips to determine conflict of interest
- using your own data to determine trends
- how to use credible sources
- Getting back on stage – 7/22/2021
Have a question or topic you’d like to see us cover? Reach out here: https://www.trustinsights.ai/resources/so-what-the-marketing-analytics-and-insights-show/
Katie Robbert 0:17
Well, Hi there, welcome to so wet the marketing analytics and insights live show Happy Thursday, I’m joined by Chris and for once in Massachusetts, it is not raining. So that is a big deal, although I believe it’s going to rain all weekend. So here we are. In today’s show, we’re talking about vetting source credibility. We’re going to talk through tips to determine if there’s a conflict of interest, meaning that someone is posting something that’s promotional, but they’re not disclosing necessarily. And so making sure you’re staying away from using that kind of information, using your own data to determine some trends and how to use credible sources, the so what of it all? So Chris, where do you want to start?
Christopher Penn 1:04
I think, actually, in some ways, a price start with the sowhat of it, which is that if you if you’re making decisions with data, and you’re getting data from third parties, meaning it’s not your own, you have to know what ingredients you’re working with it, where do they come from? Are they are they credible? And by credible? I think we mean, a few things we mean, you know, is it from a source that you can trust? Is the data you work with? Correct? Is the data working fresh? Like you said, Katie is conflicted? And do you know where that data came from sort of the provenance of the lineage? Katie, you spent a number of years in healthcare. When you’re when you are working with third party data, how did you decide like, yes, this data is credible, or no, this data is not.
Katie Robbert 1:58
So I worked in pharmaceutical and clinical trial research. And so a lot of the data that we use to supplement what we were doing was government data. And so things like samsa and, and Said’s. And so those are self report. data sources, if I remember correctly, I’m now going back over a decade, but they’re self report. And so the idea is that they weren’t out surveying the different clinics or the substance abuse counselors, but those clinics were self reporting. And basically, they had to fill out information and submit it once a year or once a quarter. And so in some, so that methodology, in some ways, takes out the I have an agenda. Because the organization, the governing organization, the that’s publishing the data, theoretically, they shouldn’t be changing it, they shouldn’t be influencing the information. And so where we did start to run into issue was, if we were running a commercial survey, a survey that was sponsored by a pharmaceutical company, we had to be very clear about disclosing who was sponsoring the information, because obviously, the pharmaceutical company had an agenda of the information they wanted to collect. They wanted to collect information specifically about their thing, and they wanted it to be a certain way. I mean, we run into these issues at pretty much any company that’s running some sort of a consumer survey, they’re doing it because, you know, either they want to know information, maybe shed a good light, or they want to collect information that they can then turn around and share publicly that paints them in a good light. And that’s not always what you get back. And so that’s one of the challenges of sponsored content or sponsored surveys is the person sponsoring it a lot of times has, whether it’s a conscious or unconscious agenda of how they want the information to turn out. And that was one of the challenges with pharmaceutical sponsored surveys is that people didn’t trust them. They’re like, well, Big Pharma is asking you to collect this information from me. So why do I want to tell you anything?
Christopher Penn 4:29
Hmm, gotcha. Yeah, I mean, certainly in the last 16 months, we’ve had a chance all of us, you know, collectively to evaluate new information and try to decide can we believe it? Right? There’s no shortage of people who are sharing really bad information, particularly around the pandemic. There’s no shortage of people who sharing really good information to and I was giving some thought to this as we were talking, getting ready for this show and I came up with a shortlist of about five Things that I know I do Personally, I’d love to have you weigh in on this five areas I think are for credibility. One is the credentials both of the individual and the and the sponsoring company, like when Trust Insights, publishes content in our newsletter, right? We are the institutional sponsor of the thing. And you ask questions like, you know, isn’t the known entity? Or, I guess, more crassly, can you sue this company? If we give out bad advice, can you sue us? The answer is, yes, we exist. You can look us up in a business directory, find our IRS tax number, all that stuff. And then, at the person level, does the person have authentic verifiable credentials of some kinds, like, when we’re looking at people sharing information about a brand new virus? Like, is the person showing this virologist? Right? Is this you know, their life’s work? Or is it some random guy on Facebook, or some YouTube celebrity who is popular, but doesn’t have any actual expertise, the topic. So I think that’s the first area anywhere, evaluate any sources look at the credentials of the individual and the institution behind the individual.
Katie Robbert 6:08
So the example you give of someone sharing information about a virus, the person should have some sort of, you know, medical background would probably be helpful. That’s pretty straightforward. It’s not as straightforward in say, the marketing industry. So a lot of marketers don’t have a marketing degree. Because you don’t necessarily need a marketing degree in order to do marketing. Um, you know, I have my undergrad is, you know, a communications degree, specifically TV and film that has literally nothing to do with marketing. So one could look at my degree and go, Well, why is she telling me anything, she doesn’t have her master’s in business, she doesn’t have a marketing degree. So I can’t take anything she says seriously. So how do you reconcile that situation?
Christopher Penn 6:59
That’s a really good point. And that’s where being well regarded by other experts in the field is helpful. When you survey and ask people, like, you know, this person’s affiliations, what are they affiliate with an institution, a company of some kind, that is, at least, you know, exists and appears credible? It’s one of the reasons why you will see, you know, whenever mark is talking like he does the clients we work with worked with T Mobile, we’ve worked with at&t, we worked with, so on and so forth. Because that badge value essentially says, All other organizations trusted this person to this company, to do work for them whether or not they did it in good work. logos, or at least they had that experience. But there is that whole aspect of being well regarded by other experts. There are certainly a lot of things you can do, like, you know, write a book or whatever and things that create those additional heuristics. But one of the clear signs to me is is almost like Six Degrees of Kevin, Kevin Bacon, right? So, you know, this person is connected to this person is connected to this person, well, this person at the start of the chain is a known trustworthy person, it how many degrees of connection? And how strong are those degrees? To say, like, yeah, this person is probably trustworthy. So you know, a silly example. There are a number of people who will be grudgingly say, Okay, I, I at least know what I’m doing within a certain period, right? And you’re affiliated with me and TrustInsights.ai as affiliate with me. And so therefore, there’s one degree of separation between the two of us. So by that by that same extension, you have that credibility, if someone’s tracing that research through me, illustrates that research through John Wall, right? You know, they’re a fan of marketing over coffee. And they see that, you know, John works at Trust Insights, you work with TrustInsights.ai. Again, they following that lineage. So there are cases where there isn’t a clear academic credential or some kind of certification. That does help. The other thing to point out that there certainly are other certifications like UI, Google Analytics certified professional, Google Ads certified professional and Hubspot certified professional in the absence of other markers. Those are things that at least you took the time and effort to get that might even if your opinion of like Hubspot or Google’s or whoever certification, you know, maybe, okay, it exists. I don’t know how much faith to put in at least you know, the person made the effort, as opposed to somebody who hasn’t.
Katie Robbert 9:27
Fun fact, I am one degree from Kevin Bacon, but that’s probably for another time.
Christopher Penn 9:34
But it’ll be a fun if we should like do like a year and at the end of this, like a ridiculous random show, like a live show has nothing to do about marketing.
Katie Robbert 9:44
That would be an excellent idea. But to your point, Chris, I think that that makes sense because what you know, it’s interesting because what you’re describing works two ways. One is credibility by association, the other is guilty by association, and so on. The credibility piece of it. So you know, US following that example, Chris, you are well known and have certifications and are considered in the marketing space and authority on machine learning artificial intelligence, and those kinds of topics. So by association, when I speak about it, or I talk about it, I’m borrowing your credibility, to build my own credibility, until I’m standing on my own two feet in that space, at least from other people’s perception. The flip side of that is, let’s say people find out this is theoretical. This is not true. Let’s be clear, full disclaimer, none of this is true. Let’s find out that Chris Penn is a total fraud, and knows nothing about artificial intelligence, as he has been making it up based on that Hailey Joe Osmond movie that he saw. Let’s say that that’s the case. I then if I start speaking about artificial intelligence, I’m guilty by association, because I then haven’t done my due diligence to find out if Chris is actually truly credible.
Christopher Penn 11:07
Exactly. And there’s a fair number of people again, this, this is where that, you know, well regarded by experts, is important. When you have a variety of people who all say, yes, this person at least knows what they’re doing. You know, that’s important, because there are as with anything, human, Lady humans, there are certainly cliques of people, like, Oh, yeah, this person, you know, you got this group, like, I guess that person is the bomb, and you ask, why not? That person’s a total? Whatever? Oh,
Katie Robbert 11:34
do you have the people who can do No, wrong?
Christopher Penn 11:36
Exactly. But in general, if you would a panel, you know, 10 or 15, these folks, even the folks who, you know, you’re doing right, if somebody who is not in the circles, like yeah, they know what they’re doing. You know, I don’t get like I might have clutter drinks with them, whatever. They know what they’re doing, they’re okay.
Katie Robbert 11:52
To test that is to, you know, amongst your slack communities, or Facebook communities, where, wherever else, you have those kinds of communities, you can just ask the question, Who do you know, who’s an expert on x? And if you start to get similar names, like if you if you say, you know, in a slack community, who do you know, who’s an expert on Google Analytics? Or who do you know who’s an expert on machine learning and marketing, you would naturally assume that Chris Penn’s name would come up?
Christopher Penn 12:28
Hopefully. So that’s one way. Number two is, is the source that you’re working with presenting factually correct dating information? That is, that is correct. And those two things, two important aspects. One, if it’s a more formal publication, again, this goes back to academia, there’s probably been a peer review process where other people who are also independent have looked at and said, Yep, this this passes review, you know, you didn’t obviously screw up. But to your point, Katie, I get a lot of marketing is not academic. And and so the second thing to look at is, if this is a source that you’re considering, you know, following a blog, you want to follow us like that look for stuff that, you know, right. So if someone’s writing a blog, say about it, we’ll use machine learning. But they’ve got blog posts about Google Analytics, and you know, Google Analytics, you can read their stuff and go, okay, they at least know their stuff within the domain of Google Analytics. So I can feel somewhat comfortable that they probably have at least done similar levels of homework on machine learning. On the other hand, if you read their blog on this a, you know, and it’s 2021, and this, uh, you know, for Google Analytics, just assault the GA JS tracking code on your website. Yeah, that like that stopped being used seven years ago, so that this person is not presenting factually correct data anymore. That would be number two, in terms of evaluating a source.
Katie Robbert 13:51
You know, I think that that’s a really good point. If you’re not looking at academic papers, I can’t even recall how many times someone has shared an article with me and said, you know, here’s the latest and greatest and the first thing we notice is, well, this was published in 2016. How is this the latest and greatest? Or even, um, you know, if you think about it, how fast moving things are, one of the things that I was working on with clients, or maybe even you know, about two months ago was some of the changes to Facebook Business Manager. And we had a really hard time finding up to date information that answered the questions. And part of that was because nobody knew the answer, because Facebook wasn’t really sharing the information. And they made it really difficult. But the information we were finding, once we dug into it, once I was really reading through it was speculation. And so a lot of it was published in December of 2020. And once you’ve really sort of dug in was, this is what I think is going to happen and this is what might be but it’s not out there yet. So we don’t really know and making sure you’re not just, you know, reading the title. Have an article and then sharing it but really reading it thoroughly. Because you could be sharing something that then makes you look like you’re not credible. Because again, going back to you need to do your homework and make sure that you’re not sharing information that is theoretical and not actually, you know, something that’s happened in factual and up to date.
Christopher Penn 15:20
Exactly, which is the point number three, which is, are things up to date or not? Again, anytime you’re working with data, data is like fruit when it’s fresh, it’s better. A lot of the times older data is can be really hazardous, especially right now, right? We’re 16 months, 17 months into a pandemic that has changed everything about life, if you’re still working substantially with data from prior to March 2020, and making important decisions based on it without looking at the data since then, you’re at risks that that current current, this is important. The fourth is, is are you working with a source that’s conflicted in some way, and this is where we get into some interesting stuff, the thing that we’re looking for our material conflicts of interest. So real simple example, when I published my personal newsletter, I have to disclose like that the things I’m earning money on, like, hey, when I share a link to say, Agorapulse, I’m an ambassador of theirs. And they do have they sent me money, they have sent me money, but they do send me swag. But there there is a conflict of interest, because they’re giving me something an expectation of something else. Right. But the important part, I think there’s that material clause is, is it a conflict of interest that would negate the data source like what you were saying earlier, when a pharma company is sponsoring a study, it’s like a cigarette company sponsoring a study that smoking isn’t harmful, like there’s a material conflict of interest.
Katie Robbert 16:53
So a really good example, I actually ran into this the other day, so one of the things that we’re doing right now at Trust Insights is we are reviewing different project management tools. And so you know, there’s a whole host of them out there, I’ve been asking our community what they like the best. And so of course, naturally, I was googling, like, you know, best tools for productivity, best project management tools for collaboration, and those kinds of things. And what I got back, were articles authored by one of the many software tools out there giving that list of Here are the top 10 best project management tools. And so I’d be like, Oh, this, this should be interesting. Wouldn’t they say that they are the best. And lo and behold, the articles will not obvious were written in such a way that painted their tool in the best light. So they would review other tools, they wouldn’t, you know, tear them down and slam them. But none of the tools lived up to all of the really great things that the author’s tool did. And so that is a not as obvious conflict of interest, other than the fact that the company wrote the review of all of the other tools, including their own, so that you can sort of make that, you know, natural leap to well, they’re probably going to paint there as the best, very few companies are going to go out there and be like, yeah, ours isn’t the greatest, maybe it’s not for you.
Christopher Penn 18:20
Exactly. And the the marker that I look for, and again, this comes from having now read, you know, months and months of pandemic related literature, and studies is a very specific thing with disclosures, what disclosures does any piece of content have? And, you know, I’ll give you a real simple example, if you look in the Trust Insights newsletter every week, you know, there’s a piece of data of some kind. And at the end, is the sentences We are the sole sponsor of the study, we neither gave nor receive compensation for the data use beyond applicable service fees. And we declare no competing interests, which again, is coming from more academic literature. But we’re basically saying we’re presenting something here that is we don’t have a material conflict of interest for so this data, if we were presenting something like what’s the best analyst consulting firm, that’s a competing interest, we would have to declare there’s an obvious competing interest here. And when you’re looking at all these different articles and things, look for those disclosures, if the ideal is to see a disclosure section, no matter what even if it says disclosures, no competing interests, because again, you see that a lot in academia, we’ve declared there aren’t any issues. But if you if you’re reading something, and there isn’t a disclosure, you have to wonder as opposed to being explicit. So, you know, a takeaway for the marketers, is, if you want your stuff to be maximally trustworthy, make sure there’s always a disclosure section so that you can say like, Yeah, I don’t have a conflict of interest here.
Katie Robbert 19:50
The other thing to think about so one of the things that we pride ourselves on Chris, we get actually we get asked this question a lot is Sort of like who you partnered with which vendors? You know, do you have exclusive relationships with Trust Insights, we are vendor and software and platform agnostic. Meaning that we don’t, we’re not partial to any one piece of software, we have the pieces of software that we use, but we don’t use them exclusively, we will use other tools, or we will promote other tools, if they make sense for the situation. And we always really try to do that research to make sure that not only are we using the tool that makes sense, whether it’s the most popular tool or just the tool that does the job the best. But we’re then also when we’re recommending those tools to our clients and to our communities that from our standpoint, we try to take into consideration the business requirements from the person is it cost? Is it this is it that, you know, sort of whatever the other things are, and finding them the right thing that fits that situation? And so when you take that to the conversation we’re having about credible sources, thinking about the organization who’s publishing the information, you know, are they known to be agnostic to all of the other things? Or do you know that like, you know, again, theoretical, not true at all, you know, Trust Insights is pocketing money from one specific tool. So therefore, they’re always promoting that one thing, and that should really that to Chris’s point, it needs to be mentioned in disclosures methodology, you know, so Trust Insights partnered with, you know, x company to, you know, publish this research. If that’s not disclosed, then we are acting unethically on on our side.
Christopher Penn 21:45
Exactly. And the last category is citation, right, which is a fancy way of saying the ingredients like where did this stuff come from? And that dew point, Katie, exactly right. The methodology statement, where did you get the data from? There are any number of studies out there will say like, Hey, here’s the here’s what we found. Great. Where did you get that data from? I’m doing stuff right now, with a survey for my mailing list, right? The data came from me. And then I could either have to acknowledge the biases of the people who are on my list, right? Because, for example, if I’m making broad statements about how here’s how marketers feel about analytics role, I haven’t my oddest is inherently biased towards people who are more analytical. Therefore, I can’t make a broad assumption that what people say is true. So I feel we do this. Because I think this would be kind of fun. Let’s take a look at a study right and ask, and if you’re watching, and you want to leave some comments, go ahead and feel free to on the platform you’re watching on about how credible you think this is. So this is from MailChimp, this is email marketing, benchmark statistics by industry. We kind of scroll through here. And they list a whole bunch of these different things by industry, the the open rates, click through rates, and things like that. Always keep scrolling, say the average open rate for all industries was 21%. How do you improve your open rates, which is their version of the so what the click through rate is 2.62%? Again, the So what? And that’s the end? So on this very quick tour key? How, how would you rate the credibility of this based on our five factors.
Katie Robbert 23:31
So my gut reaction is, oh, MailChimp is publishing data on email marketing, that is what they do. Therefore, my assumption is that they have some sort of an agenda to say, if you use MailChimp, as a product, therefore, you know, this is what you’re going to get. So again, that’s an assumption. I don’t know that that’s true, because what’s missing is any kind of disclosure or methodology, my assumption is that they’re only using their own data and MailChimp does not own the market in terms of email marketing, they don’t cover everybody. Therefore, the data is just a segment of the market just their segment. So really, what they should say is email marketing benchmarks and statistics by industry that use MailChimp, services.
Christopher Penn 24:24
Exactly right. Because Because MailChimp has a reputation, deservedly for, you know, having a lot of small business customers, right, can sign up for free and do all the stuff, which is going to skew the population. If you’re looking for say enterprise email marketing data. This probably isn’t it But to your point, right. There is definitely a potential conflict of interest. Right. And there’s absolutely no citation whatsoever. Other than that brief things it says, you know, from MailChimp users Well, great. There’s a whole host of issues with that. Is it current? Yes, that is crazy. It’s relatively published is Is the data correct? This is an important one, we don’t know. We don’t know whether it’s correct or not. We do know what our own open and click through rates are from our own stuff. So we couldn’t, you know, if we could find our industry, we could see it somewhat matches up to reality. And there is an institution, there is there is a company behind it some kind. So at least it’s not like some random blog that we couldn’t find. But in general, we’re kind of at, you know, two and a half strikes out of five here on on this. So if you were to say, okay, I’ve now need to go in and benchmark my email marketing program. And if we’re getting, you know, less than 2.62%, click through rate, we’re screwing up, right? That’s the message that a lot of people would take away from this, like, if your click through rate isn’t isn’t that high? You failed, right? Yeah, that’s the case.
Katie Robbert 25:51
It’s not the case. And that’s problematic. And so it works if you and all of your customers are also MailChimp, users, because then you fall closer into the right set of information. And so Chris, to go back to So you said that this is current, we know that the article publication date is current. Because there’s no methodology statement, we don’t know if the data itself is current, we don’t know how much data they use, how much data they sampled. So the data provided on this page was last updated in October 2019. And may vary from benchmarking data provided within the MailChimp application. So October 2019, in the digital marketing world is actually pretty old.
Christopher Penn 26:39
It’s pre pandemic. So actually, yeah, we’ll add a third strike for for not being current,
Katie Robbert 26:43
its pre pandemic, but pandemic or not in the digital marketing world, it’s old, because the expectation is that you should be able to collect near real time information and analyze it just as quickly, in order to get that information. data that’s two years old as a CEO doesn’t help me, I can’t make a decision with it. And that’s the problem when you work sort of in an academic standpoint, because the data, the way it’s collected does come in a lot slower, they may have made advances since then. But in general, government databases work a lot slower. Digital Marketing databases have a different set of expectations, and they should be near real time.
Christopher Penn 27:24
And so we’re now at essentially three strikes out of five. Now, here’s the challenge. This is the number one search result for email marketing benchmarks, right. So there’s a lot of people who view this as credible as a lot of people who will click on this and just kind of run with it, because they haven’t questioned the data. So Katie, the question I would ask you is okay, in a situation where you had we need data, and you don’t have a credible source, what do we do?
Katie Robbert 27:53
It kind of puts you in a bind. But if this is, you know, let’s say that article is the latest and greatest published information. You can use it with really big caveats. So you can say, you know, we found some places to start, and then create a plan to maybe collect your own data or work with a third party institution that doesn’t do their own email marketing, but can collect information and data on email marketing, if you really need those benchmarks. So you can use that information that MailChimp publish, you can absolutely use it as long as you disclose what it is where you got it, that it’s a small segment of the email marketing population, it doesn’t represent all email marketers. So as long as you are doing your due diligence with disclosures, then you should be fine.
Christopher Penn 28:48
I would I would go back to something you said earlier, which is you just use your own data. If the goal if you’re asking for this information benchmarks to figure out how you’re doing. In some ways, I would argue it doesn’t really matter what the industry benchmark is, it matters is your data improving over time, right? If your click through rate is 7%, right, so you’re three times above the industry benchmark, but it’s been on the decline for seven months here. So the benchmark is you’ve got a problem. Now you’ve got a problem you need to fix because you’ve been declining for seven months. On the other hand, your click through rate is point 1%. But you’ve doubled it every month for the last three months. Keep doing what you’re doing, because you will eventually get to the point where you know, you’re seeing that if you can maintain that level of growth you’re going to do great. And so to your point, Katie, focus on the data you can trust, which is your own data in that in those cases, and work on improving it again, it doesn’t really matter what the competition’s doing. What matters is that you’re getting results out of the marketing you’re trying to do.
Katie Robbert 29:46
The analogy that probably suits us the best because I know Chris, you really like the analogy is to sort of drive the point home is if you are working out and training for something so as we may have talked about in other episodes I do training on the peloton. I’ve definitely drank the peloton Kool Aid. But they are not paying me to say any of this. So that’s my disclosure. Well, I mean, that would be lovely. But again, different episode, The point being is they have this program called power zones. And so I’ve been working out alongside one of my friends, and her power zones are very different from mine. And every time you take a class, one of the first things that the instructor in the class says is, don’t pay attention to the zones that your mother, brother, best friend, neighbor, have, they have nothing to do with you, your zones are going to be unique to you, because there’s so many variables. And the same, you know, of you know, who you are, how you approach it, what you do your strength, you’re this, you’re that, that you can’t make that one to one comparison. So to drive home your point, Chris, that same thing is true of industry benchmarks. It’s not as straightforward as this happened, therefore, it’s the benchmark, there’s so many other factors that don’t get considered brought in. And so, you know, one of the things you said about MailChimp is they tend to attract a lot of small businesses. Well, they’re representing this as if it spoke for enterprise sized companies. But maybe that’s not true. We don’t know that because we don’t have that methodology statement, you know, and so you’re right, Chris, focusing only on your information, your benchmarks, you may get asked by the C suite. Well, what’s the industry benchmark? And are we performing? It’s an opportunity for you politely and respectively to push back and say, who cares?
Christopher Penn 31:42
Exactly. So putting this into action, if you’re trying to publish your own data, as a marketing tool, as a content marketing tool, please make sure that you’re following the five rules, right? What your credentials and your organization’s credentials upfront, so people can quickly go, Oh, I know why you’re publishing this. And I know that you have some authority to be to be talking about it. Make sure your data is correct. Please, please, please, please. Even if there isn’t, you’re not a formal academic institution, there is something to be said for an informal peer review, like share your data with somebody under NDA, if you have to say Hey, could you look at this and sniff test it one of the things that we do a ton at the office is, you know, Katie, you are constantly looking over my data, my code, etc. and going, Hey, this doesn’t pass the sniff test. Something’s wrong here.
Katie Robbert 32:32
Yeah, that’s peer review. It’s peer review. And you’re right, because it doesn’t have to be a formal process. Um, you know, you don’t have to get a whole bunch of people involved. And the other is, even like in a bullet list, list out your methodology, from this day to this day, this is the information that I’m looking at this is the source that it comes from. This is, you know, this was my intention in looking at the data is basically having that plan. And then you can look at and go, Hmm, I’m trying to understand information from, you know, q4, but for some reason, once I realized that now I’m looking at data from q2. So it’s a good gut check to make sure you’re even looking at the right information.
Christopher Penn 33:14
Exactly. So point three is make sure the data is current, that you’re working with data. And there’s, there’s something to be said for it, you know, after certain amount of time, you’re almost if you’re have a survey, for example, it’s running for six months, that time factor may substantially impact you data. So make sure you understand the data source. Number four, you have to have disclosures know, what are your conflicts of interest? Even if you don’t have any, you have to disclose you don’t have any disclosures you should anyway. And number five, is your methodology statement in detail. What is the state of where did it come from? How reliable is it. And one other thing that we started putting even in our client reports these days is limitations. These are the known limitations of this study. Here’s what we know, we know, for example, one of our recent content marketing reports. So we know that our scraping tools can’t get every single web page on a client’s website yet, like 90% of them. But if there’s a section we need to log in, or it’s behind a wall, it’s, it’s a view, this is a known limitation of this. It’s a known limitation, that, you know, with this particular machine learning technique can’t do certain things, right. It’s like saying, hey, this blender can’t make french fries. Well, yes, we know it’s, it’s, it’s a limitation of the tool. But in the methodology section, the more detailed you can be about the known limitations of the data that you’re publishing or working with, the more trustworthy you will come across.
Katie Robbert 34:38
If you’re if you come across a piece of data, so going back to that MailChimp example and we have nothing against MailChimp, we think they’re a fantastic company. I think their services are fine and we’re just using their article as an example. If you are looking to use that information and there it doesn’t contain the disclosures or citations or methodology that you’re looking for, reach out to them and ask Most companies are happy to provide that information. They may not have thought to include it, or it’s just not part of how they publish their content. But don’t be shy about asking. I say this broad strokes, companies that have nothing to hide, will very readily share that information with you, they should be sharing that information with you. If and again, I’ve never experienced this with MailChimp, I don’t want to pretend like I’m speaking about them as a, you know, example I’m not. If you run across a company that says I can’t tell you, or we don’t have that information, or we don’t share that publicly, that might be a red flag, that might be a indicator to you to maybe find a different data source. If you come to us and ask us, well, how did you collect that information? You didn’t publish your methodology? We’re happy to share that with you. Absolutely. Howie.
Christopher Penn 35:58
Exactly. So that is vetting credible sources, you know, determining conflicts of interest, the five conditions, you know, credentials, correctness, correctness, conflicts and citations. are the five ways to evaluate a source and figure out, yes, this is this seems reliable, or it could be issues. And it’s always up to you as to the sort of the the risk tolerance you willing to have, like, is this a mission critical decision, you should probably make sure all five boxes are checked, is a throwaway stat in a blog post, you know, to two or three is probably good enough for something where it’s it is gonna be not material to business, but definitely, the more important piece of data is to business. It’s like, it’s like any ingredient, the more you can be making sure it’s good quality.
Katie Robbert 36:47
Well, I would, I would with the caveat, Chris, to be careful with that whole throwaway stat in a blog post, you never know what’s going to come back to bite you. So, you know, try not to operate that way, like oh, nobody will read it, or it doesn’t matter. It’s just this little thing, like, believe it or not, that’ll probably be the thing that you know, gets you called to the carpet three years from now. Like, I remember when you set this thing, it was really wrong and people are still quoting it.
Christopher Penn 37:13
I got some blog post update, Gale. Good luck. Exactly. All right. We will talk to you folks. next week for another episode. Thanks for tuning in. Thanks for watching today. Be sure to subscribe to our show wherever you’re watching it. For more resources. And to learn more, check out the Trust Insights podcast at Trust insights.ai slash ti podcast, and a weekly email newsletter at Trust insights.ai slash newsletter. got questions about what you saw in today’s episode. Join our free analytics for markers slack group at Trust insights.ai slash analytics for marketers. See you next time.
Transcribed by https://otter.ai
Need help with your marketing data and analytics?
You might also enjoy:
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, Data in the Headlights. Subscribe now for free; new issues every Wednesday!
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new 10-minute or less episodes every week.