In this week’s episode, Katie and Chris discuss data analytics requirements gathering. When you’re setting out on any kind of marketing analytics project, how do you go about determining what data you’ll need? This process is called requirements gathering, and it’s not as simple as it sounds. Listen in for tips on how to go about requirements gathering as well as how different development methodologies approach requirements. Plus, learn the situations in data science and machine learning when requirements gathering is premature, and how to handle ever-shifting data.
Subscribe To This Show!
If you're not already subscribed to In-Ear Insights, get set up now!
- In-Ear Insights on Apple Podcasts
- In-Ear Insights on Google Podcasts
- In-Ear Insights on all other podcasting software
Advertisement: Data Science 101 for Marketers
Do you want to understand data science better as a marketer? Would you like to learn whether it’s the right choice for your career? Do you need to know how to manage data science employees and vendors? Take the Data Science 101 workshop from Trust Insights.
In this 90-minute on-demand workshop, learn what data science is, why it matters to marketers, and how to embark on your marketing data science journey. You’ll learn:
- How to build a KPI map
- How to analyze and explore Google Analytics data
- How to construct a valid hypothesis
- Basics of centrality, distribution, regression, and clustering
- Essential soft skills
- How to hire data science professionals or agencies
The course comes with the video, audio recording, PDF of the slides, automated transcript, example KPI map, and sample workbook with data.
Sponsor This Show!Are you struggling to reach the right audiences? Trust Insights offers sponsorships in our newsletters, podcasts, and media properties to help your brand be seen and heard by the right people. Our media properties reach almost 100,000 people every week, from the In Ear Insights podcast to the Almost Timely and In the Headlights newsletters. Reach out to us today to learn more.
Listen to the audio here:
- Need help with your company’s data and analytics? Let us know!
- Join our free Slack group for marketers interested in analytics!
What follows is an AI-generated transcript. The transcript may contain errors and is not a substitute for listening to the episode.
Christopher Penn: In today’s In-Ear Insights, we have a chicken and egg problem. The chicken and egg problem is this: Data requirements gathering is an essential part of marketing analytics. You have to know what it is that you need, you have to know who has it, where it lives, etc. On the other hand, when you’re doing advanced machine learning and statistics and whatnot, even on your marketing data, there’ll be times when the data that you end up exploring and finding is not in the requirements.
Let me give you an example. I was pulling some social media data over the weekend and I was trying to figure out if there was a relationship between some social data and conversions in Google Analytics. A pretty straightforward scenario that comes up a lot. And one of the things I wanted to extract and engineer out of this data set was, instead of just the date itself, what day of the week was this? Was this a Sunday? A Tuesday? That is an example of feature engineering. The day of the week was not in the original data set, and it wasn’t even something I knew I was looking for until I got into the project. But I thought, “I wonder if the day of the week matters.” So I had to engineer this out of the regular little bit of code to say, Hey, check the date, check what day the week it is, and put the day of the week in this column on this table.
So Katie, as somebody who has a lot of experience dealing with requirements, gathering and dealing with planning ahead for what you’re going to need, how do you reconcile this chicken and egg scenario where you, on the one hand, want to know what you have, but on the other hand, you don’t know what you have?
Katie Robbert: You know, the conversation about requirements and project management, in general, is an interesting one. In some methodologies it is true that once you do the documentation, that’s it, you’re done. It is static, you can’t change it in some instances like a waterfall or very rigid industries like manufacturing, where A plus B always has to equal C, and there is no sort of changing it. But when you think about more agile methodology, requirements gathering is something that is meant to be a living document. It’s fluid. You always want to start with requirements gathering, but you may also continue to gather requirements along the way. What shouldn’t change is, ‘What’s the question I’m trying to answer?’ and ‘What is the problem I’m trying to solve?’ It may evolve a little bit, but ultimately, you should always know what that particular piece is and then put your stake in the ground to say, “I think what I need is the following 5 to 10 things to answer this question.”
In your example, Chris, you may find that you need to do some refining and tweaking or to look in other directions for the answer to the question, and that’s okay. As long as you continue to keep your documentation up to date alongside it. So let’s call requirements gathering the egg, and let’s call the data the chicken in this example. You always want to start with the egg. You always want to start with the requirements gathering. I don’t think it’s a chicken and egg situation. I think it’s a matter of whether this is a living document or not.
CP: Did you just coin the term agile analytics?
KR: I might have. I’ve only had two sips of coffee.
CP: Excellent. So in your typical agile process, you have your outcome and then you have your backlog, you have your Scrum and you have your after-action review. Is there such a thing in that methodology for agile requirements gathering where you iterate the requirements gathering first and then you iterate the production of the thing afterward? Or if you have predefined sets where something can’t be changed—like you’re trying to meet, for example, FDA requirements on something and you can’t mess around with those, you have to do exactly as you’re told—. But in marketing analytics, a lot of marketers are in a situation where their shareholders and stakeholders are saying, “We don’t know what we want, we want you to tell us and we’ll tell you if it’s right or wrong.” There’s a natural sort of iterative to that as well. So, in terms of requirements gathering, is there such a thing as agile requirements gathering for marketing analytics?
KR: Absolutely. If you think about software development’s life cycle, and where agile methodology truly lives, you have the two-week sprints. And basically, it’s this constant, iterative process where on day one, you’re starting sprint number one. But on day five of sprint number one, you’re starting the requirements gathering for sprint number two, taking what you’ve learned from sprint number one. Think about it like this constant loop, moving along until you reach the constant which is the end goal: What is it that you’re trying to solve for?
So if your stakeholder comes to you and says, “I’m curious about this thing. I don’t know if I’m right or wrong,” there should still be at the core of it that question of ‘What are we trying to prove?’ ‘What are we trying to answer?’ How you get there is the iterative process. That is what you always need to have in mind.
If you go and truly research in development and experimental, then it’s fine to just go ahead and open the box and see what’s inside. But in the sense of time being spent, money being spent, resources being used, you probably want to have some semblance of a plan. So yes, you can do your requirements gathering in an agile way you can break down into smaller milestones.
The bigger question is, “I want to know how much more money we’re going to make with social media this year,” or something along those lines. Then you start to break it down into smaller milestones and say, “All right, the very first little nugget of this thing that I’m going to explore is the value of a like on my Facebook page,” and then you just start there. You don’t necessarily worry about the rest of the requirements, because you take what you learn from that little piece of the puzzle and apply it forward. That’s how you build those additional requirements. You may not know what the heck is going to happen on step 37. But that’s okay. At least you have an idea of what step 37 might be.
CP: When it comes to these methodologies, I think an important clarification is that they’re really meant for building a thing, right? A software development, a system development, even just a simple Google Data Studio—you’re still building a thing. And there is a point at which you ship this thing to your stakeholders.
When we’re talking about things like exploratory data analysis, there really isn’t a thing you’re building. It’s more of a process that you’re following. Are there different methodologies for either a process you’re following to explore something, or an operations process where there isn’t a thing that you’re doing at the end but it’s more of an ongoing process, like publishing your monthly reports? Yes, there is a thing but it’s more of an operational tempo. So what are the ways to manage those? Is there an equivalent to the idea of agile and iterative in those things but not in the same sense of building a product that you’re shipping?
KR: Well, in the example of exploratory data analysis, we often refer back to the scientific method, which is tried and true. It is a little bit more waterfall than agile. I keep using those terms because I feel like regardless of whether or not you’re building a thing, or you’re exploring a thing, I think it’s important to bring it down to those concepts. Because waterfall means you’re going from one step to the next and you can’t move on to the next step until you finish the first. Whereas agile means you can get partway through step one, and then start step two, and then start step three, and then partway through each of them, which is more flexible. It’s literally called agile. And I think that it does apply. So even if you aren’t building a thing, if you’re just doing something like exploratory data analysis or research, there is still a process. And it starts with some semblance of requirements gathering.
For example, you want to understand how this correlates with this and this and this, and there’s no output but rather curiosity, you still start with, “What’s my hypothesis?” “What is it that I’m trying to prove?” If you’re just mashing data together, that’s fine, but you’re not going to get very far. And after a while, you’re not even going to know what you’re looking at. So you’re starting with some kind of a hypothesis. Call it a process, call it a methodology. You’re following a plan. And the plan is, “I want to know X. I need to have the following pieces of information in order to get to the question.” It’s the same across the board.
CP: So when we think about data analytics and the use of that data, obviously we have the outputs and the outcomes that we’re looking for. What happens when we get feedback from these stakeholders who say, “No, this isn’t what I was looking for,” but they’re unable to explain what it is they’re looking for. It’s very much that, ‘I don’t know what it is, but I’ll tell you what I see’ kind of thing. How do you deal with that in a situation where you’re trying to be organized, you’re trying to make a plan, you’ve got your marketing analytics mandate, you’re trying to show how you’re going to fix marketing and make it more profitable, and you have a CMO or VP saying, “I don’t know what I want, surprise me.”
What do you do in a situation where you have somebody who’s completely unhelpful towards guidance and you know that, generally, you perform better but you have no KPIs. You may not even have business goals. You may be in a marketing unit inside of a much larger organization, where the goals are completely unclear. What do you do then?
KR: Well, that’s where you start. If the CMO comes to you and says, “I don’t know, look at some data and tell me what it says.” It is your job to push back and ask in a polite sense, “What is the goal?” Because it’s your time taken away from doing something else. And I think it’s totally appropriate to position it in that way to say, “I’m happy to take a look at this for you, but I need your help understanding the overall goal.” You want to understand revenue, you want to understand awareness, you want to understand productivity. Throw out a bunch of different categories and see what sticks. And if they still say, “I don’t know just go do it,” then it’s up to you to start making that decision and say, “Okay, let me see what it looks like when I put it in the lane of revenue,” or “Let me see what it looks like when I put it in the lane of awareness.”
You, as a marketer, have enough information to say, “I know my company cares about making more money, so let me take a look at it from that angle.” And I think it’s appropriate for you to tell whoever’s asking, “I’m happy to take a look at this, because you have given me very little direction or because you were unsure of what it is that you’re after. This may take me a little bit of time. Are you okay with me being pulled off the 20 clients who are actually paying us in order to do this weird little pet project for you?” (Laughter). Probably say it in a nicer, more respectful form than that, but that’s the general gist. It is totally appropriate for you as the person being asked to push back and say, “I’m happy to do this. You’ve given me no direction. It’s going to take me a while.” Then you, as the marketer, know the basics about what this company cares about so you can look at it through that lens.
CP: Right. I think the challenge there, at least for folks who are more junior in their careers, is that they may not be clear about that. They may not be clear about what it is, other than broadly knowing what the company does. But not being able to connect the dots back to things that they have influence over is something I see the most in social media analytics and social media marketing, where you have social media managers who are publishing content, they’re cranking out videos, they’re putting a post on Instagram, and they don’t know why they’re doing it. And because they have no visibility and no data to work with, they just sort of measure themselves on activity.
You and I saw this all the time in the world of public relations in our previous work, where people would report on what they did, but literally had no idea what the impact was. They had no system of measurement of any kind. And in a case like that, how does that tie back to requirements gathering? Like, you got to crank out 15 press releases and make 40 calls this week and send out 52 emails and spam a bunch of people. What do you do in a situation where there may not even be a measurable impact?
KR: Well, I think that’s where you gather your team together, because again 99% of the time, companies are going to care about making more money. So, you can always try your best to tie it back to that. Get your group of marketers, or if you are in a Facebook group of other marketers, or a slack group—Analytics for Marketers is a great place to start, TrustInsights.ai/analyticsformarketers; it’s our free slack group—, and just ask the question: “Hey, I’m churning out 40 press releases a month. Can anyone help me figure out how to tie this back to revenue?” or “What have you done to measure this?” or “Is there a way to tie this into making more money?” Asking the question is a great first step.
So I think that if your boss isn’t giving you the information if you’re not sure about the company, use the lens of money. Because 99% of the time money is going to be the right answer unless you’re a nonprofit. So start there, and then start asking around to your team members, to other people in the organization, to your community groups, to old professors, to whoever might have any sort of intelligent opinion on the subject, and just start to piece it together. It might not be right and that’s okay, but you’re going to learn a lot in the process.
CP: So to wrap up the story, this is the analysis I was doing over the weekend for an upcoming talk in competitive analysis—stay tuned to the Trust Insights blog over at TrustInsights.ai/blog where we’ll be publishing that session in a couple of months—. But what I found was that by engineering some of those extra features, even though they weren’t in the requirements, it did shed light on some things. In this case, the target was branded ‘organic search’, because you can’t see a competitor’s revenues by day, nor their web analytics, nor their conversions, but you can see the number of searches by name on any given day. So what we ended up doing was taking all this competitive social media data and then engineering out things like day of the week, an hour of the day, and things like that, and then saying, Does any of this have any impact whatsoever on the number of times audiences search for this brand by name? It turns out there were some pretty clear indicators that some of the things this competitor was doing in social media actually created potential lift. Obviously, the next step in that would be the hypothesis. If we do the same thing, will we see a commensurate increase? If we get 15% more Twitter likes, do we see 15% more searches for our brand name? And then, because for many companies branded search is a big deal, do we see 15% more revenue down the line?
KR: It’s interesting because you started with a very specific question, even if you weren’t sure where you were going to find the answer. You were basically following the process that we’ve outlined. It’s okay to not know exactly what it is that you need. But because you had a very specific question in mind, you knew that data from the National Weather Service wasn’t necessarily going to be helpful, so you weren’t going to go down that road. But maybe Google Trends data, social media data, social listening data, CPC data, those might’ve been within the right kind of realm, so you had very loose requirements to start with.
Now that you’ve gotten a more specific answer, you’re refining it. So you’re demonstrating what it is that we’ve been talking about, that it’s possible to start with the question, and only the question, and then go from there.
CP: Well… good for me.
KR: Happy Monday.
CP: Happy Monday. Alright, folks, if you have questions of your own about marketing analytics, please let us know. Go to TrustInsights.ai and you can find this podcast episode on our website. Just leave a comment on the post and let us know or you can always drop us an email or, as Katie mentioned, join our slack group over at TrustInsights.ai/analyticsformarketers. We’d love to see you there.
Ask all the questions you want—the silly questions you want—and we’ll be happy to answer them. Till then, we’ll talk to you soon. Take care.
Need help with your marketing data and analytics?
You might also enjoy:
Get unique data, analysis, and perspectives on analytics, insights, machine learning, marketing, and AI in the weekly Trust Insights newsletter, Data in the Headlights. Subscribe now for free; new issues every Wednesday!
Want to learn more about data, analytics, and insights? Subscribe to In-Ear Insights, the Trust Insights podcast, with new 10-minute or less episodes every week.