How to Turn Complex Data Into B2B Marketing Strategies
Many B2B marketers drown in the ubiquity of data when they could be using it to uncover some untapped opportunities. However, data alone doesn’t solve marketing challenges. What’s needed is a clearly-defined strategy, objectives, and smart decision-marketing. What can marketers do to leverage data strategically and avoid “death by mediocrity?”
That’s why we’re talking to Rebecca Shaddix (Founder & Managing Partner, Strategica Partners), who shares incredible insights about how to turn complex data into actionable B2B marketing strategies. During our conversation, Rebecca emphasized the importance of setting a clear strategy and highlighted the need for defining acceptable mistakes, aligning cross-functional teams early, and avoiding common pitfalls that impede performance. She also offered a practical roadmap for marketers aiming to drive better results through collaborative, insight-led, and data-driven strategies.
Subscribe: Spotify | Amazon Music | RSS | More
[1:40] The challenges B2B marketers face with marketing data and why mediocrity is the biggest threat for any company
[7:54] The role of data in smarter decision-making for marketing
[11:21] Untapped opportunities that B2B marketers can find in data
[13:04] “Not articulating and aligning the acceptable mistake” is the single most important pitfall that B2B marketers should avoid in data-driven marketing
[16:51] Best practices to generate useful and actionable insights from data through a crowdsource group activity
[23:22] Actionable tips for B2B marketers on leveraging data:
[28:13] Case study: How to address bottlenecks in the funnel with better data
Transcript
Christian Klepp 00:00
Welcome to this episode of the B2B Marketers on the Mission podcast, and I’m your host, Christian Klepp. Today I’ll be talking to Rebecca Shaddix. She is a managing partner at Strategica Partners, a writer for Forbes, one of LinkedIn top voices in product marketing, and the expert in residence at the Product Marketing Alliance. She leverages her background and research to merge data driven strategies with innovative marketing approaches. Tune in to find out more about what this B2B marketers mission is.
Christian Klepp 00:31
Okay, I’m gonna say. Rebecca Shaddix, welcome to the show.
Rebecca Shaddix 00:35
Thanks for having me.
Christian Klepp 00:36
Great to have you on the show. We had such a fantastic pre-interview conversation, and I should have hit record back then when we were chatting, but here we are, and I’m really looking forward to this discussion, Rebecca, because I think you are gonna drop some incredible insights that I think are gonna be relevant and extremely valuable to our audience.
Rebecca Shaddix 00:58
I hope you’re right. Let’s see.
Christian Klepp 01:00
No pressure, right? Set the bar really high. But yeah, let’s, let’s dive right in. Shall we? Rebecca, you’re on a mission to create an incredible experience for customers by delivering clarity amidst the chaos through B2B marketing strategies developed through data driven insights. So for this conversation, let’s zero in on a topic that I think has become part of your professional mission, how to turn complex data into actionable B2B marketing strategies. So let’s kick off the conversation with this question, why do you think a lot of B2B marketers drown in I think what you call the ubiquity of data?
Rebecca Shaddix 01:40
Yeah, I think it’s most often the root cause is going backwards with the order of evaluating data and setting a strategy. So the strategy, the strategic level company and then departmental, needs to come with clear insight into your place in the market, your differentiation, and a vision for that, and then how you’re doing and getting there is supported by the metrics and the dashboards that would indicate your progress. But if you don’t have clarity on to which metrics matter, what you’re optimizing for and what’s noise, it’s too easy to just be inundated with tons and tons of data that doesn’t necessarily mean what you think it does may not be set up to be collected, to telling the stories that you think it is, and then you’re asking it to deliver insight that it wasn’t set up for.
Rebecca Shaddix 02:28
So I think most often the biggest challenge is going backwards. Thinking more data means more answers means more clarity, and that’s not the case. All of the data in the world could be interpreted in any number of ways, and most of it is noise, if you don’t know what really matters. And so if you haven’t set that strategy up top and been really intentional about the trade offs, it’ll take to get there, what you’re not optimizing for, what you don’t need to invest in, then you don’t know which metrics you need to influence or measure and improve, and then it just becomes death by mediocrity of spiraling.
Christian Klepp 03:04
I’m gonna steal that one, death by mediocrity. Wow.
Rebecca Shaddix 03:09
Yeah. And I think that’s a big problem. I think mediocrity is probably the biggest threat that any company faces, because if something is clearly, objectively terrible, you’ll take action. If you don’t have any data to tell you what’s working you’re going to validate your assumptions. But if things are just sort of fine, and they’re just sort of languishing along, you’re not necessarily having the urgency to change things. You could be worried about the opportunity cost of messing with something that has some progress. And then, over time, the delta between where you are and where you could be is so much greater than it would be than if something was terrible.
Rebecca Shaddix 03:46
And so to the earlier question of drowning in data, I often think that when people have data, they think that they have just completely unbiased answers, but all of the data in the world is only telling you what this set of inputs is being able to measure in the output, so it’s not actionable unless it’s set up in a certain way. So I think of that mediocrity of enough data to sort of kid yourself that you’re not making assumptions which you are, enough growth, even if it’s stagnant, to kid yourself that this is progress, those are the two biggest threats that people tend to just let persist without taking action decisively. And the decision inherently involves not doing something you’d like to do, or a trade off, or a risk of something you would like not to have, because if it didn’t, there wouldn’t be a decision, right? It would just be the obvious thing to do.
Rebecca Shaddix 04:38
And so this risk aversion, this really myopic definition of accountability, I think, is a big threat to marketers who are sort of incentivized to play, not to lose, not play to win, and to play the short game, because they’re not going to be around long enough for the long game if they take big swings that don’t pay off. And so I think. Now we see a lot of this languishing mediocrity, because the incentives are such that people just don’t want to do something bold that could get them fired. And so organizations that say they want to take big risks, they want to do big innovations, have to really back that up and say they’re okay with certain things not working out, and if they’re not, they just have to be honest that they want consistent, predictable growth, but you can’t have both.
Christian Klepp 05:23
Absolutely, absolutely, um, don’t do something bold that seems to be like a symptom that is prevalent in B2B, like, across the board, wouldn’t you say?
Rebecca Shaddix 05:33
Sure, I mean, yeah, if something can’t be proven wrong, it can’t really be proven right. So you haven’t set things up that are bold enough just the short term thinking of short term investor cycles incentivized short term decision making, and that there’s a path for that, and there’s a place for that. There’s a place for predictable, consistent growth, and that’s fine. What I think is the problem is when people say, we want to have these big, bold, industry changing innovations that come with doubling or tripling a revenue in some short period of time. You have to accept that in order to do that, there’s the risk that certain things won’t pay off. And the way I think that you bridge that is by making your problem statement. The first thing you focus on being very clear this is the problem that’s a priority to solve. Then having two to four hypotheses about ways to solve it, and then having five or six experiments for each of those hypotheses, what I think can get skipped is going either the opposite direction. So you do a bunch of experiments with no clear hypothesis about what they’ll tell you. You don’t let them run long enough to know if they’re working. You don’t necessarily have a big enough sample size to know if they’re working at all.
Rebecca Shaddix 06:42
So if you’re just running indiscriminate experiments all the time, that’s not productive. But I also think it’s common to just jump into two to four hypotheses, like we think we’re seeing x because of y, but if you haven’t started by anchoring that in a problem statement that you don’t even know if these hypotheses are the most important ones to work on, and that’s where I see a lot of this downstream disconnect. If the CEO and their leadership team can be aligned, this is the problem statement we’re solving. Then you could articulate these hypotheses, then you could have the experiments to run. But if you’re just sort of running experiments, or you’re jumping right into hypotheses, and you could have the well, what is marketing even doing? Why are you running all these weird AB tests? That’s not what matters. If you have that conversation up front, great, you’re aligned and you’re strategic. And you know what good looks like if you’re just sort of doing a bunch of activity and you’ve skipped the problem statement, alignment, that’s where a lot of the downstream problems come from.
Christian Klepp 07:37
This may sound like a bit of a guided question here, but I’m going to ask it anyway. How do you feel like when done properly, of course, how do you feel data can bridge that gap or solve this issue of the downstream disconnect? As you just said.
Rebecca Shaddix 07:54
Yeah, data is so powerful to answer big questions that we wouldn’t otherwise be able to intuit so I think that the promise of data to answer questions, to tell us whether internal differences of opinions or assumptions are true, is tremendous, but I see a lot of parallels to how this data driven decision making revolution, if you will, was really picking up steam 10, 15 years ago, to a lot of what I hear around with the AI revolution, that there’s a tremendous power in both. There’s tremendous power in data driven decisions. There’s tremendous power in AI. But all of the data in the world can’t replace critical thinking or strategic decision making. There’s no dashboard that will have an answer what you’re measuring, how and why, is the role of the strategic critical thinker? Same thing with AI. There’s a pretty low floor to start using AI tools or even building them now into your own products, and a pretty high ceiling for what it’s capable of. But if you think that AI can suddenly you can delegate strategic decision making to it, or big, important creative, complex tasks without the right prompt engineering and guidance, you’re going to make similar mistakes to if you think that you can just look at a dashboard and develop a strategy out of it.
Christian Klepp 09:10
It’s interesting that you say that. And before I go to the next question, I gotta bring up this anecdote. I saw this on LinkedIn, and you might have seen that as well. I’m not gonna say who it was, but it’s basically something to the effect of if you enter these specific prompts into ChatGPT, it can help you conduct an SEO (Search Engine Optimization) audit of your own website that will save you at least $5,000 if you outsourced it to an agency. And there was a big group that said, oh, please send me the prompt, right? And there was the other group that said, that is absolute nonsense, right? If that were true, um, if that were true, a lot of these agencies will already be out of a job, right? So it’s to your point about like, um…
Rebecca Shaddix 09:58
Yeah, I didn’t see that. I mean. So I guess the first thing that popped into my mind is, I don’t know what these prompts are, but, I mean, we have Ahrefs and as in Semrush, I don’t know what these prompts are yielding that existing tools that don’t cost $5,000 are but I think a lot of it’s the questions that you’re asking can are there certain keywords with more traffic that you can optimize? There’s tools that you can already use to answer those questions. I mean, obviously there’s the on page and off page considerations. Of course, you want things to be crawlable. I don’t know what these prompts are, but I think a lot of it is, what is your end goal here? What is the question? How important is SEO, even to your company? And so if it’s worth a significant amount of your revenue, it’s a really important growth strategy, then asking the right questions to the right experts, probably an agency makes sense for $5,000 if it’s not, if you’re it’s obviously a long game, too, starting with what you can incrementally do, I guess I just come back to What’s the objective, and what are You optimizing for?
Christian Klepp 11:01
Spoken like a strategist, I would say. So moving on to the next question. What do you think are and you mentioned some of them already, but look what are the untapped opportunities that you see? Right? So specifically that data presents regarding B2B marketing strategies.
Rebecca Shaddix 11:21
There’s untapped opportunities in data that I universally See, I think that we’re coming back to a time where what’s not changing is even more of an important question to be asking as a marketer than what will change in the next 5 or 10 or 15 years, Because who knows what will change? Obviously, we want to make sure we’re not building things that will be obsolete. But I think the more important strategic question is, what won’t change about people, about buyers, about how people will engage. They will probably consistently want reliable uptime, faster shipping, if you do that, or whatever, reliable customer support, lower prices that are aligned with how they perceive the value of the product. All of those are things that won’t change. So figuring out what your competitive edge is really comes back to understanding what your customers want and are willing to pay for. It’s more predictable to build products and processes and marketing channels around what will always be true, which is positioning will be important, market segmentation will be important. Understanding your buyer will be important than guessing what will come next. And I think that there’s potentially an inverse of attention paid to them. Who knows what we can predict a year from now, 10 years from now, with Tech, I think it’s easier to say what won’t change and where can we have a unique edge.
Christian Klepp 12:46
Absolutely, on the topic of turning all this data into actionable B2B marketing strategies, what are some of the key pitfalls you would advise B2B marketers to avoid first part of the question, and the second one is, what should they be doing instead?
Rebecca Shaddix 13:05
Yeah, I think the single biggest pitfall that will solve a lot of the downstream problems that B2B marketers typically face is failing to define an acceptable mistake at the onset of an initiative or a go to market strategy. So we have North Star metrics and KPIs (Key Performance Indicators), we’re all very explicit about how we’ll measure success, and typically pretty good about reporting on how we’re trending toward leading and then lagging indicators of our goals. But very few companies actually, at the same time as you have your north star metric and your KPIs on a go to market plan, for example, will articulate the acceptable mistake. And so there’s always a trade off to anything that you’re doing. An acceptable mistake could be something like, it’s okay if we come to market later, because we need to do comprehensive testing of this product, that would be an acceptable mistake. And then that would tell your teens that the timeline can be less compact, etc, or an acceptable mistake could be the exact opposite. It’s okay if we don’t do comprehensive testing and have to iterate after launch, because we have to start generating revenue ahead of this critical moment in the buying cycle. But if you don’t articulate that, that’s where I hear a lot of cross functional strife is, well, we can’t, marketing say we can’t launch this product by that time period. And what they usually mean is, we cannot put all of the resources into a tier one launch that we typically do with a tier one plan in that time period. So they start to worry, I’m going to look bad if I don’t have all my ducks in a row and the sales enablement buttoned up and everything. But then product here is, what do you mean you can’t announce this product in a month? We built it in a month. It doesn’t and they’re thinking, it doesn’t have to be perfect. We don’t even need a press release. I don’t care if the sales team is shipping over themselves. We just need this launched in live. It can go to a beta cohort for all we care. And because they haven’t articulated the acceptable mistake is comprehensiveness, because speed is the priority, and they’re both just talking past each other and haven’t articulated that. And then that’s where marketing looks slow, or product looks like they don’t understand and reckless, and then sales feels out of the loop. And so if all of the department heads haven’t aligned with this acceptable mistake, you have all kinds of downstream problems, and it’s hard to rebuild trust from there. And so I think that’s the single most important pitfall. And I guess the second part of your question, too is what they should do instead, which is, do that.
Christian Klepp 15:29
That’s fair enough. That’s fair enough, absolutely. But, you know, it sounds so simple, but it’s amazing to your point, how, how many organizations or business units within an organization, get that wrong and end up, end up talking past each other, right?
Rebecca Shaddix 15:46
Yeah, yeah. My friend Wes Kao says I might butcher this. Suddenly, the effect of, there’s a difference between the knowing doing gap. So you could know that makes sense. I know it. I do it. I do it well.
Christian Klepp 15:59
Yeah.
Rebecca Shaddix 16:00
Follow her. She’s gonna say this better than I am. But basically it’s like, sure, I know that makes sense. Far fewer people do it, and then even fewer will do it well. And doing it well requires that cross functional alignment and buy in leadership has signed off well in advance, because if they’re surprised and a retrospective about what the acceptable mistake is, you got problems. So that’s the I know it’s sure it’s simple, yeah, define your acceptable mistake. Do it is far less often than doing it well, is far less often too.
Christian Klepp 16:30
Absolutely, absolutely. So you’ve been talking a lot about the importance of having a strategy, having those goals, defining what the problem is. So talk to us about the importance of formulating that right goal and that right strategy when you’re analyzing the data right because otherwise things can go awry. But how can this approach help you to generate those insights that are useful and actionable?
Rebecca Shaddix 16:51
Yeah, so I think you formulate tactics and hypotheses when evaluating data. You formulate strategies beforehand, and then the data tells you how you’re performing on them. I think lots of really interesting questions about how you can define what the data is revealing pop up the more you make evaluating data a crowd source group activity, but make sure that decision making is still not a team sport. So what I mean by that is there’s a lot of ways to evaluate any dashboard. And this became this really clear to me in a business school accounting class where this our class was broken up into five groups. We all got the exact same data, and four of us there were basically five groups, and there were four different variations of interpreting the data. And I’ve seen that play out in the business world too. We look at some dashboard that says something like, hopefully, through a multi touch attribution model, not last touch, you’ll have some big number attributed to some channel. Let’s call it events.
Rebecca Shaddix 17:53
So let’s say you’re saying a ton of revenue is coming through events. And you can look at that and say, we should be doing more events. We should dump in money to more events. Look at the ROI (Return on investment) from events, and that could be true, but it’s also entirely possible that you’ll start saturating the attendees. If there’s overlap in that, you’ll hit diminishing returns, or that’ll change the next year. And so what I really like to do is make evaluating dashboards a group sport. Make sure there’s this concept of Hippo, a highest paid person’s opinion. I think that the highest paid person in a room should take care to speak last when they’re evaluating the data, hear what other perspectives say. Be intentional and mindful about who is in these rooms when you’re looking at data, because if you don’t want somebody’s perspective, you’re probably wasting their time in there. But make evaluating the dashboards and discussing what they could be saying a team activity, what are possible explanations, and really challenge multiple of those. I like to make sure that the data has been circulated in advance. It can be in a Google Sheet or Google Slides form. It could be in a spreadsheet, but basically make sure people have seen it in advance and they come with hypotheses, because when people start talking about it, you want to make sure there’s other perspectives. And the very first thing that gets said about how to interpret the data is likely to stick, but it is probably not the only way to interpret it.
Rebecca Shaddix 19:13
So share the data in advance. Come to these meetings, prepare to discuss the questions and the interpretations. I don’t like to just like, sit in a meeting and say, read off a dashboard. As you can see, x is happening. I just don’t think it’s a good use of people’s time. So evaluating it, I usually put this in a Google Slides format, and we address the comments. We address that people have different interpretations of what they’re seeing. You can leave it as a comment. You can discuss it, but the meeting is to discuss the comments, not to interpret the data. Now you’re having a discussion about possible interpretations, and you still have a single decision maker responsible for going forward or making a recommendation, but now you have different perspectives about what it could mean. And again, I want that decision maker usually to be the last person to share their opinion, and for the first time, people just. Data not to be in a group meeting, because they’re not going to think as creatively, but this means that your meetings are more effective and productive, and that’s how you can safeguard people’s time to come prepared. A lot of the meetings that I run had these conditions of acceptance. If you’re going to come into these meetings, I expect you’d already have these evaluations or analyzes. They can be written down, or they can be brought with you. Same thing with a brainstorm. I don’t think that expecting to context, shift everybody out of a whole bunch of meetings, dump them into a room for an hour, is going to yield as creative of insights. So the admission for that meeting is coming with ideas for the questions that were circulated in advance.
Rebecca Shaddix 20:36
And that means that every meeting is more effective, so people have more time to prepare more mindfully for all of these meetings, and so just to be intentional about how you’re using the time, live time is really great for collaborating and sharing ideas. It shouldn’t just be a regurgitation of something people can read. So you can protect the time for reading and flow and strategic work by letting people read when it works best for them, and then coming with a very high bar of expectation for preparation, because if you don’t, then the meetings are just not as effective. And people get trained that being in person is just kind of a show up vanity thing, as opposed to we expect a high level of preparation before this. And I do think it’s fine if you wanted people to read a little bit async, if you set five minutes in advance, but expecting that they’re going to have their most creative ideas at the same time in the middle of a busy day, typically, isn’t how people do their best work. And so I like to make sure people have access to things that we want them to evaluate or think about when they could be most productive, which could be 9pm for some people or 6am for others.
Christian Klepp 21:40
Yeah, those are some really interesting insights, and thanks for sharing those that though you know, this conversation really brings me back to some of those meetings that I had where we to your point. I had to discuss that data that we disseminated. And you’re absolutely right. At least from my experience, it’s really challenging, and it can be a little bit intimidating if you force everybody into a room and say, Come up with your best ideas now, based on all this data, right?
Rebecca Shaddix 22:11
Yeah, they don’t have time to sit with it, and it may not be when they’re most creative. There could be something else in their mind if you give people enough notice at least 72 hours, if you can, 48 is probably eight is probably okay, then you at least you know that they’ve thought about it and sat with it for enough time to have creative ideas flowing.
Rebecca Shaddix 22:30
Love it.
Christian Klepp 22:30
Right?
Rebecca Shaddix 22:30
That’s great.
Christian Klepp 22:30
Absolutely, absolutely. You also brought up something a couple of minutes ago which reminded me of one of my ex bosses. And it was a lesson in leadership, if you will. And it was one of these things that he said where he mentioned that he would ask everybody’s opinion first before giving his own. When he has meetings with senior management, director level types, right? He would always go around the room and Okay, so this is the objective of the meeting. These are the challenges or the problems that we’re facing. So Rebecca, what are your thoughts on this, right? And he would ask for everybody’s perspective first before making a decision, right?
Christian Klepp 22:30
Absolutely, absolutely. Okay, Rebecca, we’re you’ve given us plenty already, but we’re reaching the point in the conversation where we talk about actionable tips. So based on everything that you’ve said, If there’s somebody out there that’s going to listen to this interview between you and I, and what would you tell them, what are like, three to five things that they can do right now to turn this complex data into actionable B2B marketing strategies?
Rebecca Shaddix 23:42
Yeah, I think making sure that you’re aligned with your direct manager about the acceptable mistakes for the projects that you’re kicking off or you’re working on is going to have a big impact long term. Really, that when you think about decision making, making sure that you’re explicit with what those trade offs and those downsides are, because if you’re not, you’re probably going to get into a dicey situation of what success does or doesn’t look like. So be explicit with the acceptable mistake. Be explicit with the trade off and the decisions, and make sure you’re communicating that cross functionally with the other people who are involved. I think buy in early is important, and collaboratively is important. It’s not as effective to ask people to buy into a late stage strategy as it is to co create it with their expertise so early and often. And I think that that comes down to not being afraid to take up a little space, communicate directly and ask for what you need if you need more intentional meetings to drive that alignment. I know I certainly felt very anxious about asking for people’s time early in my career, I felt like I should just do everything excellently by myself. And it turns out that it’s not as good when you’re by yourself and just trying to do. Or something like when I was starting in Product Marketing, thinking that I would just deliver a fully fleshed out go to market strategy, and everybody would rejoice that all they had to do was execute their part. Yeah, right. News flash. That’s not what happened. Demand gen wanted to say earlier they had great ideas that made it better.
Rebecca Shaddix 25:15
So I think really thinking about your partners as not just a way for you to drive visibility for your initiatives, but to make them better, that buy in is not just to get them to buy into your ideas. It’s to make ideas better by incorporating their ideas earlier. So I would make sure kickoffs and retrospective meetings are a core part of the initiatives that you run. For any product, launch any campaign, you can scale them differently, but if you do that, if you define your acceptable mistake, make sure you have buy in early and are intentional about setting problem statements that you agree on first, before developing hypotheses, before developing experiments which data will measure. But in that order, thinking that you’re going to develop that in the opposite direction is just a recipe for the wrong direction. As I’ll just say that your data can’t reveal problem statements. You certainly could say our data is revealing a big drop off from our home page. That is a problem statement. And you would tie that to we need this conversion rate to be high, because it’s the start of our funnel, and if people don’t convert, so your data can certainly reveal problem statements, but before you start actioning them, you have to agree that the problem statement is the priority to solve.
Christian Klepp 26:31
Absolutely. Rebecca, you really, you really took me back with those tips, because that’s exactly what happened to me when I was a product marketer, right? Like, I felt like I’ll go at it on my own, and then I’ll do the big reveal in the managers meeting. And guess what? The plan got completely killed, right.
Rebecca Shaddix 26:50
Right, right. And I think that’s the missing point. When people talk about driving by invisibility, it doesn’t mean big reveal if you’re solo created strategy. It really means co-creating.
Christian Klepp 27:02
Absolutely, absolutely. So yeah, to your point, I learned it the hard way.
Rebecca Shaddix 27:06
Yeah.
Christian Klepp 27:07
And I did the take two, where I went to each business unit and business unit head, and I used this more collaborative approach, and they were more open to that, because now they were contributing their ideas, and they felt invested in the initiative versus, oh, it’s just something that marketing is going to do, right?
Rebecca Shaddix 27:25
Right.
Christian Klepp 27:26
When we had that next group meeting, they were advocating for the plan, because some of those ideas were, in fact, theirs, right?
Rebecca Shaddix 27:35
Great.
Christian Klepp 27:37
Right, exactly. But you know, sometimes you sometimes you just need a black eye, right? And then you Fantastic. Okay, I’m gonna throw this bit of a wild card question out there to you, Rebecca, and if you can, it will be fantastic if you can give us an example from your past or present experience where you’ve managed to analyze this data and come up with like a marketing strategy or a marketing approach or some kind of initiative that helped move the needle for your company.
Rebecca Shaddix 28:13
I imagine lots of folks have had this experience. You see just some bottleneck in the funnel of conversions between certain stages, and you suddenly don’t understand why people aren’t converting between whatever stage that you have of MQL (Marketing Qualified Lead ) to SQL (Sales-Qualified Lead). For us, it was the way we define, basically stage two of the funnel. And so there was suddenly this really precipitous drop off in what had been a relatively consistently converting funnel, and we all of a sudden noted, notice a true blocker just suddenly drop, tanking the conversion between the second and third stages of the funnel.
Rebecca Shaddix 28:54
And the first approach was just to quantitatively, look at the data and then start layering in a little bit of amplitude and mix panel data to see. Oh, can we explain it? And ultimately, that led to some incremental improvements and ideas. Oh, it looks like people are getting stuck here. Let’s make the form shorter. That did have incremental improvement. Let’s make sure that the wording is clear, great. So we changed from the copy so it was consistent between the landing page and the onboarding. We made the form shorter, and that did have an improvement, but what we actually found was that the buying persona had shifted with new budget cycles, so that the people who were making the purchase had become in many of our target segments, actually a different title, and that was leading to an entirely different acceptance process within their own internal process, and different rollout and onboarding. And so that meant that we actually were looking at the segments different. Currently. So what we used to call upper mid market, and mid market had to shift so we could tackle that focus a little bit differently to make sure that the right company sizes were changing.
Rebecca Shaddix 30:09
And so ultimately, what that meant was a redefinition of the firmographic persona and the criteria of what we categorized, as we called it, velocity enterprise. But basically what went into those different segments had to shift, because the market was shifting under us. Between the years of the buying cycle, we had a pretty cyclical buying cycle, and so macroeconomic trends were having a big impact on how budgets were being spent at certain moments, which was only obvious when we said, Okay, what’s happening here with this big bottleneck, and digging deeper into this fundamental shift in how different company sizes were suddenly engaging with their budgets and rolling out the product, came from asking, how do we fix this? The quantitative data was the identifying factor. It definitely helped us re segment. But qualitative interviews of what had changed was what made that contextually possible, to actually understand, interviewing the ideal fit customers and finding out how they had engaged that was often a different persona than the initial buyer by the time they came for renewals. That’s where some of those hypotheses came in that we could then test. But I think the most important thing to take away from that is the data revealed there was a problem of where the problem was starting, but it couldn’t reveal how to fix it without interviewing this ideal fit and trying to replicate that.
Christian Klepp 31:30
That’s actually a great example. I mean, because, and again, it’s because you were able to identify that there was an issue, and you use both like qualitative and quantitative data, right? Because had you not have done that, you wouldn’t have you wouldn’t have realized that the personas were shifting, and there wouldn’t have been an adjustment in the initiatives.
Rebecca Shaddix 31:51
Right. Yeah, we would have had incremental improvements that we would have said, Okay, this is our new normal. Look. We got better in this terrible state. It worked, right? But that’s the mediocrity point, right? So, yeah, we did do it. We did make incremental improvement just from the quantitative data, but the real, how do we fix this to keep growing and compounding? The growth came from the qualitative.
Christian Klepp 32:12
Exactly. Okay, here comes the soapbox question, and I’m sure, I’m sure you’re gonna love this one. But what is a status quo in your area of expertise so specifically on this topic, right that you passionately disagree with and why?
Rebecca Shaddix 32:31
I think making a lot of assumptions from data without realizing you’re doing it is pretty common. I think it’s really easy to look at a dashboard and say, See, this is what the data indicate, and that’s often an incomplete answer. So I think there’s a big risk that when you’re drowning in dashboards, you think you’re not making any assumptions anymore. But all of the data in the world is just measuring some inputs about how you decided to configure it so asking certain questions of dashboards that weren’t configured to answer them is pretty common, and it’s a pretty common way to sort of derisk and like C.Y.A (cover your ass) your decision making. See, we can all agree that this is the right decision, because look at this dashboard, and that’s usually an incomplete and mediocre answer. All of the data in the world can only tell you what it’s measuring. It can’t tell you how it’s related to other metrics you care about, necessarily. And so I guess that’s the first one is expecting dashboards to make decisions.
Rebecca Shaddix 33:32
But I also think not being explicit about pairing metrics that you’re measuring. So I let’s say you care about conversion rate of a certain segment, if you don’t pair that with retention of that same segment, then you could be missing out on impact. And that’s just one example. Basically, pairing metrics are things that I think about you couldn’t just over optimize for one without evaluating it alongside something else that matters. And so a North Star metric is important to align around. But then when you’re evaluating how you’re getting there and how you’re trending for the overall business health, making sure that you’re not too myopic at any single metric at the expense of one that would also be business critical, but potentially impacted. I think for B2B SaaS, that’s a good example.
Rebecca Shaddix 34:23
We can acquire a ton of customers by selling to people we should have disqualified and promising things the product doesn’t do. And if we don’t, then pair that with CSAT (customer satisfaction score) or NPS (Net Promoter Score) or retention, then we’ve lost the plot. I think the same thing for the incentive to discount, we could say we really need to drive up Q4 revenue, and you could slam your audience with a bunch of discounts that encourage people who may have held off to buy more quickly, but you could have then tanked their LTV (Lifetime Value) if you’ve accustomed them to a discount that they now expect to get. They may not have been as invested in your product, if the discount is what drove it, they may be more price sensitive. They may not be the power users you want to optimize for, and they could be people you start over indexing their opinion in your product because of the sheer volume of them, versus waiting appropriately who the segment is, which segments are really, actually going to be driving revenue long term. So I think that not having tension or paired metrics and just slamming for these near term goals is a pretty common mistake that takes a lot of discipline not to be blindsided by. And it comes back to really articulating with your leadership team what you’re willing to trade off and not.
Christian Klepp 35:41
Absolutely, absolutely. Okay. Here comes the bonus question. I have it on good authority, um, that you’re into improv comedy as well. All right, so, um, I’ll leave it up to you if you want to tell me a joke or not, but the but the other question is, I’m not sure how long you’ve been doing improv comedy, but I’d like to know how that has helped you to improve, personally and professionally.
Rebecca Shaddix 36:08
Yeah, it’s a great question. I think the short answer is, there’s the way that I’m aware it’s helped me improve, and I’m happy to talk about those, but I also think there are ways that I have grown because of it that I haven’t consciously documented, which I think is true of anything that we practice. But I doing it for just under two years. My husband and I went to a show just on a date almost two years ago, April of 2023 and I started doing classes afterwards, but initially I just got into it because I noticed I was overthinking things. And while we were at the show, I was sitting there. I just thought, you know, who doesn’t overthink things is improv actors, because they just don’t have time it’s happening.
Rebecca Shaddix 36:46
And so, yeah, I just got in as a way of mitigating my overthinking. And didn’t have, I didn’t enjoy it. I mean, I didn’t have any grand expectations this would become a hobby. I thought this would be kind of a chore. I forced myself to do because I thought it was good for me. And for the first year, it certainly was that it was something I dreaded, but I did because it was good for me. And over time, I’ve just come to really appreciate the camaraderie that the real way to do improv well is to be there for your scene partner, to set them up, to be more focused on what the scene needs and on what you need, or on being funny, and so things go well when you’re focused on giving your partner gifts or endowments that can help it move forward, and things go poorly when somebody’s worrying about being funny.
Rebecca Shaddix 37:30
And I love that, because I think so much of life is worrying about how we’re performing or perceived. And when you just ask, what does this scene need with the situation need, you can go a lot farther. And so I think that’s a big one. Not overthinking things is a big one before you go out for shows, there’s this ritual of patting people in the back. I got your back, I got your back. And so I just like that camaraderie of really making sure that pro-social, collaborative approaches are the core focus of what people need bail an amount if, if things need to end or edited. But I think there’s probably more that that’s a big way not overthinking things, not being so self critical. But really the way I got myself to go out, like into classes, even the first time at my first show, especially, I really did not want to do it at all. And the way I got myself to go was just by saying, you regularly speak in front of a lot more people than this, a lot of those people can fire you. So what’s the worst that can happen? You look stupid in front of 40 people. You’re fine. You’re gonna survive. And so I think that, that mindset of what’s the worst that could happen helped me lower the stakes of a lot of my thinking, because and my default is often over preparation.
Rebecca Shaddix 38:47
So whenever I feel insecure, preparing, preparing, preparing is my default, and just showing up half an hour before, having no idea who’s performing what you’re doing, what the theme is, frustrated me a lot. I was like, come on, he just put 20 minutes of preparation, and this would go so much better. But just accepting that that was part of the process was exactly what I needed. And so lower stakes thinking, but also accepting that things are not so consequential, right? The best show of my life, the worst show of my life, the next day, it did not matter. The world goes on and in life, I think that’s helpful, because the best project of your life, the worst project of your life, life goes on, and it’s not the end of the world. And so helping me lower that catastrophizing has been great.
Christian Klepp 39:34
Also some really, really great insights and life lessons too, I’m gonna say, and especially the bit about overthinking things. I truly believe that that’s a lesson that B2B marketers can learn from because there is a lot of overthinking. There is a lot of analysis paralysis and opinion itis that’s not to say that you know you shouldn’t be planning at all. That’s not what I’m saying, right?
Rebecca Shaddix 39:58
Yeah, no, no, I think know your miss is this thing that comes to mind. I got that from Disc Golf. If you tend to be an over prepare like me, shipping things before you think they feel ready is probably the miss. If you tend to be a little more spontaneous, maybe a little more preparation is helpful for you. Knowing your default is really helpful to decide, well, feeling like you’re doing too much prep may be right if you’re on this side and feeling like you’re shipping prematurely, maybe right, just knowing your default is helpful.
Christian Klepp 40:28
Absolutely, absolutely. Rebecca this, this conversation was dynamite. I mean, thank you so much for coming on the show and for sharing your experience and expertise with the listeners. Please, quick intro to yourself and how folks out there can get in touch with you.
Rebecca Shaddix 40:43
I love LinkedIn. Let’s connect on LinkedIn. Rebecca Shaddix, if you put in a note that we you listen to this here, and would love to keep the conversation going.
Christian Klepp 40:50
Fantastic, fantastic. So Rebecca, once again, thanks for your time. Take care, stay safe and talk to you soon.
Rebecca Shaddix 40:56
Thank you. It’s been fun. You too.
Christian Klepp 40:58
Bye for now.
© Copyright 2025 EINBLICK Consulting Inc.
All rights reserved.
REGISTER NOW FOR WEBINAR
How to Get a Meeting with Anyone