Marketing teams are trusted to tell brand stories effectively. And yet, many can’t tell their own story internally – one that proves marketing is an investment, contributing to the top line, not an expense detracting from the bottom.
It’s an important discussion that shouldn’t be overlooked. If it’s buried, the marketing team can expect its budget and even the CMO or marketing leader to fall right along with it.
Why CMOs are defenseless against budget cuts
A typical scenario sees a company’s CFO tell the CMO that their budget will be reduced by 10% for the coming year. The CMO disagrees but doesn’t have the data to prove that this will result in sales losses.
The decision is made.
But the conversation can go differently if the CMO shows that every $1 spent on marketing results in a $3 increase in sales and that the 10% budget reduction creates a quantifiable decrease in revenue. This new conversation can only happen when the proper research, modeling, and testing tools and methodologies are in place.
Erase gaps in your research, modeling, and testing
The Harvard Business Review finds that “more than 80% of surveyed marketers were dissatisfied with their ability to measure marketing ROI.” Analytics can largely solve this challenge but only if you know what, how, and when to analyze your performance. Through the careful and persistent use of these tools and techniques marketers will improve their campaign effectiveness and be able to quantify incremental improvements.
Let’s start with research. There are many fine market research vendors out there and you may have a small team internally that can properly design and deploy surveys. However, that’s just the beginning. The more difficult and rewarding step is to extract insights, connections and patterns from the research and then use those learnings to shape and refine a marketing plan. This takes access to a wide swath of data (in source, time and specificity), and the right analytical resources. Here there’s no faking it until you make it — a team of data scientists think and act differently than people who simply know how to use Excel.
Are your research efforts merely collecting data? Or, are they disseminating insights that alter decisions across your organization? Are those insights serving as inputs to subsequent modeling or test plans? Finally, are those results feeding back to your research method, thereby completing the virtuous circle?
Next up, modeling. You need to know what’s working and what’s not so you can allocate your marketing investment. There are a variety of techniques to do this and a plethora of vendors offering their solutions. Most importantly, marketers should be aware that a single monolithic solution is unlikely to answer all of your questions; rather, you should consider using a collection of hierarchal or nested models instead. Choosing which of these to use depends on the questions you are trying to answer. But be aware that every model has gaps in what it can do and answer.
Two popular approaches to such measurement are Media Mix Models (MMM) and Digital Attribution Models. Each has their uses but are far from interchangeable. If your primary questions are strategic or budget-setting in nature, then a MMM may be a perfect place to start. If, however, you want to focus on the relative performance of media tactics and your KPI or conversion signal is digital, then a Digital Attribution Model is a better choice.
Quad has developed a model, the Tactical Media Effectiveness Model (TMEM), which is algorithmically similar to an MMM while yielding media tactic performance in a rather timely fashion. If your conversion is offline, or a significant amount of your investment is in offline channels, this may represent the best option for your organization.
Finally, once you’re using the proper modeling approach, test boldly. The methods noted above are unsuitable in cases where you wish to try something new because they are based on estimating the performance of what you’ve done in the past.
By “new” I don’t mean just new media channels or new messages, but also increasing your media volume significantly in the same channels using the same messages. The models, in this situation, yield estimates that might be extrapolations well outside the range of confidence. When you want to try something new and decide that a test is in order, remember there is nothing more useless than an inconclusive media test. Performing tests is time-consuming and can involve significant hard costs; therefore, your tests need to be designed to ensure you get a conclusive result (it might be negative). You do this through bold tests.
How do you know if you’re testing boldly?
- Are you applying 10% of your media spend to the new testing strategy or 50%?
- Are you making slight color and copy changes to existing creative or testing entirely new packages or creative?
- Are you making a slight increase in your messaging frequency or doubling or tripling your output?
The bolder the adjustment, the more conclusive the test.
You may be thinking that such a testing approach is too costly, or could be a significant disadvantage to a sales region, product, or brand. Bold does not mean costly and you can mitigate the risks by careful design.
Lastly, remember that each time you conduct a conclusive test you’ve learned something. By incorporating that learning into your normal business operations moving forward, your company does better. Therefore, establishing a culture of continuous improvement through testing, modeling, and research will yield the superior results you are seeking.
This journey can be arduous and it’s often hard to detect the marginal gains, but they add up to meaningful improvements if you put in the hard work and your organization is dedicated to the process. The easy and obvious improvements have already been made. You need to begin this journey right now if you hope to beat your competitors. And, when you start seeing results, that conversation with the CFO will be fundamentally different!