Skip to main content

Marketing measurement is hard. Good marketing measurement is really hard. A recent Deloitte Digital survey of U.S. marketing measurement leaders cited “a lack of top-down vision from leadership regarding measurement, as well as poor buy-in from stakeholders.” In fact, only 36% of those leaders reported a strong understanding of media’s impact on business outcomes. Considering the numerous and complex data sources used to evaluate channel and campaign performance, as seen in the following image.

So, what are marketers to do in the face of all this data? How do marketers navigate the waters of estimating media value with so much data and possible outputs. This becomes even more challenging, when they don’t all “agree” with each other. The truth is that measurement outputs rarely, if ever, perfectly align. And so the other path forward is to evaluate and incorporate them all to some degree.

Understanding all of the methodologies at play and attempting to bring these disparate sources together can feel daunting.  Many refer to this as, “analysis paralysis,” however the cost of doing nothing is often much higher. While there is no one right way to “triangulate” these data sources, there are FOUR key best practices that marketers can rely on.  This is particularly true when attempting to draw reasonable and decisive conclusions to quickly take action and drive business outcomes.

Best Practice #1: Comparing against financial TARGETS

The amount of effort put into “triangulating” measurement outputs depends in part on how they compare to financial targets. If measurement outputs roughly “tell the same story” in terms of being above or below target (Scenario 1 below), then interpretation and action may be relatively clear. However, if the results land on either side of the financial target line (Scenario 2 below), the interpretation isn’t as clear and applying the remaining best practices below will likely be needed.

Best Practice #2: Considering the CONTEXT of measurement outputs

Measurement systems and methodologies rely on different data sets as inputs and similarly will output data at different levels of granularity, across different groupings of channels, at different frequency, etc. Understanding the context of each measurement output is critical for being able to compare and triangulate it against other signals.

For example, MMM’s typically output results on a quarterly basis and incorporate a lot of historical channel performance data as well as other factors such as seasonality, weather, competitor activity, promotions, etc. whereas Incrementality Test outputs are more timely but the results represent a “snapshot in time.” So, for example, if Q4 holiday is on the horizon one might rely more heavily on MMM than an Incrementality Test conducted 6 months ago because seasonality is one of the most critical factors to take into account. Conversely, if there was a dramatic shift in the market (due to technology changes, global events, etc.) a recent Incrementality Test would much better reflect this than an MMM that is relying almost entirely on data from before the dramatic shift. When comparing measurement outputs, context matters.

Best Practice #3: Understanding Which Measurement Signals Carry Weight and WHY

Organizations are complex, and oftentimes siloed groups within them rely on different (and often competing) measurement systems and outputs. A CMO may look to an MMM model to support long-term planning, while the head of eComm may use MTA to make cross-channel decisions, whereas a Paid Search manager may use Site Analytics data to justify branded search spend. Understanding which stakeholders are using which measurement outputs and why is a first critical step in being able to bridge the gap between them.

Separately, organizations are often slow to adopt new technologies and methodologies. Onboarding a new MMM or Incrementality solution may be exciting for marketing teams, but often finance, and sometimes analytics, teams resist the perspective and insight that these new systems bring so as not to “ruffle any feathers” or because the conclusions don’t align with their goals.

Best Practice #4: Connecting Methodologies Back to a Business OUTCOME

Good measurement always begins with a business question and has a business outcome in mind. Measurement methodologies are then mapped against the question to create a hypothesis and execute analysis and/or testing to drive towards an answer and the resulting decisioning and action.

For example, if the business question is, “How do I best plan my media mix for 2024?,” the most appropriate methodology to address this will likely be Marketing Mix Modeling given the omnichannel nature of most MMM models, with other methodologies potentially playing a supporting role.

If the business question being asked is, “Can I scale up OTT by a factor of 3X and still achieve a reasonable ROI?,” MMM is not best suited to answer this question because of the statistical limitations in extrapolating outside of historical ranges. Instead, a carefully designed multi-cell geo-matched market test where individual cells simultaneously test significantly different spend levels (at a fraction of the cost) might be most appropriate.

Example “Triangulation” Exercise

Following is a (somewhat simplified) scenario of how to address two specific business questions that address investment decisioning between channels. The below table shows Publisher-Reported spend and conversions, and various estimated CPAs across a number of measurement systems for two important channels+tactics: 1) YouTube Prospecting, and 2) Google Non-Brand Paid Search. Let’s assume for this exercise that our target Cost per Action (conversions in this case) is $40.

The two key business questions / challenges to be addressed are:

  1. “I need to decide which channel+tactic is performing better to re-allocate dollars to optimize my mix as needed.”
  2. “I’ve just been approved for $250K in incremental budget in an effort to bolster revenue. Can I scale up in either/both of these two channels+tactics?”

Question / Challenge #1: Optimizing channel mix

In this example, the goal is to compare performance between these two channels and then benchmark against the CPA Target to decide whether one or both warrant investment at existing or adjusted levels. Below is a visual representation of the measurement outputs across the two channels.

What first jumps out is that there’s no apparent “clear winner.” However, digging into the details and context of these outputs might shed some light. For YouTube, Site Analytics shows poor performance, but this is expected given that its click-based nature is at odds with the view-based nature of YouTube, so we’ll significantly discount this result. MMM shows solid performance. A previous in-market incrementality test (albeit conducted 6 months prior, as noted above) reinforces this with a really strong result. Overall, signals point to YouTube likely performing somewhere around target.

For NB Search, MMM similarly shows strong performance. The Publisher-Reported results show promise, but it depends entirely on which internal walled-garden attribution model Google is reporting data on, so more research is needed here as to how to interpret this result. MTA and Site Analytics, both traditionally click-based attribution systems, show unfavorable performance for a channel that is nearly entirely click-based.

Overall, signals seem muddled at best, and likely point to NB Search underperforming relative to YouTube. Next steps could include:

  1. Rebalance dollars away from NB Search towards YouTube, while digging into ways to improve NB Search campaigns.
  2. Revisit YT with another incrementality test to support initial findings.
  3. Work with a partner who can design and help deploy a NB Search incrementality test (and perhaps include YouTube, as noted in Scenario #2 below)

Question / Challenge #2: Scaling investments

In this example, the goal is to understand if one or both channels can be confidently and quickly scaled up (and to what degree) based on existing measurement outputs. The below image is a refresher from the previous section.

Significantly and confidently scaling a channel requires having measurement outputs within or near the range of historical and/or “tested” spend. MMM works well typically up to ~20% outside of historical spend levels; beyond that, performance is difficult to predict. Incrementality testing is best suited to address this question, where a test can be designed to simultaneously test significantly different spend levels.

In this case, for YouTube, we’ll again discount Site Analytics results (see Scenario 1). Both MMM and a previous incrementality test shows efficient performance at/below the Target CPA. However, incrementality tests can become “stale”, so carefully consider seasonality and other recent factors. Regardless, it appears there’s some room here to explore.

The next step could be to immediately push spend up ~20% to “see what happens,” while at the same time design a multi-cell incrementality test at increasing levels of spend (1.5X, 2X, 3X, etc.) in an attempt to “carve out” a diminishing returns curve to identify a spend level where performance meets the $40 CPA.

For NB Search, things get more complicated. Site Analytics (which, again, favors click-based media) and MTA (which usually rebalances and improves performance of upper-funnel media like NB search) surprisingly show very challenged results. Simultaneously, MMM shows considerable promise. So, next steps here depend on risk tolerance and business priority. Options could include:

  1. Do nothing and focus efforts on the more obvious opportunity with YouTube.
  2. Rely on MMM to push spend up 10-20% and hope for the best.
  3. Given the relatively significant investment in this channel ($500K vs. $200K on YT), work with a partner who can design and help deploy a geo-matched market incrementality test to evaluate scale opportunity (Bonus points for designing a geo test that includes an “overlap” cell to investigate the channel synergies between YouTube and NB Search).

We Can Help

Good measurement isn’t easy. Transparent Partners helps brands gain alignment around best practices in understanding the most relevant methodologies that address key business questions and competently navigating disparate measurement sources to confidently identify where to best invest the next marketing dollar — connect with us here.

Transparent Partners