I'm a partner in the advanced analytics group at Bain & Company, the global management consulting firm. My primary focus is on marketing analytics (bio). I've been writing here (views my own) about marketing, technology, e-business, and analytics since 2003 (blog name explained).

Email or follow me:


« Clunkalytics | Main | My First Google App Engine App »

August 13, 2009

Marketing Analytics Management

As marketing goes digital, it becomes more accountable:  targetable, testable, trackable.  This is good -- in theory.  In practice, digital marketing has turned many marketers into sorcerer's apprentices who just can't get ahead of the flood.  It's not a question of a lack of analytic smarts or the availability of technology to support the work. Rather, the challenge is collective coordination.  

Some marketing teams compensate by hunkering down in channel-defined silos: "You do email, I do affiliates, he does, SEM, she does catalogs, those guys run the stores."  But by now most people are hip to the idea that the best, most effective customer experiences optimize efforts to attract, engage, and convert customers across channels.  Unfortunately, most marketing teams, and the ecosystem of agencies, technology vendors, research firms, and consultants that support them, are not organized to integrate analytics and tie them comprehensibly and usefully to the mix-, campaign-, and customer-level decisions such experiences require.  Marketing analytics today are 110V appliances trying to plug into 220V strategic sockets, and what the world needs is a good adapter.  

"Marketing analytics management", to coin a term, needs to move from an ad hoc set of informal activities run by a "priesthood" to a formally-defined and run process that is more broadly accessible.  As with any process, the basic questions are:

  • what are the outputs?
  • what are the inputs? 
  • what happens along the way?  
  • who's involved, and how? 
  • what resources are needed to run the process?
  • how do we evaluate and manage the process?  

As a senior retailing executive said to me last fall, "I need my marketing team to think less like accountants, and more like decision makers!"  The point: stop thinking that the process ends with a regular report whose chief virtue is that it's accurate -- or, even, that the process ends with one or more particular trends or insights about customer behavior that we should pay attention to.  

Rather, each and every analytic effort needs to have a keep, change, drop, or add implication for the current marketing plan -- itself defined as the set of decisions (expressed as actions) about whom to target, how (what products, price, promotion, etc.).  In a formal "marketing analytics management" context, this means an explicit, documented two-way association.  Top-down, following explicit logic (expressed through models like the ones suggested by The Pyramid Principle), the marketing plan needs to selectively "commission" analytics efforts to support and monitor documented hypotheses and decisions about what to spend on and how to execute.  Conversely, any ad hoc, bottom-up analytic effort should seek a relevant decision as a "patron" for the effort.  A more complete "outputs data structure" might look like this:

  • potential decision (action)
  • analytic initiative
  • stakes (in dollars, likely multiple fields) 
  • deadline for decision (for example, marketing plan due dates, important events, etc.) 


For each decision to be made, and for any associated analysis to be done, we have to have:

  • data that's easily findable, sufficiently understood, sufficiently clean, and sufficient;
  • tools up to the complexity of the analytic task;
  • a sense for how accurate and rigorous the answer has to be given the stakes involved in the decision to be made;
  • a collective memory of relevant past decisions and analysis supporting them 
First, I think it's much more important to know where data lives, and to understand and trust it, than to have lots of it.  If you have the former two, you can always get more of the last if you need it, whereas the reverse just makes a small problem bigger.  This is especially true as the emerging problem du jour to be solved is integration.  To this end, as I've suggested before, it's important to have a data management framework that includes the following elements:
  • map that shows where all the data lives;
  • dictionary that explains what piece of data means;
  • directory that explains how to get to the data;
  • a log (just like the inspection logs in elevators) that documents periodic audits making sure that everything is flowing as it should 
Second, as for tools, I've been on both sides of software vendor pitches, and my preference is always to think "decision-in" rather than "analysis-out".  Today's tools, even the free ones like GA, are quite powerful, veritable swiss-army knives for analytic expeditions.  You can spend a lot of time watching a sales engineer demonstrate everything you can do with them.  It's much better to be able to go into these meetings and say, "Here are the decisions we're trying to make, the analysis we think we need to support that, and the data we can reasonably get and rely on.  How would your tool collect, scrub, store, aggregate, and synthesize to support our decisions?  How long would it take to implement, configure, and modify to do that?  How much of this can we do ourselves?"  If you're going to take this approach, it helps to have:
  • guide that describes how routine analytics get done and associated decisions get made
Third, think creatively about statistical validity.  In a recent pitch to a media firm, we pointed out that by concentrating sampling on customer segments with disposable incomes more likely to make fine-grained targeting worthwhile, and by relaxing confidence levels from 99% to 95% and confidence intervals from 5% to 10%, we could reduce total necessary sample size by 85%, and associated research costs by nearly 40% (a big piece of these are fixed of course).  In short, we focused the bulk of the research to be done on where it would pay off -- thus managing research dollars the same way we would have optimized, say, a PPC campaign based on them.

Finally, it's always tempting to short-change documenting and archiving past decisions and analysis when budgets and timelines are tight.  But that's if you get artificially fancy.  At the cheap extreme, get a Gmail account, set up filters and labels, then have folks send relevant stuff to it.  Voila:
  • library (searchable at minimum, if not indexed) that stores all of the individual bits of research (again: queries, regressions, tests, benchmarks) and conclusions you've reached in the past.

Just as data need to be scoped to be appropriate to the stakes for a given decision, so should a process -- the steps and people necessary to go through to get a decision.  Figure out the deadline for the decision, subtract the days required for the review process, along with the days required to get higher-priority work done, and you're left with the limited time you have to do the necessary design, data collection, analysis, and packaging.  Presto, discipline.


At dinner recently, IBM Research's Kate Ehrlich observed she's seen three main roles in heavy-duty analytics efforts:
  • decision-maker
  • subject-matter expert 
  • data analyst 
These are often and appropriately distinct people.  But, it's really important that the folks in each of these roles be able and willing to communicate with each other -- that they be know enough and be curious enough to ask good questions, so that their collective sniff tests about the size, process, and conclusions of any analytic effort be accurate ones.  (For me, Avinash Kaushik's Web Analytics: An Hour A Day represents the gold standard for thinking about what the "101" level of understanding might be in that analytic category specifically, but perhaps by analogy more generally as well.)


A recurring theme behind the points above is that you should know where you're trying to go with your analytic efforts, and scope the work, time, tools, and people involved accordingly.  It's been my experience that there are exponentially diminishing returns and similarly exponentially increasing costs to analyses seeking to explain variances beyond a handful of variables (corroborating case in point: predicting tweet virality), or to cut uncertainty by a few percentage points (as above), or to involve more than a handful (3-5) people in primary roles in a review process. Again, all this isn't meant to dissuade us from such complexity -- it's just to make sure that the stakes make it worth it.  I've also experienced the costs of NOT maintaining a tight ship when it comes to the data framework described above -- not just in terms of assembling data and re-learning lessons, but in terms of missed revenue and margin opportunities when bad data hide underperforming operations.

So how do you know if you've got a healthy "Marketing Analytics Management" process?  In most places, the associated work and resources are managed as cost centers.  But just as we work to make marketing campaign costs (creative, media) more accountable, it makes sense to work some allocation of the associated research and analytics costs into ROI calculations as well.  Doing so has two important benefits.  First, it associates analysis with decisions.  Winning decisions end up transferring credit to the insights and people that helped to generate them.  Winning decisions that were over-analyzed to the point where results dragged accordingly can also be tracked.  Second, it isolates and illuminates analytics and related resources that aren't associated with any decisions or decision-makers who will commission them.  A corollary requirement to make this work is that allocations for analytics shouldn't necessarily be mandatory.  If a decision-maker can get what he or she needs from other sources, or can successfully fly by the seat of the pants, more power.  But with freedom comes responsibility...

Once again, the point of this post is to suggest to senior marketers some pragmatic "scaffolding" they might adopt to address the sorcerer's apprentice problem.  That way, they don't have to rely on the "Master's" intervention: Mickey may have gotten off with a cute, sheepish look, but in today's business climate things are likely less forgiving for the average executive.


TrackBack URL for this entry:

Listed below are links to weblogs that reference Marketing Analytics Management:


The comments to this entry are closed.