Pragmalytics, Part II
When we launched our new firm earlier this year, the short description of the opportunity we saw was "marketing analytics services". Everyone we talked to about it was enthusiastic about the gap we proposed to fill, for the reasons we've previously described. And since then, we've made good progress, with the good fortune to work for some great clients and to produce clear and significant results for them. At the same time, the market has taught us some interesting things about the need that we're focused on serving.
The first and most important lesson has been that there is no market (so far) for "marketing analytics services" per se. All of our assignments so far have been motivated not by our clients' needs to "bolster their analytics capabilities", but rather by something more closely related to the "story" of the business: improving performance, figuring out how to incorporate and expand the use of digital (and especially social) channels in the marketing mix, or a major event or change of some kind (like launching a new product, channel or business).
Consequently, our assignments have been more "executive" than "functional". For example, we've been retained to design and run overall marketing programs, or to advise at this level. In these contexts, "analytics" have come to the fore as means to an end, and "building the analytics capability" has been an ancillary deliverable. So, our starting point has been to look for flags -- things that seem odd or missing, or quantitative performance that seems out of place to us, based on internal and external benchmarks, the "business story", and our own experience -- as a point of departure for where to focus "marketing analytics services".
For example, in a recent assignment with an overall charter to "drive revenue and build the brand", we noticed that one channel seemed to be performing below where we'd expect it to have, and its under-performance had become a bit of a self-fulfilling prophecy. Looking more closely at it, we noticed two things. First, the number of people we were targeting seemed low. Second, looking back over a year, there were significant differences in the performance of offers with certain themes as compared to others, and the lower-performing themes had been over-emphasized. A match-back analysis revealed information we had internally to allow us to target several times the number of customers, and that this disconnect was due to some broken technical plumbing that could be fixed easily. On the theme front, by mining past order data to organize customers into a few straightforward purchase-behavior categories and targeting accordingly, we identified opportunities to raise the performance of themes that hadn't previously been doing as well as we thought they should have. Together, these improvements when implemented will be worth a lot of money, both directly in terms of the increased value from this channel alone, and indirectly in terms of the substitution potential for segments deemed only marginally profitable if reached through other channels as part of the overall marketing mix.
Second, with the exception of the biggest firms (and even then rarely), we haven't seen "analytics" get executed as monolithic efforts where complex models are built on huge data sets in separate labs. Rather, "analytics " tend to happen as a series of ad hoc queries, regressions, and tests across disparate data sets and interfaces, prioritized by the opportunities of the business, rather than by "build it and they will come". Aimed at "on-the-fly" tuning of the business tied to the "story" we described above, these efforts have been justified on a "pay-as-you-go" basis, with tight cycles for finding, validating, and realizing opportunities to bootstrap investments in further opportunities.
For example, we were recently asked to offer a proposal to help a firm better use online channels to drive offline sales. The range of issues we were asked to address was quite broad, but in the end the firm chose based on more focused expertise in its sector, and especially with tuning sector-specific channels, with a refined knowledge transfer objective: "We need to get smart first, and walk before we can run."
Third, for any given analytic insight, be it customer segmentation and associated offer/channel tuning or something else, there seem to be two versions: the "great" one that fully exploits the opportunity but is hard to grasp and often harder to implement, and the "good" one that is straightforward to grok and to go after. Don't try to start the car in fifth gear -- do the simple version that gets 30-40% of the benefit first, and use that momentum to fund the cash and credibility, and develop the informed perspective, you need for shifting up. More specifically, I suggested recently that analytics shouldn't get more than three months ahead of the ability to act on the answers.
Smart analytics platform vendors get this. Recently I've spent time with the top teams at two such firms, VisualIQ and Proclivity Systems. Each of them have sensible migration paths for ramping up practically to the full value of what they can do, through sequencing the data sets they combine and focusing the breadth and depth of the customers they target.
Fourth, while the "data deluge" of the digital world demands that you improve your infrastructure for taking advantage of it, I've come to believe that we need to think about building a "data warehouse" as a continuous process and not a single or episodic deliverable. To this end, what's important is having
- a map that shows where all the data lives,
- a dictionary that explains what piece of data means,
- a directory that explains how to get to the data,
- a guide that describes how routine analytics get done and associated decisions get made, and
- a library (searchable at minimum, if not indexed) that stores all of the individual bits of research (again: queries, regressions, tests, benchmarks) and conclusions you've reached in the past.
These artifacts need to be living documents, like Wikipedia, rather than static editions. But like Wikipedia, they need to be "curated". Or if you prefer a different metaphor, think of them as parts of an "insight engine" for driving your business, which collectively need to be lubricated and maintained to run your business at peak performance. Based on recent assignments, I'm convinced there's a 20-30% time-to-market advantage associated with having these things up to snuff.
(As for technical issues, and in particular what's required to support integration, "hit-and-run" manual integration ("dump flat files out of systems X and Y, import, scrub, integrate, and analyze them in Excel or something more exotic) is expedient but unsustainable for anything you plan to do more than once or twice a year, as for planning a seasonal catalog mailing. At the same time, trying to get everything into a single system is likely to pay off only if you plan to crunch that data very frequently, like every day or at least a few times a week, like if you're targeting offers in real-time in an online store. For channels and campaigns with "intermediate" timing, like email, or changing online ad copy or offers through affiliate programs, a "Service Oriented Architecture" ("SOA") approach (to support "mashups" off a number of "data warehouselets" for analytics projects) may make more sense.
You also need to consider the skills and experience of your marketing team. If every analytics project requires help from IT to assemble the necessary data, you may need some sort of intermediate interface and / or training to ease that bottleneck. Conversely, trying to build the ultimate interface is a quixotic quest guaranteed to be crushed under its own weight at some point. An emerging litmus test for me is the degree to which marketers in any given firm have moved away from using spreadsheets (Excel) to databases (Access, etc.) for ad hoc analytics. Either extreme is problematic, though I haven't yet seen the marketing team that is 100% SQL-literate. Great tutorial for non-programmers here, BTW.)
All this has led to a deeper reflection on the meaning of the current "analytics" craze. How much is hype, and how much is real? The first thing to consider is that we're all using "analytics" already, to differing degrees, to answer "how can I usefully define target customers, and what combination of offers, communications, and channels gets them to respond?" So prioritizing "analytics" is really more a question of degree and direction rather than a discrete decision. In fact, ultimately, it's about business governance. How much decision-making -- and for that matter, learning -- will I centralize, through models and tests developed and interpreted by analytic "experts", versus delegate, through selection, training and development, to managers and employees?
To make "analytics" a higher priority for your business going forward, you have to believe:
- more sophisticated tests and models would provide insights and potential returns that would justify prioritizing them
- these enhancements lie within what's practically and affordably possible
These tests move the prioritization decision away from philosophical evangelism to practical assessment, based on how competitively sophisticated you are in parts of your marketing program where it matters to be so. They also move the decision away from one to generally surf a trend, toward one to selectively focus building your capability.
So, we're tuning the business to address these lessons...
You are totally correct in that no marketing team is 100% SQL-literate. In fact, I haven't seen anything more than 25%, and most of those people use SQL on Access. Another tutorial I've found that's good for non-techies is http://www.1keydata.com/sql/sql.html
Posted by: Tanner Lee | January 13, 2012 at 10:32 PM