One of the big innovations used in the F-16 fighter jet was the "fly-by-wire" flight control system. Instead of directly connecting the pilot's movements of the control stick and the rudder pedals to the aircraft's control surfaces through cables (for WWI-era biplanes) or hydraulics, the pilot's commands were now communicated electronically to an intermediate computer, which then interpreted those inputs and made appropriate adjustments.
This saved a lot of weight, and channeling some of those weight savings into redundant control circuits made planes safer. Taken to its extreme in planes like the B2 bomber, "fly-by-wire" made it possible for pilots to "fly" inherently unstable airplanes by leaving microsecond-by-microsecond adjustments to the intermediate computer, while the pilot (or autopilot) provided broader guidance about climbs, turns, and descents.
Now we have "fly-by-wire marketing".
A couple of days ago I read Daniel Roth's October 19 article on Wired.com titled "The Answer Factory: Fast, Disposable, and Profitable as Hell", describing Demand Media's algorithmic approach to deciding what content to commission and publish. The article is a real eye-opener. While we watch traditional publishers talk about turning "print dollars into digital dimes", Demand has built a $200 million annual revenue business with a $1 billion valuation. How? As Roth puts it, "Instead of trying to raise the market value of online content to match
the cost of producing it — perhaps an impossible proposition — the
secret is to cut costs until they match the market value." More specifically,
Before Reese came up with his formula, Demand Media operated in the
traditional way. Contributors suggested articles or videos they wanted
to create. Editors, trained in the ways of search engine optimization,
would approve or deny each while also coming up with their own ideas.
The process worked fine. But once it was automated, every
algorithm-generated piece of content produced 4.9 times the revenue of
the human-created ideas. So Rosenblatt got rid of the editors.
Suddenly, profit on each piece was 20 to 25 times what it had been. It
turned out that gut instinct and experience were less effective at
predicting what readers and viewers wanted — and worse for the company
— than a formula.
I'm currently in situations where either the day-to-day optimization of the marketing process is too complex to manage fully through direct human intervention, or some of the optimizations to be performed are still sufficiently vague that we can only anticipate them at a broader, categorical level, from which a subsequent process -- perhaps an automated one -- will be necessary to fully realize them. I also recently went to and blogged about a very provocative MITX panel on personalization, where a key insight (thanks to Scott Brinker, Co-founder and CTO of ion Interactive) was how the process to support personalization needs to change as you cut to finer and finer-grained targeting. So it was with these contexts in mind that I read Roth's article, and the question it prompted for me was, "In a future dominated by digital channels, is there a generic roadmap for appropriate algorithmic abstractions of marketing optimization efforts that I can then adapt for (client-) specific situations?"
That may sound a little out there, but Demand Media is further proof that "The future's already here, it's just not evenly distributed yet." And, I'm not original in pointing out that we've had automated trading on Wall Street for a while; with the market for our attention becoming as digital as the markets for financial securities, this analogy is increasingly apt.
So here are some bare bones of what such a roadmap might look like.
Starting with end in mind, an ultimate destination might be that we could vary as many elements of the marketing mix as needed, as quickly as needed, for each customer (You laugh, but the holodeck
isn't that far away
...), where the end result of that effort would generate some positive marginal profit contribution.
At the other end of the road, where we stand today, in most companies these optimization efforts are done mostly by hand. We design and set campaigns into motion by hand, we use our eyes to read the results, and we make manual adjustments.
One step forward, we have mechanistic
approaches. We set up rules that say, "Read the incoming data; if you see this pattern, then make this adjustment." More concretely, "When a site visitor with these cookies set in her browser arrives, serve her this content." This works fine as long as the patterns to be recognized, and the adjustments to be made, are few and relatively simple. It's a lot of work to define the patterns to look for. And, it can be lots of work to design, implement, and maintain a campaign, especially if it has lots of variants for different target segments and offers (even if you take a "modular" approach to building campaign elements). Further, at this level, while what the customer experiences is automated, the adjustments to the approach are manual
, based on human observation and interpretation of the results.
Two steps down the road, we have self-optimizing approaches where the results are fed back into the rule set automatically. The Big Machine says, "When we saw these patterns and executed these marketing activities, we saw these results; crunching a big statistical model / linear program suggests we should modify our marketing responses for these patterns in the following ways..." At this level, the human intervention is about how to optimize -- not what factors to consider, but which tools to use to consider them.
I'm not clear yet about what's beyond that. Maybe Skynet. Or, maybe I get a Kurzweil-brand math co-processor implant, so I can keep up with the machines.
The next question you ask yourself is, "How far down this road does it makes sense for me to go, by when?" Up until recently, I thought about this with the fairly simplistic idea that there are single curves that describe exponentially decreasing returns and exponentially increasing complexity. The reality is that there are different relationships between complexity and returns at different points -- what my old boss George Bennett used to call "step-function" change.
For me, the practical question-within-a-question this raises is, for each of these "step-functions", is there an version of the algorithm that's only 20% as complex, that gets me 80% of the benefit? My experience has been that the answer is usually "yes". But even if that weren't the case, my approach in jumping into the uncharted territory of a "step-function" change in process, with new supporting technology and people roles, would be to start simple and see where that goes.
At minimum, given the "step-function" economics demonstrated by the Demand Medias of the world, I think senior marketing executives should be asking themselves, "What does the next 'step-function' look like?", and "What's the simplest version of it we should be exploring?" (Naturally, marketing efforts in different channels might proceed down this road at different paces, depending on a variety of factors, including the volume of business through that channel, the maturity of the technology involved, and the quality of the available data. I've pushed the roadmap idea further to help organizations make decisions based on this richer set of considerations.)
So, what are your plans for Fly-By-Wire Marketing?
Postscript: Check out "The Value Of The New Machine", by Steve Smith in Mediapost's "Behavioral Insider" e-newsletter today. Clearly things are well down the road -- or should be -- at most firms doing online display and search buys and campaigns. Email's probably a good candidate for some algorithmic abstraction.