I'm a partner in the advanced analytics group at Bain & Company, the global management consulting firm. My primary focus is on marketing analytics (bio). I've been writing here (views my own) about marketing, technology, e-business, and analytics since 2003 (blog name explained).

Email or follow me:


66 posts categorized "e-business"

June 01, 2009

The Age of Analytics, Part II: It's not the Technology, it's what you do with it

When we started our "marketing analytics agency" business nearly a year and a half ago, we were reacting to a set of emerging needs and opportunities wrapped up in a zeitgeist we called "The Age Of Analytics". In the midst of a tough economy, it's nice to see (in addition to the clients we have the good fortune to work for) validation of the concept, both in yesterday's NYT (discussing the emergence of data practices at ad agency holding companies) and in today's Mediapost.  The latter article, summarizing research by Forrester Research's John Lovett, offers some especially interesting forecasts:

  • spending on web analytics (software and support services) will double to nearly a billion dollars in the next five years;
  • the growth is being driven principally by marketers trying to figure out what's happening across media channels, not just within a web site;
  • most of the growth will come from organizations with between half a million and two million uniques;
  • 58% of sites surveyed through the WASP service have analytics installed;
  • the average cost of hosted analytics services is $15,000 / year, though 73% use free (read: GA) services;
  • Of the folks using services they pay for, over a third also use a free (read: GA) service as well; 
The article presents a wonderful picture from John's report that perfectly describes the market as we see it today (see this earlier post on "Pragmalytics" which describes our approach).  Most organizations have some analytics system installed, but they're still struggling to get over data issues.  The challenge ultimately is to get over the "action chasm", or as one senior executive told us last year, "Our challenge is to get our marketing team thinking less like accountants and more like decision makers!"  


Postscript from John's "Analytics Evolution" blog:

Web analytics is no longer a point solution – its part of something bigger. For vendors, this means that you should plan on diversifying or instilling your data collection solutions into as many marketing applications as possible. Agencies and consultants should maintain an agnostic approach to Web analytics tools and focus less on which solution and more on applying the right metrics, reporting quality (actionable) information and uniting data from disparate marketing functions (like advertising and site-side information). Organizations should be asking themselves how their Web analytics solution is supporting their entire marketing efforts. Not just in the data that the tools are producing, but in their ability to generate insight and automate marketing processes. Practitioners, it’s your time to shine. As I mentioned, the job market is ripe and your skills and talents are more in need now than ever.

April 14, 2009

Smart Destinations: Creative Intermediation for Growth-Challenged Times

I had breakfast last week with Rob Schmults, an ecommerce veteran who after gigs at Fort Point Partners and GSI Commerce is now CMO at Boston-based Smart Destinations.  Rob's firm sells you and me a fixed-price, all-you-can-eat, multi-day "Go-Card" that provides discounted admissions to a variety of attractions in 15 cities across the US.  In an otherwise tough economy, Smart Destinations is growing at a healthy rate.  As such, it provides an interesting model for others trying to find ways to do the same.

For us customers, Smart Destinations offers convenience, savings, and less hesitation to do "one more thing" if we're taking the family to, say, Miami.  For the attractions in different cities that enroll in the program, discounting off the retail rate, and being listed in Smart Destinations' guide book, drive additional visits that they are betting they otherwise wouldn't have seen from you and me.  One cool feature of the service -- when an attraction signs up, it gets a Smart Destinations card reader, similar to a credit card terminal, that it sets up at its member/ information desk by the entry.  That way, visitors don't wait in the conventional admissions lines, the attraction gets another POS terminal, and Smart Destinations gets paper-free instant accounting for admissions fees (bought at wholesale) it owes the attraction.

Together, increased savings and convenience link customer groups and attractions that might not otherwise have hooked up, creating an economic pool that Smart Destinations can then try to maximize and harvest through creative packaging, pricing, and marketing.  And there's still plenty of room for innovation!  For example:
  • Today, Smart Destinations sells principally to folks in city X visiting City Y.  But imagine if it were to reconceive itself as a "platform" for "third-party affiliates", and allow local teachers and grad students to put together "profjonestourofcoolthingsforkidstodoinharvardsquare dot smartdestinations.com", tapping new market segments of suburban families desperate for high-quality activities in their own cities on grim winter Saturdays.
  • Today, attractions are enrolled ahead of time and are relatively "slow-in", "slow-out" of the program.  A logical extension might be to offer a "Site59.com"-style "dynamic-packaging" approach in addition to the base service.  Site59.com was an ingenious service created in 1999 that purchased blocks of unsold airline seats, unbooked hotel rooms, and unreserved restaurant tables and concert tickets on the cheap, and combined them into conveniently pre-planned packages for busy professionals desperate for a weekend away with significant others, but unable to plan more than a day or two ahead.  The resulting attractive arbitrage -- buy what someone's desperate to sell, and sell what someone's desperate to buy, was a big winner in the dark days of 2001-2002.  (My erstwhile colleagues at ArsDigita were proud to have worked with the Site59 team to build the service.)
  • Today, Smart Destinations markets through a variety of conventional channels -- search, affiliate programs.  But travel journalism suggests a number of potentially synergistic relationships.  Think for example, of a Smart Destinations partnership / sponsorship of iconic regular editorial features like the NYT's "36 hours" .  A few purists might sniff at the erosion of "Church-State" boundaries, but pragmatists no doubt would cheer! 
So, the question is, what other sectors have the potential to spawn businesses models like this?  

If you're a kid looking for work this summer, maybe start a Zipcar-ish lawn equipment exchange in your town?  Sign up folks with stuff as suppliers when they're not otherwise using their things.  Sell a "lawngear" card to others without stuff.   Use some of the proceeds to return sharpened, well-oiled gear to its owners, and pay for damages.  Pocket the difference. 

Or, if Smart Destinations doesn't provide enough inspiration, check out The Coupon Diva. (Listen especially to the point she makes at 4:15 into the video.)

Optimizing SEM vs. Affiliate Channel Investment: Amazon Giveth, Amazon Taketh Away

Last week Amazon announced that it would stop paying affiliates that drive traffic via paid search.

This is worth paying attention to and understanding better.  Amazon is the biggest online retailer.  Affiliates (through its Associates Program, not to be confused with its third-party sellers) are a big part of its marketing program.  And, driving scale / lower costs by encouraging and supporting (paradoxically) the growth of seeming competitors is a fundamental part of Amazon's "Wheel of Growth" strategy (see Scott Wingo's excellent analysis on Seeking Alpha, including the Jeff Bezos napkin diagram that beautifully captures the idea.)

I didn't understand Amazon's decision, so I asked Rob Schmults, CMO at Smart Destinations, for his take.  His answer was an enlightening window into the big leagues of multi-channel optimization in ecommerce, and with his permission I've shared our exchange (emphasis mine).


Why would Amazon do this if they pay affiliates on a CPA basis?  If an affiliate uses a paid search unit to drive traffic to Amazon vs. putting the marginal dollar into a better site, it shouldn't matter to Amazon, because the affiliate bears all the risk on that traffic-driving investment.  Could it be that there's an overall systemic side effect, which is that if everyone spends on SEM, the overall affiliate ecosystem's investment in value-added content helping to sell Amazon stuff is depressed?


I think Amazon is doing it because they want to clear out the pile-on around terms they are buying directly themselves. Most companies forbid affiliates from bidding on their "branded terms" as you know (i.e., the name of the company and its branded terms -- to be clear: NOT the names of other brands it may happen to sell). Those terms tend to have an effective acquisition cost well below affiliate fees, so any traffic from those terms going through an affiliate costs more that if they had come through the company's own paid link.

But the affiliates also drive up the cost of that paid link by bidding on the term. Assuming Amazon has purely economic motives for its decision, then it must have decided it has enough keywords that it can buy profitably where the acquisition cost is already below the affiliate cost OR will be once the affiliates are no longer bidding on the terms.

One other economic consideration is double paying. Most companies pay affiliate on ANY sales to someone who came through that affiliate for a period of time (30, 60, even 90 days). And most companies don't carve out sales to that customer where they may have come through another program in the meantime (PPC, e-mail, online ad). In those cases where the affiliate visit came first, but another program was the last click before purchase, it might be argued that you are double paying for that customer.

Here's an example:

I visit your site and come through an affiliate on March 15. I don't buy but I get cookied and you have a 60 program. Sometime before mid-May I click one of your PPC terms and then buy from you. Not only did you have to pay the PPC fee as always, but now you owe the affiliate 8% or whatever.

Because of the economics of PPC, nobody complains when PPC gets clicked first for no sale, and then the affiliate is the second trip that converts. But then the premise of PPC is PPC, not pay for performance!

Anyway, cutting out the PPC riding affiliates would probably eliminate the biggest source of double payments. The bigger affiliates tend to loan you their customers rather than pick off ones opportunistically.


Hey this is really great -- thanks for taking the time to explain it!  I guess at this scale of keyword purchases, etc. the volumes really move the market so optimization matters lots.   One interesting question is whether the time window issue might presage the emergence of a two-tier/staged payment model where affiliates get paid for the click if there's no buy , and later for the buy if it happens in the time window?


Interesting thought. I think there could be value in it. One challenge is that it’s a pain to track double payments in an actionable way. You can do the analysis to identify, but not sure how many companies can do it in a systematic way that would be necessary to either stop double paying or go to a variable rate. Maybe not as hard as it seems – just needs someone to put a concerted effort to it. There are real dollars there.

Rob's emphasis on actionability here is of course the crucial part, especially in these resource constrained ways.  One alternative to the two-tier payment structure I suggested would be to shorten the duration of the time window during which you would credit an affiliate for a sale.  Pushing this once step further, you might vary that by product category, to match purchase decision cycles for different products.  On more complex sales with longer cycles (cars), it might be appropriate to retain a longer window.  For less complex decisions (books), it might make sense to go much shorter.

Regardless, thank you Rob for your clear and thorough explanation!

March 25, 2009

MITX Measurement 2.0 Panel Recap

Yesterday morning I went to a MITX panel discussion titled "Measurement 2.0: How to Tell the Full Digital Story".  With 110 folks, it was SRO at Google's pad in Kendall Square.  Charlie Ballard from One to One Interactive (sponsor of other cool MITX panels) moderated, and the other panelists included Paul Botto , head of GA Enterprise Sales at Google, Morris Martin from Microsoft's Atlas Institute (that's him in the banner picture), Visible Measures' VP of Marketing and Analytics Matt Cutler, Mike Schneider from Allen & Gerritson, and my friend and colleague Ms. Perry Hewitt, CMO at the Cambridge-based social media measurement firm Crimson Hexagon.

Notwithstanding that it's so very 2004 to call anything "2.0" these days, Mike was correct to point out that before we can expect dollars to move toward "Web 3.0", we've got to get Measurement 2.0 right first.  Charlie usefully characterized that if "1.0" is about optimizing within channel silos, "2.0" is, in this context, about optimizing across them.  Whether you like the moniker or not, I agree (not uniquely) with his premise.  

Paul pushed the point further, arguing that to really understand a customer's experience, we need to move beyond a page-based measurement model to an event-based one.  This is especially necessary in a rich media world (think YouTube) where an experience spanning interaction across multiple rich media objects can happen within the context of a single page. (Whether or not you agree, it's provoking to think that while some pressures push us to think more macro (multi-channel), other technological developments push us to go more micro (intra-page).  Wonder if the same design concepts (pathways, handoffs) apply "fractally"?)  

However, Mike took the view that we should be careful about introducing new more exotic frameworks into a world where standards are such that we still can't agree on what defines a visit.  Matt pointed out that event tracking generates 10-100x the data, further complicating matters.  I'm in between: if you got a whole lotta Flash, you have no choice but to implement event-based measurement. Nonetheless, if we can't agree on standards, you give up benchmarking, because your own site (and perhaps others your agency has implemented) will be your only apples-to-apples point of reference.  (Paul indicated that event-based measurement is an invitation-only feature of GA.  I asked for one, and will report what I learn when I get to try it.)

Charlie kicked off the questions for the panel by referring to the multi-channel-measurement tool ur-text, Suresh Vittal's (Forrester Research report "Defining The Online Marketing Suite".  Specifically he asked if the centralized, "command and control" notion of tracking folks through a purchase pipeline across multiple channels still makes sense.  

Matt's take was that the explosive rise of social media has pushed the centralized model toward obsolescence (so soon!). He argued that with the "conversation" happening in places that don't (yet) let you slip measurement tags into their "vessels", marketing needs to be more about tracking what's happening out there using tools (like Perry's firm's) that Suresh didn't then cover but since has.  "Today, the center of gravity has moved, and marketing is much more like portfolio management", said Matt.   He then pointed to a silver lining opportunity: getting value from what he called "big data".  He described how in some presentations, he's successfully used tagcrowd.com to crunch a big bolus of comments on a video to infer / visually convey their collective meaning.

One question is, if we take his comments literally, are we back to local optimization of social stovepipes?  And, "big data" is valuable if you've got big comments.  What if no one comes to your party? In Long Tail space, no one can hear you scream. (Aside: this puts a premium on understanding viral propagation of your social media efforts as part of your portfolio management.)

Morris argued that the central model's value is just beginning to be realized, as it enables us to better understand the value of "upstream" investments and slowly ease away from over-emphasizing the value of being (if you're a publisher) / spending on (if you're an advertiser) the "last click".  Setting aside that Atlas is a display ad network with a natural interest in making this point, others have confirmed that display campaigns lift searches 15-20%.  Knowing this value, I think the opportunity here is to do the math to determine the "effective CPA" of an extra dollar to search vs. an extra dollar to display.

Charlie next asked, "How do we move from measurement to optimization?"  

Morris asserted that you've got to be able to track everything first, and that you shouldn't try to retrain media planners to work with a different process -- it's just too hard.  He pointed us to Atlas' Engagement Mapping tool, (launched a year ago, here's a review) as one option for optimizing within existing constructs.

Perry noted that one client has told her that her thinking about optimization has shifted, from "measure twice, cut once" to "measure twice, cut fast" -- the point being that media usage patterns are shifting quickly enough that a rough optimum appropriate to today is better than a perfect optimum appropriate to patterns we saw six months ago. Perry continued, "agility is the core competence in optimization efforts today."

Picking up Perry's thread, Matt urged the audience to think carefully about what data to collect.  He distinguished between "just-in-time" versus "just-in-case" data collection efforts.  "A bigger regression won't help," he noted, observing that "Even if it's more accurate, if people can't understand it they're unlikely to be able to act on it."  He suggested focusing on a narrow set of metrics and trying to move the needle 10% first, then adding more complexity to your models.  And, as a way to avoid analysis paralysis, Mike advised starting with a likely story in mind to prove or disprove, rather than boiling the ocean (testing/ regressing everything against everything) to find "emergent stories".  Truly men after my own  heart.  

A logical extension of the points above, particularly Perry's, is to shift the relative importance of A/B testing and passive measurement, versus back-testing, for media mix modeling efforts.  Charlie moved to this question next, asking, "How far can it go?"

Paul pointed out great results they've had (using Google Website Optimizer, natch) optimizing the Picassa download page.  Testing 200 different versions, they settled on one that "none of us would have ever thought of" that drove downloads 30% higher.  Surprisingly, the words "free download" don't help.  And, for those who fear that testing curbs creative freedom, reducing us to no better than Shakespeare's Monkeys, Paul pointed out that ironically, the opposite has been true -- creative teams feel they don't have to "play it safe" and can explore more possibilities, knowing that testing will ultimately discipline the process.  (Of course, this is true when experiments are as small as having or not having "free" on your page, but gets harder as the creative execution gets more expensive.)

Charlie's next question: "What about brand-focused advertising measurement?"  Matt talked about how the emergence of online video and social media have brought the left and right brains together: in these media, it's now simultaneously possible to craft a story that traditional brand marketers love and to measure its impact at least better than before, if not yet well enough.  In particular, he told the story of a credit card company that syndicated a video widget and saw a big jump in applications from folks who viewed it.  Perry told the story of how semantic analysis of an online crafting community's conversations (about vinyl home decor -- go figure) is being recycled to shape creative execution of television spots for one of her firm's clients (Ahem Perry, interesting crowd you're hanging with).  Matt further pointed to opportunities for "viral packaging", like paying Blendtec $10k to ask "Will It Blend?" of your product after their clever YouTube experiment with the iPhone drove millions of views and hundreds of thousands of subscribers to Blendtec's channel.  Paul suggested folks try Google Insights for Search as a way of getting a better view of what's happening upstream.

Panelists suggested the following additional resources:


  • I asked about whether the assembled players had explored allowing members of social media services like FB and LinkedIn extend their member profiles to include "analytics tracking tags" fields, so members could track visits and interaction by others in content the members publish or syndicate there.  It seemed to me a win-win all around for advertisers, members, and social media platforms.  Answer, good idea in principle, but social media platforms still guard that data jealously and there are privacy concerns that folks like Google and Microsoft in particular are sensitive to.  Paul did note though that YouTube provides some of this data to branded channel customers today.  My view is that if I can track you, dear reader, in GA using the tag embedded on this page through the Typepad template that wraps this content, it won't be long before Facebook makes the same thing happen, since advertisers want/ will pay for that (indirectly via CPMs), and who knows, they might be able to get a buck or a few each month from publishers to whom that information is really valuable.
  • Another person asked about the validity of the "view-through" as a metric -- that is,  what credit do you give to display ads that aren't clicked on?  Here's an article that describes the issue further (I love the author's concluding sentence: "Something between 0 and 100 percent credit is appropriate, depending on the advertiser's unique environmental, programmatic, and analytic profile. Each advertiser has to find its own answer.")  Morris referred folks to the Engagement Mapping research cited above, noting that "You can't grow search from the bottom of the funnel."
  • A third question was about the degree to which marketers should try to identify "emergent" funnels from the data versus operate/ test "pre-defined" purchase funnels.  The panelists were pretty much aligned in their responses about the practicality of focusing on the latter.  Matt said, "we're reinforcing for advertisers the importance of stories -- as humans we're tuned to listen to stories deep in our DNA, and it's much harder to infer them from oceans of data and analyses."  (From my end, I see an opportunity here -- services that collect stories as hypotheses, so that you can test the fit between stories and stats, mad-libs style.)  Charlie told a story about how they had tracked anonymous user 110135 through this cookie ID, and used this journey in a presentation to a cable company CEO, to huge effect.  Mike put it beautifully: "No story, no value."

December 10, 2008

MITX "Digital In The Downturn" Panel: 2009 is the Year of the Counting Dog

Yesterday morning I went to an excellent MITX panel discussion, "Utilizing Digital In A Down Economy". Here are some notes:

  • The panelists were grim about the outlook.  Jere Doyle predicted January will bring 1M (!) new job losses economy-wide. Guesses at changes in digital spending ranged from slightly negative to up 10% -- way down from the 20-30% digital growth rates we've become accustomed to. 
  • (One dynamic to keep an eye on: digital has shorter lead times (say, vs. catalog production or TV ads) and therefore the first to get cut; the flip side is it's easier to ramp up again)
  • While the 2001-2002 bubble burst was worse, the landing for the industry was softer because the Fed pumped  up the economy; this time our industry is healthier and the capabilities we offer customers are more mature, but there will be no soft landing. 
  • Old Media's really in trouble now.
  • Measurable Media (e.g., paid search, email) closely related to conversion are faring the best.
  • Digital advertising for branding -- e.g., display campaigns with 0.1% CTR's -- are really sucking wind.
  • Despite all the evangelism of the past 24 months, social media per se won't attract spending (except perhaps as a content-generation vehicle), though specific, close-to-conversion solutions that leverage it (social graph-based extensions of things like Bazaarvoice) are still getting serious looks (I can verify this independently).
  • Emily Green asked, "Are people more willing to trade information for value?" General answers: yes.  More specifically, Ralph Folz talked about the "Value Exchange" concept, using Adidas' MiCoach as an example of how it doesn't necessarily have to be about price (see also this related post).
  • Emily also asked, "Are we likely to see a new generation depression thinking -- a generation of recessionistas and frugalistas?" (Great terms, first I've heard of them.) Ralph: "Huge push in our agency to measure consumer engagement with brands, be more consumer-centric in our analytics, to figure out exactly how people are reacting" (So can we also coin analistas?) (See also this related post.)
  • Panelists' advice for coping: "Cut deep and fast now -- don't be incremental"; "Sometimes the fish just aren't biting, so keep your powder dry"; and, the audience favorite: "Pursue the Cockroach Strategy -- there's always food along the sides!"   
  • I asked, "Top 5 adjustments you'd recommend clients make to their digital strategies in the downturn?"  Answers: Ralph offered,  "My 1-3 are measurement, measurement, and measurement."  Others: "Think explicitly about how to create or highlight value for consumer in all digital interactions," and "Don't ignore global opportunities"
  • Kiki Mills asked, "What will 2009 bring?" Jere Doyle: "2009 will be the year of Survival"; Ralph: "Year of Data"; Emily: "Year of Cash"; Jim Savage: "Year of Accountability"; and poor Tom Anderson, who went last and rightly tried to offer something different: "Year of Mobile" (to which the snarky refrain was "Isn't next year always the year of Mobile?")
  • From the Great Closing Remarks department: Emily suggested reading  Martin Seligman's Learned Optimism, noting optimists' belief in 3P's: "It's not permanent, it's not pervasive, and it's not personal."

Rothenberg's Rant: Right Diagnosis, Wrong Prescription

Yesterday's Ad Age carried a 3-minute clip from IAB CEO Randall Rothenberg's speech on audience measurement.  

Rothenberg argued for looking past the glum economy to what he called a "crisis of complexity", with research at the root.  Citing an October McKinsey survey of 350 marketers which found 80% allocating media by guesswork and off last year's numbers, he suggested that research professionals are making audience measurement too complicated for marketers.  Then he went on to suggest that the solution lies in "business process standards and measurement".

I wasn't there to hear the rest of Rothenberg's comments, so this observation may miss further context.  But I see where he's coming from.  He runs a trade association whose job it is to promote interactive / digital media as an advertising option.  One way this happens is by standardizing measurement so the medium is easier to buy.  Today, advertisers can measure traffic more or less well enough, and care more about measuring engagement (as a means of getting foks to click on the holy "buy" button of course).  Notwithstanding efforts to solve this (like Eric Peterson's -- see this white paper and this more recent post), this is hard because engagement can mean many things and user interactions (or lack thereof) can correlate imperfectly with each of these meanings. Hence complexity, frustration, and his call for standards.

While I agree with Rothenburg's diagnosis, I disagree with the priority for his prescription.  Yes, standards are important, but they take a long time to agree on and the processes that generate them can bog down if the issues at hand get too far ahead of their economic relevance.  For my part, I think if we're to realize the full potential of interactive media and advertising for more effective and efficient marketing, there are two more immediate imperatives.  

First, no reference to measurements and data about engagement should be made without first starting with the specific user behavior to be promoted and the possible options for doing so.  The data and analyses are only useful in that context, and it's only in that context that we can judge if we're being too simple, too complex, or "just right" in how we're trying to answer questions.  So let's break engagement down -- do we mean more reading by users, users giving us information in exchange for suggestions, registration for more personalization, content contributions, user recommendations (forward to a friend, for example, or otherwise)?  If we do this, marketers can "shop" more easily for the kind of media they believe drives the kind of engagement that they need to convert at a higher rate.

Second, marketers and researchers need to meet in the middle in terms of education.  Researchers need to better understand the specific engagement objectives and solutions their work addresses -- magnitudes, implications, investment requirements, feasibility, necessary analytic precision; marketers need to get smarter about the guts of how interactive media and advertising actually work, and the implications of those mechanics for the data they use and the actions they might take.  To this end, I'm in the middle of reading Avinash Kaushik's excellent book, Web Analytics: An Hour A Day.  I'll follow with a more detailed summary / review, but I strongly recommend the book for marketers despite its somewhat narrow title -- in addition to a great exposition of the nitty gritty of how things work, the book offers lots of practical advice about how to keep analytics manageable, in perspective, and focused on actions.

IAB has a role in driving both of these imperatives forward, and it already does.  But we've reached the point in digital marketing where we need to move beyond whether (as argued here and here), to how.  For IAB and others, rather than orienting publications and sessions around specific media or measurement per se, these might be organized around business issues -- "driving awareness, engagement, conversion" and only presenting data and analysis in the context of how it supported business decisions related to these issues.  Limitations and complexities in data and analysis should be balanced with references to whether they matter  and to practical workarounds or cross-checks when they do.  The ideal session or publication would present integrated stories of problems, options, analysis, and results, and through this help us keep complexity in appropriate perspective.  And, standards-setting efforts will proceed faster and generate better outcomes with a better-informed "electorate" of marketers.

September 30, 2008

After The Apocalypse: Recalibrating Analytics

Yesterday's debacles in Washington and on Wall Street will further shock consumer spending in both practical and psychological ways. Practically, credit will be tighter, bonuses lower, jobs fewer. Psychologically, the decline in the stock market will pile onto declines in home equity to further chill the wealth effect that's been driving us for some time.

What does this mean for marketers relying on analytics to make decisions? Analytics are built on regressions on past activity and tests across present efforts. With recent developments, it's highly likely that many regression-based models that look backward to predict ahead will be obsolete, especially for highly seasonal businesses. So, this should mean a shift in analytic efforts toward testing in general, and consequently toward test-friendly media (e-mail, search, display, lightweight direct mail) and easier-to-execute components (copy, price) to help firms feel their way forward to a better understanding of how consumers will react under the new realities.

Unfortunately, in many cases, firms will be organizationally constrained from making this important adjustment. Skills and resources for regression-based analytics are different from those for test-based analytics. Big-bang model building efforts of the past year or two may create a drag on the ability to gather and re-direct the necessary budgets to test more extensively and aggressively. But marketers that took a pragmatic, efficient, and balanced approach to analytics will find the transition an easier one to make.

September 14, 2008

Pragmalytic Perspective Roundup

Been browsing a few of my MediaPost newsletter subscriptions. Here are a few good articles that reinforce our POV about the shift away from the digital marketing "Age of Evangelism" to the "Age of Analytics":

Skepticism about the advertising value of social media by Pat LaPointe in the Online Metrics Insider newsletter.

Good practical advice about testing from Aaron Smith in the Email Insider Newsletter on email testing, consistent with much of the advice we also offered here.

From Steve Smith's article in the Mobile Insider, some useful metrics on mobile browsing and mobile search in particular.

August 19, 2008

The Age of Analytics

Here's an interesting chart from Google Trends.  It prompts two  questions:

  • What happened in November 2005?
  • What's sustained the growth in interest since then?

My guess is that the huge spike in interest in Q4 2005 was related to Microsoft's PR push for its release of Yukon (SQL Server 2005 with Analysis Services).

Tom Davenport's HBR article "Competing On Analytics" appeared in January 2006, followed by the book of the same name in March 2007.  Ian Ayres' eminently-readable Supercrunchers appeared in August 2007.  The spike in press interest in the topic this summer appears to have coincided with yet another SQL Server release, highlighting the influence of the Microsoft marketing machine once again.

More broadly, analytics is being brought to the fore by the confluence of a bunch of different things:

  • a critical mass of complementary data sets, in electronic formats as more behavior occurs through electronic channels (prediction: following the lead of others, Google will soon add "data sets" as a specialized category you can search on, as it has with so many others already
  • bandwidth, storage, processing power, grouped as cloud computing utilities
  • the software to go with them, not just from MSFT, but also from folks like Sun
  • the maturation of standards for integration of different data sets, making the whole mashup trend possible

Today however, our reach still largely exceeds our grasp. The bottleneck to future growth looks to be fluency, in both the computer languages and tools required to assemble and manipulate data, as well as in the statistics to interpret them. It's particularly interesting in that light to note the geographic concentration in India for searches on the term, as a proxy for where future leadership on the topic might come from.

Pragmalytics, Part II

When we launched our new firm earlier this year, the short description of the opportunity we saw was "marketing analytics services".  Everyone we talked to about it was enthusiastic about the gap we proposed to fill, for the reasons we've previously described.  And since then, we've made good progress, with the good fortune to work for some great clients and to produce clear and significant results for them.  At the same time, the market has taught us some interesting things about the need that we're focused on serving.

Continue reading "Pragmalytics, Part II" »