I'm a partner in the advanced analytics group at Bain & Company, the global management consulting firm. My primary focus is on marketing analytics (bio). I've been writing here (views my own) about marketing, technology, e-business, and analytics since 2003 (blog name explained).

Email or follow me:


16 posts categorized "Current Affairs"

January 17, 2010

What's NYT.com Worth To You?

Via Chris Schroeder's  (@cmsschroed) RT of Henry Blodget (@hblodget), the news of the NYT's decision to start charging (again) for content.

Blodget's prior analysis suggested this might be worth ~$100 million per year  (my deduction based on his math) to NYT Co.  If a tenth of its 130M monthly unique visitors end up being "heavy users" that pay, 4 bucks a month gets them ~$600 million annually (13m * $4 * 12 months = $624 million).  (Seems high; better data anyone?)

What's it worth to you?  See what some folks had to say in the chart below.  Please take this survey to add your perspective, and let your friends know about it:

Note: I removed one response of "1 million articles for free, willing to pay $0 thereafter" because it messed up the display, but am mentioning it here for full disclosure. And to the respondent, thank you for participating!

Postscript: conclusions and analysis

September 17, 2009

#Adobe + #Omniture: Further Thoughts ( #Analytics )

I've been following the Web Analytics Forum on Yahoo! -- David Simmons' ideas here are especially thoughtful -- and I listened to the Q3 Adobe con call.  Plus last night at Web Analytics Wednesday in Cambridge I had a chance to talk about it with VisualIQ CTO Anto Chittilappilly and Visible Measures' Tara Chang.  Here are some further ideas based on what I've read and heard so far:

  • Rich media -- video, interactive ads, etc. -- are a growing piece of the Internet experience. (Google's rumored acquisition of Brightcove reflects this.)  (The flip side: Semphonic CEO Gary Angel writes that "It's taking a long time, but HTML is dying.")
  • Since they're growing, tracking user interactions with them effectively is increasingly important, not just on the Internet but across platforms, and as the CIMM challenge to Nielsen suggests, not yet well addressed.
  • User tracking in rich media platforms like Flash is more granular and more persistent than cookie-based tracking (David Simmons explains how and recommends Eric Peterson's Web Site Measurement Hacks for more).
  • But, support for event-based tracking of users' interactions in rich internet media in existing web analytics platforms is in its infancy (though vendors say otherwise).
  • So, publishers want tighter, simpler integration of event-based tracking.
  • In the con call, Adobe CEO Shantanu Narayen mentions the framework "Create - Deploy - Optimize" as a way to understand their overall product roadmap vision.
  • Adobe has the "Create" (e.g., Photoshop, this part of their business is ~60% of their revenues) and "Deploy" (e.g., Flash Server / Flash Player) pieces covered.  But "Optimize" was still uncovered, up until this announcement.
That's the product strategy logic.  The financial logic:
  • Adobe is too dependent on the "Create" leg of their stool, and hasn't been able to monetize the "Deploy" piece as much as they might have hoped -- removing  license fees from the Open Screen Project is one recent example of this limitation.  So they're betting that "Optimize" has legs, and that buying Omniture in this economic climate at ~5x revenues is good timing.
  • Adobe's traditional software-licensing business model has gotten crushed year to year.  Omniture's revenues are >90% recurring subscriptions based on a SAAS model.  Adobe revenues (~$3B) are 10x Omniture's but the enhanced value proposition of the A-O integration and cross-selling through Adobe's sales force will accelerate O's growth.  Over the next 2-3 years, this will help to reduce the volatility of A's revenues / revenue growth.

What's ahead?  One direction, as I've previously discussed, is that the workflows associated with "Create-Deploy-Optimize" are increasingly complex, and that platforms that support these workflows in a simple, more integrated way will become important to have.  Managing these processes through hairballs built with Excel spreadsheets scattered across file servers just won't cut it.

Postscript: Eric Peterson's take.  And more from him.  The gist -- not so high on Omniture's relative value to customers and the experience of working with them, and not sure why Adobe did this.  Anil Batra has some interesting ideas for product directions that emerge from the combination.

September 15, 2009

Adobe + Omniture: Pragmalytically Perfect Sense

Big news (via, in my case, Eric Peterson's "Web Analytics Forum" on Yahoo!): Adobe's buying Omniture

Simple logic:

  • Adobe makes great tools for developing custom dashboards and other data visualization apps.  I know because in one engagement this summer, we've worked with a terrific client and another (also terrific) engineering firm to build a Flex-based prototype of an advanced predictive analytics application.  But prototyping is easy, tying a front end to a working, real-world analytics data model is much harder.
  • Omniture leads the pack of web analytics platform vendors, who all have more features and capabilities in their left pinkies than many of us could dream of in six lifetimes.  But exposing mere mortals to the interfaces these leading firms provide is like showing kryptonite to  Joe / Jane Superexecutive.  As analytics get more complex, it's even more important to focus on key questions and expose only the data / views on that data that illuminate those key questions.
  • So if you believe that that this web / digital / multichannel analytics thing has legs, then putting these two firms together and working both ends to the middle faster than might otherwise have happened is a smart thing to do.
  • The other reason to do this is to anticipate the trend in "custom reporting" and "advanced segmentation" capabilities in the "lower-end" analytics offerings (e.g., GA) from folks like Google.  I've been using these capabilities recently, and they get you a meaningful way, eroding the value of higher-end offerings on both the front (Adobe) and middle-back (Omniture) ends.

July 21, 2009

Facebook at 250 (Million): What's Next? And What Will Your Share Be?

Facebook announced last week that it had passed 250 million members.  Since no social network grows to the sky (as MySpace experienced before it), it's useful to reflect on the enablers and constraints to that growth, and on the challenges and opportunities those constraints present to other major media franchises (old and new) that are groping for a way ahead.

"Structured Collaboration" principles say social media empires survive and thrive based on how well they support value, affinity, and simplicity.  That is,

  • how useful (rationally and emotionally) are the exchanges of information they support?
  • how well do they support group structures that maximize trust and lower information vetting costs for members? 
  • how easy do they make it for users to contribute and consume information? 
(There are of course additional, necessary "means to these ends" factors, like "liquidity" -- the seed content and membership necessary to prime the pump -- and "extensibility" -- the degree to which members can adapt the service to their needs -- but that's for another post.)

My own experience with Facebook as a user, as well as my professional experience with it in client marketing efforts, has been:
  • Facebook focuses on broad, mostly generic emotional exchanges -- pictures, birthday reminders, pokes.  I get the first two, and I admire the economy of meaning in the third.  The service leaves it to you to figure out what else to share or swap.  As a result, it is (for me anyway) <linkbait> only sometimes  relevant as an element in a B2C campaign, and rarely relevant in a B2B campaign </linkbait>
  • Facebook roared past MySpace because it got affinity right -- initially.  That is, Facebook's structure was originally constrained -- you had to have an email address from the school whose Facebook group you sought to join.  Essentially, there had to be some pre-existing basis for affinity, and Facebook just helped (re-)build this connective tissue.  Then, Facebook allowed anyone to join, and made identifying the nature of relationships established or reinforced there optional.  Since most of us including me are some combination of busy and lazy, we haven't used this feature consistently to describe the origins and nature  of these relationships.  And, it's cumbersome and awkward to have to go back and re-categorize "friends". (An expedient hack on this might be to allow you to organize your friends into groups, and then ask you which groups you want to publish items to, as you go.)
  • Facebook is a mixed bag as a UI.  On one hand, by allowing folks to syndicate blogs and tweets into Facebook, they've made our life easier.  On the other, the popular unstructured communications vehicles -- like the "Wall" -- have created real problems for some marketers.  Structured forms of interaction that would have created less risky choices for marketers, like polls, have come later than they should have and are still problematic ( for example, you can't add polls to groups yet, which would be killer).  And, interacting with Facebook through my email client -- on my PC and on my smartphone -- is still painful.  To their credit, Facebook opened up a great API to enable others to build specialized forms of structured interaction on its social graph. But in doing so it's ceded an opportunity to own the data associated with potentially promising ones.  (Like prediction markets; Inkling Markets, for example, lets you syndicate notices of your trades to Facebook, but the cupboard's pretty bare still for pm apps running against Facebook directly.)
The big picture: Facebook's optimizing size of the pie versus share of the pie.  It can't be all things to all people, so it's let others extend it and share in the revenue and create streams of their own.  Estimates of the revenues to be earned this year by the ecosystem of third party app developers running on Facebook and MySpace run to $300-500 million, growing at 35% annually.  
Them's not "digital dimes", especially in the context of steep declines in September ad page trends in, say, revenues of leading magazine franchises, as well as stalled television network upfronts. But, folks might argue, "Do I want to live in thrall to the fickle Facebook API, and rent their social graph at a premium?"  The answer isn't binary -- how much of an app's functionality lives in Facebook, versus living on a publisher's own server, is a choice.  Plus, there's ways to keep Facebook honest, like getting behind projects like OpenSocial, as other social networks have done. (OpenSocial is trying to become to Facebook's social graph as Linux is to Windows.  Engineer friends, I know -- only sort of.)  And, for Old Media types who don't feel they are up to the engineering tasks necessary, there are modern-day Levi Strausses out there selling jeans to the miners -- like Ning, which just today raised more money at a high valuation.  Still too risky? Old Media could farm out app development to their own third party developer networks, improving viral prospects by branding and promoting (to their suscriber lists) the ones they like in exchange for a cut of any revenues.  In this scenario, content gets added as an ingredient, not the whole main course.  

What is true in the new environment is that reach-based ad network plays surfing on aggregated content won't pay any more.  Rather we have to think about services that would generate more revenue from narrower audiences.  The third-party games created by Facebook app developers referenced above demonstrate how those revenues might stem from value through entertainment.  As we speak, Apple and its developers are earning non-trivial sums from apps.  Phonetag has its hands in folks' pockets (mine included) for $10/month for its superuseful -- albeit non-social -- transcription service.  Filtering for relevant content is a big challenge and opportunity.  Might someone aggregate audiences with similar interests and offer a retail version sourced at wholesale from filtering service firms like Crimson Hexagon?  Looks like Porter Novelli may already be thinking down these lines...

Let's push the math: a winner service by anyone's measure would earn, say, $50M a year. Four bucks a month from each person is roughly $50/ year.  You'd then need a million folks, 1/250th of Facebook's user base to sign up.  Reasonability check -- consider the US circulation of some major magazine titles

If your application service is especially useful, maybe you can get $2/month directly from each person.  Maybe you can make the rest up in ecommerce affiliate commissions (a 10% commission on $125 in annual purchases by each person gets you ~$1/month) and ad revenue (the $12 million/year nut would require one dollar per member per month; a $10 CPM would mean getting each of your million users to account for one impression on your service a couple of times per week, more or less, to cover that nut.)

We also have to be prepared to live in a world where the premiums such services earn are as evanescent as mayflies, especially if we build them on open social graphs.  But that's ok -- just as Old Media winners built empires on excellent, timely editorial taste in content, New Media winners will build their franchises on "editorial noses" for function-du-jour, and function-based insights relevant to their advertisers.  And last time I checked, function and elegance were not mutually exclusive.

So, even as we salute the Facebook juggernaut as it steams past Media Beach, it's time to light some design workshop campfires, and think application services that have "Value, Affinity, Simplicity."

September 14, 2008

Electoralmap.net: Pragmalytics and the Presidential Election

Lately we've been asked a lot about what metrics to pay attention to in digital marketing channels. A central piece of this is finding, at any given point in time, those few places in your business where stakes, uncertainty, and degrees of freedom for action are highest, and then focusing your reporting and analytics improvement efforts on those places, while tuning out all the other places that call for attention but don't have the same leverage.

A good public-sector example of excellent reporting on the right issue, relevant to all of us right now, is electoralmap.net. This service uses state-by-state contract prices from Intrade, the world's largest public prediction market, to predict the outcome of the Electoral College vote.

Reading Dailykos and Michelle Malkin have me convinced that regardless of how any of the candidates perform over the last month and a half of the campaign, 99.9% of the voters have already made up their minds. And right now, according to electoralmap.net the election appears to be a dead heat, with only Colorado's 9 electoral votes hanging in the balance. Looks like the DNC was fairly prescient in choosing Denver for its convention!

Should we believe it? This analysis suggests we can. Further, in prediction markets, market volume is a proxy for sample size. A closer look at the trading in Colorado (in the left-hand nav, go to "politics", then "US Election by State", then expand "Alabama-Florida" and look at the Colorado contracts, where it currently looks like Obama trades at $5.30 for a $10 payoff, and McCain trades at $4.70) indicates a total of about 2600 contracts in the market, for a total contract value traded of $26,000. That's not too much, but with a 3-point bid-asked price spread on the Obama and McCain Colorado contracts, it's enough I'd think to begin to attract trading by folks with inside local knowledge away from the main contracts ("2008 US Election" in the left hand nav), which collectively have $12 million in contract value traded, but where the bid-asked spreads are only 10-20% what they are in the Colorado contracts.

So, this is a fancy way of saying that, if you follow the money, "It's Colorado, stupid!" It will be interesting to see if the national election coverage in major online outlets begins to highlight places where things are really tight and selectively aggregate news and opinion from them.

And congratulations to the electoralmap.net guys for creating one of the most imaginative, useful, and usable mashups for following and filtering the election. But may I suggest a widget, or a FB app, to boost traffic? At only 6k uniques a day, you're really missing a big opportunity! Then maybe call Disney about a sponsorship to support its release of Swing Vote on DVD.

June 23, 2008

Qik+Twitter+Summize+(Spinvision): We Have Met Big Brother, And He Is Us

Imagine if you could sit above the world, at whatever altitude you wish,  and see anything through anyone and everyone's eyes, in real time, filtering these streams to let through only those things you're actually interested in. 

Today, we have real-time video streaming (now -- though not always practically -- in 3G) via folks with Nokia N95's and Qik.  Qik lets people know you are streaming via Twitter, and you can filter these "tweets" with Summize (which I wrote about yesterday)  You can also get your Qik streams onto YouTube automaticallySpinvision, a brother to Twittervision and Flickrvision, lets you see videos as they are uploaded to YouTube -- superimposed on a map of the Earth.

Now let's roll ahead 12-18 months.  N95's won't be the only devices with high quality camera/ video capture and GPS capabilities -- so, many more people will have this capability.  3G will be more widely available and adopted.  Twitter and Summize will be features of much larger players' services, so they too will move from the fringe to the mainstream as more people inevitable discover the utility of microblogging for different purposes, and the utility of filtering all that microblogging (and microvlogging).   Presumably, you'll be able to stream simultaneously on Qik and YouTube.   Google's just announced the availability of Google Earth running in a browser (though strangely, they didn't keep in sync with the release of Firefox 3.0), so we'll be able to  make our mashups even more dynamic and accessible.  Throw in a little facial recognition to boot, while you're at it.

What does all this add up to? A crowd-sourced, global/hyper-local, digital video, roll-your-own-channel, keep-your-friends-close-and-your-enemies-closer news network. 

What does that make you?


Imagine if rather than turning over a videotape to the authorities, she had streamed this.  Or if Zimbabwe, Darfur, Afghanistan, Iraq, or New Orleans for that matter, were live and unedited, 24/7, from a thousand sources each.   How will that change us?