About

I lead Force Five Partners, a marketing analytics consulting firm (bio). I've been writing here about marketing, technology, e-business, and analytics since 2003 (blog name explained).

Email or follow me:

21 posts categorized "Viral Marketing"

May 10, 2013

Book Review: Converge by @rwlord and @rvelez #convergebook

I just finished reading Converge, the new book on integrating technology, creativity, and media by Razorfish CEO Bob Lord and his colleague Ray Velez, the firm’s CTO.  (Full disclosure: I’ve known Bob as a colleague, former boss, and friend for more than twenty years and I’m a proud Razorfish alum from a decade ago.)

Reflecting on the book I’m reminded of the novelist William Gibson’s famous comment in a 2003 Economist interview that “The future’s already here, it’s just not evenly distributed.”  In this case, the near-perfect perch that two already-smart guys have on the Digital Revolution and its impact on global brands has provided them a view of a new reality most of the rest of us perceive only dimly.

So what is this emerging reality?  Somewhere along the line in my business education I heard the phrase, “A brand is a promise.”  Bob and Ray now say, “The brand is a service.”  In virtually all businesses that touch end consumers, and extending well into relevant supply chains, information technology has now made it possible to turn what used to be communication media into elements of the actual fulfillment of whatever product or service the firm provides.  

One example they point to is Tesco’s virtual store format, in which images of stocked store shelves are projected on the wall of, say, a train station, and commuters can snap the QR codes on the yogurt or quarts of milk displayed and have their order delivered to their homes by the time they arrive there: Tesco’s turned the billboard into your cupboard.  Another example they cite is Audi City, the Kinnect-powered configurator experience through which you can explore and order the Audi of your dreams.  As the authors say, “marketing is commerce, and commerce is marketing.”

But Bob and Ray don’t just describe, they also prescribe.  I’ll leave you to read the specific suggestions, which aren’t necessarily new.  What is fresh here is the compelling case they make for them; for example, their point-by-point case for leveraging the public cloud is very persuasive, even for the most security-conscious CIO.  Also useful is their summary of the Agile method, and of how they’ve applied it for their clients.

Looking more deeply, the book isn’t just another surf on the zeitgeist, but is theoretically well-grounded.  At one point early on, they say, “The villain in this book is the silo.”  On reading this (nicely turned phrase), I was reminded of the “experience curve” business strategy concept I learned at Bain & Company many years ago.  The experience curve, based on the idea that the more you make and sell of something, the better you (should) get at it, describes a fairly predictable mathematical relationship between experience and cost, and therefore between relative market share and profit margins.  One of the ways you can maximize experience is through functional specialization, which of course has the side effect of encouraging the development of organizational silos.  A hidden assumption in this strategy is that customer needs and associated attention spans stay pinned down and stable long enough to achieve experience-driven profitable ways to serve them.  But in today’s super-fragmented, hyper-connected, kaleidoscopic marketplace, this assumption breaks down, and the way to compete shifts from capturing experience through specialization, to generating experience “at-bats” through speedy iteration, innovation, and execution.  And this latter competitive mode relies more on the kind of cross-disciplinary integration that Bob and Ray describe so richly.

The book is a quick, engaging read, full of good stories drawn from their extensive experiences with blue-chip brands and interesting upstarts, and with some useful bits of historical analysis that frame their arguments well (in particular, I Iiked their exposition of the television upfront).  But maybe the best thing I can say about it is that it encouraged me to push harder and faster to stay in front of the future that’s already here.  Or, as a friend says, “We gotta get with the ‘90’s, they’re almost over!”

(See this review and buy the book on Amazon.com)


February 02, 2012

Please Help Me Get Listed On The #Google #Currents Catalog. And Please ReTweet!

Hi folks, I need a favor.  I need 200 subscribers to this blog via Google Currents to get Octavianworld listed in the Currents catalog.  If you're reading this on an iPhone, iPad, or Android device, follow this link:

http://www.google.com/producer/editions/CAow75wQ/octavianworld

If you are looking at this on a PC, just snap this QR code with your iPhone or Android phone after getting the Currents app.

 

Img

 



Here's what I look like on Currents:

 

Photo

 

 

What is Currents?  If you've used Flipboard or Zite, this is Google's entry. If you've used an RSS reader, but haven't used any of these yet, you're probably a nerdy holdout (it takes one to know one).  If you've used none of these, and have no idea what I'm talking about, apps like these help folks like me (and big media firms too) publish online magazines that make screen-scrollable content page-flippable and still-clickable.  Yet another distribution channel to help reach new audiences.  

Thank you!

#Facebook at 100 (Almost)

So Facebook's finally filed to do an IPO.  Should you like?  A year ago, I posted about how a $50 billion valuation might make sense.  Today, the target value floated by folks is ~$85 billion.  One way to look at it then, and now, is to ask whether each Facebook user (500 million of them last January, 845 million of them today) has a net present value to Facebook's shareholders of $100. This ignores future users, but then also excludes hoped-for appreciation in the firm's value.  

One way to get your arms around a $100/ user NPV is to simply discount a perpetuity:  divide an annual $10 per user cash flow (assumed = to profit here, for simplicity) by a 10% discount rate.  Granted, this is more of a bond-than-growth-stock approach to valuation, but Facebook's already pretty big, and Google's making up ground, plus under these economic conditions it's probably OK to be a bit conservative.

Facebook's filing indicated they earned $1 billion in profit on just under $4 billion in revenue in 2011.  This means they're running at about $1.20 per user in profit.  To bridge this gap between $1.20 and $10, you have to believe there's lots more per-user profit still to come.  

Today, 85% of Facebook's revenues come from advertising.  So Facebook needs to make each of us users more valuable to its advertisers, perhaps 4x so to bridge half the gap.  That would mean getting 4x better at targeting us and/or influencing our behavior on advertisers' behalf.  What would that look like?

The other half of the gap gets bridged by a large increase in the share of Facebook's revenues that comes from its cut of what app builders running on the FB platform, like Zynga, get from you.  At Facebook's current margin of 25%, $5 in incremental profit would require $20 in incremental net revenue.  Assume Facebook's cut from its third party app providers is 50%, and that means an incremental $40/year each user would have to kick in at retail.  Are each of us good for another $40/year to Facebook?  If so, where would it come from?  

My guess is that Facebook will further cultivate, through third-party developers most likely, some combination of paid content and productivity app subscription businesses.  It's possible that doing so would not only raise revenues directly but also have a synergistic positive effect on ad rates the firm can command, with more of our time and activity under the firm's gaze.

 

 

 

 

January 04, 2011

Facebook at Fifty (Billion)

Is Facebook worth $50 billion?  Some caveman thoughts on this valuation:

1. It's worth $50 billion because Goldman Sachs says so, and they make the rules.

2. It's worth $50 billion because for an evanescent moment, some people are willing to trade a few shares at that price. (Always a dangerous way to value a firm.)

3.  Google's valuation provides an interesting benchmark:

a. Google's market cap is close to $200 billion.  Google makes (annualizing Q32010) $30 billion a year in revenue and $8 billion a year in profit (wow), for a price to earnings ratio of approximately 25x.

b. Facebook claims $2 billion a year in revenue for 2010, a number that's likely higher if we annualize latest quarters (I'm guessing, I haven't seen the books).   Google's clearing close to 30% of its revenue to the bottom line.  Let's assume Facebook's getting similar results, and let's say that annualized, they're at $3 billion in revenues, yielding a $1 billion annual profit (which they're re-investing in the business, but ignore that for the moment).  That means a "P/E" of about 50x, roughly twice Google's.  Facebook has half Google's uniques, but has passed Google in visits.  So, maybe this growth, and potential for more, justifies double the multiple.  Judge for yourself; here's a little data on historical P/E ratios (and interest rates, which are very low today, BTW), to give you some context.  Granted, these are for the market as a whole, and Facebook is a unique high-growth tech firm, but not every tree grows to the sky.

c. One factor to consider in favor of this valuation for Facebook is that its revenues are better diversified than Google's.  Google of course gets 99% of its revenue from search marketing. Facebook gets a piece of the action on all those Zynga et. al. games, in addition to its core display ad business.  You might argue that these game revenues are stable and recurring, and point the way to monetizing the Facebook API to very attractive utility-like economic levels (high fixed costs, but super-high marginal profits once revenues pass those, with equally high barriers to entry).

d. Further, since viral / referral marketing is every advertiser's holy grail, and Facebook effectively owns the Web's social graph at the moment, it should get some credit for the potential value of owning a better mousetrap.  (Though, despite Facebook's best attempts -- see Beacon -- to Hoover value out of your and my relationship networks, the jury's still out on whether and how they will do that.  For perspective, consider that a $50 billion valuation for Facebook means investors are counting on each of today's 500 million users to be good for $100, ignoring future user growth.)

e. On the other hand,  Facebook's dominant source of revenue (about 2/3 of it) is display ad revenue, and it doesn't dominate this market the way Google dominates the search ad market (market dominance means higher profit margins -- see Microsoft circa 1995 -- beyond their natural life).  Also, display ads are more focused on brand-building, and are more vulnerable in economic downturns.

4. In conclusion: if Facebook doubles revenues and profits off the numbers I suggested above, Facebook's valuation will more or less track Google's on a relative basis (~25x P/E).  If you think this scenario is a slam dunk, then the current price being paid for Facebook is "fair", using Google's as a benchmark.  If you think there's further upside beyond this doubling, with virtually no risk associated with this scenario, then Facebook begins to look cheap in comparison to Google.

Your move.

Who's got a better take?

Postscript:  my brother, the successful professional investor, does; see his comment below (click "Comments")

March 13, 2010

Fly-By-Wire Marketing, Part II: The Limits Of Real Time Personalization

A few months ago I posted on what I called "Fly-By-Wire Marketing", or the emergence of the automation of marketing decisions -- and sometimes the automation of the development of rules for guiding those decisions.

More recently Brian Stein introduced me to Hunch, the new recommendation service founded by Caterina Fake of Flickr fame.  (Here's their description of how it works.  Here's my profile, I'm just getting going.)  When you register, you answer questions to help the system get to know you.  When you ask for a recommendation on a topic, the system not only considers what others have recommended under different conditions, but also what you've told it about you, and how you compare with others who have sought advice on the subject.

It's an ambitious service, both in terms of its potential business value (as an affiliate on steroids), but also in terms of its technical approach to "real time personalization".  Via Sim Simeonov's blog, I read this GigaOm post by Tom Pinckney, a Hunch co-founder and their VP of Engineering.  Sim's comment sparked an interesting comment thread on Tom's post.  They're useful to read to get a feel for the balance between pre-computation and on-the-fly computation, as well as the advantages of and limits to large pre-existing data sets about user preferences and behavior, that go into these services today.

One thing neither post mentions is that there may be diminishing returns to increasingly powerful recommendation logic if the set of things from which a recommendation can ultimately be selected is limited at a generic level.  For example, take a look at Hunch's recommendations for housewarming gifts.  The results more or less break down into wine, plants, media, and housewares.  Beyond this level, I'm not sure the answer is improved by "the wisdom of Hunch's crowd" or "Hunch's wisdom about me", as much as my specific wisdom about the person for whom I'm getting the gift, or maybe by what's available at a good price. (Perhaps this particular Hunch "topic" could be further improved by crossing recommendations against the intended beneficiary's Amazon wish list?)

My point isn't that Hunch isn't an interesting or potentially useful service.  Rather, as I argued several months ago,

The [next] question you ask yourself is, "How far down this road does it makes sense for me to go, by when?"  Up until recently, I thought about this with the fairly simplistic idea that there are single curves that describe exponentially decreasing returns and exponentially increasing complexity.  The reality is that there are different relationships between complexity and returns at different points -- what my old boss George Bennett used to call "step-function" change.

For me, the practical question-within-a-question this raises is, for each of these "step-functions", is there an version of the algorithm that's only 20% as complex, that gets me 80% of the benefit?  My experience has been that the answer is usually "yes".  But even if that weren't the case, my approach in jumping into the uncharted territory of a "step-function" change in process, with new supporting technology and people roles, would be to start simple and see where that goes.

At minimum, given the "step-function" economics demonstrated by the Demand Medias of the world, I think senior marketing executives should be asking themselves, "What does the next 'step-function' look like?", and "What's the simplest version of it we should be exploring?" (Naturally, marketing efforts in different channels might proceed down this road at different paces, depending on a variety of factors, including the volume of business through that channel, the maturity of the technology involved, and the quality of the available data...)

Hunch is an interesting specific example of the increasingly broad RTP trend.  The NYT had an interesting article on real time bidding for display ads yesterday, for example.  The deeper issue in the trend I find interesting is the shift in power and profit toward specialized third parties who develop the capability to match the right cookie to the right ad unit (or, for humans, the right user to the right advertiser), and away from publishers with audiences.  In the case of Hunch, they're one and the same, but they're the exception.  How much of the increased value advertisers are willing to pay for better targeting goes to the specialized provider with the algorithm and the computing power, versus the publisher with the audience and the data about its members' behavior?  And for that matter, how can advertisers better optimize their investments across the continuum of targeting granularity?  Given the dollars now flooding into digital marketing, these questions aren't trivial.

January 29, 2010

Ecommerce On The Edge In 2010 #MITX

Yesterday morning I attended MITX's "What's Next For E-Commerce" Panel at Microsoft in Cambridge.  Flybridge Capital's Jeff Bussgang moderated a panel that included Shoebuy.com CEO Scott Savitz, CSN CEO Niraj Shah, Mall Networks CEO Tom Beecher,and Avenue 100 Media Solutions CEO Brian Eberman.

The session was well-attended and the panelists didn't disappoint. Across the board they provided a consistent cross-section of the sophistication and energy that characterizes life 2 SDs the right on the ecommerce success curve.

My notes and observations follow. But first, courtesy of Jeff, a quiz (answers at the end of the post):

1. Name the person, company, and city that originated the web-based shopping cart and secure payment process?

2. Name the person, company, and city that originated affiliate marketing on the web?

3. Name the largest email marketing firm in the world, and the city where it's headquartered?

Jeff opened by asking each of the panelists to talk about how they drive traffic, and how they try to distinguish themselves in doing so.

Brian described (my version) what his firm does as "performance marketing in the long tail", historically for education-sector customers (for- and non-profit) but now beyond that category. What that means is that they manage bidding and creative for 2 million less-popular keywords across all the major search engines for their customers. Their business is entirely automated and uses sophisticated models to predict when a customer should be willing to pay price X and use creative Y for keyword Z to reel in a likely-profitable order. The idea is that the boom in SEM demand has driven prices way up for popular keywords, but that there are still efficient marketing deals to be mined in the "long tail" of keyword popularity (e.g.,structured collaboration").

Niraj noted that there's an increasing returns dynamic in the SEM channel that raises entry barriers for upstarts and helps firms like CSN preserve and expand their position.  Namely, as firms like his get more sophisticated about conversion through scale and experience, they can afford to pay higher prices for a given keyword than smaller competitors can, and can reinvest in extending their SEM capabilities.  CSN now has a 10-person search marketing team within its total staff of 500. Since SEM is, to some degree, a jump-starter for firms that don't yet have a web presence sufficient to drive traffic organically, this edge is a powerful competitive weapon.  CSN is up to $200 million in annual revenues, and now manages the online furniture stores for folks like Walmart.

Scott sounded a different note, with similar results.  Shoebuy has focused more on cultivating its relationship with its existing customers and on Lifetime Value -- including referrals.  This focus has had a salutary effect on SEO, allowing them to rely less on SEM as it gets pricier.  Last year Shoebuy experienced double-digit top line growth and hit 8M uniques for December's shopping season, while realizing its lowest marketing expense as a percentage of sales since 2002.  They've continued to plow the savings into a better overall customer experience.  One way Shoebuy guides this reinvestment is through extensive use of Net Promoter-based surveys.  They keep the surveys brutally simple:  1)"Were you satisfied?" 2)"Whould you shop with us again?" 3)"Would you recommend us?".  Then they calculate the resulting NP scores to different things they try in their marketing mix, to give them a more nuanced insight than the binary outcome of an order can provide.

Tom described how while Mall Networks' traffic is "free" -- it all comes from their loyalty program partners' sites (e.g. Delta Skymiles website awards redemption page) -- they still have to jockey for Mall Networks' placement on those pages. (Though Tom was too polite to say so, the processes for deciding who goes where on popular pages is often a blood sport and ripe in most organizations for a more structured, rational approach.)

Former Molecular founder and CEO Ralph Folz asked about display -- is that making a comeback?  Brian indicated the lack of performance and the lack of placement control through ad networks made that a highly negative experience.  He did note that they are now experimenting with participation in real-time-bidding through ad exchanges for inventory that ad networks make available, sometimes for time windows only a hundred milliseconds long.  Jeff reinforced the emergence of "RTB" and mentioned MIT Prof. Ed Crawley's Cambridge-based DataXu (which Flybridge has invested in) as a leader in the field.

Affiliate marketing came up next.  Tom explained the basics (in response to a question): each of the 600 stores in Mall Networks stable pays Mall Networks, say for example, a 10% commission on orders that come through Mall Networks.  Mall Networks gives a chunk to the members of various loyalty programs that shop through it -- say 3-5% of the value of the order; some goes to the loyalty programs themselves, as partial inducements for sending traffic to Mall Networks, and the rest goes to Mall Networks to cover costs and yield profits.

All the other panelists include affiliates in their marketing mix, and all appeared satisfied to have them play a healthy role.  Niraj specifically mentioned the ShareASale and Google Affiliate networks.  Jeff asked about everyone's frenemy Amazon; the answers were uniformly respectful: "they're a tough competitor, but they build general confidence and familiarity with the ecommerce channel, and that's good for everyone."  Niraj noted the 800 lb. gorilla nature of their category dominance: "They're at $20m and NewEgg is the next biggest pure play at $2B.  They're a fact of life. We just have to be better at what we focus on."

Someone in the audience raised email.  All of the panelists use it, with lists ranging from millions to hundreds of millions of recipients in size.  They noted that this traditional pillar of online marketing has now gotten very sophisticated.  In their world, they look well beyond top line metrics like open- and clickthrough rates to root-cause analysis of segment-based performance.  Re-targeting came up, and Niraj noted that for them, email and re-targeting weren't substitutes (as some have seen them) but in fact played complementary roles in their mix.  (Jeff explained re-targeting for the audience: using an ad network to cookie visitors to your site, and then serving them "please come back!" ads on other sites in the network they go to after they've abandoned a shopping cart or otherwise left your site.  A twist: serving ads inviting them to *your* site after they've abandoned one of your competitors' sites.  Hey, all's fair in love, war, and ecommerce...).  A common theme:  unlike most of the rest of the world, email teams at these leading firms are tightly integrated with other channels' operators to better integrate the overall experience, even to the point of shared metrics.

What about social?  Scott: "Building community is key for us.  We run contests -- "What are you hoping will be under your tree this Christmas?" -- to stimulate input from our customers.  And, while we have social media coordinators, many people here participate in channels like Twitter in support of our efforts."  Niraj: "Our PR team came up with a 'Living Room Rescue' contest which we did in partnership with [a popular] HGTV host [whose name escaped me -- C.B.].  We got six thousand entries; we used a panel of professional decorators to narrow the list to a hundred, and then used social voting to choose a winner.  We publicized the contest, and it took on a life of its own, as local papers tried to drum up support for their local [slobs -- my word, not Niraj's].  While we couldn't / didn't measure conversion directly from this campaign, our indirect assessment was that it had a great ROI."  Jeff observed that social's potential seems greater when the object of the buzz is newsworthy.

It was a short leap from this to a question about attribution analysis, the simultaneous-dream-and-nightmare-du-jour for web analytics geeks out there.  Brian was surprisingly dismissive.  In his experience (if I understood correctly), he's seeing only up to 20%, and usually only 5-10% of order-placing customers touch two or more properties they source clicks from, across the broad landscape they cover, across a time frame ranging from a day to a month long.  "In the end, only a couple of dollars would shift from one channel to another if we did attribution analysis, so in general it's not worth it."  We chatted briefly after the panel about this; there are large ticket, high-margin exceptions to this rule (cars).  I need to learn about this one some more, it surprised me.

Mobile!  Is it finally here?  Scott reports that 6-9 months ago *customers* finally began asking for it (as opposed to having it pushed by vendors), so now they have a Shoebuy.com iPhone app.  Jeff noted that customers are rolling their own mobile strategies -- some folks are now going into (say) Best Buy, having a look at products in the flesh, then checking Amazon for the items and buying them through their iPhone if the price is right.  So, your store is now Amazon's showroom.  If you can't find something, or didn't even know you wanted it, but happen to stray near a store carrying it, location-based services will push offers at you -- and the offers may come from competitors.  (Gratuitous told-you-so here.)  Niraj:  "Say you're in Home Depot.  You want a mailbox.  Their selection is 'limited' [his description was more colorful]. We have 300 to choose from.  Wouldn't you want to know that?" Jeff:  Soon we'll also see the death of the checkout line: you'll take a picture of the barcode on the object of your desire, your smartphone will tell the store's POS system about it, and the POS system will send back a digital receipt you can show someone (or in the future, something) on your way out of the store. 

With all these channels in use, I asked how often they make decisions to reallocate investments across (as opposed to within) them -- say from search to email, as opposed to from keyword to keyword.  Brian: "Every day, each morning.  Some things -- like affiliate relationships -- may take 3-4 days to unwind.  But the optimization is basically non-stop."  Later we talked about the parallels with Wall Street trading floors.  For him, the analogy is apt.  Effectively he's a market-maker, only the securities are clicks, not stocks.  It's now reflected in their recruiting: many recent hires are former Wall Street quants.

A final note: The cultures in these shops are intensely customer-focused, flat, and data-driven.  Scott reads *every one* of the hundreds of thousands (yes you read right) of customer survey responses Shoebuy gets each year.  He also described the enthusiasm with which their customer service team embraced having all company communications to customers end with an invitation to email senior management with any concerns.  Niraj described CSN's floor plan:  500 people, no offices.  Everyone in the company takes a regular turn in customer service.  Everyone has access to the firm's data warehouse.  Brian told us about a digital display they have up in their offices showing hour-by-hour, source-by-source performance.  They also recently ran a "Query Day" in which everyone in the company -- including sales, finance, HR -- got training in how to use their databases to answer business questions.  Tom described that they “watch the cash register every minute, hour, day during the Christmas shopping season.”

This was a terrific session, and I've only captured half of it here.  Further comments / corrections / observations very welcome.

Quiz Answers:

1. MIT Prof. David K. Gifford, Open Market, Cambridge

2. Tom Gerace, BeFree, Cambridge

3. Constant Contact, Waltham

January 01, 2010

Grokking Google Wave: The Homeland Security Use Case (And Why You Should Care)

A few people asked me recently what I thought of Google WaveLike others, I've struggled to answer this.

In the past few days I've been following the news about the failed attempt to blow up Northwest 253 on Christmas Day, and the finger-pointing among various agencies that's followed it.  More particularly, I've been thinking less about whose fault it is and more about how social media / collaboration tools might be applied to reduce the chance of a Missed Connection like this.

A lot of the comments by folks in these agencies went something like, "Well, they didn't tell us that they knew X," or "We didn't think we needed to pass this information on."  What most of these comments have in common is that they're rooted in a model of person-to-person (or point-to-point) communication, which creates the possibility that one might "be left out of the loop" or "not get the memo".

For me, this created a helpful context for understanding how Google Wave is different from email and IM, and why the difference is important.  Google Wave's issue isn't that the fundamental concept's not a good idea.  It is.  Rather, its problem is that it's paradigmatically foreign to how most people (excepting the wikifringe) still think.

Put simply, Google Wave makes conversations ("Waves") primary, and who's participating secondary.  Email, in contrast, makes participants primary, and the subjects of conversations secondary.  In Google Wave, with the right permissions, folks can opt into reading and participating in conversations, and they can invite others.  The onus for awareness shifts from the initiator of a conversation to folks who have the permission and responsibility to be aware of the conversation.  (Here's a good video from the Wave team that explains the difference right up front.)  If the conversation about Mr. Abdulmutallab's activities had been primary, the focus today would be about who read the memo, rather than who got it.  That would be good.  I'd rather we had a filtering problem than an information access / integration problem.

You may well ask, "Isn't the emperor scantily clad -- how is this different from a threaded bboard?"  Great question.   One answer might be that "Bboards typically exist either independently, or as features of separate purpose-specific web sites.  Google Wave is to threaded bboard discussions as Google Reader is to RSS feeds -- a site-independent conversation aggregator, just as Google Reader is a site-independent content aggregator."   Nice!  Almost: one problem of course is that Google Wave today only supports conversations that start natively in Google Wave.  And, of course, that you can (sometimes) subscribe to RSS feeds of bboard posts, as in Google Groups, or by following conversations by subscribing to RSS feeds for Twitter hashtags.  Another question: "How is Google Wave different from chat rooms?"  In general, most chats are more evanescent, while Waves appear (to me) to support both synchronous chat and asynchronous exchanges equally well.

Now the Big Question: "Why should I care?  No one is using Google Wave anyway."  True (only 1 million invitation-only beta accounts as of mid-November, active number unknown) -- but at least 146 million people use Gmail.  Others already expect Google Wave eventually will be introduced as a feature for Gmail: instead of / in addition to sending a message, you'll be able to start a "Wave".  It's one of the top requests for the Wave team.  (Gmail already approximates Wave by organizing its list of messages into threads, and by supporting labeling and filtering.)  Facebook, with groups and fan pages, appears to have stolen a march on Google for now, but for the vast bulk of the world that still lives in email, it's clunky to switch back and forth.  The killer social media / collaboration app is one that tightly integrates conversations and collaboration with messaging, and the prospect of Google-Wave-in-Gmail is the closest solution with any realistic adoption prospects that I can imagine right now.

So while it's absurdly early, marketers, you read it here first: Sponsored Google Waves :-)  And for you developers, it's not too early to get started hacking the Google Wave API and planning how to monetize your apps.

Oh, and Happy New Year!

Postscript: It was the software's fault...

Postscript #2: Beware the echo chamber

October 24, 2009

Activating Latent Social Networks

This morning via TechCrunch  I read Sean Parker's Web 2.0 Summit presentation materials, in which he says that the future belongs to "network services" that connect people, like Facebook, and not to "information services" that connect us to data, like Google.  My experiences at Contact Networks taught me to think of email patterns as proxies for social networks.  So, the following idea occurred to me.

Google has Gmail.  Google allows people to publish profiles.  What if Gmail had a button that allowed me to "recognize" a recipient by linking to his / her public profile when I send an email to him / her?

If I have a public profile and the recipient has one too, by pressing this "recognize" button I would make our relationship "provisionally acknowledged" (like a "friend request"); the link would become "acknowledged" if the recipient agreed.  Further, either side (with mutual agreement) could choose to "publish" this relationship in multiple social nets they participate in: Facebook, LinkedIn, Orkut, or they could even make it fully public.

The more two-way email traffic there is between the two users, the stronger the link is assumed by the service to be.  Note that this wouldn't be scored in a linear way.  Probably some sort of recency and frequency considerations would be involved, just as we had at Contact Networks.

Taking a page out of PageRank (pun partially intended), the scoring algorithm could also consider the popularity of the URLs I associated with my Google profile to consider the "centrality of my node" in the uber-network, and therefore the "value" of my "acknowledgements", when given.  Link-love could be configured by each user to be given by-the-message or by default to different email recipients.  Recipients could also "transfer" this link-love, with permission, to their other web presences (e.g., blogs).

The idea isn't limited to the major mail platforms, either.  Any media firm with an online community has a latent social network that could be defined by the response patterns in forum posts.  Users wouldn't experience the pain and inconvenience of joining YASNS, just a minor modification -- perhaps a welcome one, if accompanied by a little extra valuable information -- to how they interact already in the communities they belong to.  "Activating" such social networks through mechanisms similar to the ones described above would enhance the viral marketing potential of the communities, which would appeal to advertisers.

Since basically everyone uses email, doing this would also "democratize the social graph".  What I mean is that today there are two kinds of networks.  Either they are private -- owned and run by Facebook, LinkedIn, etc. -- or they are "public-but-elite", defined by the link structure of the Web.  In the former case, if amigo ergo sum ("I friend therefore I am"), I exist at Facebook's whim.  In the latter case, only folks who take the time to establish a public web presence and get linked to (say, through a blog, or a social net public profile) exist.  (Reminds me of Steve Martin's excitement at making it into the phone book in The Jerk.)  An open, more inclusive social graph mechanism than either of these currently provides would help bridge the digital divide, among other benefits.

Who's doing this?  The idea isn't entirely original.  Partially relevant: Facebook has just updated its News Feed to consider interactions between users as inputs for how to filter items to each user.  I'm sure this must have occurred to the major portals with email services.  Seems like a natural feature for Google Wave, for example, though I haven't seen it.  Surely (as with Contact Networks) it's also valuable to large organizations to establish "enterprise social networks", inside and beyond. 

Postscript: Gather.com CEO Tom Gerace commented they are working on a patent-pending capability they call PeopleRank that will do what I describe above in the online community section of this post. Google's been thinking about this for at least a year -- how come we haven't heard more yet?

July 29, 2009

(Re-)Thinking Marketing Silos

I've been wondering lately about the social marketing boom, the timing of its inevitable commoditization (via Adverblog, parodied here, warning -- possibly NSFW) and where its useful boundaries are -- at what point does it stop making sense to talk about a social media campaign per se, and to start thinking about social media elements in integrated campaigns.  (For example, how do you classify a social "forward to a friend" capability in an email campaign?)  The issue is broader of course.  Integrating the "customer experience" is widely acknowledged as a good thing these days across many channels, not just with respect to social media.  

Silos in marketing organizations are widely acknowledged as a principal barrier to realizing this objective.  So, three questions are worth asking.  Why do they exist?  What organizational model would be a better choice?  And, practically speaking, how do we get from A to B?

At an operational level, silos today are most commonly defined by channel.  That is, for example, offline channels vs. online channels (in retail, the folks who run the stores vs. the folks who run the website; in media, the folks who run the print magazines vs. the folks who run "interactive").  Or, within "online", email vs. web site vs. social media.  

These definitions get started because even though what folks share in common may be really important -- knowledge about a customer segment's habits for example -- the specialized expertise  required to mechanically execute campaigns in the medium is very different.  Even when these roles are combined, individuals or small groups may still think "channel first" because of the concentration required to think and execute mechanically.

The channel-defined definitions persist in part because the vendors who supply the tools for those channels have an interest in continuing to fight commoditization with new, complex features rather than with simplicity, which though virtuous, sadly accelerates commoditization rather than staving it off.  (The exception -- Apple -- proves this rule.)  Also, vendors have historically made integration hard, though this is changing.  Further, there is a vendor-sponsored event ecosystem that pulls-channel-defined specialists together more often (and in nicer places, with better schwag) than, say, audience-segment-centric conferences might.  And finally, just as cucumbers rarely survive their encounters with barrels of brine, folks steeped in particular media begin to think differently in ways that shape their abilities to think across channels.  Someone with a lifetime as an auteur of 30-second TV spots may find it harder to think about "conversations".

What alternatives might be better, and when?  If different target customer segments had radically different media usage patterns, you might cut things that way.  Or, if the media used to  attract, engage, and convert customers were similarly distinct, you might slice efforts by stage of purchase funnel.  But things have begun to blend: old folks are using Facebook and Twitter as much as young folks (who increasingly think these media are uncool as a result), and what do you call a retail checkout experience that stays on an affiliate's site?

So, the answer may be ad hoc task force overlays to today's channel-defined silos.  These can be reinforced by experience-wide metrics, analytics, and perhaps even evaluation and compensation frameworks for the members of these task forces.  But perhaps the most important first steps are awareness of what customers experience, a firm's business strategy  for choosing and serving (what products, priced how?) the ones they want, and mutual communication and education about how different folks approach and do their jobs in their respective channels.

To what degree do silos constrain you today? How are you adapting to overcome their challenges, and improve target customers' experiences?

July 21, 2009

Facebook at 250 (Million): What's Next? And What Will Your Share Be?

Facebook announced last week that it had passed 250 million members.  Since no social network grows to the sky (as MySpace experienced before it), it's useful to reflect on the enablers and constraints to that growth, and on the challenges and opportunities those constraints present to other major media franchises (old and new) that are groping for a way ahead.

"Structured Collaboration" principles say social media empires survive and thrive based on how well they support value, affinity, and simplicity.  That is,

  • how useful (rationally and emotionally) are the exchanges of information they support?
  • how well do they support group structures that maximize trust and lower information vetting costs for members? 
  • how easy do they make it for users to contribute and consume information? 
(There are of course additional, necessary "means to these ends" factors, like "liquidity" -- the seed content and membership necessary to prime the pump -- and "extensibility" -- the degree to which members can adapt the service to their needs -- but that's for another post.)

My own experience with Facebook as a user, as well as my professional experience with it in client marketing efforts, has been:
  • Facebook focuses on broad, mostly generic emotional exchanges -- pictures, birthday reminders, pokes.  I get the first two, and I admire the economy of meaning in the third.  The service leaves it to you to figure out what else to share or swap.  As a result, it is (for me anyway) <linkbait> only sometimes  relevant as an element in a B2C campaign, and rarely relevant in a B2B campaign </linkbait>
  • Facebook roared past MySpace because it got affinity right -- initially.  That is, Facebook's structure was originally constrained -- you had to have an email address from the school whose Facebook group you sought to join.  Essentially, there had to be some pre-existing basis for affinity, and Facebook just helped (re-)build this connective tissue.  Then, Facebook allowed anyone to join, and made identifying the nature of relationships established or reinforced there optional.  Since most of us including me are some combination of busy and lazy, we haven't used this feature consistently to describe the origins and nature  of these relationships.  And, it's cumbersome and awkward to have to go back and re-categorize "friends". (An expedient hack on this might be to allow you to organize your friends into groups, and then ask you which groups you want to publish items to, as you go.)
  • Facebook is a mixed bag as a UI.  On one hand, by allowing folks to syndicate blogs and tweets into Facebook, they've made our life easier.  On the other, the popular unstructured communications vehicles -- like the "Wall" -- have created real problems for some marketers.  Structured forms of interaction that would have created less risky choices for marketers, like polls, have come later than they should have and are still problematic ( for example, you can't add polls to groups yet, which would be killer).  And, interacting with Facebook through my email client -- on my PC and on my smartphone -- is still painful.  To their credit, Facebook opened up a great API to enable others to build specialized forms of structured interaction on its social graph. But in doing so it's ceded an opportunity to own the data associated with potentially promising ones.  (Like prediction markets; Inkling Markets, for example, lets you syndicate notices of your trades to Facebook, but the cupboard's pretty bare still for pm apps running against Facebook directly.)
The big picture: Facebook's optimizing size of the pie versus share of the pie.  It can't be all things to all people, so it's let others extend it and share in the revenue and create streams of their own.  Estimates of the revenues to be earned this year by the ecosystem of third party app developers running on Facebook and MySpace run to $300-500 million, growing at 35% annually.  
Them's not "digital dimes", especially in the context of steep declines in September ad page trends in, say, revenues of leading magazine franchises, as well as stalled television network upfronts. But, folks might argue, "Do I want to live in thrall to the fickle Facebook API, and rent their social graph at a premium?"  The answer isn't binary -- how much of an app's functionality lives in Facebook, versus living on a publisher's own server, is a choice.  Plus, there's ways to keep Facebook honest, like getting behind projects like OpenSocial, as other social networks have done. (OpenSocial is trying to become to Facebook's social graph as Linux is to Windows.  Engineer friends, I know -- only sort of.)  And, for Old Media types who don't feel they are up to the engineering tasks necessary, there are modern-day Levi Strausses out there selling jeans to the miners -- like Ning, which just today raised more money at a high valuation.  Still too risky? Old Media could farm out app development to their own third party developer networks, improving viral prospects by branding and promoting (to their suscriber lists) the ones they like in exchange for a cut of any revenues.  In this scenario, content gets added as an ingredient, not the whole main course.  

What is true in the new environment is that reach-based ad network plays surfing on aggregated content won't pay any more.  Rather we have to think about services that would generate more revenue from narrower audiences.  The third-party games created by Facebook app developers referenced above demonstrate how those revenues might stem from value through entertainment.  As we speak, Apple and its developers are earning non-trivial sums from apps.  Phonetag has its hands in folks' pockets (mine included) for $10/month for its superuseful -- albeit non-social -- transcription service.  Filtering for relevant content is a big challenge and opportunity.  Might someone aggregate audiences with similar interests and offer a retail version sourced at wholesale from filtering service firms like Crimson Hexagon?  Looks like Porter Novelli may already be thinking down these lines...

Let's push the math: a winner service by anyone's measure would earn, say, $50M a year. Four bucks a month from each person is roughly $50/ year.  You'd then need a million folks, 1/250th of Facebook's user base to sign up.  Reasonability check -- consider the US circulation of some major magazine titles

If your application service is especially useful, maybe you can get $2/month directly from each person.  Maybe you can make the rest up in ecommerce affiliate commissions (a 10% commission on $125 in annual purchases by each person gets you ~$1/month) and ad revenue (the $12 million/year nut would require one dollar per member per month; a $10 CPM would mean getting each of your million users to account for one impression on your service a couple of times per week, more or less, to cover that nut.)

We also have to be prepared to live in a world where the premiums such services earn are as evanescent as mayflies, especially if we build them on open social graphs.  But that's ok -- just as Old Media winners built empires on excellent, timely editorial taste in content, New Media winners will build their franchises on "editorial noses" for function-du-jour, and function-based insights relevant to their advertisers.  And last time I checked, function and elegance were not mutually exclusive.

So, even as we salute the Facebook juggernaut as it steams past Media Beach, it's time to light some design workshop campfires, and think application services that have "Value, Affinity, Simplicity."

Search

Books by
Cesar Brea