About

Cesar A. Brea bio at Force Five Partners

     

Subscribe

Get new posts by email:

RSS

Subscribe via
Google Currents

April 16, 2014

Book Review: "Big Data @ Work", by Tom Davenport

I've just finished Big Data @ Work: Dispelling The Myths, Uncovering The Opportunities, by Tom Davenport, the author of Competing On Analytics.  

The book marks a watershed moment in the Big Data zeitgeist. Much of the literature on the topic to this point has been more evangelical, telling us how analytics will make us all taller, smarter, and more handsome.  But the general sense for me has been of stories that are "way out there" for most organizations.  This latest book is much more about how to realize these visions with tactical, practical prescriptions across a range of issues.

Perhaps the most important of these dimensions is having a clear idea of the challenges or opportunities for which Big Data might be a part of the solution.  In Chapter Two, Davenport presents a very helpful series of use cases for using Big Data in several industry applications, including business travel, energy management, retail, and home education. He pushes further to examine the relative readiness of a number of different industries and business functions, including marketing and sales (which are the particular focus of my own upcoming book, Marketing and Sales Analytics). In Chapter Three he builds on these examples and sector assessments to offer a framework for shaping business strategies that leverage Big Data.  He suggests cost reduction, time reduction, new offerings, and decision support as broad objectives for focusing Big Data initiatives, and then further suggests a useful distinction between discovery-oriented application of Big Data (say, for sorting out emergent patterns of behavior to address) and production-oriented usage (applying Big Data to personalize experiences based on which emergent patterns might be worth the effort).

This "ends" focused approach to applying Big Data, in contrast to an "If I build it (my giant Hadoop Cluster) they will come" is an extremely valuable perspective to have introduced at this point in the evolution of this trend, and Davenport has wrapped it in a clean, well-organized package of specific advice executives interested in this space can profit from.

My New Book: #Marketing and #Sales #Analytics

I've written a second book.  It's called Marketing and Sales Analytics: Proven Techniques and Powerful Applications From Industry Leaders (so named for SEO purposes).  Pearson is publishing it (special thanks to Judah Phillips, author of Building A Digital Analytics Organization, for introducing me to Jeanne Glasser at Pearson).  The ebook version will be available on May 23, and the print version will come out June 23.

The book examines how to focus, build, and manage analytics capabilities related to sales and marketing.  It's aimed at C-level executives who are trying to take advantage of these capabilities, as well as other senior executives directly responsible for building and running these groups. It synthesizes interviews with 15 senior executives at a variety of firms across a number of industries, including Abbott, La-Z-Boy, HSN, Condé Nast, Harrah's, Aetna, The Hartford, Bed Bath & Beyond, Paramount Pictures, Wayfair, Harvard University, TIAA-CREF, Talbots, and Lenovo. My friend and former boss Bob Lord, author of Converge was kind enough to write the foreword.

I'm in the final editing stages. More to follow soon, including content, excerpts, nice things people have said about it, slideshows, articles, lunch talk...

January 17, 2014

Culturelytics

I'm working on a book. It will be titled Marketing and Sales Analytics: Powerful Lessons from Leading Practitioners. My first book, Pragmalytics, described some lessons I'd learned; this book extends those lessons with interviews with more than a dozen senior executives grappling with building and applying analytics capabilities in their companies. Pearson's agreed to publish it, and it will be out this spring. Right now I'm in the middle of the agony of writing it. Thank you Stephen Pressfield (and thanks to my wife Nan for introducing us).

A common denominator in the conversations I've been having is the importance of culture. Culture makes building an analytics capability possible. In some cases, pressure for culture change comes outside-in: external conditions become so dire that a firm must embrace data-driven objectivity. In others, the pressure comes top-down: senior leadership embodies it, leads by example, and is willing to re-staff the firm in its image. But what do you do when the wolf's not quite at the door, or when it makes more sense (hopefully, your situation) to try to build the capability largely within the team you have than to make wholesale changes?

There are a lot of models for understanding culture and how to change it. Here's a caveman version (informed by behavioral psychology principles, and small enough to remember). Culture is a collection of values -- beliefs -- about what works, and doesn't: what behaviors lead to good outcomes for customers, shareholders, and employees; and, what behaviors are either ignored or punished.

Photo (16)

Values, in turn, are developed through chances individuals have to try target behaviors, the consequences of those experiences, and how effectively those chances and their consequences are communicated to other people working in the organization.

Photo (15)

Chances are to  culture change as reps (repetitions) are to sports. If you want to drive change, to get better, you need more of them. Remember that not all reps come in games. Test programs can support culture change the same way practices work for teams. Also, courage is a muscle: to bench press 500 pounds once, start with one pushup, then ten, and so on. If you want your marketing team to get comfortable conceiving and executing bigger and bolder bets, start by carving out, frequently, many small test cells in your programs. Then, add weight: define and bound dimensions and ranges for experimentation within those cells that don't just have limits, but also minimums for departure from the norm. If you can't agree on exactly what part of your marketing mix needs the most attention, don't study it forever. A few pushups won't hurt, even if it's your belly that needs the attention. A habit is easier to re-focus than it is to start.

Consequences need to be both visible and meaningful. Visible means good feedback loops to understand the outcome of the chance taken. Meaningful can run to more pay and promotion of course, but also to opportunity and recognition. And don't forget: a sense of impact and accomplishment -- of making a difference -- can be the most powerful reinforcer of all. For this reason, a high density of chances with short, visible feedback loops becomes really important to your change strategy.

Communication magnifies and sustains the impact of chances taken and their consequences. If you speak up at a sales meeting, the client says Good Point, and I later praise you for that, the culture change impact is X. If I then relate that story to everyone at the next sales team meeting, the impact is X * 10 others there. If we write down that behavior in the firm's sales training program as a good model to follow, the impact is X * 100 others who will go through that program.

Summing up, here's a simple set of questions to ask for managing culture change:

  • What specific values does our culture consist of?
  • How strongly held are these values: how well-reinforced have they been by chances, consequences, and communication?
  • What values do I need to keep / change / drop / add?
  • In light of the pre-existing value topology -- fancy way of saying, the values already out there and their relative strength -- what specific chances, consequences, communication program will I need to effect the necessary keeps / changes / drops / adds to the value set?
  • How can my marketing and sales programs incorporate a greater number of formal and informal tests? How quickly and frequently can we execute them?
  • What dimensions (for example, pricing, visual design, messaging style and content, etc.) and "min-max" ranges on those dimensions should I set? 
  • How clearly and quickly can we see the results of these tests?
  • What pay, promotion, opportunity, and recognition implications can I associate with each test?
  • What mechanisms are available / should I use to communicate tests and results?

Ask these questions daily, tote up the score -- chances taken, consequences realized, communications executed -- weekly or monthly. Track the trend, slice the numbers by the behaviors and people you're trying to influence, and the consequences and communications that apply. Don't forget to keep culture change in context: frame it with the business results culture is supposed to serve. Re-focus, then wash, rinse, repeat.  Very soon you'll have a clear view of and strong grip on culture change in your organization.

November 23, 2013

Book Review: "The Human Brand"

October 13, 2013

Unpacking Healthcare.gov

So healthcare.gov launched, with problems.  I'm trying to understand why, so I can apply some lessons in my professional life.  Here are some ideas.

First, I think it helps to define some levels of the problem.  I can think of four:

1. Strategic / policy level -- what challenges do the goals we set create?  In this case, the objective, basically, is two-fold: first; reduce the costs of late-stage, high-cost uncompensated care by enrolling the people who ultimately use that (middle-aged poor folks and other unfortunates) in health insurance that will get them care earlier and reduce stress / improve outcomes (for them and for society) later; second; reduce the cost of this insurance through exchanges that drive competition.  So, basically, bring a bunch of folks from, in many cases, the wrong side of the Digital Divide, and expose them to a bunch of eligibility- and choice-driven complexity (proof:  need for "Navigators"). Hmm.  (Cue the folks who say that's why we need a simple single-payor model, but the obvious response would be that it simply wasn't politically feasible.  We need to play the cards we're dealt.)

2. Experience level -- In light of that need, let's examine what the government did do for each of the "Attract / Engage / Convert / Retain" phases of a Caveman User Experience.  It did promote ACA -- arguably insufficiently or not creatively enough to distinguish itself from opposing signal levels it should have anticipated (one take here).  But more problematically, from what I can tell, the program skips "Engage" and emphasizes "Convert": Healthcare.gov immediately asks you to "Apply Now" (see screenshot below, where "Apply Now" is prominently  featured over "Learn More", even on the "Learn" tab of the site). This is technically problematic (see #3 below), but also experientially lots to ask for when you don't yet know what's behind the curtain. 

Healthcaregov
3. Technical level -- Excellent piece in Washington Post by Timothy B. Lee. Basically, the system tries to do an eligibility check (for participation and subsidies) before sending you on to enrollment.  Doing this requires checking a bunch of other government systems.  The flowchart explains very clearly why this could be problematic.  There are some front end problems as well, described in rawest form by some of the chatter on Reddit, but from what I've seen these are more superficial, a function of poor process / time management, and fixable.

4. Organizational level -- Great article here in Slate by David Auerbach. Basically, poor coordination structure and execution by HHS of the front and back ends.

Second, here are some things HHS might do differently:

1. Strategic level: Sounds like some segmentation of the potential user base would have suggested a much greater investment in explanation / education, in advance of registration.  Since any responsible design effort starts with users and use cases, I'm sure they did this.  But what came out the other end doesn't seem to reflect that.  What bureaucratic or political considerations got in the way, and what can be revisited, to improve the result? Or, instead of allowing political hacks to infiltrate and dominate the ranks of engineers trying to design a service that works, why not embed competent technologists, perhaps drawn from the ranks of Chief Digital Officers, into the senior political ranks, to advise them on how to get things right online?

2. Experience level: Perhaps the first couple of levels of experience on healthcare.gov should have been explanatory?  "Here's what to expect, here's how this works..." Maybe video (could have used YouTube!)? Maybe also ask a couple of quick anonymous questions to determine whether the eligibility / subsidy check would be relevant, to spare the load on that engine, before seeing what plans might be available, at what price?  You could always re-ask / confirm that data later once the user's past the shopping /evaluation stage, before formally enrolling them into a plan.  In ecommerce, we don't ask untargeted shoppers to enter discount codes until they're about to check out, right?

Or, why not pre-process and cache the answer to the eligibility question the system currently tries to calculate on the fly?  After all, the government already has all our social security numbers and green card numbers, and our tax returns.  So by the time any of us go to the site, it could have pre-determined the size of any potential subsidy, if any, we'd be eligible for, and it could have used this *estimated* subsidy to calculate a *projected* premium we might pay.  We'd need a little registration / security, maybe "enter your last name and social security number, and if they match we'll tell you your estimated subsidy". (I suppose returning a subsidy answer would confirm for a crook who knows my last name that he had my correct SSN, but maybe we could prevent the brute force querying this requires with CAPTCHA. Security friends, please advise.  Naturally, I'd make sure the pre-chached lookup file stays server-side, and isn't exposed as an array in a client-side Javascript snippet!)

3. I see from viewing the page source they have Google Tag Manager running, so perhaps they also have Google Analytics running too, alongside whatever other things...  Since they've open-sourced the front end code and their content on Github, maybe they could also share what they're learning via GA, so we could evaluate ideas for improving the site in the context of that data?

4. It appears they are using Optimizely to test/ optimize their pages (javascript from page source here).  While the nice pictures with people smiling may be optimal, There's plenty of research that suggests that by pushing much of the links to site content below the fold, and forcing us to scroll to see it, they might be burying the very resources the "experience perspective" I've described suggests they need to highlight.  So maybe this layout is in fact what maximizes the results they're looking for -- pressing the "Apply Now" button -- but maybe that's the wrong question to be asking!

Postscript, November 1:

Food for thought (scroll to bottom).  How does this happen?  Software engineer friends, please weigh in!

 

September 11, 2013

Book Review: "Building A Digital Analytics Organization" by @Judah Phillips #analytics

I originally got to know Judah Phillips through Web Analytics Wednesdays events he organized, and in recent years he's kindly participated on panels I've moderated and has been helpful to my own writing and publishing efforts. I've even partnered with some of the excellent professionals who have worked for him. So while I'm biased as the beneficiary of his wisdom and support, I can also vouch first-hand for the depth and credibility of his advice. In short, in an increasingly hype-filled category, Judah is the real deal, and this makes "Building The Digital Analytics Organization" a book to take seriously.

For me the book was useful on three levels. One, it's a foundational text for framing how to come at business analysis and reporting. Specifically, he presents an Analytics Value Chain that reminds us to bookend our analytic efforts per se with a clear set of objectives and actions, an orientation that's sadly missing in many balkanized corporate environments. Two, it's a blueprint for your own organization-building efforts. He really covers the waterfront, from how to approach analysis, to different kinds of analysis you can pursue, to how to organize the function and manage its relationships with other groups that play important supporting roles. For me, Chapter 6, "Defining, Planning, Collecting, and Governing Data in Digital Analytics" is an especially useful section. In it, he presents a very clear, straightforward structure for how you should set up and run these crucial functions. Finally, three, Judah offers a strong point of view on certain decisions. For example, I read him to advocate for a strongly centralized digital analytics function, rooted in the "business" side of the house, to make sure that you have both critical mass for these crucial skills, as well as proximity to the decisions they need to support.

These three uses had me scribbling in the margins and dog-earing extensively. But if you still need one more reason to pull the trigger, it helps that the book is very up-to-date and has a final chapter that looks forward very thoughtfully into how Judah expects what he describes as the "Analytical Economy" to evolve. This section is both a helpful survey of the different capabilities that will shape this future as well as an exploration of the issues these capabilities and associated trends will raise, in particular as they relate to privacy. It's a valuable checklist, to make sure you're not just building for today, but for the next few years to come.

Here's the book and the review on Amazon.

September 01, 2013

#MITX Panel: Analytically Aligned Decision Making in the Multi-Agency Context

I moderated this panel at the Massachusetts Innovation and Technology Exchange's (mitx.org)"The Science of Marketing: Using Data & Analytics for Winning" summit on August 1, 2013.  Thanks to T. Rowe Price's Paul Musante, Visual IQ's Manu Mathew, iKnowtion's Don Ryan, and Google's Sonia Chung for participating!

 

July 16, 2013

Please sponsor my 2013 NLG #autism ride: 2007 Ride Recap

On July 27, I'll be riding once again in the annual Nashoba Learning Group bike-a-thon, and I'd really appreciate your support:

http://www.crowdrise.com/nlgbikecesar2013

(Note: please also Like / Retweet / forward to friends, etc. using links at bottom!)

This is a great cause, and an incredibly effective and well-run school.  Your contribution will make a big difference. (And thank you to everyone who'd been so generous so far!)

For kicks, here's my recap of my 2007 ride:

"Friends,

Thank you all for being so generous on such short notice!   

Fresh off a flight from London that arrived in Boston at midnight on Friday, I wheeled myself onto the starting line Saturday morning a few minutes after eight 
.  Herewith, a few journal entries from the ride:

Mile 2:  The 
peloton drops me like a stone.  DopeursNever mind; this breakaway is but  le petit setback.  Where are my domestiques to bring me back to the pack?

Mile 3:  Reality intrudes.  No 
domestiques.  Facing 47 miles' worth of solo quality time, I plot my comeback... 

Mile 10: 1st major climb, L'Alpe de Bolton (MA), a steep, nasty little "beyond classification" grade.  I curse at the crowds pressing in.  'AllezAllez!' they call, like wolves.  A farmer in a Superman cape runs alongside.

Mile 10.25: Mirages disappear in the 95-degree heat.  (First time I've seen the Superman dude, though.  Moral of this story: lay off the British Airways dessert wines the night before a big ride.) 

Mile 10.5: Descending L'Alpe de Bolton, feeling airborne at 35 MPH

Mile 10.50125: Realizing after hitting bump that I am, in fact, airborne.   AAAAARRH!!!

Mile 14: I smell sweet victory in the morning air!

Mile 15:  Realize the smell is actually the Bolton dump

Mile 27: Col d'Harvard (MA).  Mis-shift on steep climb, drop chain off granny ring.  Barely click out of pedal to avoid keeling over, disappointing two buzzards circling overhead. 

Mile 33:  Whip out Blackberry, Googling 'Michael Rasmussen 
soigneurto see if can score some surplus EPO

Mile 40:  I see dead people

Mile 50:  I am, ahem... outsprinted at the finish.  Ride organizers generously grant me 'same time' when they realize no one noticed exactly when I got back."
 

June 16, 2013

Organizing for #Analytics - Seven Considerations

We're now in the blood-sugar-crash phase of the Analytics / Big Data hype cycle, where the gap between promise and reality is greatest.  Presenting symptoms of the gap include complaints about alignment, access to data, capacity to act on data-driven insights, and talent.  This September 2012 HBR blog post by Paul Barth and Randy Bean of NewVantage Partners underscores this with some interesting data.

Executives' anxiety about this gap is also at its peak.  Many of them turn to organization as their prime lever for solving things. A question I get a lot is "How should we organize our analytic capabilities?"  Related ones include "How centralized should they be?", and "What should be on the business side, and what belongs in IT?"  

This post suggests a few criteria for helping to answer these questions.  But first, I'd like to offer a principle for tackling this generally:

Think organization last, not first.

A corollary to this might be, "Role is as role does."  Too much attention today is paid to developing and organizing for analytic capability.  Not enough attention is paid to defining and managing a portfolio of important business opportunities that leverage this capability.  In our work with clients, we focus on building capability through practice and results.  Our litmus test for whether we're making progress is a rule we call "3-2-1": In each quarter, the portfolio of business opportunities we're supporting with analytic efforts has to yield at least three "news you can use" insights, two experiments based on these insights, and one "scaling" of prior experiments to "production", with commensurate results.  (The specific goals we set for each of these varies of course from situation to situation, but the approach is the same.)

Approaching things this way has several benefits:

  • You frame "Analytics" and "Big Data" requirements in terms of what you need to solve specific challenges relevant to you, not in terms of a vendor's list of features;
  • You stay focused on the result, and not the input, so you don't invest past the point of diminishing returns;
  • By keeping cycles short and accountable to this rule, you hedge execution risk and maximize learning;
  • Your talent recruitment, development, and organization are done in the context of explicit opportunities, and thus stay flexible and integrated around concrete business results and not abstract concepts for what you need;
  • The results-oriented management of the capability helps build confidence that the overall ROI expected will be achieved.  Momentum is strategic.

Now, two critiques that can be made of this approach are, first, that it's too ad hoc and therefore misses opportunities to leverage experience beyond each individual opportunity addressed, and second, that it ignores that most people are "tribal" and that their behaviors are shaped accordingly.  So once you've got a decent portfolio assembled and you're managing it along, here are some organizational considerations you can apply to help decide where folks should "live":

  • For the business opportunities you're faced with, how unique is "local knowledge" -- that is, intimate knowledge of the specific market dynamics or operational mechanics that generate the data and shape the necessary analytics -- to each of them?  The more so, the more it will make sense to place your analysts in the groups responsible for those areas.
  • To what extent does the type of analysis you are pursuing require a certain degree of critical mass? It's hard for a single person or even small groups to manage and mine a Big Data capability, and if you sprinkle Big Data analysts throughout your firm to support different groups, you overwhelm each of them and under-serve the opportunity. Plus, each of them ends up with different Hammers Looking For Nails based on the particular tools and techniques they learn, rather than picking the best ones for different jobs.
  • How important is enterprise leverage to the business case for your capability?  If it is, centralizing your analysts so that purchasing efficiencies and idea sharing and reuse are maximized will be more important.
  • Are you concerned about objectivity?  When analysts get embedded deeply with business teams, there's a risk they can "go native", either because they fall in love with the solutions they're part of developing, or because of pressure, subtle and otherwise, to prove these solutions work.  This phenomenon is well-documented in scientific fields, even with peer review, so it's certainly more problematic in business.  
  • Are you, for whatever reason, having trouble keeping your analysts and their efforts aligned with your key priorities? For example, if one group needs to quickly get a product into market to grab its share of a high-growth opportunity, and then evolve it from there, and your analysts work in a group whose norms and objectives are more about "perfect" than "good enough", you may need to move folks, or get different folks in place.
  • How's your analyst-marketer relationship? If they're talking and working together productively, and the interpersonal karma is good, you can worry less about whether their boxes on the chart are closer or further apart.
  • Finally, which of these four "C's" describes the behavior you're trying to encourage: communication, coordination, collaboration, or control?  At the communication end of the spectrum, you just want folks to be aware of each other's efforts.  Coordination, for example, can mean "Hey, I'll be running my test Tuesday, so could you wait until Wednesday?"  Collaboration may require formal re-grouping, but it might only be temporary.  Control can be necessary for effective execution of complex projects.  The more analytic success relies on such control, rather than being satisfied by the "lesser" C's, the more you may solve for that with organization.

In our work we'll typically apply these criteria using scoresheets to evaluate either or both the specific business challenges we're solving for or the organizational models we're evaluating as possible options.  Sometimes we just use "high-medium-low" assessments, and other times we'll do the math to help us stay objective about different ways to go.  The main things are to keep attention to organization in balance with attention to progress, and to keep discussions about organization focused on the needs of the business, rather than allowing them to devolve into proxy battles for executive power and influence.

June 12, 2013

Privacy vs. Security Survey Interim Results #prism #analytics

This week, one of the big news items is the disclosure of the NSA's Prism program that collects all sorts of our electronic communications, to help identify terrorists and prevent attacks.

I was struck by three things.  One is the recency bias in the outrage expressed by many people.  Not sixty days ago we were all horrified at the news of the Boston Marathon bombings.  Another is the polarization of the debate.  Consider the contrast the Hullabaloo blog draws between "insurrectionists" and "institutionalists".  The third was the superficial treatment of the tradeoffs folks would be willing to make.  Yesterday the New York Times Caucus blog published the results of a survey that suggested most folks are fence-sitters on the tradeoff between privacy and security, but left it more or less at that.  (The Onion wasn't far behind with a perfect send-up of the ambivalence we feel.)

In sum, biased decision-making based on excessively simplified choices using limited data.  Not helpful. Better would be a more nuanced examination of the tradeoff between the privacy you would be willing to give up for the potential lives saved.  I see this opportunity to improve decision making alot, and I thought this would be an interesting example to illustrate how framing and informing an issue differently can help.  So I posted this survey: https://t.co/et0Bs0OrKF

Here are some early results from twelve folks who kindly took it (please feel free to add your answers, if I get enough more I'll update the results):

Privacy vs security

(Each axis is a seven point scale, 1 at lowest and 7 at highest.  Bubble size = # of respondents who provided that tradeoff as their answer.  No bubble / just label = 1 respondent, biggest bubble at lower right = 3 respondents.)

Interesting distribution, tending slightly toward folks valuing (their own) privacy over (other people's) security.

Now my friend and business school classmate Sam Kinney suggested this tradeoff was a false choice.  I disagreed with him. But the exchange did get me to think a bit further.  More data isn't necessarily linear in its benefits.  It could have diminishing returns of course (as I argued in Pragmalytics) but it could also have increasing value as the incremental data might fill in a puzzle or help to make a connection.  While that relationship between data and safety is hard for me to process, the government might help its case by being less deceptive and more transparent about what it's collecting, and its relative benefits.  It might do this, if not for principle, then for the practical value of controlling the terms of the debate when, as David Brooks wrote so brilliantly this week, an increasingly anomic society cultivates Edward Snowdens at an accelerating clip.

I'm skeptical about the value of this data for identifying terrorists and preventing their attacks.  Any competent terrorist network will use burner phones, run its own email servers, and communicate in code.  But maybe the data surveillance program has value because it raises the bar to this level of infrastructure and process, and thus makes it harder for such networks to operate.

I'm not concerned about the use of my data for security purposes, especially not if it can save innocent boys and girls from losing limbs at the hands of sick whackos.  I am really concerned it might get reused for other purposes in ways I don't approve, or by folks whose motives I don't approve, so I'm sure we could improve oversight, not only for what data gets used how, but of the vast, outsourced, increasingly unaccountable government we have in place. But right now, against the broader backdrop of gridlock on essentially any important public issue, I just think the debate needs to get more utilitarian, and less political and ideological.  And, I think analytically-inclined folks can play a productive role in making this happen.

(Thanks to @zimbalist and @perryhewitt for steering me to some great links, and to Sam for pushing my thinking.)

May 20, 2013

"How to Engage Consumers in a Multi-Platform World?" See you May 22 @APPNATION bootcamp panel in NYC

Sponsorpay's Global Sales SVP Andy Bibby kindly asked me to join his NYC Internet Week APPNATION panel on Wednesday, May 22 2:15-3p at 82 Mercer.  Hope to see you there, watch this space for a recap of the conversation.

May 19, 2013

@nathanheller #MOOCs in The New Yorker: You Don't Need A Weatherman

The May 20th 2013 edition of The New Yorker has an article by Vogue writer Nathan Heller on Massive Online Open Courses (MOOCs) titled "Laptop U: Has the future of college moved online?"  The author explores, or at least raises, a number of related questions.  How (well) does the traditional offline learning experience transfer online?  Is the online learning experience more or less effective than the traditional one? (By what standard? For what material?  What is gained and lost?)  What will MOOCs mean for different colleges and universities, and their faculties?  How will the MOOC revolution be funded?  (In particular, what revenue model will emerge?)

Having worked a lot in the sector, for both public and private university clients, developing everything from technology, to online-enabled programs themselves, to analytic approaches, and even on marketing and promotion, the article was a good prompt for me to try to boil out some ways to think about answering these questions.

The article focuses almost exclusively on Harvard and EdX, the 12-school joint venture through which it's pursuing MOOCs.  Obviously this skews the evaluation.  Heller writes:

Education is a curiously alchemical process. Its vicissitudes are hard to isolate.  Why do some students retain what they learned in a course for years, while others lose it through the other ear over their summer breaks?  Is the fact that Bill Gates and Mark Zuckerberg dropped out of Harvard to revolutionize the tech industry a sign that their Harvard educations worked, or that they failed?  The answer matters, because the mechanism by which conveyed knowledge blooms into an education is the standard by which MOOCs will either enrich teaching in this country or deplete it.

For me, the first step to boiling things out is to define what we mean by -- and want from -- an "education".  So, let's try to unpack why people go to college.  In most cases, Reason One is that you need a degree to get any sort of decent job.  Reason Two is to plug into a network of people -- fellow students, alumni, faculty -- that provide you a life-long community.  Of course you need a professional community for that Job thing, but also because in an otherwise anomic society you need an archipelago to seed friendships, companionships, and self-definition (or at least, as scaffolding for your personal brand: as one junior I heard on a recent college visit put it memorably, "Being here is part of the personal narrative I'm building.")  Reason Three -- firmly third -- is to get an "education" in the sense that Heller describes.  (Apropos: check this recording of David Foster Wallace's 2005 commencement address at Kenyon College.) 

Next, this hierarchy of needs then gives us a way to evaluate the prospects for MOOCs.

If organization X can produce graduates demonstrably better qualified (through objective testing, portfolios of work, and experience) to do job Y, at a lower cost, then it will thrive.  If organization X can do this better and cheaper by offering and/or curating/ aggregating MOOCs, then MOOCs will thrive.  If a MOOC can demonstrate an adequately superior result / contribution to the end outcome, and do it inexpensively enough to hold its place in the curriculum, and do it often enough that its edge becomes a self-fulfilling prophecy -- a brand, in other words -- then it will crowd out its competitors, as surely as one plant shuts out the sunlight to another.  Anyone care to bet against Georgia Tech's new $7K Master's in Computer Science?

If a MOOC-mediated social experience can connect you to a Club You Want To Be A Member Of, you will pay for that.  And if a Club That Would Have You As A Member can attract you to its clubhouse with MOOCs, then MOOCs will line the shelves of its bar.  The winning MOOC cocktails will be the ones that best produce the desired social outcomes, with the greatest number of satisfying connections.

Finally, learning is as much about the frame of mind of the student as it is about the quality of the teacher.  If through the MOOC the student is able to choose a better time to engage, and can manage better the pace of the delivery of the subject matter, then the MOOC wins.

Beyond general prospects, as you consider these principles, it becomes clear that it's less about whether MOOCs win, but which ones, for what and for whom, and how.  

The more objective and standardized -- and thus measurable and comparable -- the learning outcome and the standard of achievement, the greater the potential for a MOOC to dominate. My program either works, or it doesn't.  

If a MOOC facilitates the kinds of content exchanges that seed and stimulate offline social gatherings -- pitches to VCs, or mock interviewing, or poetry, or dance routines, or photography, or music, or historical tours, or bird-watching trips, or snowblower-maintenance workshops -- then it has a better chance of fulfilling the longings of its students for connection and belonging.  

And, the more well-developed the surrounding Internet ecosystem (Wikipedia, discussion groups, Quora forums, and beyond) is around a topic, the less I need a Harvard professor, or even a Harvard grad student, to help me, however nuanced and alchemical the experience I miss might otherwise have been.  The prospect of schlepping to class or office hours on a cold, rainy November night has a way of diluting the urge to be there live in case something serendipitous happens.

Understanding how MOOCs win then also becomes a clue to understanding potential revenue models.  

If you can get accredited to offer a degree based in part or whole on MOOCs, you can charge for that degree, and gets students or the government to pay for it (Exhibit A: University of Phoenix).  That's hard, but as a variant of this, you can get hired by an organization, or a syndicate of organizations you organize, to produce tailored degree programs -- think corporate training programs on steroids -- that use MOOCs to filter and train students.  (Think "You, Student, pay for the 101-level stuff; if you pass you get a certificate and an invitation to attend the 201-level stuff that we fund; if you pass that we give you a job.")  

Funding can come directly, or be subsidized by sponsors and advertisers, or both.  

You can try to charge for content: if you produce a MOOC that someone else wants to include in a degree-based program, you can try to license it, in part or in whole.  

You can make money via the service angle, the way self-publishing firms support authors, with a variety of best-practice based production services.  Delivery might be offered via a freemium model -- the content might be free, but access to premium groups, with teaching assistant support, might come at a price.  You can also promote MOOCs -- build awareness, drive distribution, even simply brand  -- for a cut of the action, the way publishers and event promoters do.  

Perhaps in the not-too-distant future we'll get the Academic Upfront, in which Universities front a semester's worth of classes in a MOOC, then pitch the class to sponsors, the way TV networks do today. Or, maybe the retail industry also offers a window into how MOOCs will be monetized.  Today's retail environment is dominated by global brands (think professors as fashion designers) and big-box (plus Amazon) firms that dominate supply chains and distrubution networks.  Together, Brands and Retailers effectively act as filters: we make assumptions that the products on their shelves are safe, effective, reasonably priced, acceptably stylish, well-supported.  In exchange, we'll pay their markup.  This logic sounds a cautionary note for many schools: boutiques can survive as part of or at the edges of the mega-retailers' ecosystems, but small-to-mid-size firms reselling commodities get crushed.

Of course, these are all generic, unoriginal (see Ecclesiastes 1:9) speculations.  Successful revenue models will blend careful attention to segmenting target markets and working back from their needs, resources, and processes (certain models might be friendlier to budgets and purchasing mechanisms than others) with thoughtful in-the-wild testing of the ideas.  Monolithic executions with Neolithic measurement plans ("Gee, the focus group loved it, I can't understand why no one's signing up for the paid version!") are unlikely to get very far.  Instead, be sure to design with testability in mind (make content modular enough to package or offer a la carte, for example).  Maybe even use Kickstarter as a lab for different models!

PS Heller's brilliant sendup of automated essay grading

Postscript:

The MOOC professor perspective, via the Chronicle, March 2013


May 16, 2013

Need #Data

Word cloud based on notes from a workshop not too long ago:

Need data 3

May 10, 2013

Book Review: Converge by @rwlord and @rvelez #convergebook

I just finished reading Converge, the new book on integrating technology, creativity, and media by Razorfish CEO Bob Lord and his colleague Ray Velez, the firm’s CTO.  (Full disclosure: I’ve known Bob as a colleague, former boss, and friend for more than twenty years and I’m a proud Razorfish alum from a decade ago.)

Reflecting on the book I’m reminded of the novelist William Gibson’s famous comment in a 2003 Economist interview that “The future’s already here, it’s just not evenly distributed.”  In this case, the near-perfect perch that two already-smart guys have on the Digital Revolution and its impact on global brands has provided them a view of a new reality most of the rest of us perceive only dimly.

So what is this emerging reality?  Somewhere along the line in my business education I heard the phrase, “A brand is a promise.”  Bob and Ray now say, “The brand is a service.”  In virtually all businesses that touch end consumers, and extending well into relevant supply chains, information technology has now made it possible to turn what used to be communication media into elements of the actual fulfillment of whatever product or service the firm provides.  

One example they point to is Tesco’s virtual store format, in which images of stocked store shelves are projected on the wall of, say, a train station, and commuters can snap the QR codes on the yogurt or quarts of milk displayed and have their order delivered to their homes by the time they arrive there: Tesco’s turned the billboard into your cupboard.  Another example they cite is Audi City, the Kinnect-powered configurator experience through which you can explore and order the Audi of your dreams.  As the authors say, “marketing is commerce, and commerce is marketing.”

But Bob and Ray don’t just describe, they also prescribe.  I’ll leave you to read the specific suggestions, which aren’t necessarily new.  What is fresh here is the compelling case they make for them; for example, their point-by-point case for leveraging the public cloud is very persuasive, even for the most security-conscious CIO.  Also useful is their summary of the Agile method, and of how they’ve applied it for their clients.

Looking more deeply, the book isn’t just another surf on the zeitgeist, but is theoretically well-grounded.  At one point early on, they say, “The villain in this book is the silo.”  On reading this (nicely turned phrase), I was reminded of the “experience curve” business strategy concept I learned at Bain & Company many years ago.  The experience curve, based on the idea that the more you make and sell of something, the better you (should) get at it, describes a fairly predictable mathematical relationship between experience and cost, and therefore between relative market share and profit margins.  One of the ways you can maximize experience is through functional specialization, which of course has the side effect of encouraging the development of organizational silos.  A hidden assumption in this strategy is that customer needs and associated attention spans stay pinned down and stable long enough to achieve experience-driven profitable ways to serve them.  But in today’s super-fragmented, hyper-connected, kaleidoscopic marketplace, this assumption breaks down, and the way to compete shifts from capturing experience through specialization, to generating experience “at-bats” through speedy iteration, innovation, and execution.  And this latter competitive mode relies more on the kind of cross-disciplinary integration that Bob and Ray describe so richly.

The book is a quick, engaging read, full of good stories drawn from their extensive experiences with blue-chip brands and interesting upstarts, and with some useful bits of historical analysis that frame their arguments well (in particular, I Iiked their exposition of the television upfront).  But maybe the best thing I can say about it is that it encouraged me to push harder and faster to stay in front of the future that’s already here.  Or, as a friend says, “We gotta get with the ‘90’s, they’re almost over!”

(See this review and buy the book on Amazon.com)


April 10, 2013

Fooling Around With Google App Engine @googlecloud

A simple experiment: the "Influence Reach Factor" Calculator. (Um, it just multiplies two numbers together.  But that's beside the point, which was to sort out what it's like to build and deploy an app to Google's App Engine, their cloud computing service.)

Answer: pretty easy.  Download the App Engine SDK.  Write your program (mine's in Python, code here, be kind, props and thanks to Bukhantsov.org for a good model to work from).  Deploy to GAE with a single click.

By contrast, let's go back to 1999.  As part of getting up to speed at ArsDigita, I wanted to install the ArsDigita Community System (ACS), an open-source application toolkit and collection of modules for online communities.  So I dredged up an old PC from my basement, installed Linux, then Postgres, then AOLServer, then configured all of them so they'd welcome ACS when I spooled it up (oh so many hours RTFM-ing to get various drivers to work).  Then once I had it at "Hello World!" on localhost, I had to get it networked to the Web so I could show it to friends elsewhere (this being back in the days before the cable company shut down home-served websites).  

At which point, cue the Dawn Of Man.

Later, I rented servers from co-los. But I still had to worry about whether they were up, whether I had configured the stack properly, whether I was virus-free or enrolled as a bot in some army of darkness, or whether demand from the adoring masses was going to blow the capacity I'd signed up for. (Real Soon Now, surely!)

Now, Real Engineers will say that all of this served to educate me about how it all works, and they'd be right.  But unfortunately it also crowded out the time I had to learn about how to program at the top of the stack, to make things that people would actually use.  Now Google's given me that time back.

Why should you care?  Well, isn't it the case that you read everywhere about how you, or at least certainly your kids, need to learn to program to be literate and effective in the Digital Age?  And yet, like Kubrick's monolith, it all seems so opaque and impenetrable.  Where do you start?  One of the great gifts I received in the last 15 years was to work with engineers who taught me to peel it back one layer at a time.  My weak effort to pay it forward is this small, unoriginal advice: start by learning to program using a high-level interpreted language like Python, and by letting Google take care of the underlying "stack" of technology needed to show your work to your friends via the Web.  Then, as your functional or performance needs demand (which for most of us will be rarely), you can push to lower-level "more powerful" (flexible but harder to learn) languages, and deeper into the stack.

April 08, 2013

From Big Data to Bigger Results: Focus on Ecosystemic Conditions for Analytics ROI

My guest post on the MITX.org blog

April 06, 2013

Dazed and Confused #opensource @perryhewitt @oreillymedia @roughtype @thebafflermag @evgenymorozov

Earlier today, my friend Perry Hewitt pointed me to a very thoughtful essay by Evgeny Morozov in the latest issue of The Baffler, titled "The Meme Hustler: Tim O'Reilly's Crazy Talk".  

A while back I worked at a free software firm (ArsDigita, where early versions of the ArsDigita Community System were licensed under GPL) and was deeply involved in developing  an "open source" license that balanced our needs, interests, and objectives with our clients' (the ArsDigita Public License, or ADPL, which was closely based on the Mozilla Public License, or MPL).  I've been to O'Reilly's conferences (<shameless> I remember a ~20-person 2001 Birds-of-a-Feather session in San Diego with Mitch Kapor and pre-Google Eric Schmidt on commercializing open source </shameless>).  Also, I'm a user of O'Reilly's books (currently have Charles Severance's Using Google App Engine in my bag).  So I figured I should read this carefully and have a point of view about the essay.  And despite having recently read Nicholas Carr's excellent and disturbing  2011 book The Shallows about how dumb the Internet has made me, I thought nonetheless that I should brave at least a superficial review of Morozov's sixteen-thousand-word piece.

To summarize: Morozov describes O'Reilly as a self-promoting manipulator who wraps and justifies his evangelizing of Internet-centered open innovation in software, and more recently government, in a Randian cloak sequined with Silicon Valley rhinestones.  My main reaction: "So, your point would be...?" More closely:

First, there's what Theodore Roosevelt had to say about critics. (Accordingly, I fully cop to the recursive hypocrisy of this post.) If, as Morozov says of O'Reilly, "For all his economistic outlook, he was not one to talk externalities..." then Morozov (as most of my fellow liberals do) ignores the utility of motivation.  I accept and embrace that with self-interest and the energy to pursue it, more (ahem, taxable) wealth is created.  So when O'Reilly says something, I don't reflexively reject it because it might be self-promoting; rather, I first try to make sure I understand how that benefits him, so I can better filter for what might benefit me. For example, Morozov writes:

In his 2007 bestseller Words That Work, the Republican operative Frank Luntz lists ten rules of effective communication: simplicity, brevity, credibility, consistency, novelty, sound, aspiration, visualization, questioning, and context. O’Reilly, while employing most of them, has a few unique rules of his own. Clever use of visualization, for example, helps him craft his message in a way that is both sharp and open-ended. Thus, O’Reilly’s meme-engineering efforts usually result in “meme maps,” where the meme to be defined—whether it’s “open source” or “Web 2.0”—is put at the center, while other blob-like terms are drawn as connected to it.
Where Morozov offers a warning, I see a manual! I just have to remember my obligation to apply it honestly and ethically.

Second, Morozov chooses not to observe that if O'Reilly and others hadn't broadened the free software movement into an "open source" one that ultimately offered more options for balancing the needs and rights of software developers with those of users (who themselves might also be developers), we might all still be in deeper thrall to proprietary vendors.  I know from first-hand experience that the world simply was not and is still not ready to accept GPL as the only option.

Nonetheless, good on Morozov for offering this critique of O'Reilly.  Essays like this help keep guys like O'Reilly honest, as far as that's necessary.  They also force us to think hard about what O'Reilly's peddling -- a responsibility that should be ours.  I used to get frustrated by folks who slapped the 2.0 label on everything, to the point of meaninglessness, until I appreciated that the meme and its overuse drove me to think and presented me with an opportunity to riff on it.  I think O'Reilly and others like him do us a great service when they try to boil down complexities into memes.  The trick for us is to make sure the memes are the start of our understanding, not the end of it.

March 26, 2013

Financial Services Program Management Consulting Opportunity

We're currently working with a leading investment management firm to help deploy and refine a new retirement guidance process and related tools.  As part of this, we're helping our client find a freelance project/ business manager with broad new venture launch experience (not just management of a software development project, but coordination of promotional and operational aspects as well) for the balance of 2013.  We would refer interested candidates to contract directly with our mid-Atlantic region client.  (The work would be largely on-site.)

About the role:

  • Responsibilities
    • Define and maintain granular and integrated plan for this initiative
      • Granular = day by day as needed/weekly calendar
      • Integrated = development, promotion, operational (channel) integration, etc.  NOT just development; will closely coordinate with existing project / release management on the development team
    • Track and report progress against this plan for a variety of audiences and uses
      • Includes learning and training other team members on necessary information interfaces for principal program metrics
    • Identify program dependencies, coordination requirements, delays, and resource needs in partnership with Development/Product/Promotion and Channel Integration Leaders
    • Develop and recommend options for resolving challenges
    • Work with finance staff to track  spending vs  budget
    • Coordinate with external experience design vendors as needed to support the program
    • Prepare / conduct / debrief regular team meetings (agendas, follow up notes)
    • Maintain online workspace and necessary documents to support program operations
  • Qualifications and Experience
    • 2-4 years prior program management experience with efforts of this scale
    • Understanding of/experience with software product development and promotion
    • Broad experience as a business manager preferred
    • Organized, disciplined, detail-oriented – demonstrated through prior similar efforts
    • Formal program management training ideal
  • Organizational Role
    • Reports to Program Leader
    • Peer to IT Tech Development and Business-side Product and Promotion Leaders
    • Partner with other groups as needed on integration, analytics and other topics

If you're interested, please fill out the short form below, or please pass this on to someone you know who might be a good fit!  Thanks.

 

January 09, 2013

My New Book: Pragmalytics

I've written a short book.  It's called "Pragmalytics: Practical Approaches to Marketing Analytics in the Digital Age".  It's a collection and synthesis of some of the things I've learned over the last several years about how to take better advantage of data (Big and little) to make better marketing decisions, and to get better returns on your investments in this area.  

The main point of the book is the need for orchestration.  I see too much of the focus today on "If we build It (the Big Data Machine, with some data scientist high priests to look after it), good things will happen."  My experience has been that you need to get "ecosystemic conditions" in balance to get value.  You need to agree on where to focus.  You need to get access to the data.  You need to have the operational flexibility to act on any insights.  And, you need to cultivate an "analytic marketer" mindset in your broader marketing team that blends perspectives, rather than cultivating an elite but blinkered cadre of "marketing analysts".  Over the next few weeks, I'll further outline some of what's in the book in a few posts here on my blog.

I'm really grateful to the folks who were kind enough to help me with the book.  The list includes: Mike Bernstein, Tip Clifton, Susan Ellerin, Ann Hackett, Perry Hewitt, Jeff Hupe, Ben Kline, Janelle Leonard, Sam Mawn-Mahlau, Bob Neuhaus, Judah Phillips, Trish Gorman Clifford, Rob Schmults, Michelle Seaton, Tad Staley, and my business partner, Jamie Schein.  As I said in the book, if you like any of it, they get credit for salvaging it.  The rest -- including several bits that even on the thousandth reading still aren't as clear as they should be, plus a couple of typos I need to fix -- are entirely my responsibility.

I'm also grateful to the wonderful firms and colleagues and clients I've had the good fortune to work for and with.  I've named the ones I can, but in general have erred on the side of respecting their privacy and confidentiality where the work isn't otherwise in the public domain.  To all of them: Thank You!

This field is evolving quickly in some ways, but there are also some timeless principles that apply to it.  So, there are bits of the book that I'm sure won't age well (including some that are already obsolete), but others that I hope might.  While I'm not one of those coveted Data Scientists by training, I'm deep into this stuff on a regular basis at whatever level is necessary to get a positive return from the effort.  So if you're looking for a book on selecting an appropriate regression technique, or tuning Hadoop, you won't find that here, but if you're looking for a book about how to keep all the balls in the air (and in your brain), it might be useful to you.  It's purposefully short -- about half the length of a typical business book.  My mental model was to make it about as thick as "The Elements of Style", since that's something I use a lot (though you probably won't think so!).  Plus, it's organized so you can jump in anywhere and snack as you wish, since this stuff can be toxic in large doses.

In writing it amidst all the Big Data craziness, I was reminded of Gandhi's saying (paraphrased) "First they ignore you... then they fight you, then you win."  Having been in the world of marketing analytics now for a while, it seems appropriate to say that "First they ignore you, then they hype you, then you blend in."   We're now in the "hype" phase.  Not a day goes by without some big piece in the media about Big Data or Data Scientists (who now have hit the highly symbolic "$300k" salary benchmark -- and last time we saw it, in the middle part of the last decade in the online ad sales world, was a sell signal  BTW).  "Pragmalytics" is more about the "blend in" phase, when all this "cool" stuff is more a part of the furniture that needs to work in harmony with the rest of the operation to make a difference.

"Pragmalytics" is available via Amazon (among other places).  If you read it please do me a favor and rate and review it, or even better, please get in touch if you have questions or suggestions for improving it.  FWIW, any earnings from it will go to Nashoba Learning Group (a school for kids with autism and related disorders).

Where it makes sense, I'd be very pleased to come talk to you and your colleagues about the ideas in the book and how to apply them, and possibly to explore working together.  Also, in a triumph of Hope over Experience, my next book (starting now) will be a collection and synthesis of interviews with other senior marketing executives trying to put Big Data to work.  So if you would be interested in sharing some experiences, or know folks who would, I'd love to talk.

About the cover:  it's meant to convey the harmonious convergence of "Mars", "Venus", and "Earth" mindsets: that is, a blend of analytic acuity, creativity and communication ability, and practicality and results-orientation that we try to bring to our work. Fellow nerds will appreciate that it's a Cumulative Distribution Function where the exponent is, in a nod to an example in the book, 1.007.

 

 

October 31, 2012

Today's Data Exercise: The @fivethirtyeight / Intrade Presidential Election Arbitrage #Analytics

(Nerd alert!  You have been warned.)

Unoriginally, I'm a big fan of Nate Silver's fivethirtyeight blog.  I've learned a ton from him (currently also reading his book The Signal and the Noise).  For a little while now I've been puzzling over the relationship between his "Nowcast" on the presidential election and the price of Obama 2012 contracts at Intrade.  Take a look at this chart I made based on the data from each of these sources:

Obama - 538 vs Intrade October 2012

If we look past Obama's disastrous first debate, and look at the difference between the seven-day moving averages of the 538 Obama win probability and the Intrade Obama 2012 contract price, it looks to fluctuate roughly around 10-15 points, call it 12.  Also, looking at the volumes, it looks like the heaviest trading happens roughly around midweek, before Friday.  So if you trust Nate's projections, and unless you've got inside scoop about any big negative surprises to come, the logical thing to do is to buy Obama 2012s tomorrow, with an average probability of clearing $1.20 on each contract (about a 20% gain).

Now for the nerdy part:

First, the easy job: Intrade lets you download historical prices on its contracts.

Next, the harder job: Nate doesn't provide a .csv of his data.  But if you "view source" on his page, you'll see a file called:

"http://graphics8.nytimes.com/packages/html/1min/elections/2012/fivethirtyeight/fivethirtyeight-ccol-top.js"

right after a preceding description "Data URL".

If you take a look at this file, you'll notice it's Javascript-chart-friendly, but as far as for the kind of analysis above, not so much.  The first order of business was to cut out the stuff I didn't want, like the Senate race data, and the forecast part of the presidential polls.  Then, I further whacked out data before 10/1, because I thought examining trends in a more thinly-traded market would be less relevant.

For a little while I fiddled with the Stanford Visualization Group's Data Wrangler tool to reshape the remaining data into the .csv I needed.  It's a powerful tool, but it turned out to be easier in this case to wrangle the file structure I wanted manually:

"date","obama_votes","romney_votes","obama_win_pct","romney_win_pct","obama_pop_vote","romney_pop_vote"

"2012-10-30",298.8,239.2,79.5,20.5,50.4,48.6

"2012-10-29",294.4,243.6,75.2,24.8,50.2,48.8

etc.

Combining the Intrade and 538 data and then plotting the Intrade close and the "Obama win pct" series results in the chart above.