So healthcare.gov launched, with problems. I'm trying to understand why, so I can apply some lessons in my professional life. Here are some ideas.
First, I think it helps to define some levels of the problem. I can think of four:
1. Strategic / policy level -- what challenges do the goals we set create? In this case, the objective, basically, is two-fold: first; reduce the costs of late-stage, high-cost uncompensated care by enrolling the people who ultimately use that (middle-aged poor folks and other unfortunates) in health insurance that will get them care earlier and reduce stress / improve outcomes (for them and for society) later; second; reduce the cost of this insurance through exchanges that drive competition. So, basically, bring a bunch of folks from, in many cases, the wrong side of the Digital Divide, and expose them to a bunch of eligibility- and choice-driven complexity (proof: need for "Navigators"). Hmm. (Cue the folks who say that's why we need a simple single-payor model, but the obvious response would be that it simply wasn't politically feasible. We need to play the cards we're dealt.)
2. Experience level -- In light of that need, let's examine what the government did do for each of the "Attract / Engage / Convert / Retain" phases of a Caveman User Experience. It did promote ACA -- arguably insufficiently or not creatively enough to distinguish itself from opposing signal levels it should have anticipated (one take here). But more problematically, from what I can tell, the program skips "Engage" and emphasizes "Convert": Healthcare.gov immediately asks you to "Apply Now" (see screenshot below, where "Apply Now" is prominently featured over "Learn More", even on the "Learn" tab of the site). This is technically problematic (see #3 below), but also experientially lots to ask for when you don't yet know what's behind the curtain.
3. Technical level -- Excellent piece in Washington Post by Timothy B. Lee. Basically, the system tries to do an eligibility check (for participation and subsidies) before sending you on to enrollment. Doing this requires checking a bunch of other government systems. The flowchart explains very clearly why this could be problematic. There are some front end problems as well, described in rawest form by some of the chatter on Reddit, but from what I've seen these are more superficial, a function of poor process / time management, and fixable.
Second, here are some things HHS might do differently:
1. Strategic level: Sounds like some segmentation of the potential user base would have suggested a much greater investment in explanation / education, in advance of registration. Since any responsible design effort starts with users and use cases, I'm sure they did this. But what came out the other end doesn't seem to reflect that. What bureaucratic or political considerations got in the way, and what can be revisited, to improve the result? Or, instead of allowing political hacks to infiltrate and dominate the ranks of engineers trying to design a service that works, why not embed competent technologists, perhaps drawn from the ranks of Chief Digital Officers, into the senior political ranks, to advise them on how to get things right online?
2. Experience level: Perhaps the first couple of levels of experience on healthcare.gov should have been explanatory? "Here's what to expect, here's how this works..." Maybe video (could have used YouTube!)? Maybe also ask a couple of quick anonymous questions to determine whether the eligibility / subsidy check would be relevant, to spare the load on that engine, before seeing what plans might be available, at what price? You could always re-ask / confirm that data later once the user's past the shopping /evaluation stage, before formally enrolling them into a plan. In ecommerce, we don't ask untargeted shoppers to enter discount codes until they're about to check out, right?
3. I see from viewing the page source they have Google Tag Manager running, so perhaps they also have Google Analytics running too, alongside whatever other things... Since they've open-sourced the front end code and their content on Github, maybe they could also share what they're learning via GA, so we could evaluate ideas for improving the site in the context of that data?
Postscript, November 1:
Food for thought (scroll to bottom). How does this happen? Software engineer friends, please weigh in!