A year ago, while doing research for my new book Marketing and Sales Analytics I wrote a post titled "Organizing For #Analytics -- Seven Considerations". I argued that you should "think organization last, not first", and presented, well, seven considerations that would shape the structure you end up with.
But I made an important mistake in writing that post. Enquiring minds still want the boxes. So, based on the 15 case studies I presented in the book, as well as other experiences over the past decade with even more firms, this post offers a "reference structure" and suggests how it might be tweaked under different circumstances.
In constructing an analytic organization, you've got two objectives to serve that should guide you. One is answering questions. The other is answering them better the next time.
To answer questions, once again you've got two things to balance. One is "business intimacy" -- how close you need to be to the business you're serving to understand what's going on, so your analysis is grounded in that reality. By what's going on, I mean things like terminology, the particular patterns and quirks in the data the business generates, and the options for acting on insights that the operational infrastructure, people, and process of the business make possible.
Balanced against business intimacy is "analytic capability". This means having a critical mass of skills for supporting decisions appropriately. If a decision requires a sophisticated model using lots of data, for example, having a bunch of business-intimate MBAs who last ran a regression a few years ago in business school using Excel might not cut it. So you might need one or more specialists who can do that sort of thing. Or, if it requires specialized knowledge about, say, how digital analytics work, you might need folks who know about things like tag management systems, or inferred matching across devices for multi-screen tracking.
In implementing these two requirements, you then face -- wait for it -- another two choices. One is how much to distribute into the field, or into business units, or aligned to functions, versus how much to centralize. The second will be how much to separate analysis from reporting. With respect to the former, a lot depends on how different or similar the needs in each business / field location are (the more different, the closer you want support to them, and the less leverage there is from centralization anyway). As for the latter, My thinking and research has suggested that separate "report generation" groups are a bad idea -- they encourage conveying data rather than insight, and thus the development of a second-class citizenry in analytics groups; they disconnect report generation from report usage, resulting in the proverbial pallets of unused reports that consultants on cost-hunts feast on; and, they discourage automating reporting (because most folks don't like to work themselves out of a job).
To answer questions better the next time, you need some folks thinking about everything from whether there are better techniques and associated tools, to whether there's more useful data, and how to better organize and provision the data you've already got. There are going to be -- yes, of course -- two options for how to proceed here. One is a "rapid prototyping" approach, and the other is a more traditional "waterfall" method.
Wrapped around all this is "governance", only with a capital "G". In most cases, governance is applied to data, but not to the prioritizations of the questions and decisions that analysis will answer and support. Capital G governance also sets analytic priorities, and recruits, develops, and evaluates senior analytic leadership.
So, here's your org chart:
OK, now that you got what you came for, here's the price: You have to consider the design above not an immutable multi-year reality, but rather a delicate, dynamic compromise that balances the objectives and tensions mentioned above. Where a group should live, as the footnotes suggest, and how relatively large each group becomes, should be a function of a constantly evolving portfolio of opportunities and challenges your governance process should determine. Conceiving of this not as a static chart prevents a pattern too often repeated -- a guardrail to guardrail "centralize versus distribute", "locate in the business versus locate in IT (or finance)" swing, where each successive move trashes the logic of the last. But that's a topic for another post...
What's your chart look like? Why? How often has it / does it evolve?