Data scientists to CEOs: You can’t handle the truth
Too many big data initiatives fail because companies, top to bottom, aren’t committed to the truth in analytics. Let me explain.
In January 2015, the Economist Intelligence Unit (EIU) and Teradata (full disclosure: also my employer) released the results of a major study aimed at identifying how businesses that are successful at being data-driven differ from those that are not.
数多のビッグデータを用いた取り組みは、会社のアナリティクスへの関心が足りず、失敗します。
Economist Intelligence Unit (EIU)とTeradata(私の雇用者)は2015年1月に、成功したデータ駆動型である会社とそうでない会社の違いに関する研究の結果を発表した。
あまりに多くのビッグデータへの取り組みが失敗しているのは、企業が徹底してアナリティクス(分析論)における真実にコミットしていなためだ。説明しよう。
2015年1月、Economist Intelligence Unit(EIU)とTeradata(情報の全面開示:私の雇用主でもある)が、データ駆動において成功している企業とそうではない企業が、どのように違うのかを突き止めることを目的とした大規模な研究を発表した。
Among its many findings, there were some particularly troubling, “code red” results that revealed CEOs seem to have a rosier view of a company’s analytics efforts than directors, managers, analysts, and data scientists. For example, EIU found that CEOs are more likely to think that employees extract relevant insights from the company’s data – 38 percent of them hold this belief, as compared to 24 percent of all respondents and only 19 percent of senior vice presidents, vice presidents, and directors. Similarly, 43 percent of CEOs think relevant data are captured and made available in real time, compared to 29 percent of all respondents.
So why is there such a disconnect? It turns out the answer is much more human than the size of a company’s data coffers, or the technology stockpiled to analyze it. Big data initiatives fall down at the feet of biases, bad assumptions, and the failure, or fear, of letting the data speak for itself. As insights make their way up the corporate ladder, from the data scientist to the CEO, the truth in analytics is lost along the way. And this leads to a cumulative effect of unintended consequences.
Communicate the Known-Unknowns to Your CEO
Take the idea of known risks, for example. In analytics, you always have to make some assumptions because the data hardly ever paints a complete picture. So, you have to identify and rank those risks to understand what might happen when assumptions go wrong. In some cases, the risks aren’t tied to big consequences. But, in other cases, it can be devastating.
既知の危険性についての概念を例に取ってみよう。解析論的に、データだけでは全体的な状況がつかみにくいため、常に、いくつかの想定をする必要があるであろう。故に、その想定が間違っでいた時に、何が起こりうるかを認識するため、これらの危険性を見極め、査定評価しなければならない。場合によっては、その危険性が大きな結果をもたらすことには関係ないことがあるが、壊滅的な結果を及ぼしうることもあるのだ。
既知のリスクへの考え方、例えば、データによって完璧な図を描くことは解析論では難しいため、あなたは常にいくつかの仮定を想定していなければならない。そこで、あなたはその仮定が間違っていた時に、何が起こるのか理解するためにリスクについて認識し、ランク付けをしなければならないのである。あるケースでは、リスクは大きな結果に結びつかないこともあるが、あるケースでは破壊的な結果となる場合がある。
Look at the stock market crash of 2008. A whole host of people made a simple and logical assumption that home prices would only go up. But most analysts didn’t experiment enough with what would happen if prices actually fell. Well, now we know what would happen. It was almost a global calamity. The people investing in the pre-housing crisis bubble were working on an assumption that was very flawed on many levels. And very few people considered, or realized, the risk until it was too late.
The same thing happens, at generally smaller scales, in businesses. The CEO doesn’t have a clear view of risk. It is up to the data scientists, business analysts and their managers to make the CEO well aware of the risk in assumptions. The CEO needs to understand that there is a critical, level 1 risk in assumptions – in the housing example, if prices were to go down, this whole thing falls apart. Even if that risk is unlikely, at least it is on the table. Many people are uncomfortable discussing such negatives with senior executives and many senior executives don’t like to hear it. But to succeed, everyone must get past that hurdle.
Get Past the Culture of Fear of the Truth
Then there is the fear of the truth, with a bit of cognitive bias thrown in. For example, it is very common that sales people, when asked for their forecast, even armed with data on historical performance and current pipeline, are generally not sure if they are going to hit their number. But, typically, they’ll tell the VP of sales they will hit their forecasts – unless, of course, a miss is very apparent. They share the information they’re expected to share, and withhold any acknowledgement that the numbers are malleable.
そして、認識バイアスが入り込んできて真実を恐れる事態が起きる。たとえばセールスパーソンが自身の売上見通しを聞かれた際には、たとえそれまでの実績や現状の見込みに関するデータで武装していたとしても目標を達成できるかどうか確信が持てないものだ。しかし、販売部門のVP(バイスプレジデント)には予想を達成できますと言ってしまいがちだ(当然ながら、ミスが大きくなければの話)。彼らは自分がシェアしたいと思っている情報をシェアして、数字には幅があるということを表に出さない。
The problem arises in the aggregate: The VP gets a rosy picture from five sales people on her team, even though they all have serious doubts, so she puts that assumption in and the data rolls up to the CEO, or CFO. In reality, the metric is underpinned by a huge amount of doubt. The truth is buried under the fear of losing one’s job and the cultural expectation that the goal will be met. Failure is not an option. However, while it is likely several of the sales people will manage to hit their number, the chance that they all will is small. This makes the VPs figures even more unrealistic than the initial estimates.
So what happens? Everyone is shocked when the company misses its forecast. This is an example of where people sugarcoat a little at the low end, and the cumulative effect leads to the business incorrectly forecasting company-wide results.
1行目「基本的なもっと小規模だが」ではなく「基本的にはもっと小規模だが」でした。
お詫びして訂正いたします。