Anders Liu-Lindberg en Professions, Workers, Careers, Directors and Executives, Accountants Senior Finance Business Partner • Maersk Line A/S 10/11/2016 · 3 min de lectura · 3,3K

Death By Data Or The Death Of Data?

Death By Data Or The Death Of Data?

Data is dead! Or at least that’s what a lot of people will tell you in the wake of Donald Trump’s victory and Brexit before that. We used to be able to put a lot of trust in data provided to us by large polling agencies and other providers of statistics, however, now it’s all changed. Now we’re turned from being data believers and data junkies even to data skeptics and it makes us doubt if we can trust information from any so called trustworthy sources.

So how come data believers are so certain they’re right?

In the now concluded US elections everyone and I mean everyone was certain that Hillary Clinton would win. If you live abroad like me you went to bed feeling certain about this outcome and woke up to a different reality. It was the exact same thing with Brexit. Just this time you had highly respected professionals saying they would even eat a bug if Donald Trump just came anywhere close (not to winning) to challenge Hillary.

Now that’s believing in your numbers but hey what about statistical uncertainty? Everyone with a bit of statistical insight knows that the smaller the sample size you have the larger degree of uncertainty and when you do polling you don’t ask everyone. So it was always within the statistical uncertainty that Trump could win. But you mine the data, compile different data sources, compare with historical events and there’s your guess. The more you believe in it and are willing to do if you’re wrong supposedly the more people will believe in you. Yet as long as a different outcome is within the statistical uncertainty you probably shouldn’t be too square about your prediction. Of course, it’s not the first time someone has been wrong about a political or any outcome for that matter. Once in Denmark a party in the parliament had sunk to 0.0% in the polls which led one commentator to promise he would eat his old hat if the party made it to parliament again. Well, here he is eating is hat.

Death By Data Or The Death Of Data?

It certainly wasn’t for lack of data that the polling agencies couldn’t predict the winner of the election or whether Brexit would be a Yay or Nay. Rather it was the hard-nosed believe in that the data was showing the correct outcome. The polls over promised and under delivered. TWICE. Does that mean data is dead? Not quite either but certainly someone needs to go back and take a good look at their models to understand what went wrong and perhaps moderate their messages with a softer touch next time they try to predict something.

What finance professionals can learn from this data mess?

Now if you’re in Finance you’re most likely asked to “own the numbers/data” which in the light of above can be a scary task. Not even considering that the data itself could be wrong i.e. using the polling analogy a wrong group of people had been asked or a certain segment was never asked. So assuming your data is correct how do you then interpret it? Do you sell the most likely outcome as the one and only truth or do you moderate your messages to allow for different outcomes? Here are a five points we can learn from the recent data mess caused by inaccurate polling agencies and political commentators.

  • Always be skeptical about what your data tells you
  • Mind the statistical uncertainty and probability weight different outcomes
  • Don’t put all your credibility on the line so use softer words like “this trend couldindicate” or “what I read from the numbers is… how do you see it?”
  • Never over promise and under deliver. You might be forced to literally eat your own words
  • If you know the data might be wrong or uncertain then inform your stakeholders and make sure to add what you’re doing to fix it (if possible)

So use this as an opportunity to think twice about what you know about data, to not draw fast conclusions and to soften your message about the conclusions you do draw. If you’re wrong too often you lose your credibility with your stakeholders and believe me when I say it takes a looooong time rebuild.

So what do you think we can learn from the recent significant polling errors? Do they force us to rethink how we should use data or just tell us to proceed with more caution? In this day and age of Big Data it’s important that we get it right as otherwise Finance will lose its seat at the table which we have fought so hard to win. Trump and Brexit certainly haven't done us any favors in that regard. As always let me know what you think about the article by liking, commenting and sharing. While references have been made to politics there are no political stand points hence I don’t mean to incite any political discussions.

For posts in my latest series on how the CFO should transform Finance, you can continue reading below and further below there are more posts about finance business partnering.

The CFOs Roadmap To Transforming Finance

How To Fix Your Basic Finance Function

Finance Systems For The 21st Century

Big Data Needs Big Data Scientists 

Can Big Data Nerds Speak Business?

How To Become Great At Business Finance

I’m A Finance Business Partner, To Whom?

Beyond The Final Frontier Of Finance

I also encourage you to take a tour of my old posts on finance transformation and finance business partnering and not least “Introducing The Finance Transformation Nine Box” which is really that starting point for the transformation. Last but not least, you should join my Finance Business Partner Forum where we will continue to discuss this topic.

Why Accountants Are An Endangered Species

Financial Analyst vs. Finance Business Partner

You’re A Finance Business Partner, Now What?

Case Study: Becoming A Finance Business Partner

How Finance Business Partners Improve Company Performance

There Is A New Kind Of CFO Needed In Town

5 Ways For Finance To Seize The Day In 2016

Are You Ready For Finance Business Partnering 2.0

Why Transforming Finance Matters

Anders Liu-Lindberg is the Senior Finance Business Partner for Maersk LineNorth Europe and is working with the transformation of Finance and business on a daily basis. I have participated in several transformation processes among others helping Maersk Drilling to go Beyond Budgeting and transformed a finance team from Bean-counters to Business Partners. I would love the chance to collaborate with you on your own transformation processes to help you stay out of disruption. If you are looking for more advice on how to get the most of LinkedIn I also have a few tips to share as well as if you want help in your job search. Don’t be shy! Let’s get in touch and start helping each other.

Anders Liu-Lindberg 14/11/2016 · #7

#5 Well Robert it's a sample size issue of the sample is too small and the margin for error is too big. That renders the poll practically useless.

Anders Liu-Lindberg 14/11/2016 · #6

#4 Robert, I think that's probably stretching the truth to say they were all spot on. No one had predicted such a landslide victory for Trump.

Robert Bacal 13/11/2016 · #5

Este usuario ha eliminado este comentario

Robert Bacal 13/11/2016 · #4

Este usuario ha eliminado este comentario

Yankee Mountain 13/11/2016 · #3

#1 Mr. Friedman you are correct. I am aware of only one medium sized poll that accurately projected the outcome (prior to changing their "algorithm") and that is Reuters. A large sized poll solicited 50,000 Americans , they were correct also. And, of course Wikileaks giant poll of +117,000 Americans was correct.

But beyond polls , actions speak louder than words. Had the political pundits looked at YouTube , they would have quickly realized that most of America was enjoying and following Trump's messages at +-5 to 1 ratio against Clinton.

Anders Liu-Lindberg 11/11/2016 · #2

#1 Yes @Phil Friedman I certainly believe that's part of the problem. I saw most of them used samples of around 1,000 people. That's crazy when trying to predict what 100+ millions will vote. Even in Denmark where they also use 1,000 people in a sample they can be quite off.

Phil Friedman 11/11/2016 · #1

Several very good points, Anders. Do you think that, perhaps, part of the problem may also be the attempt to extrapolate results from ridiculously small samples?

+1 +1