The Dangers of the “Versus Expectations” Fallacy

Friday, the 2nd day of May, brought two important pieces of government reported information: The April unemployment report came in much better than expected, or “blows pasts forecasts as USA Today reported, and initial Obamacare enrollment included more of the previously uninsured than expected, something that Mother Jones says is “far far higher than previous estimates.” Both pieces of information are a good way of looking at an analysis pitfall common in the world of Wall Street and now being increasingly ported over to the rest of the news cycle: the fallacy of mistaking how something does vs expectations with whether its good or bad.

Imagine for a moment a hypothetical public company that is in the midst of a long business decline to oblivion. Every quarter the company has lower income and lower revenue until eventually it goes out of business. But at the same time every quarter the company makes such dire forecasts for the future that it gets the Wall Street analyst community to make very low forecasts for the immediate future.  It’s possible that on its way to bankruptcy this company manages to report earnings (or losses) that are not as bad as predicted. Given the industry’s obsession with making future forecasts, and the media’s obsession with reporting how events turned out “versus expectations,” its possible and even likely that such a company generates positive news headlines right up to the day it files for bankruptcy. Such is the danger of comparing data versus meaningless expectations.

It would be one thing if “the experts” were good at making forecasts, as then a number coming in much higher or lower than expectations would reveal something significant. But history has shown most people are not good at forecasting anything. Forecasting complicated events with significant short term variation is hard. A company’s earnings, an economy’s jobs creation or a radical healthcare overhaul’s signups are based on a myriad of complicated factors, many of which are hard to measure after the fact, never mind forecast in the future. For evidence, look no further than the Labor Department’s own revisions of numbers it already reported months ago. These revisions can be so drastic that even if some expert accurately predicts a specific month’s jobs figures when they are released, he will still be proven off in the future once those numbers are revised.

Furthermore if someone was good at forecasting any of these numbers, they probably wouldn’t be wasting their time making public forecasts for ours and the media’s consumption. Instead they would either be running their own Hedge Fund making a trillion dollars predicting the future, or being paid handsomely running the actuarial department of a health insurance company. The absence of trillionaires in our midst is evidence of how hard it is to forecast anything on a regular basis.

So why do so many people report and discuss how something did versus expectations? For some, its a way of scoring political brownie points while distracting from a larger success or failure. The roll-out of the Affordable Care Act has been fraught with so many missteps, from crashing websites to people losing preferred coverage, that today’s news can be seized upon as a much needed success story. It should be noted that the whole point of the ACA, and its primary sales pitch, was to bring coverage to people who didn’t have any. In other words, if we compare the recent reports to the original promises made by the law’s architects, an honest headline today would read “Obamacare is happening.” But by comparing the results to predictions made either a few years ago or a few months ago, both opponents and supporters can claim they were right.

As for the media, comparisons to expectations is an easy way of creating fake context, saving them the trouble of doing the harder work of creating real context. To the average reader job creation of 288,000 in a month is not obviously good or bad, and could mean everything from employment is on fire to not keeping up with population growth. Going into great detail about the trajectory of the numbers, how they compare with historical averages or a qualitative breakdown of the kinds of jobs created makes for better journalism than “blows pasts forecasts,” but it doesn’t make for a better headline.

When you hear that a report came out much better or worse versus expectations, all you are learning is that the expectations were inaccurate. To understand whether the report itself was actually good or bad, you’ll have to dig deeper.




January 2018
« Sep    

Recent Comments