More Information, More Problems

Exposure to so many new ideas is producing mass confusion.

The amount of information is increasing much more rapidly than our understanding of what to do with it, or our ability to differentiate the useful information from the mistruths.

The story the data tells us is often the one we’d like to hear, and we usually make sure that it has a happy ending.

The Productivity Paradox

Vast amounts of theory applied to extremely small amounts of data.

The Promise and Pitfalls of “Big Data”.

The fashionable term now is “Big Data”.IBM estimates that we are generating 2.5 quintillion bytes of data each day, more than 90 percent of which was created in the last two years.

The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning. We may construe them in self-serving ways that are detached from their objective reality.

Data-driven predictions can succeed—and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves.

Our naïve trust in models, and our failure to realize how fragile they are to our choice of assumptions, will yield disastrous results.

Big Data will produce progress—eventually. How quickly it does, and whether we regress in the meantime, will depend on us.

Future Shocks

Consequences of what Alvin Toffler,writing his book Future Shock in 1970, called “information overload”.

He thought our defense mechanism would be to simplify the world in ways that confirmed our biases, even as the world itself was growing more diverse and more complex.

If the quantity of information is increasing by 2.5 quintillion bytes per day, the amount of useful information almost certainly isn’t. Most of it is just noise, and the noise is increasing faster than the signal.

There are so many hypotheses to test, so many data sets to mine—but a relatively constant amount of objective truth.

Complex systems like the World Wide Web have this property of reproducing mistakes times over when they are made, as in the case of the Wicked Bible. They may not fail as often as simpler ones, but when they fail they fail badly.

Regulation is one approach to solving these problems. But I am suspicious that it is an excuse to avoid looking within ourselves for answers. We need to stop, and admit it: we have a prediction problem. We love to predict things—and we aren’t very good at it.

The Prediction Solution

Prediction does play a particularly important role in science, however. Some of you may be uncomfortable with a premise that I have been hinting at and will now state explicitly: we can never make perfectly objective predictions. They will always be tainted by our subjective point of view.

A belief in the objective truth—and a commitment to pursuing it—is the first prerequisite of making better predictions.

Prediction is important because it connects subjective and objective reality. Karl Popper, the philosopher of science, recognized this view.For Popper, a hypothesis was not scientific unless it was falsifiable—meaning that it could be tested in the real world by means of a prediction.

What should give us pause is that the few ideas we have tested aren’t doing so well, and many of our ideas have not or cannot be tested at all.

In economics, it is much easier to test an unemployment rate forecast than a claim about the effectiveness of stimulus spending.

In political science, we can test models that are used to predict the outcome of elections, but a theory about how changes to political institutions might affect policy outcomes could take decades to verify.

The fact that the few theories we can test have produced quite poor results suggests that many of the ideas we haven’t tested are very wrong as well.

We are undoubtedly living with many delusions that we do not even realize.

But there is a way forward. It is not a solution that relies on half-baked policy ideas— particularly given that I have come to view our political system as a big part of the problem. Rather, the solution requires an attitudinal change.

This attitude is embodied by something called Bayes’s theorem. Bayes’s theorem is nominally a mathematical formula. But it is really much more than that. It implies that we must think differently about our ideas—and how to test them.

We must become more comfortable with probability and uncertainty. We must think more carefully about the assumptions and beliefs that we bring to a problem.