Book Review: Superforecasting Sheds Light on the Art of Prediction

The latest book by University of Pennsylvania professor Philip Tetlock shows readers what makes a nimble mind.

2015-10-lila-maclellan-book-review-superforecasting-large.jpg

The investors and money managers who read Institutional Investor may be familiar with Philip Tetlock, a University of Pennsylvania professor of psychology and management.

His new book, Superforecasting: The Art and Science of Prediction, written with Canadian journalist Dan Gardner, celebrates those thinkers among us whose logic and mental acuity allow for clear vision into the outcome of issues of the day — and how readers can emulate their traits.

As part of the Good Judgment Project (GJP), which Tetlock launched in 2011 with his research partner and wife, Barbara Mellers, volunteers made predictions on the basis of questions such as “Will North Korea detonate a nuclear device before the end of this year?” or “How many additional countries will report cases of the Ebola virus in the next eight months?” Tetlock and Mellers carefully measured the accuracy of the forecasts by both the research pool in aggregate and individual participants.

The GJP became one of five competing teams in a massive forecasting tournament funded by the Intelligence Advanced Research Projects Activity, an agency under the U.S. government’s Office of the Director of National Intelligence. Within a few years, Tetlock had collected predictions from more than 20,000 people and had developed a simple algorithm to process the results. It won the tournament, outperforming the control group, as well as scholars and intelligence experts.

What’s more, during the experiment, Tetlock’s team witnessed the rise of an elite class of forecasters from within the crowd. Two percent of the participants stood out from the group as spectacularly more prescient: He calls them the “superforecasters.” The book opens with an introduction to Bill Flack, a retired U.S. Department of Agriculture employee from Nebraska, where he enjoys bird-watching. He’s not a celebrity author or economist, but he’s one of the top-performing, savviest superforecasters.

To enlighten readers about why precious few people share Flack’s talents, the authors spend a large part of the book describing the cognitive traps that most of us fall victim to, all the time. A supercommon forecasting mistake that superforecasters never make involves substituting easy questions for difficult ones. An example: “Will either the French or Swiss inquiries find elevated levels of polonium in the remains of Yasser Arafat’s body?” becomes “Did Israel poison Yasser Arafat?” If you already hold a particular view, the answer to the second question may seem plain as day.

The nobler traits shared by superforecasters come into sharp focus in the second half of the book. They are naturally aware of the tricks the mind can play to make itself “certain” of something, as the human brain is wont to do. They’re also ready to change their forecast when new evidence emerges, without any fear of damaging one’s ego. To be sure, the superforecasters were found to be generally more knowledgeable than the general population, and they have higher-than-average IQs. Still, they were neither that much more knowledgeable nor intelligent than forecasters not of the “super” variety. Instead, they were just more apt to keep testing ideas, remaining in what Tetlock calls “perpetual beta” mode. They are careful but not spineless or indecisive, sins for any self-respecting leader in today’s culture.

“The humility required for good judgment is not self-doubt — the sense that you are untalented, unintelligent or unworthy,” Tetlock explains in the book. “It is intellectual humility. It is a recognition that reality is profoundly complex, that seeing things clearly is a constant struggle, when it can be done at all, and that human judgment must therefore be riddled with mistakes.”

Ten years ago, Tetlock’s book Expert Political Judgment: How Good Is It? How Can We Know? caused a stir for its most famous finding: Two decades of research showed that the forecasts of an average expert were roughly as accurate as a dart-throwing chimpanzee. (The sporty apes were a colorful stand-in for random guessing.)

Tetlock’s book was in fact far more nuanced than that conclusion would imply. He didn’t entirely dismiss the work of experts but said that many, not all, were often, not always, wrong.

The book doesn’t ignore Nassim Nicholas Taleb’s black swan theory, which has become a mainstay among investors. Superforecasters naturally calls into question many — but not all — of its assertions. Pointing at what was known in intelligence circles before the terrorist attacks of September 11, 2001, as one example, Tetlock proposes that black swan events are not so unpredictable as Taleb suggests. He furthermore argues that history doesn’t always jump, as Taleb offers. It also chugs along, altered in ways that are forecastable. Predicting a true black swan event may not be possible, but making truly informed decisions about near-term variables could nonetheless change the course of history.

Superforecasting manages to be enlightening and entertaining, making complicated concepts graspable and retelling the recent history of thinking and predicting through real world and academic case studies. It’s not only a worthwhile read for finance types who want to sharpen their investing skills; it’s a must for anyone who wants to engage with the world with a more agile, open mind.

Get more on macro.

Philip Tetlock Bill Flack Dan Gardner Barbara Mellers Pennsylvania
Related