The folly of experts

An “expert” is:

  1. A person who is wrong slightly less often than non-experts, which is still most of the time.
  2. A person whom the speaker wants the listener to trust.

Ultimately, when applied to soft disciplines such as economics (especially macro), public policy or finance, the word is a misnomer – or worse, deceptive advertising. Because it is impossible to be an expert on economics or public policy in quite the same way one can be an expert on geology or coniferous trees. It’s a linguistic problem that we use same word for both.

If you ask a panel of geology experts to identify a rock, you’ll get a reliable answer.

If you ask a panel of economic experts to suggest a policy, not only will you get an unreliable answer, you’ll get a different one for every expert you ask.

This is because “soft” disciplines are fundamentally limited in the degree of certainty achievable. This doesn’t mean that politics and economics are arbitrary – that would be a chillingly dangerous notion. But there is a high degree of interpretation and mythopoeia to them, and the lack of substance is compounded by the closeness of economic and political theory (and policy) to tribal tendencies and self-interest. So on top of the intractability, there are skewed incentives. Consciously or not, everybody has an axe to grind, or as we say in Eastern Europe, “everybody is warming their own soup”.

Doesn’t happen with pines and quartz.

I would go as far as saying it is categorically impossible to be an expert in non-natural sciences. In fact, I would go even further and say that non-natural disciplines are not sciences at all by definition. It is perfectly possible to be better versed in them, have a broader theoretical base, be able to talk longer about them, sure, but the same can be said for Star Wars, and expertise in a practical or predictive sense is categorically ruled out in such airy endeavors, and real-world track records confirm this conclusively.

Such are the results of a famous study by psychologist Philip E. Tetlock which confirmed, in rigorous scientific terms, the popular suspicion that “experts” are brimmingly full of shit. The study was summed up in a rare article published in the Huffington Post that doesn’t intellectually offend, having been mercifully written by an outsider: “After testing 284 experts in political science, economics, history, and journalism in a staggering 27,450 predictions about the future, Tetlock concluded that they did little better than “a dart-throwing chimpanzee.”

It also turns out that “wisdom of the crowds”, i.e. the average of many ordinary people’s guesses, outperforms experts in aggregate in at least some of these fields. This implies that formal education in some disciplines actually detracts from quality of judgment. Or perhaps there is a process of self-selection at work instead, and only a particular sort of person chooses a career as an “expert” – and those tend not to be the pick of the litter.

Tetlock famously divides experts into Hedgehogs and Foxes (this is an excellent article – if you click just one link, click this). I suggest an alternate metric: the bluffing-to-delusion ratio, i.e. the degree to which the bullshit is deliberate as opposed to reflecting an honest, though mistaken sense of self. I admit the distinction is not always clear. People fall for their own bullshit over time, and the amount and sophistication of cognitive dissonance displayed is mind-blowing. The awareness that one is a fraud is unbearable, so self-deception kicks in.

Outside recognition doesn’t help. As shown by Dan Ariely, when we get falsely recognized for qualities we don’t have, we’re quick to convince ourselves that we do – cheaters on tests convince themselves they deserve their scores, doping athletes believe their performance reflects their actual ability, and a total moron accidentally or nepotically appointed to run a fund quickly convinces himself he’s a stock-picking superstar in the face of overwhelming statistical evidence to the contrary.

The knowledge that experts are mostly bluffing or insane is practical. It is tempting to entrust and delegate the management of intimidatingly important and highly consequential political and economic matters to “experts”, and the idea of an “expertocracy” pops up periodically in history. Unfortunately, this is a fallacy, and the kind of experts we imagine for the job not only don’t exist, but cannot exist. No infallible authority is going to save us from thinking and acting for ourselves. Fortunately.

It’s not a matter of finding and appointing the right guys. The entire concept is in error.

A profession notorious for succumbing to delusions (or pretension) of competence is finance, specifically stock picking and active money management.

This was excellently put by Scott Adams, author of Dilbert and possibly the most useful book of the last decade, in an interview for the Wall Street Journal:

Under what circumstances would you advise somebody to use active money managers as opposed to index funds?

I can think of many cases in which I would recommend active money managers over index funds. For example, I might be giving the advice to someone I hate, or—and this happens a lot—someone I expect to hate later. I would also recommend active money managers if I were accepting bribes to do so, if I were an active money manager myself, or if it were April Fools’ Day. And let’s also consider the possibility that I might be drunk, stupid or forced to say things at gunpoint. I’ve also heard good things about a German emotion called schadenfreude, so that could be a factor too.”

This isn’t to say experts don’t occasionally (also a good article) get it right. Considering that there are so many of them predicting every conceivable scenario at all times, it is a statistical certainty that somebody will have predicted whatever actually happens. But that doesn’t make that person a prescient genius any more than getting accidentally hit by a stray bullet is a deserved punishment. Being the only guy who predicted a crisis is good for speaking opportunities and book sales, but it is ultimately largely luck.  Experts accurately predicted 48 out of the last 4 economic crises.

This is the “scattershot effect” – if something happens all the time (such as economic doomsaying), it’s predictive value is zero. All births occur within 24 hours of a sunrise, but that doesn’t mean sunrises can be used to predict birth rates.

Better keep the folly of experts in mind whenever there is a talking head on TV, when you’re told (explicitly or implicitly) that you’re unqualified to make decisions for yourself, confidently informed of the horrors that will visit you if you don’t yield your liberties and money to the latest political doomsday fad, or when famous people make predictions about the dangers of technologies that don’t even exist yet.

Experts on rocks and trees and fish are real. Experts on economics, politics and “the future” are usually (oxy)morons.