November 11, 2007

Revolutionaries
You Can't Predict Who Will Change The World
Nassim Nicholas Taleb 05.24.07, 6:00 AM ET

Nassim Nicholas Taleb

pic

pic
Featured:
Ten People Who Could Change The World
You Can't Predict Who Will Change The World
The Revolutionaries:
Bruce: Growing Electricity
Cirac: Quantum Teleportation
Duflo: Fighting Poverty Efficiently
Eggan: P.C. Stem Cells
Hanson: Open Source Cinema
Linzey: Tree Rights
Nader: Altering Human Memories
Tegmark: Measuring the Invisible Universe
Tenenbaum: The Human Computer
Voigt: Reprogramming Life
In Pictures:
Fifteen Inventions We Want
People Who Have Changed The World
Video: Inside The Revolutionary Mind
Poll: Who Has Changed The World?

Before the discovery of Australia, Europeans thought that all swans were white, and it would have been considered completely unreasonable to imagine swans of any other color. The first sighting of a black swan in Australia, where black swans are, in fact, rather common, shattered that notion. The moral of this story is that there are exceptions out there, hidden away from our eyes and imagination, waiting to be discovered by complete accident. What I call a "Black Swan" is an exceptional unpredictable event that, unlike the bird, carries a huge impact.

It's impossible for the editors of Forbes.com to predict who will change the world, because major changes are Black Swans, the result of accidents and luck. But we do know who society's winners will be: those who are prepared to face Black Swans, to be exposed to them, to recognize them when they show up and to rigorously exploit them.

Things, it turns out, are all too often discovered by accident--but we don't see that when we look at history in our rear-view mirrors. The technologies that run the world today (like the Internet, the computer and the laser) are not used in the way intended by those who invented them. Even academics are starting to realize that a considerable component of medical discovery comes from the fringes, where people find what they are not exactly looking for. It is not just that hypertension drugs led to Viagra or that angiogenesis drugs led to the treatment of macular degeneration, but that even discoveries we claim come from research are themselves highly accidental. They are the result of undirected tinkering narrated after the fact, when it is dressed up as controlled research. The high rate of failure in scientific research should be sufficient to convince us of the lack of effectiveness in its design.

If the success rate of directed research is very low, though, it is true that the more we search, the more likely we are to find things "by accident," outside the original plan. Only a disproportionately minute number of discoveries traditionally came from directed academic research. What academia seems more masterful at is public relations and fundraising.

This is good news--for some. Ignore what you were told by your college economics professor and consider the following puzzle. Whenever you hear a snotty European presenting his stereotypes about Americans, he will often describe them as "unintellectual," "uneducated," and "poor in math," because, unlike European schooling, American education is not based on equation drills and memorization.

Yet the person making these statements will likely be addicted to his iPod, wearing a T-shirt and blue jeans, and using Microsoft Word to jot down his "cultural" statements on his Intel-based PC, with some Google searches on the Internet here and there interrupting his composition. If old enough, he might also be using Viagra.

America's primary export, it appears, is trial-and-error, and the innovative knowledge attained in such a way. Trial-and-error has error in it; and most top-down traditional rational and academic environments do not like the fallibility of "error" and the embarrassment of not quite knowing where they're going. The U.S. fosters entrepreneurs and creators, not exam-takers, bureaucrats or, worse, deluded economists. So the perceived weakness of the American pupil in conventional studies is where his or her very strength may lie. The American system of trial and error produces doers: Black Swan-hunting, dream-chasing entrepreneurs, with a tolerance for a certain class of risk-taking and for making plenty of small errors on the road to success or knowledge. This environment also attracts aggressive tinkering foreigners like this author.

Globalization allowed the U.S. to specialize in the creative aspect of things, the risk-taking production of concepts and ideas--that is, the scalable part of production, in which more income can be generated from the same fixed assets through innovation. By exporting jobs, the U.S. has outsourced the less scalable and more linear components of production, assigning them to the citizens of more mathematical and culturally rigid states, who are happy to be paid by the hour to work on other people's ideas.

Let us go one step further. It is high time to recognize that we humans are far better at doing than understanding, and better at tinkering than inventing. But we don't know it. We truly live under the illusion of order believing that planning and forecasting are possible. We are scared of the random, yet we live from its fruits. We are so scared of the random that we create disciplines that try to make sense of the past--but we ultimately fail to understand it, just as we fail to see the future.

The current discourse in economics, for example, is antiquated. American undirected free-enterprise works because it aggressively allows us to capture the randomness of the environment--the cheap Black Swans. This works not just because of competition, and even less because of material incentives. Neither the followers of Adam Smith nor those of Karl Marx seem to be conscious of the prevalence and effect of wild randomness. They are too bathed in enlightenment-style cause-and-effect and cannot accept that skills and payoffs may have nothing to do with one another. Nor can they swallow the argument that it is not necessarily the better technology that wins, but rather, the luckiest one. And, sadly, even those who accept this fundamental uncertainty often fail to see that it is a good thing.

Random tinkering is the path to success. And fortunately, we are increasingly learning to practice it without knowing it--thanks to overconfident entrepreneurs, naive investors, greedy investment bankers, confused scientists and aggressive venture capitalists brought together by the free-market system.

We need more tinkering: Uninhibited, aggressive, proud tinkering. We need to make our own luck. We can be scared and worried about the future, or we can look at it as a collection of happy surprises that lie outside the path of our imagination.

Nassim Nicholas Taleb is an applied statistician and derivatives trader-turned-philosopher, and author of The Black Swan: The Impact of the Highly Improbable.



Nassim Nicholas Homepage
http://www.fooledbyrandomness.com/

"My major hobby is teasing people who take themselves & the quality of their knowledge too seriously & those who don’t have the guts to sometimes say: I don’t know...." (You may not be able to change the world but can at least get some entertainment & make a living out of the epistemic arrogance of the human race).

The Black Swan on the NYT Bestseller list (16 weeks, & 6 months on the Business Week list, etc.) —whatever that means (not too glorious, given the other books. But at the least this could be a vindication for every mistreated Black Swan hunting artist, entrepreneur or researcher, in addition to Menodotus of Nicomedia, Sextus Empiricus, GLS Shackle, & other thinkers absent from the canon).

G L O S S A R Y O F T E R M S

Academic libertarian:
someone (like myself) who considers that knowledge
is subjected to strict rules but not institutional authority, as the interest
of organized knowledge is self-perpetuation, not necessarily
truth (as with governments). Academia can suffer from an acute expert
problem (q.v.), producing cosmetic but fake knowledge, particularly
in narrative disciplines (q.v.), and can be a main source of Black Swans.

Apelles-style strategy:
A strategy of seeking gains by collecting positive accidents
from maximizing exposure to “good Black Swans.”

Barbell strategy:
a method that consists of taking both a defensive attitude
and an excessively aggressive one at the same time, by protecting assets
from all sources of uncertainty while allocating a small portion for
high-risk strategies.

Bildungsphilister: a philistine with cosmetic, nongenuine culture. Nietzsche
used this term to refer to the dogma-prone newspaper reader and
opera lover with cosmetic exposure to culture and shallow depth. I extend
it to the buzzword-using researcher in nonexperimental fields
who lacks in imagination, curiosity, erudition, and culture and is
closely centered on his ideas, on his “discipline.” This prevents him
from seeing the conflicts between his ideas and the texture of the
world.

Black Swan blindness: the underestimation of the role of the Black Swan,
and occasional overestimation of a specific one.

Black Swan ethical problem:
Owing to the nonrepeatable aspect of the
Black Swan, there is an asymmetry between the rewards of those who
prevent and those who cure.

Confirmation error (or Platonic confirmation):
You look for instances that
confirm your beliefs, your construction (or model)—and find them.

Empty-suit problem (or “expert problem”):
Some professionals have no
differential abilities from the rest of the population, but for some reason,
and against their empirical records, are believed to be experts:
clinical psychologists, academic economists, risk “experts,” statisticians,
political analysts, financial “experts,” military analysts, CEOs,
et cetera. They dress up their expertise in beautiful language, jargon,
mathematics, and often wear expensive suits.

Epilogism:
A theory-free method of looking at history by accumulating
facts with minimal generalization and being conscious of the side effects
of making causal claims.

Epistemic arrogance:
Measure the difference between what someone actually
knows and how much he thinks he knows. An excess will imply
arrogance, a deficit humility. An epistemocrat is someone of epistemic
humility, who holds his own knowledge in greatest suspicion.

Epistemic opacity:
Randomness is the result of incomplete information at
some layer. It is functionally indistinguishable from “true” or “physical”
randomness.

Extremistan:
the province where the total can be conceivably impacted by
a single observation.

Fallacy of silent evidence:
Looking at history, we do not see the full story,
only the rosier parts of the process.

Fooled by randomness:
the general confusion between luck and determinism,
which leads to a variety of superstitions with practical consequences,
such as the belief that higher earnings in some professions are
generated by skills when there is a significant component of luck in
them.

Future blindness:
our natural inability to take into account the properties
of the future—like autism, which prevents one from taking into account
the existence of the minds of others.

Locke’s madman:
someone who makes impeccable and rigorous reasoning
from faulty premises—such as Paul Samuelson, Robert Merton the
minor, and Gerard Debreu—thus producing phony models of uncertainty
that make us vulnerable to Black Swans.

Lottery-ticket fallacy:
the naïve analogy equating an
investment in collecting positive Black Swans to the accumulation
of lottery tickets. Lottery tickets are not scalable.

Ludic fallacy (or uncertainty of the nerd):
the manifestation of the Platonic
fallacy in the study of uncertainty; basing studies of chance on the narrow
world of games and dice. A-platonic randomness has an additional
layer of uncertainty concerning the rules of the game in real life.

The bell curve (Gaussian), or GIF (Great Intellectual Fraud), is the application
of the ludic fallacy to randomness.

Mandelbrotian Gray Swan:
Black Swans that we can somewhat take into
account—earthquakes, blockbuster books, stock market crashes—but
for which it is not possible to completely figure out their properties and
produce precise calculations.

Mediocristan: the province dominated by the mediocre, with few extreme
successes or failures. No single observation can meaningfully affect the
aggregate. The bell curve is grounded in Mediocristan. There is a qualitative
difference between Gaussians and scalable laws, much like gas
and water.

Narrative discipline: the discipline that consists in fitting a convincing and
well-sounding story to the past. Opposed to experimental discipline.
Narrative fallacy: our need to fit a story or pattern to a series of connected
or disconnected facts. The statistical application is data mining.

Nerd knowledge: the belief that what cannot be Platonized and studied
does not exist at all, or is not worth considering. There even exists a
form of skepticism practiced by the nerd.

Platonic fold: the place where our Platonic representation enters into contact
with reality and you can see the side effects of models.

Platonicity: the focus on those pure, well-defined, and easily discernible
objects like triangles, or more social notions like friendship or love, at the
cost of ignoring those objects of seemingly messier and less tractable
structures.

Probability distribution:
the model used to calculate the odds of different
events, how they are “distributed.” When we say that an event is distributed
according to the bell curve, we mean that the Gaussian bell
curve can help provide probabilities of various occurrences.

Problem of induction:
the logical-philosophical extension of the Black
Swan problem.

Randomness as incomplete information:
simply, what I cannot guess is
random because my knowledge about the causes is incomplete, not
necessarily because the process has truly unpredictable properties.

Retrospective distortion
: examining past events without adjusting for the
forward passage of time. It leads to the illusion of posterior predictability.

Reverse-engineering problem: It is easier to predict how an ice cube would
melt into a puddle than, looking at a puddle, to guess the shape of the
ice cube that may have caused it. This “inverse problem” makes narrative
disciplines and accounts (such as histories) suspicious.

Round-trip fallacy: the confusion of absence of evidence of Black Swans
(or something else) for evidence of absence of Black Swans (or something
else). It affects statisticians and other people who have lost part
of their reasoning by solving too many equations.

Scandal of prediction: the poor prediction record in some forecasting
entities (particularly narrative disciplines) mixed with verbose commentary
and a lack of awareness of their own dire past record.

Scorn of the abstract: favoring contextualized thinking over more abstract,
though more relevant, matters. “The death of one child is a
tragedy; the death of a million is a statistic.”

Statistical regress argument (or the problem of the circularity of statistics):
We need data to discover a probability distribution. How do we know
if we have enough? From the probability distribution. If it is a Gaussian,
then a few points of data will suffice. How do we know it is a
Gaussian? From the data. So we need the data to tell us what probability
distribution to assume, and we need a probability distribution to
tell us how much data we need. This causes a severe regress argument,
which is somewhat shamelessly circumvented by resorting to the
Gaussian and its kin.

Uncertainty of the deluded: people who tunnel on sources of uncertainty
by producing precise sources like the great uncertainty principle, or
similar, less consequential matters, to real life; worrying about subatomic
particles while forgetting that we can’t predict tomorrow’s
crises.

No comments:

ShareThis