Nassim Taleb on Black Swans

Bob Carter sent me a link to the following interesting article and profile on Nassim Taleb. Taleb is a statistician with practical risk experience. We’ve talked endlessly at Climate Audit about weird and inappropriate statistical methods, with frequent mentions of Mandelbrot, fractals and odd distributions. So does Taleb. In a financial context, but Mandelbrot sought fractals both in finance and nature (even analysing earlier versions of Mann’s tree ring data.)

The introduction to Taleb’s article is as follows:

When Nassim Taleb talks about the limits of statistics, he becomes outraged. “My outrage,” he says, “is aimed at the scientist-charlatan putting society at risk using statistical methods. … As a researcher in probability, he has some credibility. In 2006, using FNMA and bank risk managers as his prime perpetrators, he wrote the following:

“The government-sponsored institution Fannie Mae, when I look at its risks, seems to be sitting on a barrel of dynamite, vulnerable to the slightest hiccup. But not to worry: their large staff of scientists deemed these events “unlikely.” “

Taleb recently accepted an academic appointment in an engineering department, describing the appointment as follows:

And Professor Bernanke [the present Federal Reserve chairman] indeed found plenty of economic explanations—what I call the narrative fallacy—with graphs, jargon, curves, the kind of facade-of-knowledge that you find in economics textbooks. (This is the kind of glib, snake-oil facade of knowledge—even more dangerous because of the mathematics—that made me, before accepting the new position in NYU’s engineering department, verify that there was not a single economist in the building. I have nothing against economists: you should let them entertain each others with their theories and elegant mathematics, and help keep college students inside buildings. But beware: they can be plain wrong, yet frame things in a way to make you feel stupid arguing with them. So make sure you do not give any of them risk-management responsibilities.)

Taleb has even had to resist demands to provide his own “reconstruction”.

Now you would think that people would buy my arguments about lack of knowledge and accept unpredictability. But many kept asking me “now that you say that our measures are wrong, do you have anything better?”

Here’s another paragraph about “self-published” negative results:

Go to a bookstore, and look at the business shelves: you will find plenty of books telling you how to make your first million, or your first quarter-billion, etc. You will not be likely to find a book on “how I failed in business and in life”—though the second type of advice is vastly more informational, and typically less charlatanic. Indeed, the only popular such finance book I found that was not quacky in nature—on how someone lost his fortune—was both self-published and out of print. Even in academia, there is little room for promotion by publishing negative results—though these are vastly more informational and less marred with statistical biases of the kind we call data snooping. So all I am saying is, “What is it that we don’t know”, and my advice is what to avoid, no more.

“Less marred by statistical biases of the kind we call data snooping.” My, my.

40 Comments

  1. Steve McIntyre
    Posted Sep 26, 2008 at 11:51 AM | Permalink

    This got buried in our crisis post and so I’ve moved it. At the end of the day, I’m going to turn off the crisis thread.

  2. BarryW
    Posted Sep 26, 2008 at 11:57 AM | Permalink

    Now you would think that people would buy my arguments about lack of knowledge and accept unpredictability. But many kept asking me “now that you say that our measures are wrong, do you have anything better?”

    Wow, talk about an appropriate comment for climate science.

  3. Austin
    Posted Sep 26, 2008 at 12:17 PM | Permalink

    Dick Feynman talked about fooling yourself. ( Funny how we don’t call him DR Feynman!!)

    “The first principle is that you must not fool yourself – and you are the easiest person to fool.”

    “If we will only allow that, as we progress, we remain unsure, we will leave opportunities for alternatives. We will not become enthusiastic for the fact, the knowledge, the absolute truth of the day, but remain always uncertain… In order to make progress, one must leave the door to the unknown ajar.”

    “…there is one feature I notice that is generally missing in ‘cargo cult science’… It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty — a kind of leaning over backwards… For example, if you’re doing an experiment, you should report everything that you think might make it invalid–not only what you think is right about it… Details that could throw doubt on your interpretation must be given, if you know them.”

  4. Not sure
    Posted Sep 26, 2008 at 12:21 PM | Permalink

    Has anyone here read his book, The Black Swan: The Impact of the Highly Improbable?

  5. Jeremy
    Posted Sep 26, 2008 at 12:28 PM | Permalink

    …look at the business shelves: you will find plenty of books telling you how to make your first million, or your first quarter-billion, etc. You will not be likely to find a book on “how I failed in business and in life”—though the second type of advice is vastly more informational, and typically less charlatanic…

    Amazing how true that is. Perhaps I should mark myself in history and start writing about and e-publishing all my failures.

  6. Posted Sep 26, 2008 at 12:31 PM | Permalink

    I read “Fooled by Randomness,” that had some good info in it. For those of you interested, here’s a good podcast with him.

    http://www.econtalk.org/archives/2007/04/taleb_on_black.html

    I find that series of podcasts very informative and entertaining. Russ is an economist (or price theorist as he likes to say) but he is very interested in the limits of economic theorizing. I highly recommend the whole series, there are very few clunkers. You can also get them through iTunes under “econtalk.”

    • Barclay E. MacDonald
      Posted Sep 27, 2008 at 7:34 PM | Permalink

      Re: Isaac Crawford (#6),

      Thank you for the reference to the NT podcast. I’m a little slow to absorb it, but find it quite fascinating and recommend it as well.

  7. Paul Armstrong
    Posted Sep 26, 2008 at 12:34 PM | Permalink

    I read and really enjoyed “The Black Swan” on my summer holidays this year. (By the way, his name’s Nassim , not Nassif)
    Lots of examples of “false & overconfident experts” in all sorts of fields ( not just finance, although that’s where he started out). My immediate reaction was ” I wonder why he hasn’t picked up on Climate Science as another classic example of the genre”? From his on-page persona, it didn’t strike me that he’d be the sort of guy to have any qualms about picking up such a political hot potato. I even though about trying to contact him to ask him whether he’d ever investigated climate science in this light. But then did nothing about it I’m afraid!

  8. Curt
    Posted Sep 26, 2008 at 1:55 PM | Permalink

    I too read “The Black Swan” this summer, and “Fooled by Randomness” earlier this year. Even though Taleb is not a great writer, I found the ideas in the books very compelling, so well worth my time. Overwhelmingly he is talking about financial markets, but I think his ideas have application in many other fields, and I was constantly reminded of issues in climate science as I read these books.

  9. Not sure
    Posted Sep 26, 2008 at 3:01 PM | Permalink

    I finally finished that essay, and it’s very readable even by those with a superficial understanding of statistics like myself.

    He does touch on climate in passing, BTW. He lists “Climate” under complex payoffs with linear expectations, and later he says “The point can be used in climatic analysis. Things that have worked for a long time are preferable—they are more likely to have reached their ergodic states.” Maybe he means a economic “climate” and I’m misunderstanding him.

    • Eric (skeptic)
      Posted Sep 26, 2008 at 10:42 PM | Permalink

      Re: Not sure (#9), regarding climate, does Taleb account for climate black swans such as asteroid impacts, supervolcanoes, or nearby supernovas? Or are those not black swans, just inadequately observed/modeled physical processes?

      • John
        Posted Sep 28, 2008 at 10:02 PM | Permalink

        Re: Eric (skeptic) (#18),

        regarding climate, does Taleb account for climate black swans such as asteroid impacts, supervolcanoes, or nearby supernovas? Or are those not black swans, just inadequately observed/modeled physical processes?

        I think those are classified as dead swans.

  10. Sam
    Posted Sep 26, 2008 at 3:10 PM | Permalink

    Just a quick note – it’s Nassim, not Nassif.

  11. rk
    Posted Sep 26, 2008 at 3:17 PM | Permalink

    Some excerpts from Bloomberg “Bringing Down Wall Street as Ratings Let Loose Subprime Scourge”:

    “I view the ratings agencies as one of the key culprits,” says Joseph Stiglitz, 65, the Nobel laureate economist at Columbia University in New York. “They were the party that performed that alchemy that converted the securities from F- rated to A-rated. The banks could not have done what they did without the complicity of the ratings agencies.”

    “The part that became the most aggravating — personally irritating — is that CDO guys everywhere didn’t want to know fundamental credit analysis; they didn’t want to know from being in touch with the underlying asset,” says Adelson, 48, who quit Moody’s in January 2001 after being reassigned out of the residential mortgage-backed securities business. “There is no substitute for fundamental credit analysis.”

    AAA ratings on subprime mortgage investments can be traced to the rise on Wall Street of quantitative analysts, or quants, with advanced degrees in math, physics and statistics. They developed computer-driven models that didn’t rely on historical performance data, according to Raiter and others. If the old rating methods were like Rembrandt’s portraiture, with details painted in, the new ones were Monet impressionism, with only a suggestion of the full picture

    • BarryW
      Posted Sep 26, 2008 at 6:57 PM | Permalink

      Re: rk (#11),

      There was an implication that the bond ratings were based in part on insurance taken out on the bonds. Of course this works until there are enough failures to swamp the insurance, which in turn implies that the assigned risk was way too low.

      This also has the smell of bad auditing practices from the last mess.

  12. bernie
    Posted Sep 26, 2008 at 4:16 PM | Permalink

    A while ago, I did a study for an insurance client of what makes for successful underwriters – these were P&C underwriters rather than financial vehicle underwriters. One key differentiator was that successful underwriters – based on loss ratios – demanded more physical data of the risk before simply using the rating books — yes, they were skeptical and liked to do personal physical audits,, e.g., doing drive-bys when looking at new types of risks!! This meant though that it cost more to underwrite a given risk — that is where things get tricky when the company is pushing short term productivity metrics. Information costs money, good information tends to cost more in the short-run.

  13. Dishman
    Posted Sep 26, 2008 at 4:52 PM | Permalink

    There’s always a Black Swan in the last room of the Hilbert Hotel.

  14. Posted Sep 26, 2008 at 5:39 PM | Permalink

    Nassim Taleb:

    “My outrage,” he says, “is aimed at the scientist-charlatan putting society at risk using statistical methods…

    Funnily enough, that’s my outrage as well.

    • bender
      Posted Sep 26, 2008 at 9:44 PM | Permalink

      Re: John A (#14),
      John A and bender find common ground.

  15. Harry Eagar
    Posted Sep 26, 2008 at 9:34 PM | Permalink

    I have just put ‘The Black Swan’ on my reading list, but, slightly ironically, although you are not likely to find many books about “how I failed in business and in life,” with a little diligence you can find Roy Little’s “How to Lose $100 Million and Other Valuable Advice.”

    Well, make that a lot of diligence. Amazon never heard of Mr. Little’s book, I see. I guess that even if you write such a book, it is hard to get people to read it.

    If you do find it, I recommend it as one of the best business autobiographies. Little created Textron and he made the $100M back by buying Bell Helicopter just as the Vietnam War expanded.

    • philh
      Posted Sep 27, 2008 at 7:45 PM | Permalink

      Re: Harry Eagar (#16), For older, out-of-print or otherwise hard to find books, try ABEBooks.com. It’s a compendium of thousands of bookstores all over the world where I’ve never failed to find a book I’m looking for.

  16. masmit
    Posted Sep 27, 2008 at 4:46 AM | Permalink

    “Less marred by statistical biases of the kind we call data snooping.”

    I think I prefer John Brignell’s phrase “data-dredging”.

    http://www.numberwatch.co.uk/default.htm

  17. Mr. Kaos
    Posted Sep 27, 2008 at 5:56 AM | Permalink

    A lot of what Taleb, and some others have written about is in essence dealing with a fundamental problem in modern society. That is, the problem of uncertainty. Or to be more specific, the inability to understand what uncertainty is. This problem seems to afflict many in the scientific/mathematical fields (who have become drunk on the perceived endless power of models & numbers), and the general masses, who often eagerly accept causation based on some presented ‘narrative’ (e.g. ‘rational’ explanation, scientific/mathematical model).

    To use a phrase, once made infamous by Rumsfeld, there is uncertainty which can be modeling through mathematics ( i.e. ‘known unknowns’), and uncertainty that cannot be understand or modeled mathematical (i.e. ‘unknown unknowns’).

    For example, you can use a mathematical expression that succinctly and accurately represent all the uncertainty present in a die roll. E.g. the uncertainty that the number ‘3′ will appear face up when I roll a fair die is 1 in 6 or 0.167. This is a complete representation of all the uncertainty present in this system. I can use this figure confidently in my calculations and decision-making.

    I cannot however represent all the uncertainty present in, say, a potential event such as a major disruption to a key supplier of General Electric. I can say something like its ‘low’ or ‘high’, or I can use statistical models and look at a bunch of similar suppliers, etc. but whatever I do, my uncertainty value is subjective to some degree. I cannot represent all the uncertainty present in this system succinctly & ‘accurately’ through a cardinal framework.

    Yet often, because the uncertainty value was derived through some ‘sophisticated’ method, we assume it’s an accurate representation.

    The key problem here is that mathematical/statistical models are not self-referential. They cannot tell you how well they are accommodating all the uncertainty in a system; they are just dumb tools that we choose how to apply.

    This is what I see in the climate science community. I believe the models, for all there sophistication, are subjective estimates/projections of what might happen in the future. The climate system has a number of highly complex non-linear phenomenon at play, a lot of which we don’t understand (even AGWers admit this), yet we are supposed to accept their projections as not subjective, but rather accurate representations of what will happen to our planet.

    We have to be extremely diligent about claims involving future projections and uncertainty. History has shown us many times that delusions of certainty, which are built on rationalizations, can be very painful & costly.

  18. Stephen
    Posted Sep 27, 2008 at 7:57 AM | Permalink

    Let me throw this one out there as a question. I dont think this is true, but is what Taleb is talking about the mirror os the 1% solution. That is that if there is a chance for the catastophe you must guard against it….this is the argument used by

    1) AGW proponents, even if there is a chance we have to do something
    2) The proponents of the Iraq war, if there is a chance that Saddam has wMD we need to act pre emtively
    3) Pastuer arguing about God, you should convert on the chance that it is all true to, prevent yourself forom going to Hell.

    Maybe its off topic, but this is all related to how you make decisions about unknown unknowns and guard against one in a million chances of critical disaster. Thoughts.

  19. KevinUK
    Posted Sep 27, 2008 at 8:41 AM | Permalink

    Steve, Thanks for this very informative and interesting thread.

    I’ve just read Nassim Taleb’s (NT’s) Edge article. As I’ve mentioned in the past I’m an ex-nuclear physicist who has been involved in the design of nuclear power plants (even got to commission one) and I can concur with NT’s statement that redundancy is vital to dealing with uncertainty. Although he doesn’t mention it, so also is diversification. Without diversification i.e. the use of physically different engineered means to prevent the nuclear reaction from running away with itself you are leaving yourself open to a common cause failure of all your shutdown protection systems.

    The unacceptable consequences of a significant nuclear accident involving the release of air-bourne radionuclides is now plain for all of us to see following the Chernobyl accident. Perhaps the financial sector has something to learn from the nuclear industry? The operating licences of nuclear facilities (excluding some military sites) are nowadays quite rightly dependent on the approval of their nuclear safety cases by a national regulatory body appointed to scrutinize them. To obtain a nuclear licence to operate the plant, the operator must demonstrate that adequate measures have been taken in the design of the plant and how it will be operated to reduce all risks to as low as practicable a level (ALARP) as possible. Particular attention is placed on how very low likelihood events which could lead to high (unacceptable) consequences are to be dealt with. The philosophy in dealing with high risk (likelihood x consequence) events is to demonstrate that as far as is reasonable practicable they have been ‘designed out’. In the case of nuclear power plants this is only possible by providing redundant reactor shutdown systems that are reliant on different (hence diverse) physical processes e.g. de-latching of control rods so that they fall under gravity into the reactor, injection of boron beads (for gas-cooled reactors) or boronated water (for PWRs) into the reactor etc. No doubt NT will be aware of this in his new job in teaching risk engineering to his students at NYU (I certainly hope so). However mitigating such risks (along with dealing with the nuclear waste arising) costs money and lots of it, so much so that it makes nuclear power generation uncompetitive compared to other forms of electricity generation like gas, coal or oil-fired generation.

    Apologies for my OT deviation thus far but to link back to the subject of this thread, the appliance of the ALARP principle to the design and operation of nuclear plant is specifically to prevent operation in the ‘fourth quadrant’ as NT calls it. This is because nuclear safety regulators know that is a very bad idea to operate in this quadrant (doing so can kill and injure lots of people). Perhaps we should therefore be consulting with them (nuclear safety regulators) for advice (e.g. applying the ALARP principle) on how to get us out of the current mess (and avoid getting into it again) that we have in our financial markets as a consequence of failing to deal appropriately with uncertainty.

    KevinUK

  20. lucklucky
    Posted Sep 27, 2008 at 2:49 PM | Permalink

    The question is also: Should we be protected to events that happen every 70’s years? And what kind of events warrant that protection? We don’t protect houses to 7 or 8 Richter scale earthquakes, our planes now mostly have 2 engines(tough they can fly with one only), we go drive in a road and literaly we are at 2 or 3m from death if other car goes against us. Of course the threshold is always wealth. Without our wealth we didn’t have any earthquake protection at all.
    It is also unfixable the paradox that without crises we loose the learning that can let us to fix them, and what is fixing a crisis? Not happening again?
    For example loosing 1% of growth(let’s suppose that is the right number) every year during 70 years by limiting risky bussiness offsets a Crash? I also think that crashes are much more dangerous when there is widespread poverty and not when there are lots of solutions, we have much more options now.

    In the end my greatest disapointment cames from schools.

  21. John T
    Posted Sep 27, 2008 at 8:45 PM | Permalink

    I know this is not Oprah’s book talk, but I think this is relevant to the discussion.

    Didier Sornette, Professor of Geophysics at UCLA has strangely enough, done work on Finance. One Book Entitled Why Stock markets Crash is interesting in that he uses the theory that cooperative herding and imitation create the environment leading to the bubble, a period of unsustainable rise and instability in pricing, then the crash.

    Beside the obvious implications as it relates to the current financial crisis, I believe that it also pertains to the “Global Climate Crisis” and how powerful the effect of cooperative herding and imitation is within the Scientific Community.

    Here is an excerpt from the article A complex system view of why stock markets crash.

    “Positive feedback, collective behaviors and herding
    In a culmination of more than ten years of research on the science of complex system, we have thus challenged the standard economic view that stock markets are both efficient and unpredictable. The main concepts that are needed to understand stock markets are imitation, herding, self-organized cooperativity and positive feedbacks, leading to the development of endogenous instabilities. According to this theory, local effects such as interest raises, new tax laws, new regulations and so on, invoked as the cause of the burst of a given bubble leading to a crash, are only one of the triggering factors but not the fundamental cause of the bubble collapse. We propose that the true origin of a bubble and of its collapse lies in the unsustainable pace of stock market price growth. As a speculative bubble develops, it becomes more and more unstable and very susceptible to any disturbance.”

    http://www.ess.ucla.edu/faculty/sornette/prediction/index.asp#prediction

    His work contains a heavy dose of Mathematics and Statistics.

    Publications: – Complex Systems
    – Discrete Scale Invariance & Complex Exponents
    – Earthquakes & Ruptures
    – Finance
    Books: – Mechanisms of Scale Invariance and Beyond
    – Critical Phenomena in Natural Sciences (Textbook)
    – Why Stock Markets Crash?
    – Extreme Financial Risks (Textbook)
    (From Dependence to Risk Management)
    Predictions: – The future of the USA stock market
    – Is There a Real-Estate Bubble in the US? (released 3rd June 2005)
    – The future of the UK and US real estate market (released March 2003)
    – A complex system view of why stock markets crash
    – Scientific Prediction of Catastrophes: A New Approach
    – The end of the growth era

  22. Barclay E. MacDonald
    Posted Sep 28, 2008 at 1:12 PM | Permalink

    For another fascinating angle on this I suggest http://docartemis.com/brainsciencepodcast/ and listen to podcasts #42 and 43 “On Being Certain”.

  23. N
    Posted Sep 28, 2008 at 6:43 PM | Permalink

    Now this is funny, in a black sort of way:
    from Nobody Expects the Spanish Inquisition—or Do They?, a book review of The Black Swan
    http://www.slate.com/id/2167993/
    Oddly, Taleb’s argument is weakest in the area he knows best, namely finance. Only on Wall Street do people seem to give proper credence—not too much, not too little—to very unlikely events.
    (written in June 2007)

  24. mundus
    Posted Sep 29, 2008 at 2:48 PM | Permalink

    Interesting thread!
    In economy forecasts are playing quite an important role – because if you are able to predict the future of markets, you will have a great competitive advantage. But the informative value of these predictions is excessively overrated, as Taleb puts it right.
    There are, for instance, many thousands of stock and investment analysts working in the finance capitals around the world who do nothing else the whole day long than pasting some data into their mathematical models (like the GCMs) to predict the future price of shares. If their predictions were regularly applicable, all those analysts would be billionaires, and their clients too. In fact, they are not. In most cases these insufficient forecasts aren´t worth the paper on which they are printed.
    And what can we learn from this? If in economy, where you have only some few well documented data and clearly defined parameters, modelling is leading to nowhere, it also will be leading to nowhere in climate science which deals with processes being much more complex.

  25. Steve
    Posted Oct 3, 2008 at 6:44 AM | Permalink

    I’m not so sure that Taleb’s portrait of Economists is at all accurate. After all it was economists who pointed out that the stock market price fluctuations on varying time scales functioned as the square root of time ala Brownian motion. The random walk theory as applied to stock prices has been around for over half a century yet we still have technical analysts reading tea leaves of market movements supporting the illusion that trading profits are evidence of skill and not just dumb luck. Modern Finance is just applied economics so I don’t see where Taleb gets his ideas. Financial product innovations probably caused the financial crisis but if you remember Buffet’s old saw “first there are the innovators, then the imitators, then the idiots” you can see who got us into the mess. But the failure doesn’t mean that the innovations were flawed in principle, spreading risk is always good, it just means that the last on board the innovation train, the late adoptors, were not as skilled in appropriately evaluating risk.

  26. marcusb
    Posted Oct 13, 2008 at 10:14 AM | Permalink

    Here is what Dr. Taleb has to say about sensible actions to combat global climate disruption:

    “Correspondents keep asking me if it the climate worriers are basing their claims on shoddy science, and whether, owing to nonlinearities, their forecasts are marred with such a possible error that we should ignore them. Now, even if I agreed that it were shoddy science; even if I agreed with the statement that the climate folks were most probably wrong, I would still opt for the most ecologically conservative stance — leave planet earth the way we found it. Consider the consequences of the very remote possibility that they may be right, or, worse, the even more remote possibility that they may be extremely right.”

    Nice to see 32 comments passed before someone stopped chin-stroking and actually found out what the man thinks!

  27. Barclay E. MacDonald
    Posted Oct 13, 2008 at 4:18 PM | Permalink

    #33

    So nice to see no reference for your quote!

    In any event the quote makes no sense to me. What is “the most ecologically conservative stance”? What level of resources would one allocate to acheive that end. Moreover, if the black swan is unpredictable in both its nature and the time of occurance, how do we know if it is more or less likely to occur based on the action we take. Any action! It may only be the fact that we took action to acheive “the most ecologically conservative stance” that caused the black swan to occur.

    On the other hand, sounds like a good idea.

  28. davidc
    Posted Oct 13, 2008 at 9:54 PM | Permalink

    For example:

    “for socio-economic and other nonlinear, complicated variables, we are riding in a bus driven a blindfolded driver, but we refuse to acknowledge it in spite of the evidence, which to me is a pathological problem with academia”

    (http://rs.resalliance.org/2008/09/17/financial-resilience-taleb-and-mandelbrot-reflect-on-crisis/)

    I don’t know how well informed he is on climate change, but based on what I know this comment supports my extremely skeptical position.

  29. marcusb
    Posted Oct 14, 2008 at 9:54 AM | Permalink

    #33

    here is the reference 4 ya, its the last para.

    http://www.edge.org/q2008/q08_17.html#taleb

    havin read the book (unlike most of these commenters), i would suggest that Taleb is mainly attacking the statistical methods used by economists and social scientists…

    Its a great read, altho i got bogged down a few times, and apparently it was the best selling non-fiction book on amazon for 2007.

    i guess he has probly sold a few this year too!

    For your info, this is his homepage…

    http://www.fooledbyrandomness.com/

    and the wiki with some good links

    http://en.wikipedia.org/wiki/The_Black_Swan_(book)

  30. JohnT
    Posted Mar 20, 2009 at 3:15 PM | Permalink

    Interesting take on on models.

    Speaking of his Gaussian copula function, Li himself said of his own model: “The most dangerous part is when people believe everything coming out of it.”

    Recipe for Disaster: The Formula That Killed Wall Street

    Nassim Taleb is quoted towards the end of this article. It is a very interesting read. You could almost substitute the words climate model for financial model in a few parts.

  31. Hu McCulloch
    Posted Mar 21, 2009 at 11:11 AM | Permalink

    RE John T, #37,
    Thanks, John, for this very interesting link.

    However, while Li’s formula may be a small part of the current problems, I think an even bigger contribution has been the Black Scholes option pricing formula. This really did win a Nobel Prize, but for Myron Scholes and Robert Merton only, since Fischer Black had died prematurely before the prize was awarded.

    While the formula is undoubtedly correct given its assumption that stocks are governed by Gaussian Brownian motions, but problem is that people have bought into the idea that because the solution was so elegant, the assumptions must be correct. The whole concept of a “hedge fund” that somehow makes clever trades that elminate risk is based on the Black/Scholes concept of a “hedge ratio.” It is no accident that Myron Scholes and Robert Merton went on to found the notorious Long Term Capital Management hedge fund.

    However, Benoit Mandelbrot has long insisted that the “Paretian” stable distributions are better able to model the fat tails that are observed in practice for stock price movements etc. The Generalized Central Limit Theorem states that if the sum of iid contributions has a limiting distribution, the limit must be a member of the stable class. The Gaussian distribution is only one member of this class, the one with the shortest tails, and the only one with a finite variance.

    A continuous time process with stable increments is full of discontinuities, unlike a Gaussian diffusion process, which is a continuous function of time. Most of these discontinuities are quite small, and just look like a diffusion. But occasionally a big one occurs. These are the events Taleb characterizes as “Black Swans”, but contra Taleb, they are mathematically quantifiable. See my 1978 J. Business paper.

    I’ve done a lot of work with using stable distributions to model financial uncertainty, but there has essentially zero interest on Wall St. in this approach. See the recent papers on my webpage, and the references to earlier papers there. I’ll add some PDFs of these papers when I get a chance.

    See especially my “Financial Applications of Stable Distributions”, Handbook of Statistics Vol 14, which develops an option pricing model for log-stable distributions, despite the misgivings of Samuelson and Merton on the feasibility of this. Also, my 1985 J Banking and Finance paper, which applies this model to the problem of evaluating deposit insurance for banks and thrifts. The same approach could be applied to counterparty risk on credit swaps, etc.

  32. W F Lenihan
    Posted Mar 31, 2009 at 12:28 PM | Permalink

    There is an excellent article in Wired (March 3, 2009), “Recipe for Disaster: The Formula That Killed Wall Street.”

    http://www.wired.com/techbiz/it/magazine/17-03/wp_quant?currentPage=1

    This article analyzes how David X Li’s work caused the worst economic meltdown since the Great Depression. Its opening paragraphs set the stage:

    “A year ago, it was hardly unthinkable that a math wizard like David X. Li might someday earn a Nobel Prize. After all, financial economists—even Wall Street quants—have received the Nobel in economics before, and Li’s work on measuring risk has had more impact, more quickly, than previous Nobel Prize-winning contributions to the field. Today, though, as dazed bankers, politicians, regulators, and investors survey the wreckage of the biggest financial meltdown since the Great Depression, Li is probably thankful he still has a job in finance at all. Not that his achievement should be dismissed. He took a notoriously tough nut—determining correlation, or how seemingly disparate events are related—and cracked it wide open with a simple and elegant mathematical formula, one that would become ubiquitous in finance worldwide.”

    “For five years, Li’s formula, known as a Gaussian copula function, looked like an unambiguously positive breakthrough, a piece of financial technology that allowed hugely complex risks to be modeled with more ease and accuracy than ever before. With his brilliant spark of mathematical legerdemain, Li made it possible for traders to sell vast quantities of new securities, expanding financial markets to unimaginable levels.”

    “His method was adopted by everybody from bond investors and Wall Street banks to ratings agencies and regulators. And it became so deeply entrenched—and was making people so much money—that warnings about its limitations were largely ignored.”

    “Then the model fell apart. Cracks started appearing early on, when financial markets began behaving in ways that users of Li’s formula hadn’t expected. The cracks became full-fledged canyons in 2008—when ruptures in the financial system’s foundation swallowed up trillions of dollars and put the survival of the global banking system in serious peril.”

    There are significant similarities between Li’s work and the GCMs that the AGWers, media and policy makers ignore at their peril. If they get their, trillions of dollars will be squandered and our economy destroyed.

    The closing paragraphs from this article put all modelers, especially climate modelers, and their work in perspective:

    “Li has been notably absent from the current debate over the causes of the crash. In fact, he is no longer even in the US. Last year, he moved to Beijing to head up the risk-management department of China International Capital Corporation. In a recent conversation, he seemed reluctant to discuss his paper and said he couldn’t talk without permission from the PR department. In response to a subsequent request, CICC’s press office sent an email saying that Li was no longer doing the kind of work he did in his previous job and, therefore, would not be speaking to the media.”

    “In the world of finance, too many quants see only the numbers before them and forget about the concrete reality the figures are supposed to represent. They think they can model just a few years’ worth of data and come up with probabilities for things that may happen only once every 10,000 years. Then people invest on the basis of those probabilities, without stopping to wonder whether the numbers make any sense at all.”

  33. Anton
    Posted May 25, 2010 at 1:04 PM | Permalink

    Sorry to tell you that Phil, but none use “world of Normal and Poisson distributions” in complex finance. And in our days most of the statistical new approaches come from finance: multi PDE resolution, data mapping optimization, non linear models, Monte Carlo simulations,…
    And the example of David Li “gaussian copula” as a key element of the current crisis is ridiculous and far from reality. The credit market collapse is not the collapse of “derivatives” only the signal that risk was too cheap in our over indebted – zero inflation – highly competitive bank/credit industry world.
    When real estate market ceased, 2 years ago, to maintain double digit yearly growth trend, it was not necessary to use any complex quant model to forecast the coming crisis.
    Put 1% or 2% shift in a simple compounded formula of any Mortgage Back Security vehicle with 30-50 years maturity and you will face a 20% -25% up front loss of value. And even if it was only “actuarial” loss, once registered in the balance sheet of the vehicle , I let you imagine the consequence in terms of funding capability for the vehicle and sustainability for credit market…..

  34. phil wilson
    Posted Sep 27, 2008 at 2:50 PM | Permalink

    Re: eric (#18),

    Taleb does not address such specifics, however his “black Swan” paradigm encompases all such unexpected events. From memory, I recall that he calls the well ordered world of normal and Poisson distributions “Mediocristan” and most of the real world where such methods should not be (but often are) applied “Extremestan”. People (economists, financial traders) mis-use stats methods by applying scientific seeming methods where they’re invalid. In his opinion a key factor in the financial meltdown was that we created complex financial instruments whose risks we under-estimated and did not understand despite the PhD methodologies of the ‘quants’. I hope I haven’t butchered his well thought out iconoclastic thinking too badly.

    Read him any chance you get. He disses his own “Fooled by Randomness” book saying read “The Black Swan”. The referenced essay is timely and worth spending thoughtfull minutes to read.

Follow

Get every new post delivered to your Inbox.

Join 3,195 other followers

%d bloggers like this: