InfoGap EconomicsLast modified:
I imagine that you have heard of Voodoo Economics, also known as Reaganomics.
But have you heard of InfoGap Economics?!
I suppose not, so here is the bottom line.
InfoGap Economics is a newly coined phrase designating a Voodoo Economic Theory that is based on (indeed is part of) InfoGap decision theory. InfoGap Economics claims to provide economists and policy analysts with a new method for tackling severe uncertainty.
The objective of this page is to explain what makes InfoGap economics a voodoo economic theory par excellence.
The first thing you need to do to uncover the facts about this theory is to peel away the thick layers of fog, spin and empty rhetoric that are the mainstay in InfoGap publications. Once the hollow verbiage about uncertainty, the place of human judgment in decisionmaking, etc. is cleared away, one discovers that InfoGap's prescription for the modeling, treatment, and management of severe uncertainty in effect boils down to this: ignore the severity of the uncertainty!
And what is more, one discovers that the purportedly new method that this theory offers to tackle severe uncertainty, namely InfoGap's robustness model, is in fact not new. Indeed, this model is a simple instance of Wald's famous Maximin model (circa 1940). This simple instance is known universally as Radius of Stability (circa 1960).
So, the only idea that is truly "new" in InfoGap Economics is the preposterous proposition that a local robustness analysis (which is in fact, a maximin analysis, in other words a worstcase analysis) that is confined entirely to the immediate neighborhood of a wild guess, is the proper way to evaluate the robustness of decisions against severe uncertainty. That is, InfoGap Economics’ thesis is that a robustness analysis that is confined to the neighborhood of a wild guess provides decision makers a sound basis for economic, monetary, environmental, etc. policy making under conditions of severe uncertainty.
If this is not a voodoo decision theory and Voodoo Economics par excellence, what is?!
Table of contents
Overview
The specific trigger that prompted the creation of this page was this statement (see the PDF file):
Description. After every crisis economists and policy analysts ask: can better models help prevent or ameliorate such situations? This book is an answer. Yes, quantitative models can help if we remember that they are rough approximations to a vastly more complex reality. Models can help if we include realistic but simple representations of uncertainty among our models. Models can help if we retain the preeminence of human judgment over the churning of our computers.
Infogap theory is a new method for modelling and managing severe uncertainty. The core of the book presents detailed examples of infogap analysis of decisions in monetary policy, financial economics, environmental economics for pollution control and climate change, estimation and forecasting.
This description foreshadowed the publication by Palgrave of a new book entitled: InfoGap Economics: An Operational Introduction.
On the Palgrave website, the Product Description of this book is similar with the last sentence making the following claim:
This book is essential reading for economic policy analysts and researchers.It is important therefore to caution all those  especially economic policy analysts and researchers  who may find the sobriquet InfoGap Economics intriguing, that not only is the method advanced in this book fundamentally flawed, it gives a thoroughly distorted picture of the stateofthe art in economic research and analysis particularly with regard to the tools available to economic policy analysts and researchers for the treatment of severe uncertainty.
In what follows I justify my position.
However, before I can proceed to do this, I ought to make it clear that my critique of InfoGap decision theory in its entirety applies down to the last detail to InfoGap Economics. This is so because, as indicated by the statements advertising this book, the book's main business is to illustrate the application of InfoGap Decision Theory in economic analysis and research. In other words, all that InfoGap Economics actually comes down to is a discussion of a number of examples outlining how InfoGap decision theory is applied to certain problems. So, on the face of it, it may appear that a critique of InfoGap Economics is redundant given that a detailed critical analysis of InfoGap decision theory which includes detailed critiques of InfoGap applications (see my review reports) is already available.
My review report on the book in now available The point of this page is then to urge the group of readers targeted by this book, to read carefully my critique of InfoGap Decision Theory as this is a prerequisite for getting a clear and accurate picture of what InfoGap Economics is all about.
InfoGap decision theory
For the benefit of readers who are not familiar with InfoGap Decision Theory, it ought to be pointed out that the trait that earns this theory the title voodoo decision theory is its turning a blind eye to the universal
GIGO Axiom Garbage In  Garbage Out This means of course that InfoGap Decision Theory is in contravention of the well known maxim:
GIGO Corollary The results of an analysis are only as good as the estimates on which they are based. Because, the InfoGap rhetoric would have us believe that an analysis conducted in the immediate neighborhood of a wild guess can generate results that are ... meaningful, worthwhile, useful, reliable etc.!!!
Also note that although InfoGap decision theory is claimed to seek decisions that are robust against severe uncertainty, not a single reference is made in the InfoGap literature to the immediately relevant thriving field of Robust Optimization.
This of course is incomprehensible and inexcusable!
To learn more about these facts I urge you to read the following pages:
 FAQs about InfoGap decision theory.
 A Second Opinion about InfoGap decision theory.
 Reviews of InfoGap publications.
You should also consult the articles and presentations on InfoGap decision theory that are listed at the bottom of this page.
InfoGap Economics
The question is then: what is InfoGap Economics?
To answer this question observe that InfoGap Economics claims to address the following challenging problem:
How should economic systems that are subject to severe uncertainty be managed?More preciely, the question is this:
How should the severe uncertainty afflicting economic systems be modelled, analyzed and managed?The prescription put forward by InfoGap Economics for this purpose is exceedingly simple:
 Ignore the severity of the uncertainty under consideration!
 Focus the analysis on the immediate neighborhood of a point estimate of the parameter of interest.
Of course you may wonder how such a recipe for managing severe uncertainty can be contemplated at all?! After all, isn't it clear to all that:
 Under severe uncertainty the point estimate of the parameter of interest is a poor indication of the true value of the parameter and is likely to be substantially wrong. Often the point estimate is a wild guess.
 The results of an analysis are only as good as the estimates on which they are based.
 It is important to include rare events in a risk analysis of economic systems that are subject to severe uncertainty.
As we shall see below, the absurd in InfoGap Economics' approach to severe uncertainty becomes clear once you realize that the supposedly new and revolutionary robustness model  in fact a robustness model invented at least 50 years ago  deployed by InfoGap Economics , was devised expressly to model the robustness of systems against small perturbation in the value of a parameter. This is the Radius of Stability model which has been used extensively, for many years now, in numerical analysis, applied mathematics, control theory and parametric optimization.
In InfoGap Economics the "reinvented" Radius of Stability model has been stood on it on its head where instead of measuring robustness against small perturbations, it is employed to measure robustness against severe uncertainty.
I argue that this glaring misapplication of the Radius of Stability model earns InfoGap Economics the title Voodoo Economics.
I should point out that the concept "Radius of Stability" is not mentioned in the InfoGap Economics book. So, indications are that the author was unaware that his "new" robustness model is in fact a wellknown local robustness model.
And before I proceed to discuss the Radius of Stability connection, a few words about the local nature of the robustness analysis prescribed by InfoGap Economics.
Local Robustness
Let me point out that, as attested by Infogap publications including those dealing with economic questions, their authors are well aware that InfoGap's analysis actually comes down to this:
No Man's Land û
No Man's Land
< Complete region of uncertainty under consideration > where
 û denotes the estimate of the parameter of interest,
 the black area represents the complete region of uncertainty under consideration,
 the red area around û represents the region of uncertainty that actually affects the results generated by InfoGap's robustness analysis,
 the vast No Man's Land represents that part of the complete region of uncertainty that has no impact whatsoever on the results generated by InfoGap's robustness model.
Yet, the claims made about InfoGap decision theory hence about InfoGap Economics are that it provides reliable means for achieving essential goals (color is mine):
If "rationality" means choosing an action which maximizes the best estimate of the outcome, as is assumed in much economic theory, then infogap robustsatisficing is not rational. However, in a competitive environment, survival dictates rationality. In section 11.4 we will show that, for a wide range of situations, the robustsatisfier is more likely to survive than the direct optimizer. If "rationality" means selecting reliable means for achieving essential goals, then infogap robustsatisficing is extremely rational.
BenHaim (2006, pp. 100101)
This is yet another reason for InfoGap decision theory qualifying for the title Voodoo Decision Theory. The picture speaks for itself.
The point here is of course that such claims about InfoGap's robustness analysis effectively attribute it the extraordinary ability to generate reliable robust decisions out of a wild guess.
Conventional Science 



InfoGap Economics
wild guess > Model > wild guess
wild guess > Robustness Model > reliable
robust decisionSo, if we accept this claim, we may as well wind up the discipline of DecisionMaking Under Severe Uncertainty and declare it redundant. For, dealing with decisionmaking problems subject to severe uncertainty would now amount to child's play:
123 foolproof recipe for decisionmaking under severe uncertainty
 Ignore the severity of the uncertainty.
 Focus instead on the neighborhood of your best estimate of the parameter of interest.
 Don't worry if you lack an estimate, a wild guess will do^{**}.
^{**}Should you need it, the recipe for obtaining a wild guess is simplicity itself:
 Wet your index finger and put it in the air.
 Think of a number and double it.
See it online at wiki.answers.com/Q/What_is_best_estimate_and_how_do_i_calculate_it.
As for rare events.
Consider this piece of vintage InfoGap rhetoric:
Rare events in probabilistic models are described by the tails of the distribution, while probability distributions are usually specified in terms of mean and meanvariation parameters. This makes probabilistic models risky design tools, since it is rare events, the catastrophic ones, which must underlie the reliable design.
BenHaim (2006, pp. 330331)
In other words, InfoGap decision theory claims that probabilistic models are risky and unreliable design tools because the insufficient weight given in these models to catastrophic rare events distorts the robustness analysis.
Such statements clearly create the impression that  in sharp contrast to probabilistic models  InfoGap's robustness model indeed does incorporate catastrophic rare events in the robustness analysis.
The question is then: What is InfoGap decision theory's approach to rare events? How exactly does InfoGap decision theory incorporate catastrophic rare events in its robustness analysis?
And the answer to this is simple in the extreme: InfoGap decisiontheory's prescription for dealing with rare events is: to simply ignores them.
The nice thing about this simple answer is that you need not be a mathematician to be able to put it across or to grasp it.
All you need to do is to consult again the picture presented above as it speaks volumes about the failings of InfoGap's robustness model.
No Man's Land û
No Man's Land
< Complete region of uncertainty under consideration > where, as you recall, û denotes the estimate of the parameter of interest, the black area represents the complete region of uncertainty under consideration, the red area around û represents the region of uncertainty that actually affects the results generated by InfoGap's robustness analysis, and the vast No Man's Land represents that part of the complete region of uncertainty that has no impact whatsoever on the results generated by InfoGap's robustness model.
The obvious question is of course: how can InfoGap's robustness analysis possibly deal with rare events if it is confined to the neighborhood of a given estimate? Are we to assume that rare events would be located in the neighborhood of the estimate û? Or, should we perhaps assume that the estimate itself represents a rare event?
The fact is however that raising these questions is pointless because far more basic questions about the estimate û, requiring a more urgent clarification, remain unattended to hence unanswered. For one thing, InfoGap decision theory does not even bother to address the more basic question of how the value of the estimate û is determined. As you will recall, this estimate is the most crucial element of InfoGap's uncertainty, robustness, and opportuneness models. Yet, all that InfoGap does is to simply assumes that the value of û is given  end of story.
In summary, InfoGap Economics is the exact antithesis of what an economic theory for decisionmaking under severe uncertainty ought to be.
The reinvention of the Radius of Stability
In view of the claims hailing InfoGap Economics as "new", it is important to keep in mind what its two core models are. To be precise:
 InfoGap's robustness model is a simple Maximin model (circa 1939).
 InfoGap's opportunity model is a simple Minimin model (circa 1950).
As a matter of fact, If you are familiar with control theory, you'll notice that InfoGap's robustness is equivalent to the Radius of Stability of the feasible region associated with InfoGap's performance requirement. Because, according to InfoGap decision theory, the robustness of a decision is equal to the distance between a wild guess of the parameter of interest and the boundary of the feasible region determined by the performance requirement.
Here is the picture:
Find the differences
So, what is new?
Given the Maximin and Radius of Stability Connections, you may well wonder what is new here? What new ideas does InfoGap Economics put forward?
The answer to this is that other than the rhetoric proclaiming InfoGap Decision Theory, hence InfoGap Economics, as new, all that is actually new here is (as indicated above) the absurd proposition to conduct a Maximin analysis in the immediate vicinity of a wild guess, declaring this result as the basis for sound economic, monetary etc. policymaking under severe uncertainty.
Conclusions
To sum up:
 Given that InfoGap's robustness model is a simple Radius of Stability model; and
 given that the Radius of Stability model is designed for the analysis of small perturbation in the value of a parameter,
the new InfoGap book entitled InfoGap Economics should have given us cogent arguments to answer the following two simple questions:
 In what way is the method proposed in the book new?
 On what grounds can one claim that InfoGap Economics actually addresses the difficulties posed by severe uncertainty?
We shall have to wait and see how the author and perhaps the publisher will handle this issue.
Stay tuned!
What's next?
Who knows?!
There seems to be no limit to what InfoGap's rhetoric can create out of a wild guess, so essentially ... the sky is the limit!
Indeed, the more prolix the rhetoric, the greater the chances of getting books based on such rhetoric published!
So if you wonder how is it that a flawed theory such as InfoGap decision theory keeps on going, the answer is rather simple. Apart from publishers falling in the trap of fog, spin and empty rhetoric, there are ongoing projects that must be completed, there are PhD/Msc dissertations in progress that must be written, and so on ....
And ... never underestimate the Band Wagon Effect: the appearance on the scene of a brand new Band Wagon, packed with the right buzzwords and rhetoric for decisionmaking under severe uncertainty, which bedazzles those who are not conversant with decision theory, operations research and related areas...
From WIKIPEDIA Bandwagon effect, first proposed by David Luder, also known as "cromo effect" and closely related to opportunism, is the observation that people often do and believe things because many other people do and believe the same things. The effect is often called herd instinct. People tend to follow the crowd without examining the merits of a particular thing. The bandwagon effect is the reason for the bandwagon fallacy's success.On the positive side though I can report that some InfoGap scholars now admit that InfoGap's robustness model is a Maximin model and that InfoGap's specific instance of this generic model is utterly unsuitable for decisionmaking under severe uncertainty. Also, some now realize that the field of Robust Optimization offers the appropriate literature for a nonprobabilistic treatment of decisionmaking under severe uncertainty.
So only time will tell how long will other InfoGap scholars continue to keep their heads buried in the sand.
It will be interesting to see how economists will react to the spin and rhetoric in InfoGap Economics publications!
We shall have to wait and see!
Modern Alchemy, Freudian Slips, QuickFixes and Suchlike
If you are taking it for granted that the quest for a magic formula capable of transforming severe lack of knowledge / information into substantial knowledge was abandoned with the Enlightenment, I have news for you!
Apparently, against all scientific odds, InfoGap scholars were successful in imputing likelihood to results generated by a nonprobabilistic model that is completely devoid of any notion of likelihood!
Recall that InfoGap decision theory prides itself on being nonprobabilistic and likelihoodfree. Yet, Infogap scholars  the Father of InfoGap included  now claim that InfoGap's robustness model is capable of identifying decisions that are most likely to satisfy a given performance requirement.
Consider for instance the following quote from ACERA Endorsed Core Material (emphasis is mine:
Informationgap (henceforth termed 'infogap') theory was invented to assist decisionmaking when there are substantial knowledge gaps and when probabilistic models of uncertainty are unreliable (BenHaim 2006). In general terms, infogap theory seeks decisions that are most likely to achieve a minimally acceptable (satisfactory) outcome in the face of uncertainty, termed robust satisficing. It provides a platform for comprehensive sensitivity analysis relevant to a decision.
Burgman, Wintle, Thompson, Moilanen, Runge, and BenHaim (2008, p. 8).
Reconciling uncertain costs and benefits in Bayes nets for invasive species management
ACERA Endorsed Core Material: Final Report, Project 0601  0611.
(PDF file, Downloaded on March 21, 2009)This is a major scientific breakthrough.
For, until now we have been warned repeatedly by InfoGap scholars that no likelihood must be attributed to results generated by InfoGap decision models. Indeed, we have been advised that this would be deceptive and even dangerous (emphasis is mine):
However, unlike in a probabilistic analysis, r has no connotation of likelihood. We have no rigorous basis for evaluating how likely failure may be; we simply lack the information, and to make a judgment would be deceptive and could be dangerous. There may definitely be a likelihood of failure associated with any given radial tolerance. However, the available information does not allow one to assess this likelihood with any reasonable accuracy.
BenHaim (1994, p. 152)
Convex models of uncertainty: applications and implications
Erkenntnis, 4, 139156.This point is also made crystal clear in the second edition of the InfoGap book (emphasis is mine):
In infogap set models of uncertainty we concentrate on clusterthinking rather than on recurrence or likelihood. Given a particular quantum of information, we ask: what is the cloud of possibilities consistent with this information? How does this cloud shrink, expand and shift as our information changes? What is the gap between what is known and what could be known. We have no recurrence information, and we can make no heuristic or lexical judgments of likelihood.
BenHaim (2006, p. 18)
InfoGap Decision Theory: Decisions Under Severe uncertainty
Academic Press.So the question is: have Infogap scholars managed to accomplish a major feat in the area of decisionmaking under severe uncertainty?
Of course the answer is that this new claim (Burgman et al. 2008) is not due to a breakthrough in decisionmaking under severe uncertainty, but rather to an unfortunate, blatant error of judgment.
My view on this episode  based as it is on numerous discussions with InfoGap scholars over the past five years  is that this new claim is simply  but not surprisingly  ... a Freudian slip.
The point is that  see my FAQs about InfoGap  without imputing some sort of "likelihood" to InfoGap's decision model, InfoGap decision theory is, and cannot escape being, a voodoo decision theory.
So, all that this Freudian slip manages to do is to extend the already existing error  an alternative that some InfoGap scholars seem to prefer to an admission to a mistake.
One can only wonder then: how long will it take other InfoGap scholars such as Burgman et al (2008), to reach this unavoidable conclusion?
Only time will tell (March 21, 2009).
The Black Swan
Only time will tell what impact (if any) Nassim Taleb's recent popular and controversial book The Black Swan: The Impact of the Highly Improbable will have on the field of decisionmaking under severe uncertainty.
I, for one, hope that the issues raised in this book and in its predecessor, Fooled by Randomness: The Hidden Role of Chance in the Markets and in Life, will be instrumental in helping decisionmakers to identify voodoo decision theories  such as InfoGap decision theory  that promise robust decisions under severe uncertainty.
I fear though  in view of my experience of the past 40 years  that the danger is that the huge success of the Black Swan will inspire a new wave of voodoo decision theories, purportedly capable of ... "domesticating" black swans and preempting the discovery of ... purple swans!
We shall have to wait and see.
For those who have "been in hiding" I should note that Taleb has become quite a celebrity. According to the Prudent Investor Newsletters (Tuesday, June 3, 2008):
 Mr. Taleb charges about $60,000 per speaking engagement and does about 30 presentations a year to "to bankers, economists, traders, even to Nasa, the US Fire Administration and the Department of Homeland Security" according to Timesonline’s Bryan Appleyard.
 He recently got $4million as advance payment for his next much awaited book.
 Earned $35$40 MILLION on a huge Black Swan eventon the biggest stockmarket crash in modern historyBlack Monday, October 19,1987.
So, if you haven’t heard him in person you can easily find on the WWW numerous videos of his interviews.
Here is a link to a very short (2:45 min) clip, recorded by Taleb himself, apparently at Heathrow Airport, of 10 tips on how to deal with Black Swans, and life in general.
 Scepticism is effortful and costly. It is better to be sceptical about matters of large consequences, and be imperfect, foolish and human in the small and the aesthetic.
 Go to parties. You can't even start to know what you may find on the envelope of serendipity. If you suffer from agoraphobia, send colleagues.
 It's not a good idea to take a forecast from someone wearing a tie. If possible, tease people who take themselves and their knowledge too seriously.
 Wear your best for your execution and stand dignified. Your last recourse against randomness is how you act  if you can't control outcomes, you can control the elegance of your behaviour. You will always have the last word.
 Don't disturb complicated systems that have been around for a very long time. We don't understand their logic. Don't pollute the planet. Leave it the way we found it, regardless of scientific 'evidence'.
 Learn to fail with pride  and do so fast and cleanly. Maximise trial and error  by mastering the error part.
 Avoid losers. If you hear someone use the words 'impossible', 'never', 'too difficult' too often, drop him or her from your social network. Never take 'no' for an answer (conversely, take most 'yeses' as 'most probably').
 Don't read newspapers for the news (just for the gossip and, of course, profiles of authors). The best filter to know if the news matters is if you hear it in cafes, restaurants ... or (again) parties.
 Hard work will get you a professorship or a BMW. You need both work and luck for a Booker, a Nobel or a private jet.
 Answer emails from junior people before more senior ones. Junior people have further to go and tend to remember who slighted them.
It is interesting to juxtapose Prof. Taleb’s thesis in The Black Swan that severe uncertainty makes (reliable) prediction in the Socio/economic/political spheres impossible, with the polar position taken by his colleague, Prof. Bruce Bueno de Mesquita, who actually specializes in predicting the future.
One thing for sure: Sooner or later infogap scholars will find a simple reliable recipe for handling Black Swans!
Stay tuned!
And what do you know ?????? See Review 17
I was bound to happen!!
New Nostradamuses
Not only professionals specializing in "decision under uncertainty", but also the proverbial "man in the street", take it for granted that the ability to accurately predict future events is one of the most onerous challenges facing humankind — especially persons in authority, persons responsible for the management of business or economic organizations etc.
A notable exception to this rule is the New _Nostradamus" : Prof. Bruce Bueno de Mesquita, a political science professor at New York University and Senior Fellow at the Hoover Institution.
Who, according to Good Magazine, specializes in predicting future events — at least in the area of international conflicts.
The claim is that this distinguished political scientist can actually predict the outcome of any international conflcit!
To do this Prof. Bueno de Mesquita does not use a Crystal Ball, but a thoroughly scientific method which he claims, is based in a branch of applied mathematics called Game Theory.
According to GoodReads.com,
" ... Bruce Bueno de Mesquita is a political scientist, professor at New York University, and senior fellow at the Hoover Institution. He specializes in international relations, foreign policy, and nation building. He is also one of the authors of the selectorate theory.
He has founded a company, Mesquita & Roundell, that specializes in making political and foreignpolicy forecasts using a computer model based on game theory and rational choice theory. He is also the director of New York University's Alexander Hamilton Center for Political Economy.
He was featured as the primary subject in the documentary on the History Channel in December 2008. The show, titled Next Nostradamus, details how the scientist is using computer algorithms to predict future world events ..."
Here is an interview with Prof. Bueno de Mesquita (with Riz Khan  The art and science of prediction  09 Jan 08):
And here is a 20minute lecture on the ... future of Iran (TED, February 2009):
Apparently, all you need to accomplish this is a computer, expertknowledge on Iran, and game theory!
Some of the predictions attributed to Prof. Bueno de Mesquita are:
 The second Palestinian Intifada and the death of the Mideast peace process, two years before this came to pass.
 The succession of the Russian leader Leonid Brezhnev by Yuri Andropov, who at the time was not even considered a contender.
 The voting out of office of Daniel Ortega and the Sandanistas in Nicaragua, two years before this happened.
 The harsh crack down on dissidents by China's hardliners four months before the Tiananmen Square incident.
 France's hairsbreadth passage of the European Union's Maastricht Treaty.
 The exact implementation of the 1998 Good Friday Agreement between Britain and the IRA.
 China's reclaiming of Hong Kong and the exact manner the handover would take place, 12 years before it happened.
Impressive, isn't it!
As might be expected, these and similar claims by Prof. Bueno de Mesquita have sparked a vigorous debate not only in the professional journals but also on the WWW. Interested readers can consult this material to see for themselves, whether Bueno de Mesquita's claims attest to a major scientific breakthrough or ... voodoo mathematics.
Also, in addition to consulting this material you may want to have a look at a short video clip by Matt Brawn (right) which, he compiled in response to a short note entitled This man can actually predict the future!.
Of particular interest is, of course, the "success" rate of the Prof. Bueno de Mesquita's predictions: over 90% — yes over ninty percent!
Here is Trevor Black's common sense reaction to this claim:
I am a little skeptical about anyone who claims to have a 90% success rate. I just don't buy it. Especially when they say that they can explain away a lot of the other 10%.If you come to me and tell me you have a model that gets it right 60% or 70% of the time, I may listen. Skeptically, but I will listen. 90% and I start to smell something.
All I wish to add here is that Prof. Bueno de Mesquita (left) makes his predictions under conditions of "severe uncertainty" which of course render them hugely vulnerable to what Prof. Naseem Taleb (right) dubs the Black Swan phenomenon.
Hence, the very proposition that such predictions can be made at all, let alone be reliable, is diametrically opposed to Nassim Taleb's categorical rejection of any such position. For his thesis is that Black Swans are totally outside the purview of mathematical treatment, especially by models that are based on expected utility theory and rational choice theory.
Interesting, though, this is precisely the stuff that Prof. Bueno de Mesquita's method is made of: expected utility theory and rational choice theory!
Even more interesting is the fact that Nassim Taleb (right) and Bueno de Mesquita (left) are staff members of the same academic institution, namely New York University. So, all that's left to say is: Go figure!
As indicated above, the debate over Bueno de Mesquita's theories is not new. It has been ongoing, in the relevant academic literature, at least since the publication of his book The War Trap (1981).
For an idea of the kind of criticism sparked by his work, take a look at the quotes I provide from articles that are critical of Bueno de Mesquita theories.
Of course, there are other New Nostradamuses around.
According to the Associated Press, the latest (2009, Mar 4, 4:39 AM EST) news from Russia about the future of the USA is that
" ... President Barack Obama will order martial law this year, the U.S. will split into six rumpstates before 2011, and Russia and China will become the backbones of a new world order ..."Apparently this prediction was made by Igor Panarin (right), Dean of the Russian Foreign Ministry diplomatic academy and a regular on Russia's statecontrolled TV channels (see full AP news report).
Regarding the future of Russia,
"You don't sound too hopeful".
"Hopeful? Please, I am Russian. I live in a land of mad hopes, long queues, lies and humiliations. They say about Russia we never had a happy present, only a cruel past and a quite amazing future ..."Malcolm Bradbury
To the Hermitage (2000, p. 347)We should therefore be reminded of J K Galbraith's (19082006) poignant observation:
There are two classes of forecasters: those who don't know and those who don't know they don't know.
And in the same vein,
The future is just what we invent in the present to put an order over the past.
Malcolm Bradbury
Doctor Criminale (1992, p. 328)So, we shall have to wait and see.
And how about this more recent piece by Heath Gilmore and Brian Robins in the Sydney Morning Herald (March 27, 2009):
"... COUPLES wondering if the love will last could find out if theirs is a match made in heaven by subjecting themselves to a mathematical test.
A professor at Oxford University and his team have perfected a model whereby they can calculate whether the relationship will succeed.
In a study of 700 couples, Professor James Murray, a maths expert, predicted the divorce rate with 94 per cent accuracy.
His calculations were based on 15minute conversations between couples who were asked to sit opposite each other in a room on their own and talk about a contentious issue, such as money, sex or relations with their inlaws.
Professor Murray and his colleagues recorded the conversations and awarded each husband and wife positive or negative points depending on what was said. ..."
Such interviews should perhaps be made mandatory for all couples registering their marriage.
More details on the mathematics of marriage can be found in The Mathematics of Marriage: Dynamic Nonlinear Models by J.M. Gottman, J.D. Murray, C. Swanson, R. Tyson, and K.R. Swanson (MIT Press, Cambridge, MA, 2002.)
On a more positive note, though, here is an online Oracle from Melbourne (Australia: the land of the real Black Swan!).
You may wish to consult this friendly 24/7 facility about important "Yes/No" questions that you no doubt have about the future.
More on this and related topics can be found in the pages of the WorstCase Analysis / Maximin Campaign, Severe Uncertainty, and the InfoGap Campaign.
Recent Articles, Working Papers, Notes
Also, see my complete list of articles
Moshe's new book!  Sniedovich, M. (2012) Fooled by local robustness, Risk Analysis, in press.
 Sniedovich, M. (2012) Black swans, new Nostradamuses, voodoo decision theories and the science of decisionmaking in the face of severe uncertainty, International Transactions in Operational Research, in press.
 Sniedovich, M. (2011) A classic decision theoretic perspective on worstcase analysis, Applications of Mathematics, 56(5), 499509.
 Sniedovich, M. (2011) Dynamic programming: introductory concepts, in Wiley Encyclopedia of Operations Research and Management Science (EORMS), Wiley.
 Caserta, M., Voss, S., Sniedovich, M. (2011) Applying the corridor method to a blocks relocation problem, OR Spectrum, 33(4), 815929, 2011.
 Sniedovich, M. (2011) Dynamic Programming: Foundations and Principles, Second Edition, Taylor & Francis.
 Sniedovich, M. (2010) A bird's view of InfoGap decision theory, Journal of Risk Finance, 11(3), 268283.
 Sniedovich M. (2009) Modeling of robustness against severe uncertainty, pp. 33 42, Proceedings of the 10th International Symposium on Operational Research, SOR'09, Nova Gorica, Slovenia, September 2325, 2009.
 Sniedovich M. (2009) A Critique of InfoGap Robustness Model. In: Martorell et al. (eds), Safety, Reliability and Risk Analysis: Theory, Methods and Applications, pp. 20712079, Taylor and Francis Group, London.
.
 Sniedovich M. (2009) A Classical Decision Theoretic Perspective on WorstCase Analysis, Working Paper No. MS0309, Department of Mathematics and Statistics, The University of Melbourne.(PDF File)
 Caserta, M., Voss, S., Sniedovich, M. (2008) The corridor method  A general solution concept with application to the blocks relocation problem. In: A. Bruzzone, F. Longo, Y. Merkuriev, G. Mirabelli and M.A. Piera (eds.), 11th International Workshop on Harbour, Maritime and Multimodal Logistics Modeling and Simulation, DIPTEM, Genova, 8994.
 Sniedovich, M. (2008) FAQS about InfoGap Decision Theory, Working Paper No. MS1208, Department of Mathematics and Statistics, The University of Melbourne, (PDF File)
 Sniedovich, M. (2008) A Call for the Reassessment of the Use and Promotion of InfoGap Decision Theory in Australia (PDF File)
 Sniedovich, M. (2008) InfoGap decision theory and the small applied world of environmental decisionmaking, Working Paper No. MS1108
This is a response to comments made by Mark Burgman on my criticism of InfoGap (PDF file )
 Sniedovich, M. (2008) A call for the reassessment of InfoGap decision theory, Decision Point, 24, 10.
 Sniedovich, M. (2008) From Shakespeare to Wald: modeling worscase analysis in the face of severe uncertainty, Decision Point, 22, 89.
 Sniedovich, M. (2008) Wald's Maximin model: a treasure in disguise!, Journal of Risk Finance, 9(3), 287291.
 Sniedovich, M. (2008) Anatomy of a Misguided Maximin formulation of InfoGap's Robustness Model (PDF File)
In this paper I explain, again, the misconceptions that InfoGap proponents seem to have regarding the relationship between InfoGap's robustness model and Wald's Maximin model.
 Sniedovich. M. (2008) The Mighty Maximin! (PDF File)
This paper is dedicated to the modeling aspects of Maximin and robust optimization.
 Sniedovich, M. (2007) The art and science of modeling decisionmaking under severe uncertainty, Decision Making in Manufacturing and Services, 12, 111136. (PDF File) .
 Sniedovich, M. (2007) CrystalClear Answers to Two FAQs about InfoGap (PDF File)
In this paper I examine the two fundamental flaws in InfoGap decision theory, and the flawed attempts to shrug off my criticism of InfoGap decision theory.
 My reply (PDF File) to BenHaim's response to one of my papers. (April 22, 2007)
This is an exciting development!
 BenHaim's response confirms my assessment of InfoGap. It is clear that InfoGap is fundamentally flawed and therefore unsuitable for decisionmaking under severe uncertainty.
 BenHaim is not familiar with the fundamental concept point estimate. He does not realize that a function can be a point estimate of another function.
So when you read my papers make sure that you do not misinterpret the notion point estimate. The phrase "A is a point estimate of B" simply means that A is an element of the same topological space that B belongs to. Thus, if B is say a probability density function and A is a point estimate of B, then A is a probability density function belonging to the same (assumed) set (family) of probability density functions.
BenHaim mistakenly assumes that a point estimate is a point in a Euclidean space and therefore a point estimate cannot be say a function. This is incredible!
 A formal proof that InfoGap is Wald's Maximin Principle in disguise. (December 31, 2006)
This is a very short article entitled Eureka! InfoGap is Worst Case (maximin) in Disguise! (PDF File)
It shows that InfoGap is not a new theory but rather a simple instance of Wald's famous Maximin Principle dating back to 1945, which in turn goes back to von Neumann's work on Maximin problems in the context of Game Theory (1928).
 A proof that InfoGap's uncertainty model is fundamentally flawed. (December 31, 2006)
This is a very short article entitled The Fundamental Flaw in InfoGap's Uncertainty Model (PDF File) .
It shows that because InfoGap deploys a single point estimate under severe uncertainty, there is no reason to believe that the solutions it generates are likely to be robust.
 A mathfree explanation of the flaw in InfoGap. ( December 31, 2006)
This is a very short article entitled The GAP in InfoGap (PDF File) .
It is a mathfree version of the paper above. Read it if you are allergic to math.
 A long essay entitled What's Wrong with InfoGap? An Operations Research Perspective (PDF File) (December 31, 2006).
This is a paper that I presented at the ASOR Recent Advances in Operations Research (PDF File) miniconference (December 1, 2006, Melbourne, Australia).Recent Lectures, Seminars, Presentations
If your organization is promoting InfoGap, I suggest that you invite me for a seminar at your place. I promise to deliver a lively, informative, entertaining and convincing presentation explaining why it is not a good idea to use — let alone promote — InfoGap as a decisionmaking tool.
Here is a list of relevant lectures/seminars on this topic that I gave in the last two years.
ASOR Recent Advances, 2011, Melbourne, Australia, November 16 2011. Presentation: The Power of the (peerreviewed) Word. (PDF file).
 Alex Rubinov Memorial Lecture The Art, Science, and Joy of (mathematical) DecisionMaking, November 7, 2011, The University of Ballarat. (PDF file).
 Black Swans, Modern Nostradamuses, Voodoo Decision Theories, and the Science of DecisionMaking in the Face of Severe Uncertainty (PDF File) .
(Invited tutorial, ALIO/INFORMS Conference, Buenos Aires, Argentina, July 69, 2010).
 A Critique of InfoGap Decision theory: From Voodoo DecisionMaking to Voodoo Economics(PDF File) .
(Recent Advances in OR, RMIT, Melbourne, Australia, November 25, 2009)
 Robust decisionmaking in the face of severe uncertainty(PDF File) .
(GRIPS, Tokyo, Japan, October 16, 2009)
 Decisionmaking in the face of severe uncertainty(PDF File) .
(KORDS'09 Conference, Vilnius, Lithuania, September 30  OCtober 3, 2009)
 Modeling robustness against severe uncertainty (PDF File) .
(SOR'09 Conference, Nova Gorica, Slovenia, September 2325, 2009)
 How do you recognize a Voodoo decision theory?(PDF File) .
(School of Mathematical and Geospatial Sciences, RMIT, June 26, 2009).
 Black Swans, Modern Nostradamuses, Voodoo Decision Theories, InfoGaps, and the Science of DecisionMaking in the Face of Severe Uncertainty (PDF File) .
(Department of Econometrics and Business Statistics, Monash University, May 8, 2009).
 The Rise and Rise of Voodoo Decision Theory.
ASOR Recent Advances, Deakin University, November 26, 2008. This presentation was based on the pages on my website (voodoo.mosheonline.com).
 Responsible DecisionMaking in the face of Severe Uncertainty (PDF File) .
(Singapore Management University, Singapore, September 29, 2008)
 A Critique of InfoGap's Robustness Model (PDF File) .
(ESREL/SRA 2008 Conference, Valencia, Spain, September 2225, 2008)
 Robust DecisionMaking in the Face of Severe Uncertainty (PDF File) .
(Technion, Haifa, Israel, September 15, 2008)
 The Art and Science of Robust DecisionMaking (PDF File) .
(AIRO 2008 Conference, Ischia, Italy, September 811, 2008 )
 The Fundamental Flaws in InfoGap Decision Theory (PDF File) .
(CSIRO, Canberra, July 9, 2008 )
 Responsible DecisionMaking in the Face of Severe Uncertainty (PDF File) .
(OR Conference, ADFA, Canberra, July 78, 2008 )
 Responsible DecisionMaking in the Face of Severe Uncertainty (PDF File) .
(University of Sydney Seminar, May 16, 2008 )
 DecisionMaking Under Severe Uncertainty: An Australian, Operational Research Perspective (PDF File) .
(ASOR National Conference, Melbourne, December 35, 2007 )
 A Critique of InfoGap (PDF File) .
(SRA 2007 Conference, Hobart, August 20, 2007)
 What exactly is wrong with InfoGap? A Decision Theoretic Perspective (PDF File) .
(MS Colloquium, University of Melbourne, August 1, 2007)
 A Formal Look at InfoGap Theory (PDF File) .
(ORSUM Seminar , University of Melbourne, May 21, 2007)
 The Art and Science of DecisionMaking Under Severe Uncertainty (PDF File) .
(ACERA seminar, University of Melbourne, May 4, 2007)
 What exactly is InfoGap? An OR perspective. (PDF File)
ASOR Recent Advances in Operations Research miniconference (December 1, 2006, Melbourne, Australia).
Disclaimer: This page, its contents and style, are the responsibility of the author (Moshe Sniedovich) and do not represent the views, policies or opinions of the organizations he is associated/affiliated with.