Showing posts with label Complexité. Show all posts
Showing posts with label Complexité. Show all posts

Saturday, November 21, 2009

Brainstorming on How Complex system Fail

Richard I COOK comes up with a must read piece on how complex system fail.
Here are his 18 bullet points, and my thoughts on the subject:

1.Complex systems are intrinsically hazardous systems.


All of the interesting systems (e.g. transportation, healthcare, power generation) are inherently and unavoidably hazardous by the own nature. The frequency of hazard exposure can sometimes be changed but the processes involved in the system are themselves intrinsically and irreducibly hazardous. It is the presence of these hazards that drives the creation of defenses against hazard that characterize these systems.

what comes to my mind: Stock Market, Regulation, Behavioural Finance, Bonded Rationality

2.Complex systems are heavily and successfully defended against failure

The high consequences of failure lead over time to the construction of multiple layers of defense against failure. These defenses include obvious technical components (e.g. backup systems, ‘safety’ features of equipment) and human components (e.g. training, knowledge) but also a variety of organizational, institutional, and regulatory defenses (e.g. policies and procedures, certification, work rules, team training). The effect of these measures is to provide a series of shields that normally divert operations away from accidents.

what comes to my mind: Risk Analysis, FED, Reserves, Regulation, Technical Analysis, Creative Destruction


3. Catastrophe requires multiple failures – single point failures are not enough.

The array of defenses works. System operations are generally successful. Overt catastrophic failure occurs when small, apparently innocuous failures join to create opportunity for a systemic accident. Each of these small failures is necessary to cause catastrophe but only the combination is sufficient to permit failure. Put another way, there are many more failure opportunities than overt system accidents. Most initial failure trajectories are blocked by designed system safety components. Trajectories that reach the operational level are mostly blocked, usually by practitioners.

what comes to my mind: Complexity of Finance, Hindsight Bias, Systemic Failures, Feedback loops, Power Laws, Black Swans

4. Complex systems contain changing mixtures of failures latent within them


The complexity of these systems makes it impossible for them to run without multiple flaws being present. Because these are individually insufficient to cause failure they are regarded as minor factors during operations. Eradication of all latent failures is limited primarily by economic cost but also because it is difficult before the fact to see how such failures might contribute to an accident. The failures change constantly because of changing technology, work organization, and efforts to eradicate failures.

What comes to my mind: Optimization, Efficient Market Hypothesis, FAMA


5. Complex systems run in degraded mode


A corollary to the preceding point is that complex systems run as broken systems. The system continues to function because it contains so many redundancies and because people can make it function, despite the presence of many flaws. After accident reviews nearly always note that the system has a history of prior ‘proto-accidents’ that nearly generated catastrophe. Arguments that these degraded conditions should have been recognized before the overt accident are usually predicated on naïve notions of system performance. System operations are dynamic, with components (organizational, human, technical) failing and being replaced continuously.

What Comes to my mind: 



6. Catastrophe is always just around the corner.


Complex systems possess potential for catastrophic failure. Human practitioners are nearly always in close physical and temporal proximity to these potential failures – disaster can occur at any time and in nearly any place. The potential for catastrophic outcome is a hallmark of complex systems. It is impossible to eliminate the potential for such catastrophic failure; the potential for such failure is always present by the system’s own nature.


What Comes to my mind: Illusion of Control, Great Moderation illusion, Uncertainty principle, Prudential regulation


7. Post-accident attribution accident to a ‘root cause’ is fundamentally wrong.


Because overt failure requires multiple faults, there is no isolated ‘cause’ of an accident. There are multiple contributors to accidents. Each of these is necessary insufficient in itself to create an accident. Only jointly are these causes sufficient to create an accident. Indeed, it is the linking of these causes together that creates the circumstances required for the accident. Thus, no isolation of the ‘root cause’ of an accident is possible. The evaluations based on such reasoning as ‘root cause’ do not reflect a technical understanding of the nature of  failure but rather the social, cultural need to blame specific, localized forces or events for outcomes.


What Comes to my mind: Hindsight bias, Reductionism, KAHNEMAN, Subprime Crisis, Management Books


8. Hindsight biases post-accident assessments of human performance.


Knowledge of the outcome makes it seem that events leading to the outcome should have appeared more salient to practitioners at the time than was actually the case. This means that ex post facto accident analysis of human performance is inaccurate. The outcome knowledge poisons the ability of after-accident observers to recreate the view of practitioners before the accident of those same factors. It seems that practitioners “should have known” that the factors would “inevitably” lead to an accident.2 Hindsight bias remains the primary obstacle to accident investigation, especially when expert human performance is involved.


What comes to my mind: Forward Thinking, Opportunism, Survivorship Bias


9. Human operators have dual roles: as producers & as defenders against failure


The system practitioners operate the system in order to produce its desired product and also work to forestall accidents. This dynamic quality of system operation, the balancing of demands for production against the possibility of incipient failure is unavoidable. Outsiders rarely acknowledge the duality of this role. In non accident filled times, the production role is emphasized. After accidents, the defense against failure role is emphasized. At either time, the outsider’s view misapprehends the operator’s constant, simultaneous engagement with both roles.


What Comes to my mind: Complexity (and acknowledged lack of understanding) of Financial Instruments, VaR, Black Swans


10. All practitioner actions are gambles


After accidents, the overt failure often appears to have been inevitable and the practitioner’s actions as blunders or deliberate willful disregard of certain impending failure. But all practitioner actions are actually gambles, that is, acts that take place in the face of uncertain outcomes. The degree of uncertainty may change from moment to moment. That practitioner actions are gambles appears clear after accidents; in general,post hoc analysis regards these gambles as poor ones. But the converse: that successful outcomes are also the result of gambles; is not widely appreciated.


What comes to my mind: Reductionism of the complexity of Finance, Illusory importance given to Track Records, Survivorship Bias, Finance is considered too Serious.


11) Actions at the sharp end resolve all ambiguity


Organizations are ambiguous, often intentionally, about the relationship between production targets, efficient use of resources, economy and costs of operations, and acceptable risks of low and high consequence accidents. All ambiguity is resolved by actions of practitioners at the sharp end of the system. After an accident, practitioner actions may be regarded as ‘errors’ or ‘violations’ but these evaluations are heavily biased by hindsight and ignore the other driving forces, especially production pressure.


What Comes to my mind:


12. Human practitioners are the adaptable element of complex systems


Practitioners and first line management actively adapt the system to maximize production and minimize accidents. These adaptations often occur on a moment by moment basis. Some of these adaptations include: (1) Restructuring the system in order to reduce exposure of vulnerable parts to failure. (2) Concentrating critical resources in areas of expected high demand. (3) Providing pathways for retreat or recovery from
 expected and unexpected faults. (4) Establishing means for early detection of changed system performance in order to allow graceful cutbacks in production or other means of increasing resiliency.


What Comes to my mind: Risk Analysis, Management Reductionism, Holistic view of Business, Flaw of Averages




13. Human expertise in complex systems is constantly changing


Complex systems require substantial human expertise in their operation and management. This expertise changes in character as technology changes but it also changes because of the need to replace experts who leave. In every case, training and refinement of skill and expertise is one part of the function of the system itself. At any moment, therefore, a given complex system will contain practitioners and trainees with varying degrees of expertise. Critical issues related to expertise arise from (1) the need to use scarce expertise as a resource for the most difficult or demanding production needs and (2) the need to develop expertise for future use.


What Comes to my mind:


 Too much time is spent trying to find out more and more about less and less, until we know everything about nothing,'' MONTIER says. ``Rarely, if ever, do we stop and ask what do we actually need to know
14. Change introduces new forms of failure.


The low rate of overt accidents in reliable systems may encourage changes, especially the use of new technology, to decrease the number of low consequence but high frequency failures. These changes maybe actually create opportunities for new, low frequency but high consequence failures. When new technologies are used to eliminate well understood system failures or to gain high precision performance they often introduce new pathways to large scale, catastrophic failures. Not uncommonly, these new, rare catastrophes have even greater impact than those eliminated by the new technology. These new forms of failure are difficult to see before the fact; attention is paid mostly to the putative beneficial characteristics of the changes. Because these new, high consequence accidents occur at a low rate, multiple system changes may occur before an accident, making it hard to see the contribution of technology to the failure.


What Comes to my mind: Backlash of the Rationalization Era, Pattern recognition illusion, VaR, Risk Hedging, Illusion of Forecasting Black Swans




15. Views of ‘cause’ limit the effectiveness of defenses against future events


Post-accident remedies for “human error” are usually predicated on obstructing activities that can “cause” accidents. These end-of-the-chain measures do little to reduce the likelihood of further accidents. In fact that likelihood of an identical accident is already extraordinarily low because the pattern of latent failures changes constantly. Instead of increasing safety, post-accident remedies usually increase the coupling and complexity ofthe system. This increases the potential number of latent failures and also makes the detection and blocking of accident trajectories more difficult.


What comes to my mind: Inference is illusory, the increase in car accidents related deaths after 9/11, Don't give credit to people who were right about past events


Do not repeat the tactics which have gained you one victory, but let your methods be regulated by the infinite variety of circumstances. SUN TZU


16. Safety is a characteristic of systems and not of their components


Safety is an emergent property of systems; it does not reside in a person, device or department of an organization or system. Safety cannot be purchased or manufactured; it is not a feature that is separate from the other components of the system. This means that safety cannot be manipulated like a feedstock or raw material. The state of safety in any system is always dynamic; continuous systemic change insures that hazard and its management are constantly changing.


What comes to my mind: Current atomicity of Financial regulation, FED, ECB, cooperation and NASH's Equilibrium


For should the enemy strengthen his van, he will weaken his rear; should he strengthen his rear, he will weaken his van; should he strengthen his left, he will weaken his right; should he strengthen his right, he will weaken his left. If he sends reinforcements everywhere, he will everywhere be weak SUN TZU (again)
17. People continuously create safety


Failure free operations are the result of activities of people who work to keep the system within the boundaries of tolerable performance. These activities are, for the most part, part of normal operations and superficially straightforward. But because system operations are never trouble free, human practitioner adaptations to changing conditions actually create safety from moment to moment. These adaptations often amount to just the selection of a well-rehearsed routine from a store of available responses; sometimes, however, the adaptations are novel combinations or de novo creations of new approaches.


What Comes to my mind: Moderation principle, Staying away of what you don't understand, Marathon runners and Sprinters / BUFFET and Day Traders


18. Failure free operations require experience with failure


Recognizing hazard and successfully manipulating system operations to remain inside the tolerable performance boundaries requires intimate contact with failure. More robust system performance is likely to arise in systems where operators can discern the “edge of the envelope”. This is where system performance begins to deteriorate, becomes difficult to predict, or cannot be readily recovered. In intrinsically hazardous systems, operators are expected to encounter and appreciate hazards in ways that lead to overall performance that is desirable. Improved safety depends on providing operators with calibrated views of the hazards. It also depends on providing calibration about how their actions move system performance towards or away from the edge of the envelope.


What comes to my mind: !Not finding books on Amazon telling the story of Failure.

Monday, November 2, 2009

The Subprime Flu - Acknowledging the Complexity of Finance -

I do not know, or can possibly imagine, where the current economic situation is going to lead us.
My only conviction is that such events arouse theoretical debates regarding economic policies or measures that shall be implemented to prevent future crisis from happening. This is why I believe such times are worth living: Only in such timeframes is conventional knowledge questioned.


Most people are currently focusing on the "what went wrong" side of the problem. In my opinion, this behavior is flawed.


Flocks of new theories and opinions regarding what led to the subprime crisis  will appear  in the forthcoming years. Some people (Peter Schiff for example), will say "I told you so", others will derive from data the possible causes of the financial collapse.
However, I seriously question our ability to understand the complexity of the system we designed, and therefore the patterns that will emerge from this brainstorming.
Does this mean I think explanations to the current crisis are wrong? I don't think they are. What I believe is that each individual explanation can solely grasp one particular aspect in the system that led to the situation we know:
    Was the subprime crisis triggered by the mark to market valuation of assets? It certainly played a role; but one as important as the banks' skyrocketing debt/equity ratio, the incentives for brokers to sell subprime mortgages whether it be to insolvent households, or the tendency for American consumers to see a greener grass in China's products compared to their American counterparts.
    Such drivers did contribute to the financial situation we know, but none accounts for the Tipping Point that brought us where we are.


    To illustrate this need to consider our system as complex, let's consider this sentence:


    "The subprime crisis is a function of the banks, the financial products they structured, and the environment of sheer confidence that home prices were ever-rising. "
    Does this describe the events that occured? Could these sentences be extracted from a recent piece of news?
    In fact, how you might rightfully suppose, they are not.


    This affirmations are inspired by  Malcom GLADWELL's delectable Tipping Point:


    "Epidemics are a function of the people who transmit infectious agents, the infectious agent itself, and the environment in which the infectious agent is operating. And when an epidemic tips, when it is jolted out of equilibrium, it tips because something has happened, some change has occured in one (or two or three) of those areas"
    I only switched the words "Epidemics", "Infectious Agents" and "Environment". If you want to give a try, please consider this (non exhaustive) list to play Make your own Financial Crisis explanation sentence:





    The lesson to be learnt from this little example does not have to do with Banking institutions, Global Imbalances or Confidence.
    It deals with the fact that financial crisis, as epidemics, are Complex and Evoluting Systems. Some epidemics have already been compared to the subprime crisis, Crack for example.


    Complex, because studying them would imply studying Finance but also Psychology, Technology or Religion.
    Evoluting, as the perpetual motion of the Financial Market Environment blurs the comprehension we have of the causes/transmission channels/ impacts of Crisis.


    As epidemics, Financial Crisis now spread like wildfire because of the flattening the World has undergone in the last 50 years. As epidemics, they hit stakeholders via a network effect but, unlike them, there  are no short-term geographical bulwark to contamination.





    The importance of the too big to fail issue: Switching from a strongly Centralized Network to a Distributed Network.


    "On Monday, March 17, 2008, global financial markets opened to news of a Federal Reserve-enabled rescue of Bear Stearns by JPMorgan Chase. We learned, in the days that followed, of a weekend marathon meeting conducted by Federal Reserve officials to find a buyer for Bear Stearns. Urgency was warranted such that the hyper-connected global financial system might escape the effects of a medium-sized U.S. investment bank filing for bankruptcy and risking reverberations to thousands, nay, millions, of counterparties that were connected to it. Speculation grew of which institutions could be next, and more importantly, of which institutions comprised the Federal Reserve’s “too connected to fail” list. In reality, there was no such list at the ready; however, we can think of several universal banks and investment banks that, by virtue of the network age, play a significantly connected role in global finance such that bankruptcy of one or more would multiply the effects on financial markets globally in a cross-defaulting negative feedback loop." PIMCO AUGUST 2008 Report



    It may seem a paradox, but Globalisation has fostered a Hubs and Spoke Network Model in Financial Services where a a few big banks (hubs) concentrate a meaningful share of customers (spokes) (see Network Model b).




    create animated gif


    Hence, a modelisation of the financial  landscape would show similarities with the one of an Epidemic (SARS in this example)














    Globalisation has, more than anything, leveled the playing field and linked every domino in the game. In good times, the rising sea lifts all the boats. In bad ones, a butterfly's wing flap can produce a worlwide recession.



    In more scientific terms, Globalisation has given birth to a Financial Ecosystem:
    • which is simulateously robust and fragile - a property exhibited by other complex adaptive networks, such as tropical rainforests;
    • whose feedback effects under stress (hoarding of liabilities and fire-sales of assets) added to these fragilities - as has been found to be the case in the spread of certain diseases;
    • whose dimensionality and hence complexity amplified materially Knightian uncertainties in the pricing of assets - causing seizures in certain financial markets;
    • where financial innovation, in the form of structured products, increased further network dimensionality, complexity and uncertainty; and
    • whose diversity was gradually eroded by institutions’ business and risk management strategies, making the whole system less resistant to disturbance - mirroring the fortunes of marine eco-systems whose diversity has been steadily eroded and whose susceptibility to collapse has thereby increased.
    To understand events leading to Financial Crisis, Researchers are to look at long time series as  Market Cooperativity (as defined by SORNETTE in 2003: the growth of correlation between investors' decision process, driven by feedback loops) is built on the long-term. 


     Let's consider the two most important types of transmission channels:


    • Common Shocks: Affecting simulateneously  Financial Sectors worldwide, they were mostly linked with the over exposure of investors and Financial institutions to increased risk of securities to the US Sub-Prime Market. This is where the flawed VaR measurement of exposure to risk certainly played a role. Another form of Common shock was provided by the liquidity shortage on Financial Markets and the shared risk aversion of investors. 





    • Spillover, or Contagion Effect: Financial shocks affecting one particular geographical location are likely to spread, or spillover, to other areas. Feedback loops between the real economy and worlwide financial markets are an example of Spillover Effect: gloomier Growth forecasts for the US economy will easily be transmitted to equity prices worlwide. Yet Another example of Spillover Effect provided by financial markets is transmitted via arbitrage opportunities: Changes in asset prices in one market entail portfolio adjustments by intermediaries and investors in other markets until one global price for the asset emerges. Foreign Banks holding US mortgages and US mortgage backed securities that were sold to them by Freddie Mac or Fannie Mae had to report significant losses, sometimes even before their american counterparts.







    The subprime crisis has shown every sign of an epidemic: "Spreading rapidly and extensively by infection and affecting many individuals in an area or a population at the same time"


    As R.SHILLER argues in his The Subprime Solution:
    "Every disease has a contagion rate (the rate at which it is spread from person to person) and a removal rate (the rate at which individuals recover from or succumb to the illness and so are no longer contagious). If the contagion rate exceeds the removal rate by a necessary amount, an epidemic begins. The contagion rate varies through time because of a number of factors. For example, contagion rates for influenza are higher in the winter, when lower temperatures encourage the spread of the virus in airborne droplets after infected individuals sneeze. So it is in the economic and social environment. Sooner or later, some factor boosts the infection rate sufficiently above the removal rate for an optimistic view of the market to become widespread. There is an escalation in public knowledge of the  arguments that would seem to support that view, and soon the epidemic spirals up and out of control. Almost everyone appears to think—if they notice at all that certain economic arguments are more in evidence—that the arguments are increasingly heard only because of their true intellectual merit. The idea that the prominence of the arguments is in fact due to a social contagion is hardly ever broached, at least not outside university sociology departments."
    We know epidemics to be complex, and that several diseases can stem from a common core but not be cured by the same treatment.
    The Subprime Crisis may  produce a paradigm shift, or it may not. The race to specialization, and the clustering of knowledge it fomented, certainly played a strong role in how the situation rolled out.
    Tackling Global issues, such as future financial crisis, will certainly imply more Global thinking.