Quoting from John Sterman's authoritative book on system dynamics,

" A fundamental principle of system dynamics states that the structure of the system gives rise to its behavior. However, people have a strong tendency to attribute the behavior of others to dispositional rather than situational factors, that is, to character and especially character flaws rather than the system in which these people are acting. The tendency to blame the person rather than the system is so strong psychologists call it the "fundamental attribution error" (Ross 1977). In complex systems, different people placed in the same structure tend to behave in similar ways. When we attribute behavior to personality we lose sight of how the structure of the system shaped our choices. The attribution of behavior to individuals and special circumstances rather than system structure diverts our attention from the high leverage points where redesigning the system or government policy can have significant, sustained, beneficial effects on performance (Forrester 1969, chap.6; Meadows 1982). When we attribute behavior to people rather than system structure the focus of management becomes scapegoating and blame rather than the design of organizations in which ordinary people can achieve extraordinary results. " (page 28-29)

Sterman's comment is especially relevant to the current debate on reforming and regulating our financial system. It is misguided to focus on greedy bankers and incompetent or compromised regulators. Bankers and regulators are merely adapting to the incentives presented to them by our current economic and political system.

In fact, the real question is why so few economic actors indulge in fraud or milking taxpayer guarantees when they have every incentive to. After all, choosing not to play the game means accepting lower returns if one's a shareholder and accepting lower bonuses and possibly even being fired for underperformance if one's a manager or a trader.

The answer is that our ethics prevent us from exploiting the situation. But our ethical standards do not remain constant. They can and will erode if a perverse system is in place for too long. This gradual erosion of ethical standards is the real risk we face if we do not reform our system and fix the incentives. We may not realise this until it's already too late and reversing this process and rebuilding ethical standards and trust in an economic system will be no easy task.


Nick Gogerty

Understanding systems as larger than individual actors is key to mitigating risks. Dekker and Reason have some great books on human error and designing around it. Bad agents either corrupt, unethical or basically stupid will show up in any system. The regulators role as system designer is to minimize the impacts of failures these inevitibilities will induce. Human nature can't be change much (even with incentives), but the operating parameters and boundaries of the systems in which they act can. You may enjoy http://www.amazon.com/Questions-About-Human-Error-Transportation/dp/0805847456/ref=sr_1_2?ie=UTF8&qid=1352305262&sr=8-2&keywords=human+factors+error


Nick - I have read both of them and most of the literature in resilience engineering. I quote James Reason extensively in this post https://www.macroresilience.com/2011/12/29/people-make-poor-monitors-for-computers/