idea of the project

System Risk Assessment

The process of globalization has increased the interconnectedness and complexity of socio-economic systems (Taleb, 2012). While this has led to significant socio-economic development, it has also increased the importance of system risk, important in, for example, global financial markets, global integrated supply chain, as well as transport, energy, and Internet infrastructure (e.g., Michaud, 2023). System risk can be interpreted as the risk that a specific, often small probability event, triggers irreversible damage or even collapse of such a system due to the existence of weakest links, bottlenecks, and other types of vulnerabilities (Taleb et al., 2014; Platje et al., 2023).

Vulnerabilities tend to become visible with different events such as the financial crisis of 2008-2009, the Fukushima disaster (2011), the Chornobyl disaster (1986), the Challenger disaster (1986), and the Covid-19 pandemic (2020-2021) (Taleb, 2007; Fergusson, 2022). While these events have been called unexpected (so-called Black Swans), signals of their possibilities existed at an earlier stage. These signals of potential system risks were ignored for various reasons (Acton and Hibbs, 2018; Wucker, 2016; Fergusson, 2022; Platje et al., 2022). Events such as COVID-19 and the war in Ukraine, while being problematic in itself, showed the weaknesses and vulnerabilities in global supply systems. These weaknesses and vulnerabilities must be identified, as they create system risks. This requires a precautionary approach, as the impact or magnitude of a disturbing event tends to be large-scale and irreversible (Taleb et al., 2014).

The issue of system risk is exemplified by these crises, showing the interconnectedness of global markets and supply chains (see Taleb, 2012), as well as the strong dependency on fossil fuels, which account for about 80% of the global energy supply (Smil, 2022). The last decades have been a period of increasing interconnectedness in the globalizing economy, with an increasing number of vulnerabilities, which often remain unnoticed in the growing complexity of global trade. This makes the world more vulnerable to random events, leading to irreversible damage (Mandelbrot and Hudson, 2008; Taleb, 2012; Taleb et al., 2014). Integrated, complex networks are featured by system risks, which can lead to a system breakdown, threatening individual, regional, national, and transnational energy security (Smil, 2022).

A challenge is that system risk, due to its relation to the complexity of systems, is an issue that is difficult to grasp for human beings. People rather tend to think in simple cause-consequence relations (Kahneman, 2011). Research on the concepts of individual and system risk is needed, as there remains a lack of clarity on the relation between risk perception, risk preparedness, and system risk. There is a wide range of ambiguous definitions of risk, risk perceptions, and the related concepts of crisis, collapse, system, system risk, etc. While, for example, Miceli et al. (2008, 165) define risk perception as “a complex process which encompasses both cognitive and affective aspects”, Wachinger et al. (2013: 1049) define it as “a process of collecting, selecting, and interpreting signals about uncertain impacts of events, activities, or technologies”. In turn, Lavigne et al. (2008: 273) see it as “the estimated probability people have that hazard will affect them,” while Slovic (1987) emphasizes “the intuitive risk judgment of individuals and social groups in the context of limited and uncertain information.”

Individuals tend to have difficulties with understanding system risk at the macro level, as they are characterized by difficulty in observing interactions in complex systems (Sterman, 2000; Kahneman, 2011). As a consequence, what can be an individual virtue or rational decision, can be a vice at the system level. This is well-known in case of an economic crisis, where individuals save more (precautionary principle), leading to reduced spending, in turn strengthening the crisis due to reduced production and sales (Keynes, 1936).

This, together with a wide range of other factors (such as focus on short-term benefits of individuals and companies), is a basis for the following working hypothesis:

Individual risk aversion leads to the amplification of system risk.

In the negligence or denial of system risks, worldviews (Meadows, 1998, 1999) may play an important role. For example, some basic vulnerabilities, such as dependence on fossil fuels in global production chains, are well known (Smil, 2022). Optimism embraced in many theories of economic growth tends to lead to the belief that technology and the well-functioning of markets will allow for managing physical barriers in natural resource availability (Raworth, 2017) and that the global economic system is resilient (Mandelbrot and Hudson, 2008). This may lead to downplaying the importance of system risk. The discussion on limits to growth (Meadows et al., 1972) is continuing, and there remains a strong belief by economists that natural resources such as raw materials and oil will stay available at a proper price (e.g., Simon, 1981). This is questioned by biologists (Ehrlich, 1968; Ehrlich and Ehrlich, 1996) and ecological economists (e.g., Daly, 1996). The practice of, just to mention an example, energy transition seems to be based on the underlying paradigm that the economy can grow infinitely due to innovations that will make more renewable energy available (Jensen et al., 2021), ignoring the physical limits of natural resources (Boulding, 1966), laws of thermodynamics (Georgescu-Roegen, 1971), as well as rebound effects, positive feedback loops and cascading effects in dynamic systems (Jevons, 1866; Sterman, 2000).

System Risk Assessment

The innovativeness of the research carried out in the framework of this Research Centre  lies in the creating of a framework and an application that allows for identification of worldviews and cognitive biases that hamper the acknowledgement of system risk. Disturbed risk perception or awareness tend to reduce peoples’ willingness to take precaution and to ignore signals provided by Early Warning Systems, and/or to make decisions that increase the probability, impact and/or magnitude of system risk. Such a probability amplifier of system risk is an element that1:

  • increases the probability of the system risk occurrence,
  • boosts its severity (impact, magnitude),
  • or brings it closer (affecting the system risk timeframe).

The scientific objective is to create and empirically verify a model of determinants that amplify the system risk and determine the capacity to create an Early Warning System. The resulting application can be used by individuals and organizations to identify and use signals that pose threats to system and organizational sustainability and the achievement of Sustainable Development Goals. It enables the identification of and discussion of the worldviews and cognitive biases in decision-makers ignorance of system risks, leading to unsustainable practices.

Some research questions to be dealt with are:

  • RQ1. Does individual risk aversion amplify the ignorance of system risks?
  • RQ2. Which cognitive biases amplify individual risk aversion?
  • RQ3. Which worldviews amplify the probability of system risk.?
  • RQ4. Which worldviews amplify the magnitude of the potential impact of system risk?
  • RQ5. Which worldviews strengthen cognitive biases regarding system risks?

The basis for the research is dealing with the following questions regarding system risk:

  • a) Does the system risk exist?
  • b) What are the determinants of the system risk?
  • c) What is the impact or magnitude of the system risk?
  • d) What should be done in order to deal with the system risk?
  • e) What is the impact of the proposed solution?

This text is based on the article in progress: Dominant Social Paradigm and ignorance of system risks – an empirical study  (Joost Platje, Anna Motylska-Kuźma, Dorota Dyjakon, Jarl K. Kampen, Ynte K. van Dam, Nadar Shah).


1 This definition has been developed by prof. Marek Krótkiewicz from the Wrocław University of Technology, Poland.

Acton, J.M., Hibbs, M. (2012), Why Fukushima was preventable; Carnegie Endowment for International Peace: Washington DC, https://carnegieendowment.org/files/fukushima.pdf [28.11.2018].

Boulding, K.E. (1966). The Economics of the Coming Spaceship Earth. In: Jarrett, H. (ed.). Environmental Quality in a Growing Economy, Essays from the Sixth RFF Forum: 3-14. Baltimore MD.

Daly, H.E. (1996), Beyond Growth, Beacon Press, Boston.

Ehrlich, P. R., Ehrlich, A. H. (1996). Betrayal of science and reason: How anti-environmental rhetoric threatens our future. Island Press.

Ehrlich, P.R. (1968), The population bomb, Ballantine Books, New York.

Fergusson, N. (2022), Fatum. Polityka i katastrofy współczesnego świata, Wyd. Literackie, Kraków.

Georgescu-Roegen, N. (1971), The entropy law and the economic process, Harvard Univ Press.

Jensen, D., Lierre, K., Wilbert, M. (2021), Bright Green Lies, Monkfish Book, Rhinebeck, NY.

Jevons, W. S. (1866), The coal question; an inquiry concerning the progress of the nation and the probable exhaustion of our coal-mines, Macmillan.

Kahneman, D. (2011), Thinking, Fast and Slow, Penguin Books, London.

Keynes, J.M. (1936), The general theory of employment, interest and money, Cambridge.

Lavigne, F., De Coster, B., Juvin, N., Flohic, F., Gaillard, J. C., Texier, P., Morin, J., Sartohadi, J. (2008). People’s behaviour in the face of volcanic hazards: Perspectives from Javanese communities, Indonesia. Journal of Volcanology and Geothermal Research, 172(3–4), 273–287.

Mandelbrot, M., Hudson, R.L. (2008), The (mis)behaviour of markets, Profile Books, London.

Meadows D. (1999), Leverage Points – places to intervene in a system.,The Sust. Institute.

Meadows, D. (1998), Indicators and Information Systems for Sustainable Development, The Sust. Institute.

Meadows, D. H., Randers, J., Meadows, D. L. (2013). The limits to growth (1972). In The future of nature (pp. 101-116). Yale University Press.

Miceli, R., Sotgiu, I., Settanni, M. (2008). Disaster preparedness and perception of flood risk: A study in an alpine valley in Italy. Journal of Environmental Psychology, 28(2), 164–173.

Michaud, S.P. (2023), The resource balanced economy to meet the twin challenges of phasing out fossil fuels and self-sufficient supply of raw materials, BSR policy Briefing 2/2023, Centrum Baltic.

Platje, J., Motylska–Kuźma, A., Will, M., van Dam, Y.K., Kampen, J.K. (2023), Dominant Social Paradigm and ignorance of system risks – an empirical study, paper in progress.

Platje, J., Will, M., Paradowska, M., van Dam, Y.K. (2022), Socioeconomic Paradigms and the Perception of System Risks: A Study of Attitudes towards Nuclear Power among Polish Business Students. Energies, 15, 7313.

Raworth, K. (2017). Doughnut Economics, Chelsea Green Publishing.

Simon, J. (1981). The Ultimate Resource, Princeton, Princeton University Press.

Slovic P. (1987). Perception of risk. Science, 236, 280–285.

Smil, V. (2022), How the world really works, Viking, UK.  

Sterman, J.D. (2000), Business Dynamics: system thinking and modelling for a complex world, Irwin/McGraw Hill, Boston. 

Taleb, N.N. (2007), The Black Swan – the impact of the highly improbable, Penguin Books, London.

Taleb, N.N. (2012), Antifragile – things that gain from disorder, Penguin Books, London.

Taleb, N.N., Bar-Yam, Y., Douady, R., Norman, J., Read, R. (2014), The precautionary principle: Fragility and black swans from policy actions, NYU Extreme Risk Initiative Working Paper, pp. 1-24.

Wachinger, G., Renn, O., Begg, C., Kuhlicke, C. (2013). The risk perception paradox-implications for governance and communication of natural hazards. Risk Analysis, 33(6), 1049–1065.

Wucker, M. (2016), The gray rhino: How to recognize and act on the obvious dangers we ignore, Macmillan.

If you are interested in cooperating with the Research Centre of System Risk Management, please contact:

dr hab. Joost (Johannes) Platje, prof. UWSB Merito in Wrocław — Director of the Research Centre for System Risk Management