I’m convinced that the ability to think and act systemically is the greatest intellectual and practical challenge of this century – relevant to climate change, economic growth, water use and digital technology and just about everything else.
I recently gave a keynote at the World of Systems and Cybernetics Conference in Ibague, Colombia. This brought together some of my intellectual heroes. One was Fernando Flores, former Chilean finance minister, inventor, entrepreneur, philosopher and co-author of the best book I’ve ever read on entrepreneurship, ‘Disclosing New Worlds’.
Another was Humberto Maturana, biologist and philosopher and fairly unique in terms of the many fields he has influenced, from science to poetry, sociology to art. There were also dozens of people who are applying systems thinking to everything from traffic management to water, alleviating poverty to economic growth.
I’m convinced that the ability to think and act systemically is the greatest intellectual and practical challenge of this century
I went because I’m convinced that the ability to think and act systemically is the greatest intellectual and practical challenge of this century – relevant to climate change, economic growth, water use and digital technology and just about everything else. Yet our dominant intellectual frameworks are largely un-systemic, and our most powerful institutions, from big companies to governments are mainly designed to squeeze out systems thinking.
The tradition Maturana and others are part of traces its roots to the cybernetics of Norbert Wiener, through figures like Heinz von Foerster and Stafford Beer, to Niklas Luhmann – who remains one of the most extraordinary social scientist theorists of recent times, and integrated Maturana with the core of the Western social theory.
Its models of feedback loops and system dynamics have been absorbed into much computing as well as the tools used to manage energy or transport systems, and it provides a way of thinking about everything from how a firm thinks, to how water flows around a city. Concepts such as ‘requisite variety’ – the idea that any system controller needs as much variety as the system it is controlling – are widely understood. Several Nesta publications have tried to take stock of the field and where it’s heading.
Maturana’s main contribution was the concept of autopoiesis – the insight that all living things exist in order to exist, and create themselves. I wrote about the application of autopoiesis in a couple of books (notably ‘Connexity’) and was particularly excited to see the concept being made more rigorous and measurable. One presenter talked of the ‘life ratio’ as a measure of autopoiesis – how much of the complexity of a system is defined by itself and how much is defined by its environment.
A practical example of this was traffic light management: does traffic circulation work best with a centrally controlled system or one where each traffic light can make decisions independently, for example in response to the relative weight of traffic in each direction. This is the sort of thing that can be modelled precisely, with detailed answers on what balance of centralisation and decentralisation works best under different conditions.
Others linked systems thinking to the hermeneutic tradition – and the interpretation of texts. Since all human organisation and cooperation depends on communication, it’s important to recognise that the person communicating cannot control how their messages will be interpreted. Most profound change is as much about changes in interpretation, and how we make sense of the world, as it is to do with changes in physical or monetary quantities.
My keynote presented some of Nesta’s work: new ways of organising feedback in systems using data; our work on innovation, working simultaneously at multiple levels from analysis and policy to demonstration projects; and our current work on systems change. This is addressing central questions for the systems field: how do you bring the parts of a system together to sense themselves as one system; how do you encourage the people in a system to share a diagnosis of what’s wrong; to design improvements; to rewire the connections between the parts of a system; and then to make the leap to a new way of doing things.
This sort of change is always as much a question of emotion and relationships as it is of rational design. But it can be described, and to a degree planned and managed (my paper for the conference, which will be published shortly, sets this out in more detail).
The event was in some respects a reminder of the intellectual thinness of much discussion of the digital world
Some of the discussion picked up on Cybersyn, the extraordinary project under Chile’s President Salvador Allende in the early 1970s to create a cybernetic network to manage the economy (WOSC was overseen by Raul Espejo who had helped design it). The inspiration for Cybersyn was the eccentric English consultant Stafford Beer, a man of cigars and Rolls Royces, grandiose theory and flashes of brilliance. Eden Medina’s book ‘Cybernetic Revolutionaries’ describes him and the Cybersyn project brilliantly, as does a recent New Yorker article by Evgeny Morozov which presents it as the socialist precursor to big data.
It was an inspired project but also a rather mad one, given that at the time Chile was being undermined by a very overt conspiracy of the CIA, American multinationals and the military, which ended up with a coup, the murder of Allende and thousands of Chileans. To put it mildly, there were more pressing priorities than the creation of a cyber utopia.
The event was in some respects a reminder of the intellectual thinness of much discussion of the digital world. Many of the insights painfully learned decades ago about the interaction of machines and humans, and the social dimension of technological change, have been forgotten. The main writers on digital technology have none of the subtlety of an earlier generation of systems thinkers, polarising between glib enthusiasts or uncompromising critics.
You can see the problem mirrored in current discussions of topics like algorithmic regulation. Many things are already managed algorithmically – and not just traffic systems. In the UK algorithms have been widely used in GP surgeries (to predict likelihood of a hospital incident), and in criminal justice (to predict likelihood of reoffending).
We’re a long way from Minority Report but we have tools at our disposal which can help make the world more manageable. Of course each of these tools brings with it important political and ethical dilemmas. But most writing on the subject is either boosterism or paranoia: neither quite does justice to the challenge. At their best the traditions of systems thinking and cybernetics have made processes and feedback loops transparent, so making it easier to make judgements about their virtues and vices.
Some of their intellectual tools should be the common currency of a connected world. But they’re not generally taught in schools or universities or in civil service colleges. For that the field itself must take some of the blame, having often been inward looking, and imprisoned in its own jargon.
More than anything perhaps we now need some really good synthesisers and communicators to make this stuff the common sense it should be.