Emergence versus Entropy, By Julian Birkinshaw, London Business School Term Chair Professor of Strategy and Entrepreneurship.
“I have been puzzling over complexity in organisations for a while now, and I reckon there are three processes underway in organisations that collectively determine the level of actual complexity as experienced by people in the organisation.
1. There is a design process –the allocation of roles and responsibilities through some sort of top-down master plan. We all know how this works.
2. There is an emergent process – a bottom-up form of spontaneous interaction between well-intentioned individuals, also known as self-organising. This has become very popular in the field of management, in large part because it draws on insights from the world of nature, such as the seemingly-spontaneous order that is exhibited by migrating geese and ant colonies. Under the right conditions, it seems, individual employees will come together to create effective coordinated action. The role of the leader is therefore to foster “emergent” order among employees without falling into the trap of over-engineering it.
3. Finally, there is an entropic process – the gradual trending of an organisational system towards disorder. This is where it gets a bit tricky. The disciples of self-organising often note that companies are “open systems” that exchange resources with the outside world, and this external source of energy is what helps to renew and refresh them. But the reality is that most companies are only semi-open. In fact, many large companies I know are actually pretty closed to outside influences. And if this is the case, the second law of thermodynamics comes into effect, namely that a closed system will gradually move towards a state of maximum disorder (i.e. entropy).
And here is the underlying conceptual point. The more open the organisation is to external sources of energy, the easier it is to harness the forces of emergence rather than entropy. What does this mean in practice? Things like refreshing your management team with outside hires, circulating employees, making people explicitly accountable to external stakeholders, collaborating with suppliers and partners, and conducting experiments in “open innovation”.”
Mr. Birkinshaw’s advice, while sound, obscures the deeper psychological and cultural aversion to even thinking about entropy. The people who run organisations are organised, and organised people don’t do entropy. There are people who live by entropy, but they just don’t work their way up to the top of organisations. They take what serendipity comes to them and accept the losses of fate. They may achieve some notoriety in art or science, but only tangentially. The really successful artists and scientists are organised (Einstein’s work was organised by his first wife.)
Organised people achieve incredible success in moments of low entropy, because when you take a system with profound imbalances and impose a few constraints, the results look miraculous. Think of the Hoover dam. When it was constructed, it unleashed an immense amount of “free energy”, but that energy attracted crowds to Arizona and California that were greater than the dam could sustain. Now the dam is just a small part of the energy portfolio, too big to shut down but too small to make a difference in anybody’s life. The entropy of the system has overwhelmed its relevance, and now no organisation will bring back the low entropy of a rural landscape with a flowing, falling river. The application of constraints appears to make free energy, at first, but at some point additional constraints of the same kind only reduce the free energy further. The result is a social and cultural catastrophe with no logical answers. The same is true in Israel and everywhere else where resources have been allocated in logical but arbitrarily orders.
Remember that entropy is not a law or a process. It is just an idea, a way of understanding what happens when the logic of causation break down. Statistical mechanics works for phenomena that emerge consistently by chance, not through any chain of cause and effect. Entropy is a way of understanding how systems evolve on their own initiative, unintentionally, and generally uncontrollably. The laws are conservation of momentum, information, and the speed of light. Entropy emerges from the fact that those laws place no constraints on location, and within them nothing is prohibited. Information and momentum are free to go anywhere, so they go as many places as possible as soon as they can get there. The second law of thermodynamics might be restated without mention of entropy as “the energy and information in the configuration of matter will tend to distribute itself as evenly and randomly as possible throughout the available space because it has no reason not to.”
One curious result is that the entropy of one body is constrained by the entropy of everything around it, not by a law or limit of entropy. Curiouser still is the notion that Alice in Wonderland is not a book about entropy, but a book about information unfettered by entropy. What would happen if physical reality were governed by information? It would be arbitrary and capricious. While entropy is not logical or causative, logic and causation depend on the backstop of entropy for their meaning and consistency.