![](/images/default-source/GIA/images-jpg.jpg)
In recent years, it’s become fashionable, almost to the point of cliché, to begin every rumination on policy or business by sagely proclaiming that the environment is, or is becoming more, complex. But when terms like “complex” and “complexity” are loosely bandied about with no clear understanding of what they mean, the phrase “it is complex” becomes less an accurate description of a situation or environment, and more a lament or an excuse.
It becomes a lament, an apologia, because it is an expression of frustration at our inability to understand or explain the problems and challenges (which may or may not be complex in themselves) that confront us. Worse, it runs the real risk of becoming a convenient excuse not only for failures that are borne out of overconfidence, complacency, laziness and sloppiness, but also for inaction disguised as benign neglect in favour of hoped-for solutions.
So what is a complex system and what traits does it exhibit? For a start, let’s focus on the word “emergent”, usually encapsulated in the aphorism “the whole is more than the sum of its parts.” The implication is that organisations, situations and crises should be appreciated holistically. Not only that, they cannot be reduced to, or wholly explained by, its component parts. As the former US Federal Reserve Chairman Alan Greenspan put it, a single bank cannot explain the banking system. Neither can the actions or behaviours of a single, archetypal individual (no matter how representative) explain our social problems.
An important implication of this that the conventional economic approach that focuses on the calculus of a “representative agent” and then extrapolates it to predict the behaviour of economic system is, at best, a first approximation of how the economy – a complex system – works. At worst, it can lead to rather misleading conclusions and policy prescriptions.
At the same time, the whole is less than the sum of its parts, since the whole imposes constraints on the parts that make up the whole. Just think of individual components of a system that are of high quality. When the system as a whole fails, it is cold comfort to know that the individual components had high tolerance rates. This is a useful reminder, especially for leaders, that the performance of a complex system cannot be reduced, in a linear fashion, to simply the existence of high quality components or, in the case of organisations, competent individuals.
Secondly, a complex system is often defined by fat-tailed behaviours, where rare events with large consequences occur much more often than would be predicted by normal probabilities. Malcolm Gladwell, among others, drew a distinction between a puzzle, the solution to which is known and knowable ahead of even trying to solve it (think jigsaw puzzle or Sudoku), and a mystery which can be ambiguous, subjected to different interpretations, and defies clear answers. Complex problems fall into the latter category.
Third, complex problems exhibit inherent uncertainty and unpredictability. What this means is that the relationship between cause and effect is weak and not repeatable. Ex-ante, this cause-effect relationship is not apparent. Ex-post, a “causal” relationship may appear to be the case. Even then, often this is a case of retrospective coherence rather than a stable, repeatable cause-effect relationship that can be taken for granted. This is what distinguishes complexity from complicatedness. In the case of the latter, the relationship between cause and effect is clear, even if it is clear only to technical experts. Complicated systems like a car or an aircraft can be engineered; they can be taken apart and put back together in exactly the same way. In contract, a rainforest or an ant colony are complex – they cannot be engineered or re-constructed in the same way that a car or aeroplane can be.
Dealing with Complicated Problems
This distinction between complex and complicated problems is crucial for managers. Understanding the nature of their problems allows them to tailor their responses and solutions accordingly.
Complicated problems can be addressed by checklists, standard operating procedures (SOPs), and the sharing of information and best practices. With complicated problems, the risks are known or at least quantifiable. Experts know in advance of the problem occurring what can be done to minimise the risks of occurrence. It is also possible to identify best practices, and to require professionals to adhere to them.
Take the recent outbreak of Hepatitis C at the Singapore General Hospital as an example of a complicated, not complex, problem. The risks of infection are knowable and known. This was not a crisis borne out of complexity. Rather, as the Independent Review Committee (IRC) pointed out, the infections were the result of “deviations from standard procedures” and “inefficient workflow”. Poor infection control in an environment where patients were at higher risk of exposure and were more susceptible to infection created the perfect storm.
Medical professionals were also not in any doubt about what constituted best practices in infection control. Indeed, one of the biggest advances in healthcare in recent decades has been in the introduction of medical protocols and checklists, especially in hospitals, to reduce the risks of infection and mistakes by professionals.
As Atul Gawande illustrates in The Checklist Manifesto: How to Get Things Right, the implementation of a five-point checklist in the intensive care unit at Johns Hopkins Hospital in 2001 virtually eliminated central line infections, and prevented an estimated 43 infections and eight deaths over a period of 27 months. The same checklist was tested in ICUs in Michigan, and reduced infections by 66 percent in three months, saving more than 1,500 lives in a year and a half. Failure in hospitals, Gawande concludes, is less the result of ignorance (not knowing what works), and more the result of ineptitude (not applying what we know works).
Not surprisingly, the IRC urged SGH to “review existing SOPs and practices on infection control, to further reduce risk of contamination of medical equipment and contact surfaces and to ensure adequate environmental cleaning and disinfection”, to “ensure adherence to standard precautions for infection control and adopt best practices”, and to “strengthen the monitoring and supervision framework for staff to ensure compliance to SOPs” (italics added for emphasis).
Beyond Checklists: Dealing with Cognitive Limitations
But while establishing checklists and SOPs are a necessary first-step, they are not sufficient. Just because professionals are dealing with problems for which best practices and established processes exist does not mean that they will always make good decisions. Even with known risks for which checklists exist, professionals may still fall prey to a variety of cognitive complications and errors.
Compliance with established SOPs may not be as straightforward and effortless as we assume. This is especially so when professionals are often tired, distracted, or both.
That well-trained professionals sometimes fail to stick strictly to established protocols, or default into less-than-optimal behaviours, when they are tired is quite well-established. A study in 2011 of eight Israeli judges considering more than 1,100 parole applications found that the judges were more likely to grant parole at the start of day, and after breaks for a morning snack and lunch.[1] Shai Danziger, one of the co-authors of the study, argued that the combination of “choice overload” and repetitive decision-making led to the judges choosing the “lazy” or default option of denying the parole application.
In the healthcare context, a 2014 study by researchers from the University of Pennsylvania and the University of North Carolina at Chapel Hill found that while healthcare workers washed their hands on 42.6 percent of the occasions that they were supposed to over the course of a 12-hour shift, this figure was only 34.8 percent in the last hour of the shift.[2] The study also found that workers were more likely to wash their hands after a longer time off between shifts.
What these studies suggest is that even well-trained professionals find it difficult to engage in careful, deliberate decision-making, and that they often rely on the default option (or the path of least resistance) when they are cognitive depleted. If so, the solution does not just lie in exhortations to hospital staff to “pay attention!”, or in increasing monitoring and supervision. Rather, it is to think creatively about how SOPs can be designed in a way that requires less cognitive effort from professionals.
Making SOPs simpler, more intuitive and more compatible with people’s cognitive limitations over a long working day is likely to be more effective in the long run than diktats from above. Having professionals themselves develop the SOPs they will follow is also a good practice.
A second factor that has been found to reduce professionals’ ability to comply with established procedures is distraction. Take medication errors, a common cause of harm to patients, as an example. In wards, nurses are frequently distracted and interrupted by requests by patients, instructions from doctors, phone calls to the ward, handling queries by visitors, looking for equipment, etc.
While such interruptions and disruptions are common, they can be extremely dangerous when nurses are doing their drug rounds. Interruptions then may cause drugs to be administered wrongly or in the wrong dosages. One solution, which has been tried out and is relatively easy to implement, is to require nurses on drug rounds to put on a sign that says, “Drug round in progress – do not disturb!”[3]
Beyond Individual Limitations: Managing Organisational Biases
Beyond a professional’s cognitive limitations, it is also critical to understand the organisational context in which he works. In the case of medical professionals, understanding the hospital environment and its organisational biases produces more reasons why a singular emphasis on monitoring and supervision for compliance with established SOPs may not work as well as in, say, a military context.
Hospitals are not just large organisations; they are also sprawling ones. Within a hospital, one usually finds many specialisations, each with its sub-specialisations and, sometimes, its own norms and culture. Medical professionals in one specialisation are also usually reluctant to comment on, or review, the decisions or practices of their peers in another.
In addition, while there is a formal hierarchy in hospitals (as there is in an army brigade), these often matter less than the authority and respect that individual physicians command among their peers and juniors. The nature of a hospital’s work – 24/7 operations, numerous “transactions” that are usually not visible to managers – also makes it difficult for hospital managers to exercise close monitoring and supervision, or to ensure strict compliance with SOPs that have been agreed upon.
In recent years, the proliferation of well-designed and user-centred personal devices have made consumers appreciate the importance of a good interface between humans and machines for optimal performance. Consider the latest smart phones. Even though they contain highly complicated technology, users do not have to undergo special training or read manuals to make good use of them. They are mostly intuitive and easy to operate based on what users already know.
But the importance of a well-designed interface between humans and procedures is much less appreciated and established. This is an area where much more can be usefully done to improve human performance – particularly in high risk areas such as medical care.
Action in a complex world is essentially a wager, due to non-linearity, irreducible uncertainties, and emergence. By contrast, complicated problems lie squarely in the domain of “rocket science”. So really, it’s not complex, it’s rocket science which we mastered a long time ago.
[1] Danziger S, et al. Extraneous Factors in Judicial Decisions, Proceedings of the National Academy of Social Sciences, 25 February 2011.
[2] Dai H, et al. The Impact of Time at Work and Time Off From Work on Rule Compliance: The Case of Hand Hygience in Health Care, Journal of Applied Psychology, 3 November 2014.
[3] Scott J, et al (2010). The effectiveness of drug round tabards in reducing incidence of medication errors, Nursing Times; 106: 34, 13-15.
Written by Donald Low, Adrian Kuah and Lam Chuan Leong, this article first appeared in The Straits Times on 19 December, 2016.