May 13, 2025
Topics Policymaking

Imagine spending months and considerable resources developing a new initiative, only to realise halfway through that it’s no longer viable or aligned with strategic goals. Do you cut your losses and move on, or push ahead in the hope that your investment will eventually pay off? 

Many would choose the latter — an instinct driven by what behavioural economists call the sunk-cost fallacy. 

This is the tendency to invest time, money, or effort into something simply because you’ve already invested heavily in it, even if continuing is not beneficial or rational. Essentially, it’s about not wanting past efforts to feel wasted, often leading to poor decisions.  

When governments can’t let go

Many well-intentioned public policies start the same way; the initial execution may be good, but after a while, they can go off track, sometimes enormously. There are myriad examples in the annals of bad decisions, but perhaps the best is the Concorde. The faster-than-sound passenger jet was the hallmark of 1970s and 80s fame, wealth and power, but also a financial black hole that never stacked up economically. So much so, the sunk-cost fallacy is also known as the Concorde fallacy. 

We as humans are famously irrational creatures, but policy makers are supposed to be more rational, or are they? Behavioural economist at the Lee Kuan Yew School of Public Policy (LKYSPP), National University of Singapore (NUS), Professor Lawrence Jin, outlined some of the hidden psychological factors that public policy makers can fall prey to and how we might mitigate those biases. 

Firstly, the biases. Sunk-cost fallacy can be a particularly difficult bias for governments. There exists an inherent tension between the legitimate desire for governments to undertake bold, transformative projects that can yield significant long-term public benefits, and the heightened risk of falling victim to the sunk-cost fallacy that such ambition often entails.  

But, according to Professor Jin, it is not just policymakers and government decision makers who are susceptible, because public opinion can make the decisions more political than rational.  

“If the public is already heavily invested in an existing policy, then it is much harder for policymakers to make major adjustments, because sunk-cost fallacy will make the public more upset from these adjustments,” he says.  

Examples in Singapore could include the decision to continue with the city’s Circle Line after the Nicoll Highway collapse in 2004 despite increasing costs, or the government’s decision to persist with the public-private partnership developing the new National Stadium after the consortium ran into financial trouble following the global financial crisis in 2008.  

The illusion of control

Another bias is the illusion of control, where policymakers might overestimate their capacity to influence outcomes. Such a bias can lead to overly ambitious or misguided policies, and hinder timely recognition of ineffective policies and the necessity to change course. 

“It is tempting to believe that outcomes like economic growth, housing prices, and public health trends can be fully controlled through textbook policies. They can, to some degree, but often much less than we might expect,” said Professor Jin.  

“This is why we think we have a better chance of winning the lottery by picking lucky numbers; pro athletes perform rituals, for example, Rafael Nadal's meticulous routines on the tennis court; and parents over-protect their children with the belief that good parenting can guarantee good outcomes for the children.” 

Seeing what we want to see 

Confirmation bias is problematic in policy creation, because it leads policymakers – and the public – to seek information confirming their existing beliefs. 

This bias can exacerbate challenges, such as when tackling misinformation, as seen in public health issues like vaccine scepticism. 

“Policymakers prone to confirmation bias would be inclined to believe that their policies are working, even when they are not. They must also account for confirmation bias among the public, as it can often hinder the effectiveness of well-intentioned policies,” said Professor Jin. 

He said an example might be a policy intervening to tackle misinformation spreading on social media about vaccine side effects by mandating fact checkers and finding fact checkers don't work, because of confirmation bias among the public. 

“Vaccine sceptics or climate change sceptics are unlikely to read fact checks and change their core beliefs. It can in fact potentially backfire by reinforcing their original views, including beliefs in conspiracy theories,” he said. 

The spotlight effect in policy 

Salience bias, the tendency to focus disproportionately on prominent issues that grab our attention, also significantly influences public policy, particularly in healthcare. Such selective attention can lead policymakers to overlook vital but less visible problems until it’s too late. 

“Salience bias can explain why we are quick to respond to salient health crises like the COVID-19 pandemic but not so much to less visible issues like mental health, an ageing population, and chronic disease prevention,” Professor Jin said.  

“This is why we fear plane crashes even though car crashes are far more likely to be fatal; fear shark attacks even though we're more likely to die from jellyfish or deer; and worry about nuclear energy even though wind and hydro power result in more fatalities each year.” 

Decision-making patterns in practice 

One of Professor Jin’s areas of research is looking at what is known as path dependency in healthcare. His research revealed that physicians often rely on recent experiences when making treatment decisions, demonstrating a form of cognitive shortcut that can bias future decisions.  

“After controlling for patient conditions, physicians exhibited a tendency to repeat the same treatment decision. This pattern was more pronounced when the current patient was similar to the previous patient in observable characteristics. However, when two consecutive patients differed unexpectedly, physicians sometimes exhibited a tendency to reverse the previous decision — more than was warranted by the current patient’s condition.”  

A similar dynamic might occur in policy decisions, he said, potentially leading to inappropriate repetition or reversal of policies due to superficial similarities or differences between issues. 

How to mitigate bias 

Policymakers can mitigate the influence of cognitive biases by first being aware of the influence they play, and taking steps to ensure decisions are made at the right time. Simple countermeasures like getting enough sleep can play a role. 

“We found that physicians are more prone to bias when fatigued. Using reminders to draw decision-makers’ attention to relevant information and encouraging open discussions, especially with external experts, can mitigate bias effectively,” Professor Jin said. 

Engaging external experts in open and critical discussions about policies is useful, he said, because “the policymaker may be prone to sunk-cost fallacy, but the external experts won't be.” 

Incentives can be used to help create an environment for unbiased decision making, but it can be counterproductive. There is the potential for over-correction, which could be a bias in and of itself. Also, individuals can be prone to varying biases in their decision-making.  

“The magnitude of the bias varies across individuals and can change over time. A person not prone to sunk-cost fallacy might make policy adjustments too frequently if incentivised incorrectly.”  

Singapore’s Civil Service College has been actively highlighting the potential impact of biases on policy making in its training and publications for many years. 

When biases work 

Professor Jin also cautions that biases might not always be detrimental. Some studies have suggested that cognitive biases, like overconfidence, might have beneficial effects. 

Looking back to the Concorde, it lost money, and the project ended in a tragic accident, but it remains an inspiration to many. Equally, another example of a project that fell victim to sunk cost fallacy, the Sydney Opera House, which went 1400 per cent over budget, has gone on to become a global icon and arguably a priceless representation of Australia’s coming of age as an independent nation. 

In other words, it’s good to be conscious of cognitive biases in policy making while also recognising a good thing when you see it.

Topics Policymaking

BE PART OF THE COMMUNITY

Join close to 50,000 subscribers