There is a growing challenge taking shape in Singapore, one involving attention spans, screen time, and invisible dopamine loops. Earlier this year, the Singapore government introduced updated screen time guidelines for children, acknowledging growing concerns that excessive use of digital devices may be contributing to health and behavioural problems in younger users. That announcement came on the heels of a survey revealing that one in two Singaporeans shows signs of problematic smartphone use.
This concern is reflected in national data: a survey jointly conducted by CNA and the Institute of Policy Studies, which polled youths and their parents, found that Singaporean teenagers aged 13 to 19 spend nearly 8.5 hours daily on electronic devices. Additionally, 58.8 per cent of respondents reported using screen time as a coping mechanism for negative emotions, and 62.5 per cent of teens surveyed found it difficult to reduce their screen times — indicating an imbalanced and unhealthy relationship with technology.
Compulsive use of digital devices may lead to several adverse effects on physical and mental or psychological health — ranging from sleep disorders, poor posture, progressive myopia, digital eye strain, and increased risk of obesity due to a sedentary lifestyle. Compulsive smartphone use has also been associated with symptoms of depression and anxiety. Therefore, the question facing policymakers is no longer whether digital overuse is a public concern, but what can be done about it.
Dr Araz Taeihagh — Assistant Professor at the Lee Kuan Yew School of Public Policy (LKYSPP), National University of Singapore (NUS) — believes the answer lies not just in behavioural change but also in the governance frameworks for the technologies shaping such behaviour. His research focuses on policy design and governance of emerging disruptive technologies.
"The rapid pace of development of these technologies often surpasses our ability to understand the implications of these technologies, which leads to significant risks and unintended consequences," Professor Taeihagh said. “Due to a lack of information about these technologies and the networked nature of these technologies, traditional command and control and incentive-based policy tools are ill-equipped to deal with them.”
The smartphone, once a communications tool, has evolved into a portal for everything from entertainment and education to e-commerce and social life. The apps we use are no longer neutral — many are designed to maximise engagement, keeping users scrolling, swiping, or playing longer. This shift is not accidental. It’s the result of behavioural design: using psychological cues to hook users and form habits that can be hard to break.
But unlike other public health threats, such as tobacco or sugar, the harms of compulsive technology use are harder to detect and even harder to regulate. Platforms are constantly updating their algorithms. New apps go viral in weeks. And policymakers, already stretched across sectors, often lack the expertise or agility to respond in real time.
A recent study, “New policy tools and traditional policy models: better understanding behavioural, digital and collaborative instruments”, co-authored by Professor Taeihagh in Policy Design and Practice, adds further weight to this view. Governments increasingly rely on new policy tools — including behavioural ‘nudges’, digital platforms, and collaborative co-design processes — that sit outside the traditional regulatory toolbox. These tools, when deployed effectively, can reshape how individuals interact with technology. However, he emphasises that behavioural nudges alone may be insufficient for some high-risk populations, and that hybrid approaches combining nudges with targeted limitations can be necessary (e.g. soft nudges with automatic time-outs for under-12s).
As the research notes, deployment of these instruments presents a range of challenges. Addressing them requires a better understanding of their performance in different domains and contexts, the effects of capacity constraints on their design and use, as well as the concerns they raise about issues such as data governance, algorithmic bias, and fairness.
Professor Taeihagh said, “The government’s critical role in navigating disruptive technologies’ challenges means that they must adopt an adaptive and proactive approach and increase transparency and collaboration by drawing from public consultations and partnerships with academia, industry and civil society to benefit from a wide range of experiences and expertise to better understand disruptive technologies and address the challenges they bring.”
Some countries have already tried more traditional methods, with mixed success. South Korea, for instance, once enforced a “Shutdown Law” in 2011, banning online gaming for minors between midnight and 6 am. The policy was in place for a decade but was eventually abolished in 2021, as tech-savvy youths found workarounds and questions emerged about its effectiveness.
In Singapore, while the government’s emphasis has so far leaned toward education and public awareness — and guidelines primarily target screen time for children — Dr Taeihagh’s research underscores that governance must extend beyond these measures to encompass design-level interventions across all user groups. The new screen time guidelines — developed with input from health professionals — offer age-specific advice for digital device use at home and in schools. But turning recommendations into real behavioural change will likely require more ambitious approaches.
One approach is to improve platform design transparency. Apps could be required to disclose when and how they use algorithms to increase engagement, or to offer opt-in features that encourage healthy usage — such as screen time reminders, time caps, or focus modes. Another possibility may include exploring ways to shift design incentives altogether, rewarding companies for promoting well-being rather than just attention metrics. Professor Taeihagh explained, “Promoting collaboration and partnerships can help encourage ethical and responsible innovation.”
He added, “Appropriate levels of regulations and provision of ethical guidelines can help ensure the responsible use of technology.” However, poorly designed regulations can have unintended consequences, such as pushing innovation to other jurisdictions or worsening inequalities in digital access. Any effective policy must aim to strike a balance between protection and participation — safeguarding public health without excluding communities from the benefits of technology.
Furthermore, as technologies evolve, tomorrow’s digital environments could be even more immersive — and potentially more addictive — than today’s. With virtual reality, generative AI, and even brain-computer interfaces entering the mainstream, the line between online and offline life is blurring, and policymakers need to prepare for the governance challenges that will come with it.
Ultimately, the issue of tech addiction is not just about individual discipline or parental control, though those remain important, but also about building a system of governance capable of responding to technologies designed to outpace regulation.