NOTE: This blog post was first written for the Oxford Internet Institute’s Connected Life conference. Find the original post here.

For decades, tech companies have competed to make devices that do more faster, and with less user effort. In a current culmination of this trend, powerful smartphones have enabled us to truly do anything anywhere anytime. But a surprising opposition has emerged against the ruling wisdom that ‘convenience is king’. In the Guardian, writer Jonathan Safran Foer complained that he “like so many people I know”, felt that constant connectivity “distracted me, made concentration more difficult … [made me check] email while giving my kids a bath”.

The Economist recently ran a feature on addictive apps, and a whole market has emerged for apps that remove functionality from laptops and smartphones (e.g. Freedom and Newsfeed Eradicator), punish users for using Facebook when they shouldn’t (e.g. Forest), or provide users with easy ways to track and visualise how much they use their digital devices (e.g. RescueTime). A blog post on Medium that gave advice on which apps to remove and which parental restrictions to apply to make your iPhone ‘distraction-free’ amassed hundreds of thousands of reads in a few days.

Scientific research on self-regulation may explain what’s going on. As we said above, the central goal for developers has been to give users the easiest possible access to the widest amount of functionality, leaving users to exercise willpower over how they use their devices. Yet, a wealth of psychological research tells us that relying on ‘willpower’ is a terrible self-control strategy. Empirically, the difference between people who are good, and people who are terrible, at self-control isn’t that the former have stronger ‘willpower’. Rather, people high in self-control are better at structuring their environment so that they are less exposed to temptation(e.g. not keeping unhealthy foods in the house, going to a library to study, etc.).

Imagine one of your friends was attempting (and failing) to eat fewer cookies. As you try to help your friend, you discover that he brings a huge supply of cookies with him wherever he goes. You probably wouldn’t advice him to just try harder not to eat cookies. Rather, you might advice him to not always keep cookies with him, or to only keep a limited amount, or perhaps to hide away the cookies out of sight in a kitchen drawer. Yet, when it comes to digital technology, the default design is to provide instant access to all functionality imaginable, without providing easily accessible ways for users to set limits for themselves – creating a situation analogous to carrying around an infinite supply of cookies with no ability to impose restrictions.

Some commenters on digital distraction suggest that people unplug, reclaim deep attention skills, and revolt against malevolent ‘attention merchants’ who profit from making digital devices as hard to let go off as possible. I think Black Mirror-esque rhetoric should be left aside. Instead, we should see unwanted distraction from laptops and smartphones simply as a design problem. Hundreds of anti-distraction tools now exist, and they use a surprising range of strategies to help users align their actual and intended behaviour. Yet, research in human-computer interaction and psychology has focused mostly on documenting distraction from information communication technologies (ICTs), not on how to design against it.

We need a focused research effort here: Do the new anti-distraction tools actually change behaviour? Why do they do so? Which strategies are more powerful? Which strategies are promising but haven’t been explored? Such a research effort could establish best practices for non-distracting ICT design. On that basis we can then discuss how to ensure that mobile companies and app developers make it easy to configure our virtual environments in ways that help us manage our attention effectively.