By Dan Nixon
The battle for our attention is the defining problem of our time. It determines election results. It underpins the digital economy. It can erode our social competencies. And, most fundamentally, it shapes who we become as human beings.
Due of the sheer volume of information we have available, we’ve witnessed our attention becoming increasingly scarce. Our various news feeds, messages and social media notifications are in a constant battle for our attention – they are the competing forces in the so-called ‘attention economy’. Meanwhile, the news is filled with stories of shorter attention spans and fragmented minds: we are distracted nearly 50% of the time; overstimulation can lead to ‘technostress’; there are increasing reports of digital addictions.
But looking beyond the various mental health angles, why is the ‘crisis of attention’ such an urgent issue for us to deal with – culturally, socially, politically and economically?
‘You are being programmed’: how is technology shaping us?
It’s hard to overstate the cultural shift that’s come about via the expansion of connected technologies into so many spheres of our lives. Needless to say, there are countless examples of digital technologies improving things for us. But as well as the benefits, the ‘digital turn’ in contemporary culture has brought with it some serious problems, too – and many of these can be seen through the lens of our human capacity to pay attention.
To understand what’s going on, you could start by noting down the various ways in which technologies intermediate our actions – we use email to send messages, we use spreadsheets to do the accounts. The assumption being: we put the technologies we create to work for us; we are in charge. But that assumption becomes less clear-cut in a world in which more and more of our day-to-day decisions are determined by complex algorithms that, as users, we don’t understand. These algorithms, written by tech giants like Google, Facebook and Amazon, are powered by huge amounts of data relating to our preferences, habits and vulnerabilities.
The pertinent question, then, is not so much how technology is shaping what we do but how is it shaping who we are? It was in this context that Chamath Palihapitiya, former Facebook vice president for user growth, recently warned: ‘You don’t realise it, but you are being programmed.’ There’s a fair bit to unpack in that statement, so let me try to explain how I see it.
First, there’s the point, widely acknowledged in Silicon Valley circles, that apps (such as Facebook) and devices (such as iPhones) are designed to be as addictive as possible. Drawing from findings in neuroscience and cognitive psychology, these technologies are designed to grab our attention and, in many cases, to hold it for as long as possible. Apps that use the ‘bottomless scrolling newsfeed’ are a good example: you want to keep scrolling just in case something good, or mildly entertaining, turns up. Tristan Harris, an ex-Google employee and founder of the Centre for Humane Technology, explains that, psychologically, this makes use of exactly the same mechanism (‘intermittent variable rewards’) used to get people hooked on slot machines. Listing various other features that help smartphones to hijack the mind, Harris talks of several billion people having ‘a slot machine their pocket’.
So we’re being programmed to depend on our consumer technologies.
But in addition, there’s the issue that in general our goals are not aligned with those of the tech giants making the products. James Williams, also formerly at Google, notes the temptation to think of our smartphones as functioning as something like ‘GPS for life’: just as they help us get from A to B in a literal sense, they also help us perform a range of everyday tasks: ordering a new ink cartridge, messaging a friend. But when it comes to how we spend our time and attention online, Williams argues that the ‘GPS for life’ analogy offers a misleading picture of reality. The goals of tech giants like Google and Facebook, for instance, will include an array of ‘engagement metrics’ – time spent on site, the number of click-throughs, the number of Likes, and so on.
By contrast, spending 3½ hours on social media is hardly a goal we would set for ourselves when we wake up in the morning. Even just writing this, I’ll pick up my phone to check something like the weather, only to notice a new WhatsApp message, or a post linking to a YouTube video. With the sheer volume of distractions channelled through one device, it can feel like a lot of discipline is required to simply stay with whatever I’m trying to focus on.
So our goals diverge from tech companies’ goals. But it’s those companies who write the algorithms that steer our behaviour. As this plays out, day by day, habits form. Certain patterns of behaviour become embedded in our neural pathways. The question is: how might this shape who we become as human beings?
One concern is the threat to our autonomy. Take the new world of digital advertising (on which, more below). Adverts, of course, aren’t new. But if our attention is increasingly ‘captured’ by algorithms that make use of big data in order to exploit our psychological vulnerabilities, in a very targeted way, at which point do our decisions of what to click on or what to buy reflect the code as much as they do our ‘own’ preferences? Are we losing the ability to ‘want what we want to want‘?
Another concern is that we slip into a more passive mode of being. As already noted, social media apps are addictive, designed to hold our attention: we scroll down a feed… click on a link… and before we know it, we have shifted into ‘autopilot’ mode (helped along, perhaps, by the ‘auto-play’ function used by the likes of YouTube and Netflix). The issue is: are we really ‘there’ when we spend hours online consuming media we hadn’t set out to give our attention to? What else follows from slipping increasingly often into autopilot mode? Does it put at risk our intellectual diversity? Does it make us less spontaneous?
It’s not that these concerns are entirely unique to recent digital inventions. Nor is it that every habit we develop that’s somehow influenced by these algorithms marks a change for the worse – and far less that, habit formation aside, the algorithms don’t have many positive uses (right now, for example, I’m enjoying new music I wouldn’t have otherwise found, thanks to Spotify’s algorithms). Moreover, business strategies evolve: it’s the sort of criticism identified here which led Mark Zuckerberg to announce an overhaul of Facebook’s News Feed algorithm so that it will put greater weight on ‘meaningful social interactions’ (updates from friends and family) over ‘relevant content’ (media from third parties that Facebook predicts a user will like). While the social merits of this have been contested (e.g. here), this is at least a step in the right direction: Zuckerberg publicly acknowledges the problem. The CEO of Netflix, by contrast, has talked about ‘sleep’ being their main competitor – seemingly without irony.
But to return to Chamath Palipithaya: it feels instructive that he doesn’t let his own kids use social media – and that this is apparently pretty normal in Silicon Valley circles. It’s well-known, for instance, that Steve Jobs forbade tablets and smartphones at the dinner table. According to a recent Ezra Klein interview with Jaron Lanier, those higher up in management at one of the tech giants tend to be more restrictive with letting their own kids use smartphones or social media.
Clearly, the ‘digital turn’ described above has also affected the nature of our social interactions. We’re told that through information and communication technologies we are ‘more connected than ever’. On the one hand, technology clearly can – and does – serve this role. I have WhatsApp to thank, for example, for making my relationship with my partner work a few years back, at a time when we were living on other sides of the world.
On the other hand, we can all relate to the frustration of speaking to someone who is semi-distracted by their phone, or to social events that seem to revolve more around capturing the moment on an iPhone than actually being there, in the present, to enjoy it.
When it comes to how we pay attention in social situations, I would argue that there’s a lot on the line – both in terms of the quality of our interactions and the development of social capacities such as empathy, friendship and intimacy.
Sherry Turkle, an expert in social connectivity at MIT, has written extensively on this topic. Her latest work, Reclaiming Conversation, focusses on how much is at stake when we replace, or interrupt, conversation with electronic communication. She cites a recent Pew Foundation study showing that around 90% of adults in the US took out a phone during their most recent social interaction. Roughly 80% reported that doing so diminished the conversation.
She traces out the adverse effects of digital technologies in various conversational contexts – teachers and students, colleagues and clients, friends and family. She blames smartphone usage, for instance, for the steep decline in measures of empathy among university students over the past 20 years.
It’s parent-child relations, though, that she is most concerned by. She writes, for instance, that:
‘Parents give their children phones. Children can’t get their parents’ attention away from their phones, so children take refuge in their own devices. Then, parents use their children’s absorption with phones as permission to have their own phones out as much as they wish.’
In terms of child development, she warns, parents are playing with fire when they do things like text during breakfast and dinner as standard practice.
The issue of how (well) we pay attention in our interactions is clearly central here. As Jonathan Franzen observes, it’s through the conversational attention of parents that children acquire ‘a sense of enduring connectedness and a habit of talking about their feelings, rather than simply acting on them’. In the UK, meanwhile, thousands of children are receiving counselling for loneliness, with parents too often ‘distracted by smart phones and work pressures’ to pick up on clues that their children were suffering often, according to Childline. Chats over the kitchen table, the charity warns, are becoming obsolete.
Interestingly, Turkle emphasises being comfortable with solitude as a key input into conversation. If we’re unable to be separated from our smartphones, she writes, we consume other people ‘in bits and pieces’. Conversation also carries the risk of moments of awkwardness or, worse still, boredom – conditions which smartphones have taught us to fear the most. Yet, as writers such as Matthew Crawford have noted, these are precisely the conditions in which patience and imagination are developed.
Echo chambers, ‘bots’ and data politics
Digital technologies, coupled with the dynamics of the attention economy, are also affecting the political landscape. Perhaps most obviously, pre-existing political views can be reinforced through social media ‘echo chambers’, with evidence that users are more likely to engage with people (and media sources) who share their political beliefs. This is concerning because the idea that we are increasingly seeing only things that we agree with challenges some of the fundamental principles that democracy thrives on. Alex Krasodomski-Jones, of the think tank Demos, argues that this challenges our capacity to compromise, to read (and accept) opposing views and at least to aim towards a shared concept of what is true.
More generally, recent campaigns have shown how the battle for our attention on social media has become central in shaping political outcomes. At the World Government Summit in Dubai earlier this month, Frances Fukuyama, famous for coining the term ‘the end of history’ in 1992, spoke about the ‘weaponization of social media’, noting that while propaganda is nothing new, the speed of dissemination in a social media era is completely unprecedented. The Trump election campaign is a prime example of this, the outcome being, of course, the rise of a US president who famously has an attention span of less than 30 minutes.
The Trump campaign hired Cambridge Analytica, a UK-based firm which gathers huge swathes of online data from Facebook and Twitter – the ‘digital footprints’ we leave on social media each time we post, Like, and Retweet. They also purchased data from third-parties on subjects’ attitudes and consumer preferences. In total, they reportedly amassed around four or five thousand data points on every adult in the US.
Armed with this data, the firm built algorithmic models that could predict, alarmingly well, each subject’s personality profile (specifically, scores across the five OCEAN traits widely used by academic researchers in the field). And if you know the personality of the people you’re targeting then, as Alexander Nix, Cambridge Analytica’s CEO, explained in a 2016 speech, you’re in a position ‘[to] nuance your messaging to resonate more effectively with those key groups.’
Finally, to help these messages land, the Trump campaign employed armies of ‘bots’ on social media platforms to spread the messages; Trump’s bots supposedly outnumbered Clinton’s by five to one. Each day, according to a piece by Sean Illing, the Trump campaign would try 40-50,000 variants of these ads: the ones that got liked, shared, and retweeted the most were reproduced and redistributed based on where they were popular and who they appealed to.
At the heart of this approach, then, was the use of data analytics to capture the attention of individual social media users; algorithms to re-optimise those messages; and armies of bots to propagate them via social media.
The data economy or the attention economy?
Finally, if we turn to the world economy, then a key trend to note is how it has become increasingly digital in nature.
One narrative for this centres on the data economy. ‘Data is the new oil‘, some say, and if anything I think this probably understates the matter. Data lies at the heart of the business models of Google, Amazon, Apple, Facebook and Microsoft – the five most valuable listed firms in the world. A recent report by leading software maker Oracle and MIT Technology Review argues that a major reason for the success of these companies is that they have embraced the mindset of ‘data as an asset’: they are able to see the value (and future revenue streams) in hoarding, commodifying, and monetising as much data as they can. To extract value from data, coders can write algorithms to predict demand: when a customer is ready to buy a particular product, say, or when a jet-engine is in need of servicing.
The impetus behind the data economy is likely to grow further. As more and more things – from watches to cars – connect to the internet, the volume of data to be harnessed for commercial use is increasing at an unprecedented rate. Industrial giants like GE and Siemens now sell themselves as ‘data’ firms, while some expect that we will soon be paying for things via Amazon and Facebook, rather than via banks.
But an alternative lens on the digital economy, at least the consumer-facing side of it, views it as centred around something still more human – namely the attention economy: a system that revolves around paying, receiving, and seeking what is ‘most intrinsically limited and not replaceable by anything else’, to quote Michael Goldhaber. A quick example of this: I recently signed up for a ‘free’ service, one that helps you manage email subscriptions and junk mail, only to be told, near the end of the registration process, that I could use the service provided, in exchange, I posted about it on Twitter or Facebook.
As we all know, we use Facebook, Google and Twitter for free and yet they make huge profits – because of their advertising revenue. The logic here is that the longer we spend on the site or app, the more likely we are to click through a banner and perhaps end up buying something. And the better targeted the ad – for instance if it uses data about our interests, purchase history etc – the more likely we are to click through, boosting how much advertisers will pay to tech giants for a few inches of prime (digital) real estate.
The problem is that this pushes everything we interact with online towards a single incentive: to grab and hold our attention on platforms that deliver advertising revenues for their owners. As Fred Stutzman argues, ‘it’s incentivised around making it hard for you to tear your eyeballs away.’
As noted above, this isn’t a good social outcome because our goals diverge significantly from those of the big tech companies. Critics such as Jaron Lanier and John Battelle see the ad-based business model as the source of the problem, and in need of reform if we want to move the system towards one that serves humanity and society.
‘The thing that is most one’s own’
Central to all the issues discussed so far – relating to our mental health, our culture, our society, our politics and our economy – is the changing way that we (are able to) pay attention to ourselves, to each other and the world around us. Time magazine’s technology writer, Lev Grossman, sums up the issue insofar as it is our smartphone usage that is part of the problem. Grossman gave the iPhone some of its earliest rave reviews. But he now worries that as a society we haven’t understood or accepted how completely smartphones have distorted ‘our relationships with ourselves and with the reality around us.’ Or, to quote Arianna Huffington, also speaking at the World Government Summit this month, our current predicament sees us ‘drowning in data but starving in wisdom’.
What’s clear, then, is that the ramping up of efforts by others (with commercial and political interests) to capture our attention points to the need for robust ways to protect our attentional abilities – and respect other people’s.
Before turning to how we might achieve that, at this point we might pause to reflect on how precious and irreplaceable this thing we call ‘attention’ is. The American philosopher and psychologist William James wrote that ‘what we attend to is reality’. While in a sense utterly obvious, this point is also profound: what we pay attention to, and the manner in which we do so, serves to determine, for each of us, our own moment-to-moment reality.
What’s more, our capacity to pay attention, in turn, determines what we pay attention to – and how much we are able to resist being buffeted around by various stimuli versus direct our attention towards our goals and pursuits.
One way to cultivate our attentional capabilities that has become increasingly popular is mindfulness – the practice of paying attention to whatever is going on for us in the present moment, on purpose, and non-judgementally. We may think of mindfulness in terms of focussing on our breath or bodily sensations. But as Jon Kabat-Zinn, widely regarded as the father of the modern mindfulness movement, noted in a recent interview, mindfulness practice:
‘…[is] not really about the breathing, or the object of attention, but it’s the attending itself. We are so seduced by thinking and emotion and we don’t realize that awareness is at least as powerful of a function. It can hold any emotion, no matter how destructive, any thought, no matter how gigantic.’
He goes on to say that this is where the potential for transformation, through mindfulness, lies: by adding a measure of ‘deep introspection and perception’ to the most ordinary of experiences.
Personally, I’ve found it insightful to revisit the story of what’s going on in the new attention economy – for instance, the cultural and economic angles discussed above – in light of reflections such as like Kabat-Zinn’s on the special, and intimate, nature of ‘paying attention’.
Take the fact, for instance, that the attention economy is driven by advertising revenues. To understand this better, we can introduce a model sometimes used by advertisers to describe the journey a consumer makes ahead of purchasing a particular product. The steps of that journey are: Attention, Interest, Desire, Action (AIDA). At the turn of the century, a marketing strategy for a billboard or TV ad, for instance, might conceivably have focussed on fostering an interest in the product, a desire to buy it, or indeed on nudging you to act so as to follow through on that desire. Of course, it may, too, have focused on grabbing your attention. But what we can be sure of is that with the explosion of information available at our fingertips over the past decade, marketing strategies today will (generally speaking) need to put greater weight on grabbing, and holding, our attention.
The reason to stress this point is that, as intimate to our ‘selves’ as our interests, desires and actions may be, our attention is even more intimate: our attention goes right to the core of who we are, as suggested by the William James quote above (and the fact that attention comes at the start of the AIDA journey). A shift in advertising strategies towards grabbing our attention, then, is of first-order significance.
How should we respond to the crisis of attention?
There is no shortage of advice about how we can fight off digital distractions as individuals.
Alan Wallace, among others, writes about cultivating attentional capabilities through meditation. Cal Newport writes lucidly about quitting social media, rationing time for email and, most of all, building our day around what’s most important to us – rather than fitting that in once we’ve ticked off everything else. Sherry Turkle, meanwhile, writes about ‘sacred spaces for conversation’, advising us not to put a phone between ourselves and the person we are sharing a meal with and to keep phones away from the kitchen and the dining room.
The focus of our Paying Attention initiative at Perspectiva, however, is on understanding the best responses at a structural, system-wide level. Further work is needed to get a deeper and more integrated understanding of the dynamics of the attention economy – dynamics which are causing huge problems for society. Work is also needed to explore tangible solutions to these problems.
Some questions for research therefore include…
To better understand the integral nature of paying attention to our humanity:
Why is the distinction between stimulus-driven and goal-orientated attention so important? Why do so many spiritual perspectives – eg Buddhist, Christian and secular ones – place special emphasis on attention? What is the danger of understanding ourselves through the prism of a ‘quantified self’? How does the crisis of attention fit with Iain McGilchrist’s narrative of ‘left hemisphere overreach‘?
To better understand the cultural-economic forces driving the attention economy:
How exactly does the attention economy fit in with the data economy? Why is it problematic to discuss developments purely in terms of the latter? Is it helpful to develop theories in which attention serves as a means of exchange, a form of labour or a form of capital?
To make the case for a political economy that respects and protects individuals’ attention: What are the negative externalities stemming from the attention economy as it currently operates? Is there a good case to be made that preferences are being engineered or that algorithmic decision-making is making us more passive? How can we best articulate the case for a political economy in which our political and business leaders and academics reappraise the value of individuals’ attention capabilities? Recalling the restrictive use of digital technologies among Silicon Valley elites, what should governments do to help all segments of society to understand the issues and take action to protect their attention?
To rebuild a digital economy that respects and protects individuals’ attention:
What are viable alternatives to the ad-based business models of Facebook, Google, Twitter etc? Would a clamp-down on online anonymity improve the environment for our attention? Which of the solutions proposed by the Centre for Humane Technology, set up recently by Tristan Harris, should be prioritised by political and business leaders in order to realign technology with humanity’s best interests? Looking ahead, how would widespread advances in general-purpose robotics, artificial intelligence techniques, virtual reality interfaces and the ‘Internet of Things’ affect our attentional capacities and the governance required to safeguard them?
Clearly, then, there is a lot to reflect on if we are to build a world that respects, and protects, our capacities to pay attention. The open questions listed here will bear relevance across many spheres – business, technology, education, politics. But perhaps the first step is to recognise more fully the true value of our attention – the thing which, after all, ‘is most one’s own’. It is easy to think of attention as just another ‘cognitive function’, but it isn’t: as Iain McGilchrist has observed, the nature of how we choose to pay attention alters the nature of the world that we experience, and governs what it is that we will find.