Introduction
A soft alarm rouses me from my sleep. The traffic on the road outside is coming alive, rumbles slipping through my double glazing. I’m feeling groggy. Thankfully, the smart speaker is on hand to brighten my morning with my favourite summer playlist – ‘Alexa™, play summer soundtrack’. That’s better.
Walking outside I grab a coffee next door – my Apple Watch™ lights up to pay, and in my pocket, I feel the buzz of my banking app notification: £2.45 debited for a large americano. Next stop is work. Time to get moving. Beep, the Oyster™ card taps seamlessly on the Number 36 bus headed up towards my office in London Victoria.
With my iPhone set to ‘do not disturb’ for transit like all other journeys, I enjoy ten minutes of peace. No notifications. No disturbances. Only Spotify™ streaming my favourite music, as traffic glides by the window of the bus. We pull into the bus stop and off I hop. I scan into work, and the day commences in earnest.
By 9am, the character of our story has only used his smartphone once. Yet by interacting with smart devices four or five times, several hundred data points have been generated. Time stamps, transaction details, geo-tagged locations, and actioned preferences all feature.
These data points will likely be ‘copied millions of times by some algorithm somewhere designed to send an advertisement,’ and then added to huge databases that enable marketeers to create ‘scenarios’ and ‘outcomes’.[2]
Staggeringly, 2.5 quintillion bytes of data[3] are generated every day (that’s 18 zeros). That number will continue to grow. Search engines log around 6.4 billion searches per day.
‘Should I really be worried about this?’, you may say. No single data point is especially significant, but such a substantial aggregation is of great value. This paper explores the scope, scale, and significance of that data capture.
Surveillance capitalism
Surveillance capitalism: a new economic order built around aggregating human experience as free raw material, for hidden commercial practices of prediction, behavioural manipulation, and sales resulting in unprecedented concentrations of wealth, knowledge, and power in the hands of private companies.[4]
How surveillance capitalism became a dominant force
Surveillance capitalism describes a pursuit for money and power by a handful of privately controlled businesses,[5] through increasingly intimate scrutiny of the lives of billions of people. These businesses have achieved market dominance through aggressive growth, regulatory weakness, and sustained lobbying. They now exert great influence over ideology and imagination across the whole democratic world. They have become fabulously wealthy by promoting ostensibly ‘free’ products and convincing the billions who use them to become a product for their own consumption.
How do surveillance capitalists make money? Much of their income comes from ‘digital advertising’. Between 2001 and 2020, digital advertising grew from 3.1 per cent of total advertising spend, to over 44 per cent.[6] Google, Facebook, and Amazon together receive almost two-thirds of this revenue.
Figure 1 – Global Advertising Spend by Channel [7]
In 2001 Google’s annual income was $70 million; by 2020, Google generated $147 billion from advertising.[8] In 2020, 98 per cent of Facebook’s $86 billion income was from advertising. Amazon’s advertising income ($14.1 billion in 2019) is doubling every two years.[9]
Moving from prediction to modifying our behaviour
To grow, surveillance capitalists need to extend their influence over what we do. Explicitly, demonstrating links between predicted and actual outcomes (e.g. advertisement leads to sales) increases the value of the services they sell to advertisers. Implicitly, improved prediction accuracy depends on having access to more data and better algorithms. To gain more data, they must increase our ‘engagement’ with their platforms.
Nir Eyal’s book Hooked describes a widely used, algorithm-powered customer engagement model called the ‘hook model’.[10]
It depends on a perpetual cycle of:
- Trigger – a system-offered invitation to respond
- Action – the user’s response, which results in a …
- Variable reward – offered by the system, which provokes further user …
- Investment – resulting in another ‘trigger’
Based on the foundational work of B. F. Skinner and others,[11] this model incorporates addictive elements into its core design, mirroring methods used in casinos to keep people playing.[12] Teenagers and young adults are particularly vulnerable to these techniques, but almost anyone will find them hard to ‘beat’. Increasing engagement via the hook model provides both a constant source of behavioural data and a committed audience for advertisements.
These factors make surveillance capitalism much more than ‘improved advertising’ for the twenty-first century. Traditional marketing is hampered by having to choose between targeting accuracy and scale, whereas digital platforms flatten the cost to profile, classify, and target users. Traditional surveys have a relatively fixed cost per question per person. Asking one hundred people one question or one person one hundred questions have similar costs. Asking one hundred people one hundred questions costs significantly more. Surveillance capitalism overcomes this limitation. Digital platforms depend on scale to achieve accuracy, recording behaviours to fuel algorithmically inferred opinions, desires, and expectations. In this model the cost per person is flat and platforms need to maximise the volume of data recorded. By harvesting minute details of actions and responses from billions of people, the platform-owners are able both to classify individuals with precision and predict accurately how individuals within any identified group will respond to a given input. The enormous population of active users combined with the relatively flat cost per person reached allows even very targeted advertisements to reach large numbers, significantly increasing the ability to influence individuals to achieve a particular outcome.
Origins – grasping the power of data
This began at the turn of the millennium, as the technology-obsessed dot-com boom came to an end. Google discovered that the linked connections between websites provided a good approximation for user interest. They leveraged this massive data set to build a market-disrupting search engine that seemed capable of anticipating what searchers were looking for. They rapidly gained complete market dominance for search but struggled to monetise that disruption.
Google considered selling advertising space on search pages, but the founders, Larry Page and Sergey Brin, believed accepting payment for preferential positioning of results would compromise the integrity of their search engine. Ultimately, the need for profits overcame their scruples, perhaps accelerated by Google’s data scientists’ demonstration of how effective the predictive techniques could be in pointing users towards contextually relevant (potentially paid for) content. Surveillance capitalism was born.
Having taken this step, Google realised that more data, and therefore more insight, would come only from expanding into other areas of human life. They pioneered a systematic approach (labelled the ‘Dispossession Cycle’ by Zuboff) to secure their social ‘right’ to gather data previously considered private.
Figure 3 – Dispossession Cycle (after Zuboff, 2019)
Surveillance capitalism spreads
In 2008, Facebook had 150 million users, but no user-generated revenue. Sheryl Sandberg was hired from Google as Facebook’s Chief Operating Officer to fix the ‘revenue problem’. Soon afterwards, Facebook asserted ‘ownership’ of all content hosted on their network, regardless of origin, and began selling ‘outcomes’: expected user responses to given stimuli. This led to a now-infamous moment for Facebook, when Cambridge Analytica demonstrated that large volumes of Facebook-originated data could effectively sell ideas, not just products. They are credited with significant ideological movements influencing both the 2016 US presidential election and the Brexit referendum.
Figure 4 – Authors’ representation of key brands associated with some leading technology companies (2020)
Surveillance capitalists have been eliminating competitors (through lawsuits and acquisitions) and extending their reach through complementary products and services. Amazon used its retail dominance to launch life-integration products like the ‘Echo’ smart speaker and ‘Ring’ home automation product line. This has established mass data collection hubs at the centre of our domestic lives. Other surveillance companies are doing the same, either using their own products or by integrating third-party devices seamlessly into their own platforms. Apple uniquely seems to be turning its back on these practices, yet they too receive billions of dollars for prioritising Google search on iOS devices.[13]
Surveillance capitalists have achieved a huge asymmetry of knowledge and power over consumers and arguably over democratic governments. They are spending heavily to entrench that position: investing in academic research; in medical, social, and political sciences as well as natural sciences and engineering. You don’t have to be a cynic to be concerned that an unregulated monopoly is spending more on political lobbying than any other sector.[14]
Three perspectives to consider
1 The delusions of purely technological hope
Beguiled by utopian dreams
Technology innovation reflects the values and ideals of its makers. Guy Brandon, in his book Digitally Remastered[15] writes: ‘a technology like a social media platform is implicitly the expression of the spiritual values of its creators and users.’ What are the ideas, values, and aspirations of the authors of Big Tech, from Jeff Bezos and Steve Jobs, to Larry Page and Mark Zuckerberg?
Eric Schmidt, the former CEO, and executive chairman of Google has remarked: ‘Our goal is to change the world…[and] monetization is a technology to pay for it.’[16] His comments resonate with the quasi-theological visions of the biggest technology companies. (Facebook wants to ‘bring the world closer together’, Amazon offers ‘Everything from A to Z’, Google’s ‘Change the world’). Data is a new unit of currency in this new world, fuelling the informational economy with real-time insights, actions, and preferences.
The technology-evangelists’ goal is to provide products and services that are so compelling, easy to access, and intuitive to use that we can’t help but adopt them. They want to offer a form of frictionless living, enabled by their products, and built around their digital architecture, that encourages us to use their services frequently, while enabling them to harvest our data. That data provides insight into human living, which in turn provides the means to exert influence over our lives. Yet these same leaders conflate a form of technicism (including the inevitability of technological advance) with consumerism. Monetisation is no longer a technology to support techno-utopian goals, it has become the goal, driving adoption, revenue, and profit.
The deception of democratised (digital) relationships
We are told that social media ‘give[s] people the power to build community’,[17] but research repeatedly demonstrates that social media rapidly and permanently polarises users.[18] Instagram and Facebook encourage parents to create ‘managed’ accounts for children younger than thirteen,[19] but industry executives (including former Facebook executives) fiercely shield their children from social media.[20] Research into the harms of social media, particularly on young adults, is reflected in the sites’ own FAQs addressing abuse and eating disorders.[21] Despite this, these firms continue to harvest trillions of behavioural observations from billions of users every day. How do the big technology companies manage to preserve the simulacrum of relationships while being so anti-relational?
The design of social media platforms intentionally redefines common relational paradigms. Facebook transformed ‘friend’ from a noun to a verb: you now ‘friend’ (or ‘unfriend’) someone to open access to curated personal information. To maximise my friending ability, Facebook collates my friends’ information, so I don’t have to digitally ‘go’ anywhere to participate in the relationship. Twitter and Instagram go further, abandoning the pretence of symmetry by defining relationships in terms of ‘followers’, encouraging asymmetry and voyeurism. Unlike symmetric, two-way conversations, social media relationships involve one party posting an artefact (e.g., an image) and a multitude of public recipients asynchronously reacting (e.g., a comment or ‘like’). This is the infrastructure that supports growing addiction.
2 The hook model and addiction
Guy Brandon writes: ‘The spiritual danger posed by social media [is] that it almost subconsciously takes precedence over everything else in our lives’.[22]
How is this subliminal addiction achieved with so little resistance? This is the glory and shame of the hook model: having convinced us to accept digital relationships, those relationships are now mediated by platforms scientifically designed to maximise our engagement. Instagram does this by leveraging our strong visual bias. Instagram launched in 2010 as a ‘fast, beautiful, and fun’ way to ‘capture and share the world’s moments’[23] and currently has 1.1 billion users.[24] The basic premise is that you post a picture or video that is pushed to your followers, who get an alert and are encouraged to open the app. Your followers can respond with comments or ‘likes’, which are tracked and prominently displayed. As anyone who has written a letter to a newspaper editor or commented on an online article knows, there is a strong temptation to return and see how people have responded. Instagram makes that feedback nearly instantaneous, giving users almost infinite capacity to post, generating that temptation to see people respond, and then immediately gratifying it and inviting the user to post again. All the elements of the hook model are here: trigger, action, uncertain reward, and further investment.
Brandon calls this ‘sensitising the mind to distraction’ and warns that ‘this distractibility compromises our humanity’.[25] This addiction is formed early: in the West, the average 2–4 year old spends 48 minutes per day on a digital device.[26] Over 70 per cent of all children and nearly 90 per cent of adolescents in the US sleep with a digital device connected to social media, which demonstrably reduces sleep quality and correlates with rises in depression.[27] This particularly affects young girls: the incidence of depression among 13–18 year olds increased 65 per cent between 2010 and 2017 after decades of decline, directly corresponding with the availability of social media sites like Facebook and Instagram for this age group.[28] Adults are not exempt, though their addiction tends to play out in terms of polarisation[29] and marked declines in ability to empathise.[30]
Beyond these immediate impacts, addiction to distraction opens the door to exploitation.
3 The costs of exploitation
At this point, we ask, ‘Why should I care? I get excellent, helpful services for free, and I never click the adverts. This seems like a good trade.’
In the movie The Matrix (The Wachowskis, 1999) humans are crops from which energy is harvested, and the ‘crop’ is maintained in a dream-like simulation to keep them productive and more-or-less content. Most people find this to be a poor trade, regardless of our dream-like contentment. Why? What has been taken from us?
Surveillance capitalism is more interested in keeping us ‘content’ and connected to harvest information. Specifically, they observe and record millions of our behaviours and responses to (strategically) varied inputs to create something like an avatar (a virtual representation) of each of us that mimics our responses to given inputs. Creating accurate avatars requires maximising engagement with the company’s actual platforms or with their advertising networks, which uniquely identify and track us around the web even if we don’t have accounts with those companies.[31]
The more data they collect, the more refined our avatar, the more accurately they can test and select inputs to manage our responses. As an example, and as a piece of self-advertisement, Facebook published studies demonstrating their ability to selectively manipulate voter turnout[32] and user emotions[33] through messages and promoted content.
‘Advertising works by creating patterns of associations…through “low attention processing”.’ As discussed above, social media is designed for distraction, the ‘undirected mental state where images, music, and emotional responses pass into long-term memory without conscious learning.’[34] When these inputs have been refined against my avatar, what chance do I stand against the well-financed effort to nudge my behaviours, emotions, and beliefs in one direction or another? If we insist on defining what is being stolen, we might not be too far off the mark if we point to self-determination, intellectual freedom, choice, and, eventually, responsibility.
Elements of a Christian response
We have mentioned several well-informed critiques of surveillance capitalism,[35] which include some policy or regulatory recommendations to address the systemic abuses of technological monopolies. The concerned reader can find a voice among organisations raising these issues at all levels of government, such as the Electronic Frontier Foundation[36] and the Center for Humane Technology.[37]
Given that any societal reform will be slow and incomplete, and there is no comprehensive mechanism to opt out, how should we live within the system and architecture of surveillance capitalism?
Cultivating life beyond exploitation
During the pandemic, products such as Zoom sustained our friendships and our worship; however, we consider these technologies as substitutes. We are embodied creatures and we rightly long to interact face-to-face. This God-given longing is a compass to help navigate the deceptive and exploitative redefinition of relationships described earlier.
Jesus himself is the preeminent example of choosing present, physical relationships in defiance of immediate ‘reach’ and ‘opportunity’.[38] Surely God could have broadcast his good news directly to the whole globe, but he chose to come as a baby, apprentice as a carpenter, and then spend three years focused on evangelising and discipling a few dozen men and women. We find similar themes in Paul’s correspondence. He is distressed when he hears that people he knows are suffering. [39] He writes to tell his readers that he longs to be with them,[40] and that he finds his joy in them. [41] John also desires to share joy ‘face to face’ rather than with ‘paper and ink’.[42]
The instruction in 1 Peter for Christians to consider their identity ‘as foreigners and exiles’[43] offers encouragement and motivation to refocus time and energy away from social media and towards loving our physical neighbours. The Israelite exiles in Babylon are told to ‘seek the peace and prosperity of the city to which I have carried you into exile. Pray to the Lord for it, because if it prospers, you too will prosper.’[44] They are called to local love, in direct contrast to their natural longing to be elsewhere.
Living by rhythms to counter addiction
In the story of Creation, God rested on the seventh day. [45] In the Ten Commandments, God’s people are commanded to keep the Sabbath holy, by resting. [46] A first step away from digital slavery is to put our devices aside for one day each week. Perhaps we could take Andy Crouch’s advice.[47] He suggests we set devices aside for one hour each day, one day each week, and one week each year.
Christians are commanded to be ‘very careful… how you live’.[48] One of the best defences against busyness and digital addiction may be an adaptation of the ancient practice of a ‘rule of life’.
The Rule of Saint Benedict[49] provides a guide for communal monastic living. The rule establishes regular rhythms of prayer, sleep, spiritual reading, and work. There has been a surge of interest in considering how aspects of Benedict’s ‘rule’ could be applied to individual as well as communal living.[50]
We see great benefit in establishing some fundamental low-tech relational rhythms and practices that reflect our most precious values and priorities. These can be combined into a personal ‘rule of life’. This offers a robust defence against the digital sprawl that threatens to inundate us and can help contain our working day. Without such practices, it is hard to ‘have life, and have it to the full’[51] in a world dominated by surveillance capitalism.
Challenging these perspectives is hard. Working together to develop practices of digital fasting, sabbath rest, and personal rule of life allows us to establish countercultural patterns that can break the bondage of consumerist technicism that holds us so tightly.
Restoring truth in a post-truth world
The Canadian philosopher Charles Taylor writes that it is the ‘prestige and aura that surround technology’[52] – with its glint of newness and promise of convenience – that casts the spell of enchantment on our society. We are beguiled by a simple aphorism: ‘the newest is the truest’.
Perhaps rather than the rigid, mechanical world prophesied by George Orwell in his book Nineteen Eighty-Four [53] our soft compliance and enchanted reliance on technology is more reminiscent of the world sketched by Aldous Huxley in the novel Brave New World.[54] Neil Postman writes: ‘what Orwell feared were those who would ban books. What Huxley feared was that there would be no need to ban a book, for there would be no one who wanted to read one… Orwell feared the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance.’[55]
Postman sided heavily with Huxley’s dystopian view of the future in Brave New World. As Huxley saw it, ‘people will come to love their oppression, to adore the technologies that undo their capacities to think’.[56] Postman echoes this tone, continuing: ‘in the age of advanced technology, spiritual devastation is more likely to come from an enemy with a smiling face than from one whose countenance exudes suspicion and hate.’ The warning is clear: a veritable wolf in sheep’s clothing.
Restoring truth is no easy quest. In their fight for our attention, the owners of social media platforms have willingly accepted societal polarisation as an unimportant side effect of their activities.[57] When we are fed ever more strident opinions that align with our own, common ground and alternative perspectives seem to evaporate. To rediscover truth, we must escape from the grip of these algorithmic tools. We have stepped too far into the looking-glass world of alternative truth, and we have to relearn the art of listening: to remember how to consider alternative perspectives, and to actively remind ourselves that the ‘facts’ we are holding on to have likely been deliberately handed to us by an algorithm designed to help entrench our position. Faith communities could go further by developing the notion of ‘generous conversations’ among people who have widely divergent views on contentious issues. These could be constructed to allow room for disagreement, to extend grace to errors, and to foster productive dialogue.
Conclusion
How then should we live? The Psalmists insist that the focus of our gaze is significant. Idols and false gods will always compete for our attention. Too easily, we allow our eyes to be drawn away from the living God, drawing from our own broken cisterns that cannot hold water.[58] ‘I lift up my eyes to the mountains – where does my help come from?’[59] begins Psalm 121. Mountaintop idols seem appealing, yet they distort our relationships by volatising the real and obliterating lived experience.[60] These same idols can deceive our sensibilities by distancing us from God and foreclosing relational intimacy. They dismay us and dissatisfy in profound ways, leading us towards mirages and leaving us insatiably thirsty. But the Psalmist continues ‘my help comes from the Lord, the Maker of Heaven and Earth’; he wisely shifts his frame from the mountain to its Maker, the foundation of what is real.
What is the potential idolatry of digital technology? Liking a photograph is fairly harmless. Endless scrolling perhaps less so. But surely it’s not idolatrous? As noted earlier, Guy Brandon highlights the spiritual danger associated with the way that social media tends to become dominant in many people’s lives.[61] We have become utterly distractable, as T. S. Eliot wrote, ‘distracted by distraction from distraction’.[62] Bottomless content and endless notifications ‘undermine our ability to focus and, implicitly, reduce our capacity to relate to each other – in the most basic terms, to love’.[63] Indeed, ‘sensitising the mind to distraction … compromises our humanity’.[64] It is this prospect for humanity that concerns us.
In this hyper-individualistic age, it is in embodied community where we can best ‘spur one another on towards love and good deeds’,[65] learning how to hear and respond to Jesus’ call over the digital cacophony. In community we can better ‘learn the unforced rhythms of grace’,[66] putting the technology that is intended to serve us in its proper place. It is in that community where we can flourish; to know and to be known.