Imagine you could walk into a room full of strangers and know immediately what makes each person tick. What they believe, what they like, what they fear, what they long for, what they’re comfortable saying out loud and what they’re really feeling underneath. Imagine the power that would give you. How easy it would be to manipulate them. To win their trust. To persuade them to do what you want.
It’s an advertisers’ dream. But over the past two decades, we’ve helped to make it a reality.
We’ve handed over swathes of personal data to social media platforms. Casually revealed our interests, passions, hobbies, political affiliations, sense of humor. Had private, sensitive conversations in online spaces, sharing our most intimate secrets through channels that can be aggregated, analyzed, and mined for insights and clues. We’ve begun to carry tiny computers in our pockets that monitor, literally, our every move, while using them to record our voices, images, and fingerprints.
People who want to influence us can buy information that tells them what we’re thinking, at whatever time of day or night… and then, thanks to targeted advertising, they can seize on our moods, whims, and fears, reaching us wherever we are.
But it’s still not enough. Marketers and governments want to know more. They want to know what we’re doing in our homes when we’re not scrolling through our phones. They want to know how we interact in the “real” world. What our medical records show. They want to know what’s happening inside our bodies, under our skin, in our brains.
They want to know all this so they know how to persuade us to do what’s best for them. That might mean buying a product. It might mean casting a vote. In extreme circumstances, it might mean instructing or intimidating us into falling in line.
In any case, this is data that companies are willing to collect, package and sell. Welcome to surveillance capitalism.
In This Article –
What is Surveillance Capitalism?
So what is surveillance capitalism? Let’s start with a short surveillance capitalism definition: turning personal data and residual data (more on that in a moment) into a commodity that can be bought and sold.
The longer explanation of the surveillance capitalism definition is that this is a way of approaching private, personal human experiences as exploitable data points, or “data point capital”, that can be turned into behavioral data. This is then purchased by third party marketers and companies – not as the raw data itself, but as predictive products. The buyers typically use this to get a picture of what different groups of people are about to do so that they know how and where to direct their advertising resources. They may also use this as training data for machine learning algorithms they have in development, for example, facial recognition or voice activation software.
Where Does the Term “Surveillance Capitalism” Come From?
The term was coined by Harvard Professor Shoshana Zuboff, whose 2019 book The Age of Surveillance Capitalism traces the growing trend of data capitalism over the past few decades – and how data control reshaped our lives.
Who Is Driving Data Capitalism?
Google invented the methods of surveillance capitalism, extracting and repackaging user data and manufacturing new markets for this. It can do that because of the sheer volume of data it processes: over 1.2 trillion searches every year.
Google knew it would never get users to explicitly agree to their private data being used in this way. Instead, it disguised efforts to capture more user data as enhancing user experience, making the search engine more convenient to use. People didn’t object because they didn’t know.
Today, Google is one of the big five data capitalists, along with Facebook, Amazon, Microsoft, and Apple. Between them, these behemoths accumulate vast swathes of data on our online activity, behavior, preferences, emotional responses, etc., and turn this into products and services for third-party buyers. It’s an incredibly lucrative trade, consolidating these corporations as the largest and most powerful companies in the world.
Surveillance capitalism and data control now underpin developments in every industry and sector, tied in with the Internet of Things (IoT) and smart devices that pepper our lives. It’s practically impossible to engage fully in the modern world – the way we socialize, interact, work, learn, access knowledge and news – without unwittingly handing over personal data that others commodify and capitalize on, in ways over which we have no control.
As Prof. Zuboff explained in this documentary:
“[These methods] have been engineered to be indecipherable, to be undetectable, to create ignorance in a vast group of all of us that they call users. Our ignorance is their bliss.”
How Does Surveillance Capitalism Work?
To answer the question, “How does surveillance capitalism work,” surveillance capitalism works by transforming human behavior into data flows and predictive signals, then selling this to marketers so that they can try and leverage this insight to sell products and ideas to the most receptive groups.
Data is collected from websites, apps, and devices we use each day. It’s shared or traded between vendors and may be combined with transaction histories, demographic information, medical data, etc. In 2019, for instance, the Wall Street Journal found that apps like the Instant Heart Rate: HR Monitor and Flo Period & Ovulation Tracker were sharing this personal medical information with Facebook. This is used to build powerful machine learning algorithms that unveil connections and patterns.
The aim is to predict what people are likely to do next, when they are likely to do it and/or who advertisers should focus on.
Targeted ads and personalized recommendations work like this. Sometimes, the link is obvious: if you talk online about how much you love The Sopranos you might get ads for The Irishman. Other times, the link can seem obscure.
Take the notorious case of a teenage girl who switched to a more gently-fragranced shampoo brand at Target, only for her family to be bombarded with coupons for baby products. Pregnant women have a heightened sense of smell, so the algorithm made the logical leap… and was correct, although the girl didn’t realize this until after her furious father confronted the store.
What’s sold to advertisers is usually eyeballs: they decide which people see their ads online, and when. One exception is the Pokemon Go surveillance capitalism model, in which people downloaded the app and were physically directed into the business premises of brands who had paid for them to be there. That’s what truly made the Pokemon Go surveillance capitalism model different from what would usually be encountered.
As our homes and lives become increasingly connected, sensors in our living spaces also turn our behavior into data point capital. Some of these we’re aware of, like Alexa. Some come as a shock. It wasn’t until Google announced that its home security system, called Nest, could now support Voice Activation that anyone realized these already had secret microphones installed inside, inexplicably omitted from schematic diagrams of the product. It’s near-impossible to know exactly what data our devices are collecting about us.
Can It Be a Good Thing?
There are, of course, benefits associated with surveillance capitalism, or people wouldn’t hand over their data.
Many of the platforms involved are free to use, including Facebook and Instagram (which Facebook owns), search engines like Google and Microsoft Bing, email providers like Gmail (part of the Google suite of products), Apple Maps and Google Maps. Most people today are aware that if they aren’t paying for the product, they are the product, but it’s a trade-off that many people are willing to accept. At least, in theory. More on that in a moment.
And then there is the fact that – along with all their other, more profitable, purposes – the big surveillance capitalists do also use the data they harvest to improve the way their products work. When Amazon suggests a novel you turn out to love based on your past purchases, when the Netflix algorithm introduces you to your new favorite show, when Facebook’s targeted advertising means you discover local events, bands, artist or opportunities that genuinely excite you, you may be left thinking, “well, I guess the system works!”
Personalization can be very attractive. Having all your smart devices talk to each other is really convenient. Alexas, Siris and other voice-activated products are useful to have around. It’s handy to be able to check your home security from afar or turn on the heating while you’re on your way home from work. People do really love this stuff, or they wouldn’t use it.
The trouble is that, even if you willingly hand over precious personal information to a company to enable these products and services, you don’t get to control how else these companies will use that data or the predictive behavioral insights they glean from it. Many people would hate to think that an oppressive regime was able to benefit from the photos they post innocently on private social media accounts. Many people are deeply troubled that private companies and hostile governments can exploit data point capitalism to spread misinformation and influence the outcomes of major elections. But that’s a risk you take when you upload your data.
Once you have devices set up that listen to you, or you make it possible for these companies to track your every move online, you don’t get to pick and choose what they know about you. Your Alexa is listening to every private conversation you have, whether you like it or not. Anyone can snoop around your home using Maps. Anyone can create facial recognition software that means you can never be anonymous again, even when that puts you in danger.
Just look at the way governments around the world have leveraged facial recognition to intensify surveillance over the past few years. Sure, this can improve safety and security, making it easier to identify dangerous criminals. Right now, amid the global COVID-19 crisis, when it’s incredibly important for people to observe official curfews or self-isolation guidance designed to curb the spread of the pandemic, facial recognition can help save lives. In Russia, for example, widespread facial recognition has been used to identify people who breach the 14-day quarantine rule after returning from high-risk countries.
The problem is, of course, that facial recognition doesn’t selectively target these people. Everyone is caught up in it. If your government stops acting in your best interest, they still retain this power to identify and arrest you. Just look at Hong Kong, where pro-democracy protesters were painfully aware that the ubiquitous facial-recognition surveillance cameras would allow the Chinese government to track them down and lock them up with ease. Or Xinjiang in Western China, which Zuboff describes as an open-air prison for Uyghur ethnic minority. Here, half a million people’s faces are scanned each day and alarms are automatically sent to police every time a small group of Uyghurs travels to another area.
As the historian Yuval Noah Harari warns, several governments (some with a shaky history of respecting human rights and individual privacy) are now using the coronavirus pandemic to push through extensive, ”emergency” surveillance packages that will have lasting effects on our day-to-day lives. These go further than simply identifying who you are when you’re out in public or tracking what you read online; in China, for example, smartphones are being used for biometric monitoring, to keep tabs on possible symptoms of COVID-19. Writing in the Financial Times, Harari said:
“It is crucial to remember that anger, joy, boredom, and love are biological phenomena just like fever and a cough. The same technology that identifies coughs could also identify laughs. If corporations and governments start harvesting our biometric data en masse, they can get to know us far better than we know ourselves, and they can then not just predict our feelings but also manipulate our feelings and sell us anything they want — be it a product or a politician. Biometric monitoring would make Cambridge Analytica’s data hacking tactics look like something from the Stone Age. Imagine North Korea in 2030, when every citizen has to wear a biometric bracelet 24 hours a day. If you listen to a speech by the Great Leader and the bracelet picks up the tell-tale signs of anger, you are done for.”
Can It Be Stopped?
So can you stop these companies from spying on what you do and profiting from it?
Regulators and privacy advocates have certainly tried. The EU curbed what companies can do with our data without permission, introducing GDPR laws that require websites to ask for consent and tell you explicitly what they are about to do with your data. These rules apply not only to organizations headquartered in the EU itself but to any site that collects data from citizens in its 27 member states.
GDPR only goes so far, though.
For a start, do you actually read the T&C’s every time you open a new webpage? Or do you click “I accept” without really taking this in? Do you scrutinize every new change to Google’s user agreement? Google, after all, knows exactly what you’re doing online all the time if you’ve signed in to any Google services, from Gmail to YouTube to Google Docs.
Secondly, sites warn you that they share your data with “carefully selected third parties” and that their services won’t work properly if you refuse. They also say they aren’t responsible for what their partners do with your data. How can you possibly keep on top of all these arrangements, every time you go online or switch on a smart device?
(Sometimes these third parties overstep the mark and get away with it. Just take the Cambridge Analytica scandal)
Thirdly, and most importantly, the most valuable data to data capitalists isn’t your personal data, it’s what happens when you aggregate these data points into a bigger trend. Much of this comes from the traces you leave, the residual data, which they argue you don’t own.
A relatively harmless example is the “Did You Mean…?” feature in Google search. Google analyzed billions of search patterns to figure out what people start typing and delete, or what words are common misspellings. They then suggest or autofill what they think you meant.
Now consider the implications of Google and Facebook tracking not only what you search and comment, but everything you start to type, even if you change your mind and delete it without hitting “publish”.
You don’t own that data. They do. They can use it to gauge knee-jerk emotional reactions. A prejudice you immediately suppress. Even a comment that, in some parts of the world, might constitute an illegal criticism of the government. Imagine if the organization buying insights from this data could use this to predict what was likely going through someone’s head before they edited and softened what they ended up sharing on social media. From there, they could market to you in a way that inflames what made you angry. Or, on the other end of the spectrum, begin policing not only your words but your thoughts.
This is all totally legal. These companies don’t need your permission to exploit this data point capital. While you’re using their platform, you can’t stop them.
Can You Opt-Out?
“I was sitting with some Google executives and I asked the question, “How do I opt out of Google Earth?” All of a sudden, the whole room goes silent. Marissa Mayer, [a Google vice president at the time], was sitting at a different table, but she turned around and looked at me and said “Shoshana, do you really want to get in the way of organizing and making accessible the world’s information?” It took me a few minutes to realize she was reciting the Google mission statement.”
– Prof. Shoshana Zuboff (Author, The Age of Surveillance Capitalism) in an interview with The Intercept,
Everywhere you go, you are being quietly watched and listened to. Google, Facebook, Apple, Microsoft, and Amazon have all worked hard to make that the case. Even if you hate social media, the chances are you’ll have to set up an account for work or school. You might never dream of getting an Alexa, but your friends’ smartwatches and phones will eavesdrop on you all the same. Now, as the whole world moves into self-isolation or lockdown in the midst of the global COVID-19 pandemic, even the most strictly analog people are now having to live their lives online.
So: no. Realistically, you can’t opt out of surveillance capitalism. But you can make it tricky as for companies to piece together your online activity into a coherent pattern, or to determine your true identity.
Firstly, don’t link your accounts. When you get the option to sign up or sign in with your Facebook account, say no. If you really want to stay anonymous, generate a temporary email address from 10 Minute Mail, which can’t be traced. If you do use your own email address for Facebook, Google, iCloud, Amazon services, etc., make sure you sign out, so your activity can’t be linked as easily when you move between sites, devices or tabs.
Your IP address is the biggest giveaway, though, so you’ll need a VPN. This masks your identity by routing your connection to the internet through a different server, which could be anywhere in the world. If this is properly encrypted (and you don’t give away your identity some other way), nothing you do online can be traced back to you.
It’s possible to find a VPN that’s completely free to use, but if we’ve learned anything from the data capitalism of the past 20 years it’s that sometimes it’s better to be sure you’re the customer, not the product. This aside, a reliable VPN boosts security and protects you from prying eyes on public WiFi, so it’s important you opt for a good one.
We recommend:
#1 ExpressVPN
ExpressVPN is a very fast, secure, reliable option that is capable of unblocking just about any type of content and works on just about any device. You can choose to connect from servers in multiple cities across 94 countries, with excellent speeds even when streaming video. It’s very stable, with few delays or interruptions.
When it comes to privacy, ExpressVPN deploys excellent encryption, incorporates DNS leak protection to prevent any identifying data creeping through, and offers a “stealth mode” option. There’s also a kill switch, which kicks into action if your WiFI suddenly drops, ensuring you don’t go back online until the VPN has reconnected (unwittingly giving away your real IP address in the process).
The company is also committed to protecting the data it keeps on customers: it doesn’t keep user logs, has an excellent privacy policy and is based in the British Virgin Islands, which means the company can’t be compelled to keep or hand over information on what users get up to online. Read our full ExpressVPN review for more information.
#2 Surfshark
While less well known than ExpressVPN (and not always as speedy) Surfshark is another great VPN option. It unblocks all the big streaming services, including BBC iPlayer and Netflix US, and has 500 servers across 50 countries, so you have plenty of choices when it comes to where you connect from. It’s also popular for torrenting.
Even better are the unlimited simultaneous connections, allowing you to use the VPN from as many devices as you want, from one account, preserving your privacy across the board.
Like ExpressVPN, Surfshark doesn’t keep any logs, so you’re protected that way, too. Security is top-notch and the split tunneling feature for Windows and Android apps allows you to omit certain apps from your VPN without turning the whole thing off. That means you can still access apps that require geolocation, for example, while still browsing anonymously. Read our full Surfshark review for more information.
Final Thoughts
The rise of the internet and the proliferation of connected devices was supposed to democratize knowledge, empower users and help us develop creative ways to solve problems. Sometimes it achieves those things. The trouble is, those same tools can be used to exploit, control and manipulate. In our current system, it’s extremely difficult to access one side without risking the other. Surveillance capitalism isn’t going anywhere, so unless you plan on logging off and going dark, you’ll need to find ways to manage its influence over your life.
- Did you find “Surveillance Capitalism” interesting? Wish to delve deeper? Get the book The Age Of Surveillance Capitalism by Prof. Shoshana Zuboff.