Surveillance Capitalism: An Overview
Hau Phan / June 07, 2021
• 21 min read
This was my final essay of CS-E5480: Digital Ethics D. PDF
Introduction
The 21st century witnessed the rapid digital transformation of the political, socioeconomic landscape made possible by the internet. Digital technology was advancing at a pace no one had expected, engendered waves of transformation across multiple industries. As the burning flame of industrial innovation slowly died out, the imminent information revolution was on the horizon, waiting to be set ablaze.
The first decade was remembered as the rise of the first tech companies: Google, Apple and Microsoft all experienced unprecedented growth during the period. These accomplishments were celebrated globally, mostly in the US, as new consumer products and digital services bring many conveniences and life improvements. Little do we know that during the same period, a group of individuals had invented an exploitative totalitarian form of capitalism, an unprecedented event of the digital transformation. Years later did the world start to realize the nature of these big tech companies and the practices they employed. Many research papers were published to address antitrust laws and monopolistic practices of these companies but little addressed the fundamental economic systems that comprise all their operations. Zuboff was first to realize and coined the term "surveillance capitalism" to characterize this rogue economic system. This essay aims to provide an overview of the foundation of surveillance capitalism, its components, operations, consequences, and a simple path for exploration of the topic.
Outline
Surveillance capitalism as a general economical concept was introduced in "A digital declaration" by Shoshana Zuboff in 2014. The paper marks the first publication on this mutation of capitalism and sparks many discussions on many technical practices of Big Tech companies such as Google, Facebook, and Microsoft. Subsequent scholarly articles further built upon the definition that had already been laid out in her original paper. In 2019, a major work on surveillance capitalism was published: "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power"1, in which, she summarized:
"Surveillance capitalism is best described as a coup from above, not an overthrow of the state but rather an overthrow of the people's sovereignty and a prominent force in the perilous drift towards democratic deconsolidation that now threatens Western liberal democracies."1
These ominous consequences of surveillance capitalism call for a coherent ethical framework in an attempt to encapsulate all of its complications. In this essay, I suggest one such framework, albeit simplified, which consists of three distinct but interconnected operations that constitute the primary behaviors observed in surveillance capitalism:
- The mining of "behavioral surplus" from user activities and experiences. (extraction)
- The feeding of behavioral data into advanced analytical processes ("machine intelligence") to produce "prediction products" (manufacture)
- The exchange of prediction products on "behavioral futures markets" (commercialization)
Concrete examples will be shown in subsequent sections, where the three primary operations of surveillance capitalism are laid out and explored in detail. However, it is necessary that some fundamental theoretical concepts are clearly understood beforehand.
Extraction and Manufacture
Machine Intelligence
First, to understand the extraction and manufacturing process of surveillance capitalism, it is necessary that a frequently used term is clearly understood beforehand: machine intelligence. The term is used frequently in Zuboff’s "Age of surveillance capitalism" as a means to bypass the technicality of analytical practices1. In summary, machine intelligence is a generalization of different specialized computational tools that are used for data analysis and manufacture of data-based products. It is used frequently as an umbrella phrase that covers complex computational methods employed by surveillance capitalists such as artificial intelligence, predictive analytics, and machine learning, allowing for easy reference in other areas of science. Its role is to help concentrate our attention on the more important issues of surveillance capitalism, decentering our focus from the technology it employs to its objectives instead.
The Behavioral Value Reinvestment Cycle
A fundamental component of surveillance capitalism is the loop of behavioral data collection, analysis, services improvements and increased user activities - the behavioral value reinvestment cycle. It came to existence during the early invention of surveillance capitalism at Google, when engineers began to apply machine intelligence to massive collections of user’s past queries. The outputs of these analyses were then translated to improvements of the user experiences: better detection of typos, better accuracy and more relevance query results/suggestions. In other words, the value created was reinvested directly into improving the user experience, hence the name. These enhancements in turn attract more users onto the platform, producing even more behavioral data for analysis. This self-improving mechanism is so effective that only 1 year since Google's establishment, more than seven million user requests were conducted on its search engine each day. Note that user data was provided at no cost but so are the enhancements of the platform’s services for users.
Before moving on, there are some misconceptions that need to be pointed out. Since there is no economic exchange, no price and no profit, it is inaccurate to think of Google’s users as the customers. There are also no wages involved or the provision of the means of production: users are not paid for the data they produced nor do they operate the process of web crawling or its enhancements. Consequently, it is also wrong to think of them as workers of the cycle. Finally, there is also a common rhetoric that the user is the "product" of the system. However, this is also misleading as many aspects of being the "product" are missing and such expressions further confuse the issue rather than clarifying it.
The cycle was once all there is to the operations of Google - the first practitioner of surveillance capitalism. While being different to previous customer-manufacture relationships by embodying a new mechanism for improving products and enlarging user base, the behavioral reinvestment cycle is not yet capitalism, at least in its current stage.
During the early days of Google, there was the major issue of converting the service usages to revenues: charging the user for their searches seemed financially risky and counterproductive; monetizing the searches seemed to be a dangerous precedent as much of Google's indexed information is taken without payment from the contents’ hosts. The cycle, while possessing the capacity of producing advanced technologies, is financially unsustainable and failed as a functional business model. For Google, providing advertisement service was the only viable solution.
The decision of incorporating advertisements on the company website faced much opposition at the start, mostly from the company's engineers and analysts. Many engineers of Google's AdWords team displayed antipathy toward ads, fearing uncontrolled bias towards advertisers could steer the company away from user’s need and degrading searches’ integrity2. However, financial pressure from the fallout of the "dotcom bubble" and intense market competition in the early 2000s eventually surmounted ethical ideals. To make more money, it was proposed that advertisements should be automatically targeted to specific consumers, simplifying the whole process of picking which keywords inciting which ads for advertisers wanting to use Google online advertising platform. Finding the solution to this problem eventually led to Google’s discovery of the centerpiece of surveillance capitalism: the behavioral surplus.
Behavioral Surplus
The behavior surplus can be described as by-products formed from user digital activities, usually existing in the form of behavioral patterns in collection of user data, readily extracted and transformed into prediction products. It is a component of the behavioral value reinvestment cycle, the "data exhaust" produced during user digital activities and their analysis3 . For example, the innocent act of searching the web for a keyword on Google produces a wake of collateral data such as search counts, phrasing, tonations, click patterns, dwell times and geographic locations. These excess behavioral data were once thought to be "exhaust material" and devoid of meaningful values, thus, were either cached in massive data stores as backups or discarded entirely. For instance, Google stores user queries results for archiving purposes in its early days of operation without much knowledge of its hidden predictive values.
Later analysis of such seemingly unrelated accidental data in massive quantities or "data mining" reveals behavioral patterns representing sensitive aspects of human behaviors such as emotions, moods, intentions and needs. Such insights give Google a powerful competitive advantage over its competitors in the ads services provider market. For example, a rival search startup Overture had developed an online auction system to address the scaling problem of online targeted advertisement. Compelled, Google also developed a similar auction system but added a transformational functionality: probabilistic modeling of user’s clicks on ads. The model produced a numeric representation that can be used to compare the effectiveness of advertiser’s ads on a particular user, not only maximizing specificity and accuracy but also the number of advertisers Google's can handle at any given time by minimizing. Eventually, Google succeeded and held the monopoly over web searching, eliminating many companies in the same service space during the process.4
The discovery of behavioral surplus and its capability of behavioral prediction marked the shift of priorities for Google’s investment strategy. Under the hood, the behavioral reinvestment cycle was rapidly subordinated by a much more complex system of operations unbeknown to users. While some of the data relevant to the improvement of user services will still be reinvested for the benefit of the consumer, the focus was now placed on the maximization of extracted behavioral data and development of machine intelligence and operations that derive values from these data. The purpose of improving services slowly descended to keeping users engaged and the platform reliable for the intention of extraction. For Google, it is keeping users reliant on Google for online browsing and analyzing their queries for better targeted ads. Note that targeted advertisement is just one of the derivatives of prediction products made possible by surveillance capitalism and not the only source of values for surveillance capitalists.
The discovery had also induced another change in corporate mindset at Google: the company is now compelled to actively hunt for sources of behavioral surplus and better tools of extraction rather than waiting for accidental patterns emerging from user activities. This was characterized by Zuboff as the extraction imperative, in contrast to the production imperative of industrial capitalism. An example is Google’s "senseless" $1.65 billion acquisition of Youtube at a time this video-sharing startup was ridden with copyright infringement lawsuits and a year of profitless operation. Another example is Facebook’s "reckless" purchase of overvalued unprofitable startups such as the virtual reality company Oculus ($2 billion) and the messaging platform WhatApps ($19 billion). Only years later was it known that these seemingly ludicrous business decisions were deliberately aimed at acquiring potential sources of behavioral surplus that evidently, had brought tremendous amounts of capital for these first movers of surveillance capitalism.
The Moat
In the discussion of "The moat around the castle"1 Zuboff laid out three main paths of exploration that go into detail how socio-political circumstances and deliberate practices of surveillance capitalism obfuscate its employers’ practices and legitimizing their exploitative operations. These includes: (1) the pursuit and defense of corporate freedom and operational rights in unregulated space; (2) the sudden federal interests in the capabilities of behavioral surplus analytics after 9/11; and (3) the construction of fortifications in politic and academia to protect and deflect scrutiny of its practices.
Right to Unregulated Space
The founders at Google had instituted a corporate structure that allowed the two opposites to coexist: total controls over the market sphere and the pursuit of freedom in the public sphere. Such freedom was made possible by the unregulated nature of cyberspace, mostly due to its novelty as an area of business activities and economic operations. The cyberspace was characterized by Eric Schmidt and Jared Cohen in the book "The New Digital Age", as the world's "largest ungovern space" and truly unbound by "terrestrial laws" and jurisdictions5. The lack of political institutions is what made cyberspace attractive to surveillance capitalists: a frictionless space where behavioral surplus extraction and manufacturing operations are done smoothly and efficiently without any socio-political hindrances. Such policy gaps were a direct transformation of the speed gaps between democratic institutions and bigtech corporations. As admitted by Schmidt in his elaboration of the 2011 senate testimony, the same antidemocratic measure of leveraging speed "also work for Google" and described as:
"This is an Andy Grove (Intel former CEO) formula.... "High tech runs three-times faster than normal businesses. And the government runs three-times slower than normal businesses. So we have a nine-times gap.... And so what you want to do is you want to make sure that the government does not get in the way and slow things down"
A Historical Circumstance
The 9/11 terror attacks had caused significant mentality changes among government officials and the general sentiments toward public surveillance. The historical circumstance has united the causes of public intelligence agencies and the early surveillance capitalist Google, producing a unique historical deformity: surveillance exceptionalism.
The terror attacks had shifted the perception of the federal government on the practices of online surveillance: from being operations in violation of user privacy to mission necessities critical to the safety of the public. Both institutions coveted certainty of user behaviors and were motivated to fulfill that craving in their respective domains at any cost. The circumstances lent surveillance capitalism a shelter from scrutiny by slowly legitimizing its operations in the political sphere. Intelligence agencies were now motivated to replicate Google’s means of extraction and manufacture, spreading surveillance capitalism’s ideologies to other sectors of power in society. For example, in 2006, General Keith Alexander outlined his vision for a search tool called ICREACH that, quoted: "allow unprecedented volumes of metadata to be shared and analyzed across the many agencies in the Intelligence Community"6. In 2007 two NSA analysts wrote an internal training manual on how to find information on the internet7. Such craving slowly translated to reliance, as the government grew dependent on Silicon Valley to defend security threats looming in cyberspace, deepening the relationship between governments and surveillance capitalists.
Fortification
The fortification strategies employed by surveillance capitalists, to my knowledge, consist of four main demonstrative operations: providing competitive advantage in electoral politics, personnel migration to and from government sectors, aggressive lobbying and manipulating public perception by influencing cultural conversation and academic publications. For example, the 2008 Obama presidential campaign had Eric Schmidt - the sitting CEO of Google - as one of the main directors, in charge of implementing state-of-the-art data strategies that have the potential to shadow traditional political campaigning with the science of behavioral prediction8. Personnel migration can be seen frequently through the years of operation at Google: the Google Transparency Project found that by April 2016, 61 individuals had migrated from the Google Sphere (company employees plus affiliates and law/lobbying firms) to the government and over 197 government officials had moved back9. Lobbying is a common practice for Google: in 2014, more than $17 million was spent on lobbying outlay and in 2018, that number rose to more than $18 million9. To obfuscate its practices, Google exercises information manipulation by means of financial pressure to influence academic research and steering public opinion. Since 2009, it has been reported that Google had deliberately sought out and funded university professors for policy papers in agreement with Google's positions.10
A Human Invention
It is important to emphasize that surveillance capitalism is an intentional creation, an invention made at a specific time and at a specific place by a group of individuals. It is not an inevitable result of the digital transformation, nor an expression of information capitalism. It was deliberately constructed to solve a business problem at a particular moment in history. If there was no recession nor the dotcom crash, or the people in charge making the decision that they had made, the fire might not have started and surveillance capitalism might have not come to existence.
Many elements of online surveillance predated the creation of surveillance. For example, "cookies" or small pieces of data stored on the user's computer by the web browser that allow websites to remember user information and activities, had already been introduced in 1994 by Netscape11. Other similar online browsing trackers and surveillance tools such as "web beacon" or "web bugs" were well-known among experts during the late 1990s12. However, it was Google that integrated a wide range of online surveillance mechanisms, from cookies to predictive analytics that allow for the institution of a new logic of accumulation by means of data extraction and analysis, establishing a new market for commercialized prediction products where customers are businesses, not consumers.
Commercialization
Prediction Product
Surveillance capitalism is a derivative of capitalism and thus the exchange of products among its actors is one of its fundamental activities. Differ from popular industrial capitalism where commodities are manufactured goods traded on the open market, surveillance capitalism goods are prediction products that forecast future behaviors of users. These include, but not limited to: thoughts, actions, emotions, moods, desires, physical needs, psychological needs, short-term intentions and possibly long-term intentions, given sufficiently powerful behavioral data. It is the nature of prediction products that explains why Google constantly distant themselves from the notion of selling personal data. Google does not sell the raw materials, they sell the predictions. The claim of privacy purity is just a superficial excuse that conceal the backstage operations of surveillance capitalism it employs.
Prediction products reduce uncertainty in their customers' operations, advising them where and when to allocate resources. The quality of prediction products is a direct translation of its accuracy: how good are their approximations of reality. The more precise the prediction, the lower the risk and the higher the revenue. For the fledgling company Google, targeted advertisements are the embodiment of prediction products. However, as demonstrated by Zuboff, advertising is far from being the end of the commodification of behavioral data.
Behavioral Market
Prediction products after being fabricated by machine intelligence from massive collections of behavioral surplus are then sold on a new kind of market: the behavioral futures market. The market exchanges exclusively the knowledge of future behaviors of consumers. Although for most of the history of surveillance capitalism, the dominant players of this new marketplace are advertisers, there is no reason why such markets are limited to this particular group.
The scope of behavioral futures markets has expanded throughout the advance of surveillance capitalism in modern society, both in terms of potential customers and the variety of traded products: once confined to the online targeted advertisement services, products of surveillance capitalists now may comprise offline predictions of users locations, emotions and actions, automated tools that generated those predictions, and ultimately, behavioral modification tools that align user behavior to the business’s means of profit-making. While surveillance capitalism is based on classical capitalism and shares many common dynamics in its commercialization, there are necessary distinctions between the two that are worth highlighting.
The classical producer-consumer relationship of capitalism is starkly different from this freshly formed variance of it. On one hand, classical capitalism allows for constructive relations between the manufacturers and the consumers, in which the former creates supply and the latter induces demand. Manufacturers base their course of actions on the state of the consumer market, adjusting product price, quality and capacity of their factories accordingly while for most consumers, purchase decisions are based on the final price tags, affected primarily by the original prices set by manufacturers.
In contrast, surveillance capitalism relationships with consumers - the users of the services - are exploitative rather than constructive. The consumers of digital services have little to no influence on the operations of these services. More and more of our online activities are accompanied by machine intelligence beyond our understanding. In fact, they have become behavioral modification tools that are purposely designed to herd consumers like sheep to areas of data extraction. For example, Youtube video recommendation algorithms are programmed to maximize user on-site time rather than satisfying user needs13. Another example is the clips sharing platform TikTok with its personalized machine learning contents that keep young users hooked for hours14. Essentially, surveillance capitalists are trying to automate consumers' behaviors, stripping their decision rights. Consumers on the other hand, have little to no power over their operations. transformative effects on uncontrolled advancement of surveillance capitalism will be discussed in detail in the next section.
This void of power over our own digital experience is what made the behavioral value reinvestment loop to run smoothly and the manufacture of prediction products run efficiently. Opposite to information capitalism, the services provided by surveillance capitalists are nothing but hooks that lure users into convenient areas of extraction. We are far from being the end consumer of surveillance capitalism, in reality, we are on the opposite ends: the raw materials, objects of an inescapable system of continuous extraction.
Instrumentarian Power
Competition in any capitalistic economic system drives the innovation of the means of production. However, in future rendition of surveillance capitalism, innovations may not be required to equal more efficient manufacturing tools and extraction. Zuboff suspected that future surveillance capitalists could discover that the best way to maximize their competitive advantages is to automate our behaviors directly. Rather than produce more accurate algorithms and better tools of extractions, they may modify user behaviors and align them according to their customers' needs instead, maximizing the effectiveness of their no longer prediction product but behavioral modifications. As portrayed in her words:
" With this reorientation from knowledge to power, it is no longer enough to automate information flows about us; the goal now is to automate us."1
This realization among future surveillance capitalists might mark the birth of a new species of power: instrumentarianism. The means of manufacture in surveillance capitalism is now replaced by means of behavioral modification. Players of the system are now stuck in a continuous loop of intensification of the mean behavioral controls, enjoying the gathering might of instrumentarian power that these means endowed.
Moreover, the means of behavioral modification may not be limited to the digital world. Competitive dynamics might nudge the expansion of behavioral futures markets beyond the digital sphere and into the physical world. The same foundational mechanism used to lure and guide your online activities and decisions such as liking posts, picking a product in an online webstore and watching a particular Youtube video, can be repurposed to physically modifying your behavior in the real world. For example Pokemon Go was Google's first publicly known experiment of physical behavioral modification in the real world1. The viral phenomenon attracts millions of users across the globe and becomes a tremendous financial success for Niantic Lab - surprising to most, is an internal startup at Google - but most importantly for Google, it is a proof that such expansion of surveillance capitalism into the market of real-world behavioral modification is possible.
Conclusion
In the essay, I proposed a framework for exploration of surveillance capitalism and its relevant discussions. Starting from the fundamental operations of extraction and manufacture, we explored the concept of behavioral surplus and the cycle that generates them: the behavioral reinvestment cycle. Then, we witnessed how the inventor and first partitioner of surveillance capitalism: Google, came up with the solution to the financial problem they faced and the eventual logical steps that they took to reach the invention of surveillance capitalism. We also explored how different aspects of surveillance capitalists are protected and obfuscated and its nature as a man made creation. Finally we moved on to the discussion of the commercial aspect of surveillance capitalism: prediction products and its market for exchange and consumption. We end our essay with a small discussion of the advancement of surveillance capitalism and its elevation from prediction products to behavioral modification as the primary means of production.
Reference
Footnotes
-
1. Shoshana Zuboff. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs; 2019. ↩ ↩2 ↩3 ↩4 ↩5 ↩6
-
1. Brin S, Page L. The Anatomy of a Large-Scale Hypertextual Web Search Engine. Computer Networks and ISDN Systems [Internet]. 1998 Apr [cited 2022 May 13];30(1):107–17. Available from: https://www.sciencedirect.com/science/article/pii/S016975529800110X ↩
-
1. Cukier K. Data, Data Everywhere. ECONOMIST [Internet]. [cited 2022 May 13];394(8671):3–5. Available from: https://search.informit.org/doi/abs/10.3316/agispt.20100985 ↩
-
1. Coy P. The Secret to Google’s Success. Business Week. 2006;17. ↩
-
1. Eric Schmidt, Jared Cohen. The New Digital Age: Transforming Nations, Businesses, and Our Lives. Knopf Doubleday Publishing Group; 2013. ↩
-
1. Gallagher R. The Surveillance Engine: How the NSA Built Its Own Secret Google. The Intercept. 2014;25. ↩
-
1. Bishop B. NSA Reveals Its Internet Search Tricks in the Recently Declassified “Untangling the Web” [Internet]. The Verge. 2013 [cited 2022 May 13]. Available from: https://www.theverge.com/2013/5/8/4313524/nsa-reveals-its-internet-search-tricks-in-the-recently-declassified-untangling-the-web ↩
-
1. Eric Schmidt: Obama’s Chief Corporate Ally [Internet]. Tech Transparency Project. 2016 [cited 2022 May 13]. Available from: https://www.techtransparencyproject.org/articles/eric-schmidt-obamas-chief-corporate-ally ↩
-
1. Google Transparency Report [Internet]. [cited 2022 May 13]. Available from: https://transparencyreport.google.com/?hl=en ↩ ↩2
-
1. Nicas BM and J. Paying Professors: Inside Google’s Academic Influence Campaign. Wall Street Journal [Internet]. 2017 Jul [cited 2022 May 13]; Available from: https://www.wsj.com/articles/paying-professors-inside-googles-academic-influence-campaign-1499785286 ↩
-
1. Kristol DM. HTTP Cookies: Standards, Privacy, and Politics. ACM Transactions on Internet Technology [Internet]. 2001 Nov [cited 2022 May 13];1(2):151–98. Available from: https://doi.org/10.1145/502152.502153 ↩
-
1. Smith RM. The Web Bug Faq. Nov. 1999;11:4. ↩
-
1. Bishop S. Anxiety, Panic and Self-Optimization: Inequalities and the YouTube Algorithm. Convergence [Internet]. 2018 Feb [cited 2022 May 13];24(1):69–84. Available from: https://doi.org/10.1177/1354856517736978 ↩
-
1. Anderson KE. Getting Acquainted with Social Networks and Apps: It Is Time to Talk about TikTok. Library Hi Tech News [Internet]. 2020 Jan [cited 2022 May 13];37(4):7–12. Available from: https://doi.org/10.1108/LHTN-01-2020-0001 ↩