Advances in the digital revolution can be dazzling. Now, data about where we are, where we’re going, how we’re feeling, what we’re saying are turning into beacons of revenue that illuminate a new commercial prospect. Lights, bells, and whistles have blinded us to the ways that high-tech giants exploit our personal data. Corporations are mining users’ information to predict and shape their behavior. Here, we’re entering uncharted territory. As a result, no longer can the concept of privacy limit the assault on behavioral data. Now we face a new kind of challenge, one that threatens the political, ideological, and existential basis of a modern liberal order; built upon principles of self-determination that have been developed over centuries. Individual sanctity and social equality; the social development of identity and autonomy; the integrity of contracts, the freedom derived from making and keeping promises; the rules of collective agreements; the functions of market democracy; the political integrity of societies; and the future of democratic sovereignty. All have been threatened in our lives.
The surveillance capitalism phenomenon is developed through a secret coupling between the digital’s vast powers, which have dominated global commerce for the last three decades. All of which was discovered and consolidated at Google, then adopted by Facebook and quickly diffused across the Internet. Google was the birthplace of a wholly new subspecies of capitalism, one characterized by the unilateral surveillance and modification of human behavior. The digital world of Google, whose signature feature is the Internet and its successors, represents a form of surveillance capitalism that cannot be imagined anywhere else. Society credits Google’s success to its advertising model; however, the discoveries that led to Google’s rapid rise in revenue and market capitalization are only incidentally related to advertising. Google’s success is based on its ability to forecast the future – specifically the future of behavior. In other words, Google has collected data on users’ search-related behavior as a by-product of query activity. In the past, these data logs were treated as ‘waste’, not even safely or methodically stored. Google eventually realized that these logs could be used to teach and continuously improve its search engine. The problem with this was this: providing users with incredible search results “used up” all the value that users had accidentally provided to Google by providing behavior data. Basically, it’s a self-sufficient process in which the users are the ends in themselves. By improving search, the value created by users is reinvested into the user’s experience. As long as the effectiveness of the search engine needed users’ behavioral data as much as users needed the search function. This idea was the start of the capital unethical ways Corporations were going to start using.
We can now see that this shift in the use of behavioral data marked a historic turning point. Data on human behavior that had been abandoned and ignored were rediscovered as what’s called a behavioral surplus. As a result of Google’s achievement in “matching” ads to pages, it became clear that this behavioral surplus could generate revenue as well as make an investment into capital. As a result, the behavioral surplus is a game-changing zero-cost asset that can be diverted from service improvement to a genuine market exchange. The key to this formula is that this new market exchange was not an exchange with users, but with other companies, who understood how to make money by predicting the behavior of users in the future. User needs were no longer the end-in-themselves in this new context. Instead, they created a new kind of marketplace in which users were neither buyers nor sellers. Users became the source of free raw material that feeds a new kind of manufacturing process.
In fact, what happened here was the discovery of a surprisingly profitable business equation – a series of lawful relationships that became increasingly embedded in surveillance capitalism’s economic logic. Advertising has dominated this new kind of marketplace for the earliest part of its history, but there is no substantive reason that such markets should be restricted to advertisers. Just think of Donald Trump’s election as an indication of this trend. The behavioral fortunes of individuals, groups, bodies, and things can be told and sold in a marketplace; where any actor who is interested in monetizing or influencing probabilistic information about our behavior may do so. As we observe capitalism, we see it shift: once profits came from products and services, then profit came from speculation, and now profits come from surveillance. Due to the exponential growth of digital technologies, the economic impact of the digital explosion has been difficult to discern, since so many of the technologies are diverted into an unethical form of profit.
Surveillance capitalism’s claim to manifest data future cannot be altered without an uprising. Society must revoke collective consent to the practices associated with the dispossession of behavior. In order to restore the primacy of ethical order in this capitalist project, society should reimagine how to intervene in the specific mechanisms that produce surveillance profits. There is a need for Government and society to enact new interventions that interrupt, outlaw, and regulate:
- Using behavioral surpluses as free raw materials
- Selling prediction products
- Prediction products that are used for third-party operations of modification, influence, and control
- Excessive concentrations of new means of production
- Manufacturing prediction products
Ultimately, this is necessary for society, for people, to restore capitalism’s healthy evolution. Unfortunately, as technology keeps advancing, it’ll keep furthering the dilemma of our privacy. How far could companies go with our privacy and information if technology advances will allow them to gain more power? Surveillance capitalism is a problem society should be indignant about because it demeans human dignity. The future of this dilemma rests in the hands of the common person in society, who are drawn to this frontier practice. High-ranking authorities, elected officials, and policymakers who understand their authority in the foundational values of a democratic state need to approach these problems for the privacy of its society. Whereas compliance induced by dependency is no social contract, and freedom from uncertainty is not freedom.