“There’s no data like more data.” Robert Mercer (billionaire associated with funding Cambridge Analytica, Breitbart News, and Donald Trump’s 2016 presidential campaign).
Computer profiles generate objects for surveillance - they instruct or train the observer in what to watch and how to watch for it. Police, psychiatrists, educators, physicians, to name just a few groups, increasingly use profile technology for early or pre-identification of various traits within preselected populations - if you match enough elements of the profile, you could become a target, even before any trait has manifested itself. To prepare the observer, to train the observer to see, and in the last instance, to be the observer's “sharp eyes”, this is the imaginary of the simulation of surveillance, a future with with precog algorithms to predict crime or political resistance before it happens.
facial recognition Xinjiang, China
Surveillance technologies have enabled two major modes of control and manipulation. One of them is political and authoritarian, the other is a vector of consumerism and monopolistic market power. Both have been made possible by the vast amount of information collected on the web, by visual monitoring, and by new techniques for sorting that information, for example facial recognition software, or analysis of “big data” that can be used to predict or modify personal behavior. Big data is an umbrella term that encompasses both the size of data that is often collected from smart city sensors and the technology used to analyze this data, including artificial intelligence, machine learning, and deep learning
The most widely publicized authoritarian model has developed in China, while the United States currently is better known for the successes of its “surveillance capitalism”, concentrated in Facebook, Amazon, Apple, and Microsoft.
In China, an extensive system of cameras monitors the activity of people in cities. One of the most invasive and discriminatory uses of this system is in the monitoring of “sensitive peoples”, such as Uighurs and Tibetans, bringing together facial recognition, DNA information, and other data to profile and often detain members of these groups.
China’s project with AI is “to twist a technology that feeds on freewheeling information to fit neatly into China's constrained information bubble. In its internal documents, the CCP says that it will use AI to shape reality and tighten its grip on power within its borders — for political repression, surveillance, and monitoring dissent. It has already launched a chatbot trained entirely on Xi Jinping’s political and economic philosophy, "Xi Jinping Thought on Socialism with Chinese Characteristics for a New Era"China's vision for the future of AI is closed-sourced, tightly controlled, and available for export all around the world. “It is a warning. The digital curtain AI can build in our imaginations will be much more impenetrable than iron, making it impossible for societies to cooperate in a shared future.” (from Business Insider)
If the techniques of surveillance lend themselves to authoritarian control and political manipulation, the information that activity on the internet produces has also enabled the rise of “surveillance capitalism,” and the market dominance of companies like Google, Facebook, Amazon, and Netflix, to name a few. Information is a commodity because it provides advantage and, coupled with categorisation, can generate new knowledges. The circulation of information provides sources of profit, and the databases that store this information are the apparatus of ‘new surveillance’ in a ‘control society . “Dataveillance” is a term used to describe the automated storage and categorisation of information, which does not replace surveillance but amplifies it and allows for the ability to ‘mine’ data, to make associations between different bits of information to create entirely new knowledge of the individual or group.
In a book entitled The Rise of Surveillance Capitalism, Shoshana Zuboff documents the ways in which corporations monetized the “data exhaust”, the “breadcrumbs” left behind by user activity, into a massive source of wealth and power, that they declared to be theirs for the taking. According to Zuboff, Google and others “asserted their rights to bypass our awareness, to take our experience and transform it into data, to produce strategies and tactics that keep us ignorant of their practices. and to insist on the conditions of lawlessness required for these operations. Apparently useless information could be combined with powerful new analytic capabilities to produce predictions of user behavior. (p.338) Our behavior, once unobservable, was declared as free for the taking, theirs to own, and theirs to decide how to use and how to profit from. Since then, “Surveillance capitalists have shifted from using automated machine processes to know about your behavior to using machine processes to shape your behavior according to their interests.”
In an essay on “Platform Monopolies and the Political Economy of AI”, Nick Srnecek notes that, perhaps surprisingly, much of the latest AI research and software is open-source and freely available to anyone who wants to use them. But as he points out, “Any company may be able to acquire and develop the same basic face-recognition software as Facebook, but only Facebook has access to the billions of uploaded photos that can be used to train those algorithms into the most accurate recognition software available.”
One form of resistance to surveillance has been called ‘‘sousveillance’, orwatching from below. It is a form of inverse surveillance in which people monitor the surveillors.’ (Jan Fernback, Temple University)
A class-action suit brought against Facebook in Illinois, where privacy laws are strong, led to a multi-million settlement for Facebook’s scanning and tagging faces without obtaining permission beforehand. However, this case seems to have been an exception.
a respirator mask that enables i-phone facial recognition