OCT 1 00:00-OCT 16 00:00
Aomori Contemporary Art Centre
As Part of the Material and Mechanism
October 25 – December 14, 2014
“If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.”- Eric Schmidt (CEO, Google)
The rise of data collection devices and the big data industry both in terms of government surveillance and private corporate data mining markets have in recent years radically restructured the individual in relation to the state, public sphere and corporate sector. These shifts have served to reframe individuals as generators of data first (via consumption, movement, social media and interaction with public and private markets) and citizens second. Data has emerged as the most valuable commodity to both states and corporate bodies in the new economy, allowing for a radical new approach to understanding the world. No longer is data
"What are people worried about? What is the problem? Are you doing something you're not supposed to?"- Trent Lott (Former U.S. Senator) acquired in small batches forcing costly and complicated work by statisticians in culling samples for correlations and patterns, data is now acquired en-mass simplifying analysis to a reliance on processing power. This shift has been facilitated by the restructuring of our private and public selves by emerging self-documentation technologies (such as Fitbit and Narrative Clip), as well as, social media platforms that have inculcated vast portions of the computer using public into accepting a new paradigm of radical publicness.
Individuals on social media platforms, now readily publish data that used to require enormous financial and infrastructural investment by state and corporate bodies to attain. And whatever data is not given freely is simply taken through the use of long and unreadable terms of service agreements or constitutional and extra-constitutional invasions by state bodies. The tendency to relinquish data on the self so readily is for the most part due to a lack of understanding of the value and implications of such data or a mistaken belief that such data surveillance programs only target ‘wrong-doers’. This is known as the ‘Nothing to Hide Argument’ which reframes privacy as secrecy, undoing a historically rooted notion of privacy as an inherent right which has more to do with maintaining aspects of selfhood that are beyond the scrutiny and control of state bodies.
As Bruce Schneier a data security expert and crytographer notes: “Privacy protects us from abuses by those in power, even if we're doing nothing wrong at the time of surveillance... For if we are observed in all matters, we are constantly under threat of correction, judgment, criticism, even plagiarism of our own uniqueness. We become children, fettered under watchful eyes, constantly fearful that -- either now or in the uncertain future -- patterns we leave behind will be brought back to implicate us, by whatever authority has now become focused upon our once-private and innocent acts. We lose our individuality, because everything we do is observable and recordable.”
Installation View Displaying Statistical Motion Analysis, Digital Print on Vinyl. Photo Credit: Tadasu Yamamoto Courtesy of: ACAC
While data may not be intrinsically harmful, the interpretation of that data by analysts introduces levels of subjectivity and mediation, formulating narratives that are coherent with worldviews that serve to perpetuate the legitimation of securitization. Bruce Schneier aptly quotes Cardinal Richelieu in saying; "If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged”. Examples of this have surfaced in Canada for instance where CSEC (Communications Security Establishment Canada) maintains surveillance of environmental and indigenous rights activists and advocates because their operations are ideologically opposed to the federal mandate of resource extraction and exploitation of the Tar Sands, an oil extraction project
which has profound environmental implications . It is this threat to the right of association and right of contrary opinions and through that democracy and civil society which is threatened by these surveillance and data extraction practices. Beyond issues of civil liberties there is an issue of how data, acquired by private companies via social media, wearable technologies, ect. can be used in ways that persecute individuals. For instance, if one was monitoring ones activity with a Fitbit and data revealed an increasing inactivity on the part of the user, a health insurance company for example can acquire this data and note your increased risk of health problems (due to inactivity) and raise premiums.
Statistical Motion Analysis/Data Immersion
In the project OCT 1 00:00-OCT 16 00:00 I make use of various wearable technologies as methods of self documentation gathering vast amounts of data on my body and how it interfaces with digital platforms, formulating a data self-portrait over the period of 15 days (Oct 1st 12:00am – Oct 16th 12:00am). The first method of data generation I used is the Fitbit Flex, a wearable technology that is worn on the wrist and monitors activity of the body 24 hours a day 7 days a week. The data generated by the Fitbit generates statistics about how many steps you take, calories you burned, floors you elevated and your patterns of sleep. This data is supplemented by additional mobile apps and self inputted variables such as water and food intake. Ostensibly this device is used to set goals in personal fitness
and monitor the achievement or failure of these goals. I use this device to monitor two variables, Calories burned (as an indication of movement) and sleep patterns (deep sleep, shallow sleep, wakefulness) and log these variables in a series of tables that are translated by 3D imaging software into 3D landscapes representing these variables over the passage of time. This data was then printed as two vinyl banner works titled, Statistical Motion Analysis (one for sleep and one for movement) that were mounted on a central ‘data center’, a scaffolding tower. These two landscapes are also incorporated as both the terrain (movement) and sky (sleep) elements of an abstract computer game, titled Data Immersion, allowing visitors to navigate the landscape of data generated by my body.
Data Immersion, Computer Game, Monitor, Mac mini, Joystick, and Headphones. [Download and Play]
Screenshot From Video Game Data Immersion [Download and Play]
The second data generation method I employ is a wearable technology called Narrative Clip. Narrative Clip is a ‘narrative camera’ that is clipped to articles of clothing and takes pictures of my ‘narrative’ (daily experiences) every 30 seconds, storing the vast quantity of images on servers for viewing via a Narrative Clip App. One cannot simply upload the photos to ones computer they must first pass to servers within an undisclosed location before a user can access them. This means that individuals not only give access to geolocational data but actual visual documentation and data on the individual wearer to undisclosed sources with the very
real possibility (given historical and ongoing cases) of access to this visual data set by states and corporate bodies. For OCT 1 00:00-OCT 16 00:00 I wore a Narrative Clip from morning to night over the 15 day documentation period and for the final installation work presented access to a cell phone with the Narrative Clip App showcasing the entirety of the images for visitors to peruse. I also gathered all of the meta data gathered by the device over the 15 day period and printed it out in its entirety affixing the massive print to the side of the ‘data centre’ tower allowing visitors to visualize the sheer quantity of data gathered by this device.
The third method of self-documentation is Lightbeam, which compiles data generated by the Mozilla Firefox program Lightbeam; a program that monitors third-party access and the storage of cookies to your computer via an Internet browser. Lightbeam consists of compilations of screenshots from the program, profiling each and every third-party company that accessed my computer over the documentation period; this profile includes the companies name, location, date of access, and primary websites that provided access. This series of company profiles is supplement by a constellation view of all the primary websites I accessed and the causal
linkages between them and the third-party companies. These visual catalogs of Lightbeam data are presented on a series of screens mounted to the ‘data centre’ tower allowing audiences to scrutinize these pathways of data. This work extends the narrative of OCT 1 00:00-OCT 16 00:00 by not only charting data willingly forfeited by myself via wearable technologies but also the channels by which data is surreptitiously mined from my movements and interactions with both online and IRL environments, bringing into question the accessibility of the totality of the generated data.
Lightbeam Constellation View Screenshot, MOV files, Media Players and Monitors. Displaying primary websites accessed and the causal linkages between them and third-party companies that then gained access to data.
Hidden in Plain Sight
The final work is a wall text titled, Hidden in Plain Sight, which quotes Eric Schmidt the CEO of Google which illustrates the fallacious ‘Nothing to Hide’ argument: “If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.” This statement is particularly questionable when spoken by an individual whose corporate model relies almost entirely on the trafficking of data. The wall text is
rendered in the aesthetic of CAPTCHA (an acronym for "Completely Automated Public Turing test to tell Computers and Humans Apart"), a type of challenge-response test used in computing to draw distinctions between individuals and bots or rather, the distinction between data-creators and data-miners.
These practices are protected by problematic legislation, such as Bill C-51 that labels as terrorists individuals who interfere with ‘economic stability or critical infrastructure’, a broadly defined field that could easily be used to implicate activists.