boundary 2

Tag: Jussi Parikka

  • Zachary Loeb — Is Big Data the Message? (Review of Natasha Lushetich, ed., Big Data—A New Medium?)

    Zachary Loeb — Is Big Data the Message? (Review of Natasha Lushetich, ed., Big Data—A New Medium?)

    a review of Natasha Lushetich, ed. Big Data—A New Medium? (Routledge, 2021)

    by Zachary Loeb

    When discussing the digital, conversations can quickly shift towards talk of quantity. Just how many images are being uploaded every hour, how many meticulously monitored purchases are being made on a particular e-commerce platform every day, how many vehicles are being booked through a ride-sharing app at 3 p.m. on Tuesday afternoon, how many people are streaming how many shows/movies/albums at any given time? The specific answer to the “how much?” and “how many?” will obviously vary depending upon the rest of the question, yet if one wanted to give a general response across these questions it would likely be fair to answer with some version of “a heck of a lot.” Yet from this flows another, perhaps more complicated and significant question, namely: given the massive amount of information being generated by seemingly every online activity, where does all of that information actually go, and how is that information rendered usable and useful? To this the simple answer may be “big data,” but this in turn just serves to raise the question of what we mean by “big data.”

    “Big data” denotes the point at which data begins to be talked about in terms of scale, not merely gigabytes but zettabytes. And, to be clear, a zettabyte represents a trillion gigabytes—and big data is dealing with zettabytes, plural. Beyond the sheer scale of the quantity in question, considering big data “as process and product” involves a consideration of “the seven Vs: volume” (the amount of data previously generated and newly generated), “variety” (the various sorts of data being generated), “velocity” (the highly accelerated rate at which data is being generated), “variability” (the range of types of information that make up big data), “visualization” (how this data can be visually represented to a user), “value” (how much all of that data is worth, especially once it can be processed in a useful way), and “veracity” (3) (the reliability, trustworthiness, and authenticity of the data being generated). In addition to these “seven Vs” there are also the “three Hs: high dimension, high complexity, and high uncertainty” (3). Granted, “many of these terms remain debatable” (3). Big data is both “process and product” (3), its applications vary from undergirding the sorts of real-time analysis that makes it possible to detect viral outbreaks as they are happening to the directions app that is able to suggest an alternative route before you hit traffic to the recommendation software (be it banal or nefarious) that forecast future behavior based on past actions.

    To the extent that discussions around the digital generally focus on the end(s) results of big data, the means remain fairly occluded both from public view and from many of the discussants. And while big data has largely been accepted as an essential aspect of our digital lives by some, for many others it remains highly fraught.

    As Natasha Lushetich notes, “in the arts and (digital) humanities…the use of big data remains a contentious issue not only because data architectures are increasingly determining classificatory systems in the educational, social, and medical realms, but because they reduce political and ethical questions to technical management” (4). And it is this contentiousness that is at the heart of Lushetich’s edited volume Big Data—A New Medium? (Routledge, 2021). Drawing together scholars from a variety of different disciplines ranging across “the arts and (digital) humanities,” this book moves beyond an analysis of what big data is to a complex considerations of what big data could be (and may be in the process of currently becoming). In engaging with the perils and potentialities of big data, the book (as its title suggests) wrestles with the question as to whether or not big data can be seen as constituting “a new medium.” Through engaging with big data as a medium, the contributors to the volume grapple not only with how big data “conjugates human existence” but also how it “(re)articulates time, space, the material and immaterial world, the knowable and the unknowable; how it navigates or alters, hierarchies of importance” and how it “enhances, obsolesces, retrieves and pushes to the limits of potentiality” (8). Across four sections, the contributors grapple with big data in terms of knowledge and time, use and extraction, cultural heritage and memory, as well as people.

    “Patterning Knowledge and Time” begins with a chapter by Ingrid M. Hoofd that places big data in the broader trajectory of the university’s attempt to make the whole of the world knowable. Considering how “big data renders its object of analysis simultaneously more unknowable (or superficial) and more knowable (or deep)” (18), Hoofd’s chapter examines how big data replicates and reinforces the ways in which that which becomes legitimated as knowable are the very things that can be known through the university’s (and big data’s) techniques. Following Hoofd, Franco “Bifo” Berardi provocatively engages with the power embedded in big data, treating it as an attempt to assert computerized control over a chaotic future by forcing it into a predictable model. Here big data is treated as a potential constraint wherein “the future is no longer  a possibility, but the implementation of a logical necessity inscribed in the present” (43), as participation in society becomes bound up with making one’s self and one’s actions legible and analyzable to the very systems that enclose one’s future horizons. Shifting towards the visual and the environmental, Abelardo Gil-Fournier and Jussi Parikka consider the interweaving of images and environments and how data impacts this. As Gil-Fournier and Parikka explore, as a result of developments in machine learning and computer vision “meteorological changes” are increasingly “not only observable but also predictable as images” (56).

    The second part of the book, “Patterning Use and Existence” starts with Btihaj Ajana reflecting on the ways in which “surveillance technologies are now embedded in our everyday products and services” (64). By juxtaposing the biometric control of refugees with the quantified-self movement, Ajana explores the datafication of society and the differences (as well as similarities) between willing participation and forced participation in regimes of surveillance of the self. Highlighting a range of well-known gig-economy platforms (such as Uber, Deliveroo, and Amazon Mechanical Turk), Tim Christaens examines the ways that “the speed of the platform’s algorithms exceeds the capacities of human bodies” (81). While offering a thorough critique of the inhuman speed imposed by gig economy platforms/algorithms, Christaens also offers a hopeful argument for the possibility that by making their software open source some of these gig platforms could “become a vehicle for social emancipation instead of machinic subjugation” (90). While aesthetic and artistic considerations appear in earlier chapters, Lonce Wyse’s chapter pushes fully into this area through looking at the ways that deep learning systems create the sorts of works of art “that, when recognized in humans, are thought of as creative” (95). Wyse provides a rich, and yet succinct, examination of how these systems function while highlighting the sorts of patterns that emerge (sometimes accidentally) in the process of training these systems.

    At the outset of the book’s third section, “Patterning cultural heritage and memory,” Craig J. Saper approaches the magazine The Smart Set as an object of analysis and proceeds to zoom in and zoom out to reveal what is revealed and what is obfuscated at different scales. Highlighting that “one cannot arbitrarily discount or dismiss particular types of data, big or intimate, or approaches to reading, distant or close” Saper’s chapter demonstrates how “all scales carry intellectual weight” (124). Moving away from the academic and the artist, Nicola Horsley’s chapter reckons with the work of archivists and the ways in which their intellectual labor and the tasks of their profession have been challenged by digital shifts. While archival training teaches archivists that “the historical record, on which collective memory is based, is a process not a product” (140) and in interacting with researchers archivists seek to convey that lesson, Horsley’s considers the ways in which the shift away from the physical archive and towards the digital archive (wherein a researcher may never directly interact with an archivist or librarian) means this “process” risks going unseen. From the archive to the work of art, Natasha Lushetich and Masaki Fujihata’s chapter explores Fujihata’s project BeHere: The Past in the Present and how augmented reality opens up the space for new artistic experience and challenges how individual memory is constructed. Through its engagement with “images obtained through data processing and digital frottage” the BeHere project reveals “new configurations of machinically (rather than humanly) perceived existents” and thus can “shed light on that which eludes the (naked) human eye” (151).

    The fourth and final section of the volume, begins with Dominic Smith’s exploration of the aesthetics of big data. While referring back to the “Seven Vs” of big data, Smith argues that to imagine big data as a “new medium” requires considering “how we make sense of data” in regards to both “how we produce it” and “how we perceive it” (164). A matter which Smith explores through an analysis of “surfaces and depths” of oceanic images. Though big data is closely connected with sheer scale (hence the “big”), Mitra Azar observes that “it is never enough as it is always possible to generate new data and make more comprehensive data sets” (180). Tangling with this in a visual registry, Azar contrasts the cinematic point of view with that of the big data enabled “data double” of the individual (which is meant to stand in for that user). Considering several of his own artistic installations—Babel, Dark Matter, and Heteropticon—Simon Biggs examines the ways in which big data reveals “the everyday and trivial and how it offers insights into the dense ambient noise that is our daily lives” (192). In contrast to treating big data as a revelator of the sublime, Biggs discusses big data’s capacity to show “the infra-ordinary” and to show the value of seemingly banal daily details. The book concludes with Warren Neidich’s speculative gaze to what the future of big data might portend, couched in a belief that “we are at the beginning of a transition from knowledge-based economics to a neural or brain-based economy” (207). Surveying current big data technologies and the trajectories they may suggest, Neidich forecasts “a gradual accumulation of telepathic technocueticals” such that “at some moment a critical point might be reached when telepathy could become a necessary skill for successful adaptation…similar to being able to read in today’s society” (218).

    In the introduction to the book, Natasha Lushetich grounds the discussion in a recognition that “it is also important to ask how big data (re)articulates time, space, the material and immaterial world, the knowable and the unknowable; how it navigates or alters, hierarchies of importance” (8), and over the course of this fascinating and challenging volume, the many contributors do just that.

    ***

    The term big data captures the way in which massive troves of digitally sourced information are made legible and understandable. Yet one of the challenges of discussing big data is trying to figure out a way to make big data itself legible and understandable. In discussions around the digital, big data is often gestured at rather obliquely as the way to explain a lot of mysterious technological activity in the background. We may not find ourselves capable, for a variety of reasons, of prying open the various black boxes of a host of different digital systems but stamped in large letters on the outside of that box are the words “big data.” When shopping online or using a particular app, a user may be aware that the information being gathered from their activities is feeding into big data and that the recommendations being promoted to them come courtesy of the same. Or they may be obliquely aware that there is some sort of connection between the mystery shrouded algorithms and big data. Or the very evocation of “big” when twinned with a recognition of surveillance technologies may serve as a discomforting reminder of “big brother.” Or “big data” might simply sound like a non-existent episode of Star Trek: The Next Generation in which Lieutenant Commander Data is somehow turned into a giant. All of which is to say, that though big data is not a new matter, the question of how to think about it (which is not the same as how to use and be used by it) remains a challenging issue.

    With Big Data—A New Medium?, Natasha Lushetich has assembled an impressive group of thinkers to engage with big data in a novel way. By raising the question of big data as “a new medium,” the contributors shift the discussion away from considerations focused on surveillance and algorithms to wrestle with the ways that big data might be similar and distinct from other mediums. While this shift does not represent a rejection, or move to ignore, the important matters related to issues like surveillance, the focus on big data as a medium raises a different set of questions. What are the aesthetics of big data? As a medium what are the affordances of big data? And what does it mean for other mediums that in the digital era so many of those mediums are themselves being subsumed by big data? After all, so many of the older mediums that theorists have grown so accustomed to discussing have undergone some not insignificant changes as a result of big data. And yet to engage with big data as a medium also opens up a potential space for engaging with big data that does not treat it as being wholly captured and controlled by large tech firms.

    The contributors to the volume do not seem to be fully in agreement with one another about whether big data represents poison or panacea, but the chapters are clearly speaking to one another instead of shouting over each other. There are certainly some contributions to the book, notably Berardi’s, with its evocation of a “new century suspended between two opposite polarities: chaos and automaton” (44), that seem a bit more pessimistic. While other contributors, such as Christaens, engage with the unsavory realities of contemporary data gathering regimes but envision the ways that these can be repurposed to serve users instead of large companies. And such optimistic and pessimistic assessments come up against multiple contributions that eschew such positive/negative framings in favor of an artistically minded aesthetic engagement with what it means to treat big data as a medium for the creation of works of art. Taken together, the chapters in the book provide a wide-ranging assessment of big data, one which is grounded in larger discussions around matters such as surveillance and algorithmic bias, but which pushes readers to think of big data beyond those established frameworks.

    As an edited volume, one of the major strengths of Big Data—A New Medium? is the way it brings together perspectives from such a variety of fields and specialties. As part of Routledge’s “studies in science, technology, and society” series, the volume demonstrates the sort of interdisciplinary mixing that makes STS such a vital space for discussions of the digital. Granted, this very interdisciplinary richness can serve to be as much benefit as burden, as some readers will wish there had been slightly more representation of their particular subfield, or wish that the particular scholarly techniques of a particular discipline had seen greater use. Case in point: Horsley’s contribution will be of great interest to those approaching this book from the world of libraries and archives (and information schools more generally), and some of those same readers will wish that other chapters in the book had been equally attentive to the work done by archive professionals. Similarly those who approach the book from fields more grounded in historical techniques may wish that more of the authors had spent more time engaging with “how we got here” instead of focusing so heavily on the exploration of the present and the possible future. Of course, these are always the challenges with edited interdisciplinary volumes, and it is a major credit to Lushetich as an editor that this volume provides readers from so many different backgrounds with so much to mull over. Beyond presenting numerous perspectives on the titular question, the book is also an invitation to artists and academics to join in discussion about that titular question.

    Those who are broadly interested in discussions around big data will find much in this volume of significance, and will likely find their own thinking pushed in novel directions. That being said, this book will likely be most productively read by those who are already somewhat conversant in debates around big data/the digital humanities/the arts/and STS more generally. While contributors are consistently careful in clearly defining their terms and referencing the theorists from whom they are drawing, from Benjamin to Foucault to Baudrillard to Marx to Deleuze and Guattari (to name but a few), the contributors to this book couch much of their commentary in theory, and a reader of this volume will be best able to engage with these chapters if they have at least some passing familiarity with those theorists themselves. Many of the contributors to this volume are also clearly engaging with arguments made by Shoshana Zuboff in Surveillance Capitalism and this book can be very productively read as critique and complement to Zuboff’s tome. Academics in and around STS, and artists who incorporate the digital into their practice, will find that this book makes a worthwhile intervention into current discourse around big data. And though the book seems to assume a fairly academically engaged readership, this book will certainly work well in graduate seminars (or advanced undergraduate classrooms)—many of the chapter will stand quite well on their own, though much of the book’s strength is in the way the chapters work in tandem.

    One of the claims that is frequently made about big data is that—for better or worse—it will allow us to see the world from a fresh perspective. And what Big Data—A New Medium? does is allow us to see big data itself from a fresh perspective.

    _____

    Zachary Loeb earned his MSIS from the University of Texas at Austin, an MA from the Media, Culture, and Communications department at NYU, and is currently a PhD candidate in the History and Sociology of Science department at the University of Pennsylvania. Loeb works at the intersection of the history of technology and disaster studies, and his research focusses on the ways that complex technological systems amplify risk, as well as the history of technological doom-saying. He is working on a dissertation on Y2K. Loeb writes at the blog Librarianshipwreck, and is a frequent contributor to The b2o Review Digital Studies section.

    Back to the essay

  • Tony D. Sampson and Jussi Parikka — The New Logics of Viral Media

    Tony D. Sampson and Jussi Parikka — The New Logics of Viral Media

    This essay is a part of the COVID-19 dossier, edited by the b2o editorial staff. 

    by Tony D. Sampson and Jussi Parikka

    Up until recently, work on a universal theory of virality seemed to always cut a somewhat marginal figure in media theory. In the early 2000s, when we first started to publish articles referring to digital contagions, immunology, epidemiology and viral networks, it was no surprise to us that although our claim to universality seemed significant, it would remain of ancillary concern to mainstream media theory. After all, media and communication studies were supposed to be about establishing connection; not the opposite of it!  We were regularly questioned about our use of a ‘viral metaphor’ and what it meant to the development of a new model of digital media. The hyperbolic focus on viral marketing did not make it any easier for us to argue that there were deeper material levels of virality that required immediate attention.

    However, now, all of a sudden, unpredictably, and rather shockingly, viral media stands at the centre of contemporary issues both materially, economically, and socially. In the wake of global uncertainty and anxiety caused by the uncontainable spread of Covid-19, there has been an abrupt move to the viral – from the margin to the middle. As we are all now discovering, Covid-19 is an epochal pandemic. The health and survival of massive scale populations are at stake, engendering panicked political responses and exposing the underlying impact of years of austerity in public policy, not least in healthcare. Virality is, as such, both entirely relevant and resolutely non-metaphorical.

    This outbreak has also, understandably, drawn urgent attention to the workings of a viral logics that criss-crosses from biological to cultural, technological and economic contexts. We can now all see how, through sometimes direct experiences, universal virality becomes a techno-social condition of proximity and distance, accident and security, communication and communication breakdown. Indeed, it is in the current context of Covid-19 that our understanding of the movement of people and messages is framed by the logics of quarantine and confinement, security and prevention. Furthermore, virality automates affective reactions and imitative behaviours that relate to different visceral registers of experience compared to those assumed to inform the logic of the market. Which is to say, the mainstream cognitive models that are supposed to support the failing economic model of rational choice (if indeed anyone really ever believed in Homo Economicus) are replaced by seemingly irrational and uncontrollable financial contagion. Moreover, recent outbreaks of panic buying of toilet roll and paracetamol, some of which have been sparked by the global proliferation of Instagram images of empty supermarket shelves, are spreading alongside the early scenes of isolated Italians, impulsively bursting into songs of solidarity and support from their balconies followed up by similar scenes in many other countries and cities. All of these are peculiar contagions because, it would seem, they are interwoven with contagions of psychological fear, anxiety, conspiracy and further financial turmoil; all triggered by the indeterminate spread of Covid-19.

    To think these contagions through in a media theory frame is, for a number of reasons, a complex task. We are, after all, dealing with an ecology of technological, biological, and affective realities moving about in strange feedback loops. Contagious agents are not simply biological; their agency always arrives in plurality.

    Future predictions are taking place against a backdrop of contested epidemiological models, reliant on, for example, the uncertain thresholds of herd immunity or total social lockdown. Certainly, following a sustained period of comparatively stable risk assessment, mostly based on known knowns and known unknowns, we have just entered a vital, possibly game changing phase in which unknown unknowns will prescribe the near future.

    We have to concede that, from the outset, the universality of our viral logics has itself been contested. There have been at least two other models of media virus that we know of. Whether or not it was the first to do seems rather inconsequential now, but Douglas Rushkoff’s Media Virus, published back in 1994, proposed an early viral model that could be harnessed to manipulate the new media. The information-virus, and latter concepts of spreadable media, perceptively challenged the assumed entrenchments of the old ideological state apparatus model of media, pointing toward a novel McLuhanesque participatory culture. We can, perhaps, in retrospect, trace the celebratory nature of this viral logics all the way to the fantasy of revolutionary social media contagions during the Arab Spring.

    The second media virus appeared in the early noughties. It was extracted from a few loose remarks made in the latter pages of Richard Dawkins’s neo-Darwinian Selfish Gene thesis of 1976. In Susan Blackmore’s neo-Darwinian Meme Machine, for example, we find a media virus which functions according to an evolutionary algorithm. The neo-Darwinian meme doctrine emerged in various millennial discourses, mostly those associated with the rhetoric of viral marketing and the computer viruses/antivirus arms race. As some viral marketers claimed, contagion may seem accidental, but the pass-on-power of a media message could be memetically encoded (and harnessed) to spread as determined.

    The universality of the third media virus – the one we proposed in the early 2000s – was intended to be more theoretically nuanced, certainly in regards to its approach to mechanisms and the question of whom or what does the harnessing. To begin with, our universal virus was more closely aligned to a viral event, or accident of contagion, than it was analogous to, or metaphorically related to, its biological counterpart. We could indeed learn more from the capriciousness of computer viruses than we would by merely looking for analogical relations. As follows, digital contagion provided insights into the modelling of the contagious behaviours of autonomous agents. Similarly, just as computer security became a core focus of digital media practices, the broader implications for virality in network culture also implied the shared legacy with epidemiology and its goal to simulate the spread of diseases. Multi-agent-based modelling was one context where contagions were initially allowed to spread, creating a bifurcated discursive formation between the burgeoning field of artificial life research, on one hand, and the tight link between measures of security and automation, on the other. Along these lines, then, early automated software processes were often grasped as artificial contagions that went beyond the human control of complex computational networks, requiring a further automated immunological response.

    Another aim of the universal virus was to reject biological or technological determinism in favour of a transversal contagion. In short, this meant that no one mechanism determined contagion since the relationality and accidentality of the viral event superseded deterministic thinking. Contagious behaviours are not solely  predetermined by an evolutionary code, as such. The universal virus also clearly relates to the complex array of unknown unknowns triggered by environmental interactions. Indeed, the vectors of contagion, and any subsequent security response to these environmental conditions, will prove to be effective only after the fact. These are paradoxical environments in which the mode of future predictions, based on existing models and reliant on historical data and assumptions, becomes at odds with the necessary open-ended nature of a shared communication network.

    Of course, the story of contagion modelling – either as epidemiological modelling or as conceptualising theoretical models – is not reducible to contemporary network culture. To better grasp the bizarre nature of the kinds of contagious loops we are experiencing with Covid-19, the universal virus also made significant references to nineteenth century contagion theory. Most notably we borrowed from Gabriel Tarde’s society of imitation thesis, which, like Paul Virilio, focused on the accidents of mechanism, rather than a mechanism’s logic. Moreover, Tarde’s imitative social subjects were not the victims, but rather the products of contagion. It is, indeed, in the accidental relations of contagion, that Tarde’s subjects are continuously made and remade.

    Like the inexplicable behaviours of crazed shoppers panic buying toilet rolls in recent weeks, the subjectivities that are produced in Tarde’s society of imitation are conspicuously rendered docile sleepwalkers. However, Tarde’s many references to social somnambulism must not be misconstrued as an understanding of society founded entirely on collective stupidity. Importantly, his references to sleepwalking were informed by the absence of a distinction he made between a biological nonconscious inclination and sociocultural tendencies to imitate. In other words, Tarde’s social subjects, including those that were supposed to be making rational economic judgements, are never self-contained. They are both, simultaneously, etched by the affect of others and leaking their own infectious affects. Again, following the logic of the universal virus, recent outbreaks of panic buying and seemingly irrational market trading, are examples of further unpredictable automations of bodies and habits.

    Back in early the 2000s, we argued for a universal virus that made a resounding, yet subtle break from established media theory analysis of contagion, doggedly couched in representation. Viruses were not solely metaphorical, figurative or indeed myths that covered up an underlying ideological reality. Following the Covid-19 outbreak, the universal virus can certainly no longer be considered as a conjured-up fantasy, projection, or for that matter, in the current context, a crude biopolitical invention  strategically placed to justify measures of containment. Although, for sure, there are multiple levels of political aims at play, not least in terms of the recurring question of immunological borders, the logic of this virus is now, for the time being, the overriding power dynamic. Far from providing a convenient allegory for action, the very real viral event of Covid-19 is currently producing its own reality according to which our habits and worlds must bend and adapt.

    Universal viruses are nonrepresentational in the sense that they make their own physical and metaphysical infrastructures of connectivity, while exposing the underlying social strata upon which – as epi–demos – they function. Along these lines, the legal theorist Andreas Philippopoulos-Mihalopoulos contends that Covid-19 presents a Spinozian contagion in terms of how bodies relate to each other and their environment. The “challenge of Covid” is, he argues, “monumentally ethical.” This is because the virus “demands of us to accept a quintessentially Spinozan ethics of positioning, of emplacing one’s body in a geography of awareness of how affects circulate between us and others.”[1] This viral patterning of habit and behaviour is no longer merely a question of homophilic identification (connecting to friends, parents, etc.), but radically expands to modes of connection and disconnection co-determined by collective bodies that are being positioned in relation to each other, to space, to borders, to containment, etc.

    The viral patterning of Covid-19 will continue to spur a range of actions, habits, behaviours and affects that might take a hold of bodies in more predictable or previously unimagined ways. Certainly, some of the pegs that fix the future of biopolitical movements of people and messages will no doubt produce more docile sleepwalkers. It is not surprising that the UK government initially opted for a neoliberal version of herd immunity in which collective obligation was pitched alongside business as usual. Even now, in its current state of belated lockdown, the UK’s unequal distribution of Covid testing sees leading political figures and royal family members prioritized over frontline health workers. In the US too, Trump’s reluctance to accept Covid-19’s utter disregard for capitalism seems to be making his country a deadly hub for infection. Indeed, what seems to unify the far-right at this moment is its propensity toward Covid-denial, exemplified by Trump and Bolsonaro’s regime in Brazil. Apparently, sales of guns and ammunition are soaring across the US as fears of Covid-19 prompt bunker mentality and self-protection. It is also the case that the reported spread of the virus has been coupled to an intensification and extension of population racism. In the UK, again, the spread of so-called maskaphobia has led to many Chinese students having to opt between what sociologist Yinxuan Huang calls “two bad choices – insecurity (for coronavirus) and fear (for racism).”[2] Ultimately, urban spaces may well be redefined by state controlled measures of social distancing, on one hand, or these kinds of fear-driven detachments, on the other; both of which clearly contrast with the themes of the classical sociology of cities, which grasped urban spaces as locales of dynamic collective density.

    The logic of the universal virus might also produce novel spatiotemporal realities for collective grassroots systems of care. In the wake of Covid-19, we are already witnessing more than the spontaneous emergence of songs of solidarity. Spain is currently nationalizing private hospitals; Iran is releasing political prisoners from jails. These are new spatiotemporal realities produced by Covid-19 that could counter the broader context of what Achille Mbembe has referred to as necropolitics. After the dark refrains of Trump, Brexit and subsequent intensifications of population racism, for example, the horror of Covid-19 might actually clear the way for some kind of large-scale radical reaction that addresses these recent corruptions of the global political scene and its role in quickening climate change and the biodiversity crisis. After the applauding of brave health workers and songs of the shutdown subside, painful social, economic and political struggles will inevitably follow the virus. How these struggles manifest against the shifting backdrop of disciplinary confinement and control by way of statistical inoculation and the abandonment of eradication are yet to be seen.[3] New political assemblages might be triggered, at least temporarily. The question we need to ask now is: what are you doing after the lockdown? We do not mean this to be a catchy social media meme, or indeed a misquotation of Baudrillard, but instead we propose it to be the looming political question we must all face.[4]

    The French version of this text is published on AOC. You can find it here.

    Tony D Sampson is a critical theorist with an interest in digital media cultures. His publications include The Spam Book, coedited with Jussi Parikka (Hampton Press, 2009), Virality: Contagion Theory in the Age of Networks (University of Minnesota Press, 2012), The Assemblage Brain: Sense Making in Neuroculture (University of Minnesota Press, 2017) and Affect and Social Media: Emotion, Mediation, Anxiety and Contagion, coedited with Darren Ellis and Stephen Maddison (Rowman and Littlefield, 2018). His next book – A Sleepwalker’s Guide to Social Media – will be published by Polity in July 2020. Sampson also hosts the Affect and Social Media international conferences in east London and is co-founder of the community engagement initiative the Cultural Engine Research Group. He works as a reader in digital media cultures and communication at the University of East London.

    Jussi Parikka is Professor at University of Southampton (Winchester School of Art) and Visiting Professor at FAMU at the Academy of Performing Arts, Prague where he leads the project on Operational Images and Visual Culture (2019-2023). In 2019-2020, he is also Visiting Chair of Media Archaeology at University of Udine, Italy.  His work has touched on questions of virality and computer accidents in the book Digital Contagions: A Media Archaeology of Computer Viruses (2nd. updated edition 2016, Peter Lang Publishing) and he has addressed questions of ecology and media in books such as Insect Media (University of Minnesota Press, 2010) and A Geology of Media (University of Minnesota Press, 2015). The Lab Book, co-authored with Darren Wershler and Lori Emerson, is forthcoming in 2021 (University of Minnesota Press). Parikka’s site is at http://jussiparikka.net.

    [1] Andreas Philippopoulos-Mihalopoulos “Covid: The Ethical Disease”. Critical Legal Thinking: Law and the Political, 13 March 2020: https://criticallegalthinking.com/2020/03/13/covid-the-ethical-disease/

    [2] Sally Weale “Chinese students flee UK after ‘maskaphobia’ triggered racist attacks: Many say China feels safer than Britain amid coronavirus crisis and increasing abuse”. The Guardian, 17 Mar 2020: https://www.theguardian.com/education/2020/mar/17/chinese-students-flee-uk-after-maskaphobia-triggered-racist-attacks

    [3] Philipp Sarasin “Understanding the Coronavirus Pandemic with Foucault?” Foucault Blog, March 31, 2020: https://www.fsw.uzh.ch/foucaultblog/essays/254/understanding-corona-with-foucault?fbclid=IwAR0t0C9bY3D-j-gyjtxj1f6CDz-0kY0KtgnCUhj9LAuOwMc4r7CC0BxAjSc

    [4] See also Tuomas Nevanlinna “Poikkeustilan julistaminen on äärimmäistä vallankäyttöä, mutta ratkaiseva hetki koittaa kun se lakkautetaan (Declaring a state of emergency is an extreme exercise of power, but the crucial moment comes when it is lifted)”. Kulttuuricocktail, 26 March 2020: https://yle.fi/aihe/artikkeli/2020/03/28/tuomas-nevanlinna-poikkeustilan-julistaminen-on-aarimmaista-vallankayttoa-mutta

  • The Ground Beneath the Screens

    The Ground Beneath the Screens

    Jussi Parikka, A Geology of Media (University of Minnesota Press, 2015)Jussi Parikka, The Anthrobscene (University of Minnesota Press, 2015)a review of Jussi Parikka, A Geology of Media (University of Minnesota Press, 2015) and The Anthrobscene (University of Minnesota Press, 2015)
    by Zachary Loeb

    ~

     

     

     

     

    Despite the aura of ethereality that clings to the Internet, today’s technologies have not shed their material aspects. Digging into the materiality of such devices does much to trouble the adoring declarations of “The Internet Is the Answer.” What is unearthed by digging is the ecological and human destruction involved in the creation of the devices on which the Internet depends—a destruction that Jussi Parikka considers an obscenity at the core of contemporary media.

    Parikka’s tale begins deep below the Earth’s surface in deposits of a host of different minerals that are integral to the variety of devices without which you could not be reading these words on a screen. This story encompasses the labor conditions in which these minerals are extracted and eventually turned into finished devices, it tells of satellites, undersea cables, massive server farms, and it includes a dark premonition of the return to the Earth which will occur following the death (possibly a premature death due to planned obsolescence) of the screen at which you are currently looking.

    In a connected duo of new books, The Anthrobscene (referenced below as A) and A Geology of Media (referenced below as GM), media scholar Parikka wrestles with the materiality of the digital. Parikka examines the pathways by which planetary elements become technology, while considering the transformations entailed in the anthropocene, and artistic attempts to render all of this understandable. Drawing upon thinkers ranging from Lewis Mumford to Donna Haraway and from the Situationists to Siegfried Zielinski – Parikka constructs a way of approaching media that emphasizes that it is born of the Earth, borne upon the Earth, and fated eventually to return to its place of origin. Parikka’s work demands that materiality be taken seriously not only by those who study media but also by all of those who interact with media – it is a demand that the anthropocene must be made visible.

    Time is an important character in both The Anthrobscene and A Geology of Media for it provides the context in which one can understand the long history of the planet as well as the scale of the years required for media to truly decompose. Parikka argues that materiality needs to be considered beyond a simple focus upon machines and infrastructure, but instead should take into account “the idea of the earth, light, air, and time as media” (GM 3). Geology is harnessed as a method of ripping open the black box of technology and analyzing what the components inside are made of – copper, lithium, coltan, and so forth. The engagement with geological materiality is key for understanding the environmental implications of media, both in terms of the technologies currently in circulation and in terms of predicting the devices that will emerge in the coming years. Too often the planet is given short shrift in considerations of the technical, but “it is the earth that provides for media and enables it”, it is “the affordances of its geophysical reality that make technical media happen” (GM 13). Drawing upon Mumford’s writings about “paleotechnics” and “neotechnics” (concepts which Mumford had himself adapted from the work of Patrick Geddes), Parikka emphasizes that both the age of coal (paleotechnics) and the age of electricity (neotechnics) are “grounded in the wider mobilization of the materiality of the earth” (GM 15). Indeed, electric power is often still quite reliant upon the extraction and burning of coal.

    More than just a pithy neologism, Parikka introduces the term “anthrobscene” to highlight the ecological violence inherent in “the massive changes human practices, technologies, and existence have brought across the ecological board” (GM 16-17) shifts that often go under the more morally vague title of “the anthropocene.” For Parikka, “the addition of the obscene is self-explanatory when one starts to consider the unsustainable, politically dubious, and ethically suspicious practices that maintain technological culture and its corporate networks” (A 6). Like a curse word beeped out by television censors, much of the obscenity of the anthropocene goes unheard even as governments and corporations compete with ever greater élan for the privilege of pillaging portions of the planet – Parikka seeks to reinscribe the obscenity.

    The world of high tech media still relies upon the extraction of metals from the earth and, as Parikka shows, a significant portion of the minerals mined today are destined to become part of media technologies. Therefore, in contemplating geology and media it can be fruitful to approach media using Zielinski’s notion of “deep time” wherein “durations become a theoretical strategy of resistance against the linear progress myths that impose a limited context for understanding technological change” (GM 37, A 23). Deploying the notion of “deep time” demonstrates the ways in which a “metallic materiality links the earth to the media technological” while also emphasizing the temporality “linked to the nonhuman earth times of decay and renewal” (GM 44, A 30). Thus, the concept of “deep time” can be particularly useful in thinking through the nonhuman scales of time involved in media, such as the centuries required for e-waste to decompose.

    Whereas “deep time” provides insight into media’s temporal quality, “psychogeophysics” presents a method for thinking through the spatial. “Psychogeophysics” is a variation of the Situationist idea of “the psychogeographical,” but where the Situationists focused upon the exploration of the urban environment, “psychogeophysics” (which appeared as a concept in a manifesto in Mute magazine) moves beyond the urban sphere to contemplate the oblate spheroid that is the planet. What the “geophysical twist brings is a stronger nonhuman element that is nonetheless aware of the current forms of exploitation but takes a strategic point of view on the nonorganic too” (GM 64). Whereas an emphasis on the urban winds up privileging the world built by humans, the shift brought by “psychogeophysics” allows people to bear witness to “a cartography of architecture of the technological that is embedded in the geophysical” (GM 79).

    The material aspects of media technology consist of many areas where visibility has broken down. In many cases this is suggestive of an almost willful disregard (ignoring exploitative mining and labor conditions as well as the harm caused by e-waste), but in still other cases it is reflective of the minute scales that materiality can assume (such as metallic dust that dangerously fills workers’ lungs after they shine iPad cases). The devices that are surrounded by an optimistic aura in some nations, thus obtain this sheen at the literal expense of others: “the residue of the utopian promise is registered in the soft tissue of a globally distributed cheap labor force” (GM 89). Indeed, those who fawn with religious adoration over the newest high-tech gizmo may simply be demonstrating that nobody they know personally will be sickened in assembling it, or be poisoned by it when it becomes e-waste. An emphasis on geology and materiality, as Parikka demonstrates, shows that the era of digital capitalism contains many echoes of the exploitation characteristic of bygone periods – appropriation of resources, despoiling of the environment, mistreatment of workers, exportation of waste, these tragedies have never ceased.

    Digital media is excellent at creating a futuristic veneer of “smart” devices and immaterial sounding aspects such as “the cloud,” and yet a material analysis demonstrates the validity of the old adage “the more things change the more they stay the same.” Despite efforts to “green” digital technology, “computer culture never really left the fossil (fuel) age anyway” (GM 111). But beyond relying on fossil fuels for energy, these devices can themselves be considered as fossils-to-be as they go to rest in dumps wherein they slowly degrade, so that “we can now ask what sort of fossil layer is defined by the technical media condition…our future fossils layers are piling up slowly but steadily as an emblem of an apocalypse in slow motion” (GM 119). We may not be surrounded by dinosaurs and trilobites, but the digital media that we encounter are tomorrow’s fossils – which may be quite mysterious and confounding to those who, thousands of years hence, dig them up. Businesses that make and sell digital media thrive on a sense of time that consists of planned obsolescence, regular updates, and new products, but to take responsibility for the materiality of these devices requires that “notions of temporality must escape any human-obsessed vocabulary and enter into a closer proximity with the fossil” (GM 135). It requires a woebegone recognition that our technological detritus may be present on the planet long after humanity has vanished.

    The living dead that lurch alongside humanity today are not the zombies of popular entertainment, but the undead media devices that provide the screens for consuming such distractions. Already fossils, bound to be disposed of long before they stop working, it is vital “to be able to remember that media never dies, but remains as toxic residue,” and thus “we should be able to repurpose and reuse solutions in new ways, as circuit bending and hardware hacking practices imply” (A 41). We live with these zombies, we live among them, and even when we attempt to pack them off to unseen graveyards they survive under the surface. A Geology of Media is thus “a call for further materialization of media not only as media but as that bit which it consists of: the list of the geophysical elements that give us digital culture” (GM 139).

    It is not simply that “machines themselves contain a planet” (GM 139) but that the very materiality of the planet is becoming riddled with a layer of fossilized machines.

    * * *

    The image of the world conjured up by Parikka in A Geology of Media and The Anthrobscene is far from comforting – after all, Parikka’s preference for talking about “the anthrobscene” does much to set a funereal tone. Nevertheless, these two books by Parikka do much to demonstrate that “obscene” may be a very fair word to use when discussing today’s digital media. By emphasizing the materiality of media, Parikka avoids the thorny discussions of the benefits and shortfalls of various platforms to instead pose a more challenging ethical puzzle: even if a given social media platform can be used for ethical ends, to what extent is this irrevocably tainted by the materiality of the device used to access these platforms? It is a dark assessment which Parikka describes without much in the way of optimistic varnish, as he describes the anthropocene (on the first page of The Anthrobscene) as “a concept that also marks the various violations of environmental and human life in corporate practices and technological culture that are ensuring that there won’t be much of humans in the future scene of life” (A 1).

    And yet both books manage to avoid the pitfall of simply coming across as wallowing in doom. Parikka is not pining for a primal pastoral fantasy, but is instead seeking to provide new theoretical tools with which his readers can attempt to think through the materiality of media. Here, Parikka’s emphasis on the way that digital technology is still heavily reliant upon mining and fossil fuels acts as an important counter to gee-whiz futurism. Similarly Parikka’s mobilization of the notion of “deep time” and fossils acts as an important contribution to thinking through the lifecycles of digital media. Dwelling on the undeath of a smartphone slowly decaying in an e-waste dump over centuries is less about evoking a fearful horror than it is about making clear the horribleness of technological waste. The discussion of “deep time” seems like it can function as a sort of geological brake on accelerationist thinking, by emphasizing that no matter how fast humans go, the planet has its own sense of temporality. Throughout these two slim books, Parikka draws upon a variety of cultural works to strengthen his argument: ranging from the earth-pillaging mad scientist of Arthur Conan Doyle’s Professor Challenger, to the Coal Fired Computers of Yokokoji-Harwood (YoHa), to Molleindustria’s smartphone game “Phone Story” which plays out on a smartphone’s screen the tangles of extraction, assembly, and disposal that are as much a part of the smartphone’s story as whatever uses for which the final device is eventually used. Cultural and artistic works, when they intend to, may be able to draw attention to the obscenity of the anthropocene.

    The Anthrobscene and A Geology of Media are complementary texts, but one need not read both in order to understand the other. As part of the University of Minnesota Press’s “Forerunners” series, The Anthrobscene is a small book (in terms of page count and physical size) which moves at a brisk pace, in some ways it functions as a sort of greatest hits version of A Geology of Media – containing many of the essential high points, but lacking some of the elements that ultimately make A Geology of Media a satisfying and challenging book. Yet the duo of books work wonderfully together as The Anthrobscene acts as a sort of primer – that a reader of both books will detect many similarities between the two is not a major detraction, for these books tell a story that often goes unheard today.

    Those looking for neat solutions to the anthropocene’s quagmire will not find them in either of these books – and as these texts are primarily aimed at an academic audience this is not particularly surprising. These books are not caught up in offering hope – be it false or genuine. At the close of A Geology of Media when Parikka discusses the need “to repurpose and reuse solutions in new ways, as circuit bending and hardware hacking practices imply” (A 41) – this does not appear as a perfect panacea but as way of possibly adjusting. Parikka is correct in emphasizing the ways in which the extractive regimes that characterized the paleotechnic continue on in the neotechnic era, and this is a point which Mumford himself made regarding the way that the various “technic” eras do not represent clean breaks from each other. As Mumford put it, “the new machines followed, not their own pattern, but the pattern laid down by previous economic and technical structures” (Mumford 2010, 236) – in other words, just as Parikka explains, the paleotechnic survives well into the neotechnic. The reason this is worth mentioning is not to challenge Parikka, but to highlight that the “neotechnic” is not meant as a characterization of a utopian technical epoch that has parted ways with the exploitation that had characterized the preceding period. For Mumford the need was to move beyond the anthropocentrism of the neotechnic period and move towards what he called (in The Culture of Cities) the “biotechnic” a period wherein “technology itself will be oriented toward the culture of life” (Mumford 1938, 495). Granted, as Mumford’s later work and as these books by Parikka make clear – instead of arriving at the “biotechnic” what we might get is instead the anthrobscene. And reading these books by Parikka makes it clear that one could not characterize the anthrobscene as being “oriented toward the culture of life” – indeed, it may be exactly the opposite. Or, to stick with Mumford a bit longer, it may be that the anthrobscene is the result of the triumph of “authoritarian technics” over “democratic” ones. Nevertheless, the true dirge like element of Parikka’s books is that they raise the possibility that it may well be too late to shift paths – that the neotechnic was perhaps just a coat of fresh paint applied to hide the rusting edifice of paleotechnics.

    A Geology of Media and The Anthrobscene are conceptual toolkits, they provide the reader with the drills and shovels they need to dig into the materiality of digital media. But what these books make clear is that along with the pickaxe and the archeologist’s brush, if one is going to dig into the materiality of media one also needs a gasmask if one is to endure the noxious fumes. Ultimately, what Parikka shows is that the Situationist inspired graffiti of May 1968 “beneath the streets – the beach” needs to be rewritten in the anthrobscene.

    Perhaps a fitting variation for today would read: “beneath the streets – the graveyard.”
    _____

    Zachary Loeb is a writer, activist, librarian, and terrible accordion player. He earned his MSIS from the University of Texas at Austin, and is currently working towards an MA in the Media, Culture, and Communications department at NYU. His research areas include media refusal and resistance to technology, ethical implications of technology, infrastructure and e-waste, as well as the intersection of library science with the STS field. Using the moniker “The Luddbrarian,” Loeb writes at the blog Librarian Shipwreck. He is a frequent contributor to The b2 Review Digital Studies section.

    Back to the essay
    _____

    Works Cited

    Mumford, Lewis. 2010. Technics and Civilization. Chicago: University of Chicago Press.

    Mumford, Lewis. 1938. The Culture of Cities. New York: Harcourt, Brace and Company.