boundary 2

Tag: Eugene Thacker

  • Eugene Thacker – Weird, Eerie, and Monstrous: A Review of “The Weird and the Eerie” by Mark Fisher

    Eugene Thacker – Weird, Eerie, and Monstrous: A Review of “The Weird and the Eerie” by Mark Fisher

    by Eugene Thacker

    Review of Mark Fisher, The Weird and the Eerie (Repeater, 2017)

    For a long time, the horror genre was not generally considered worthy of critical, let alone philosophical, reflection; it was the stuff of cheap thrills, pulp magazines, B-movies. Much of this has changed in the ensuing years, as a robust and diverse critical literature has emerged around the horror genre, much of which not only considers the horror genre as a reflection of society, but as an autonomous platform for posing far-reaching questions concerning the fate of the humans species, the species that has named itself. These are sentiments that have preoccupied recent writing on the horror genre, much of which borrows from developments in contemporary philosophy, and is attempting to expand the confines of horror beyond the usual fixation on gore, violence, and shock tactics. This hasn’t always been the case. Even today, writing on genre horror often tends towards “list” books (of the type The Top 100 Italian Horror Films From 1977, Volume IV), or books that are basically print-on-demand databases (The Encyclopedia of Asian Ghost Stories from the Beginning of Time, and Before That). These are rounded out by a plethora of introductory textbooks and surveys, usually aimed at film studies undergraduates (e.g. Key Terms in Cultural Studies: Splatterpunk), and opaque academic monographs of Lacanian psychoanalytic semiotic readings of horror film that themselves seem to be part of some kind of academic cult.

    While such books can be informative and helpful, reading them can be akin to the slightly woozy feeling one has after having gone down a combined Google/Wikipedia/YouTube rabbit-hole, emerging with bewildered eyes and terabytes of regurgitated data. However, recent writing on the horror genre takes a different approach, eschewing the poles of either the popular or the academic for a perhaps yet-to-be-named third space. One book that takes up this challenge is Mark Fisher’s The Weird and the Eerie, published this year. (Fisher is likely known to readers through his blog K-punk, which had been running for almost two decades before his untimely death.) What Fisher’s study shares with other like-minded books is an interest in expanding our understanding of the horror genre beyond the genre itself, and he does this by focusing on one of the deepest threads in the horror genre: the limits of human beings living in a human-centric world.

    As a case study, consider the opening passage from H.P. Lovecraft’s well-known short story “The Call of Cthulhu”:

    The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.

    With this – arguably the most foreboding opener ever written for a story – Lovecraft sets the stage for what is really an extended meditation on human finitude. Originally published in the February 1928 issue of the pulp magazine Weird Tales, “Cthulhu” ostensibly brings together the perspectives of deep time and deep space to reflect on the comparatively myopic and humble non-event that is human civilization – at least that’s how Lovecraft himself puts it. It is well known that Lovecraft took cues from the likes of Edgar Allan Poe, Algernon Blackwood, and Arthur Machen – influences that he himself notes. Equally well known is Lovecraft’s notorious xenophobia (often expressed in his correspondence as outright racism). Yet in spite of – or because of – this, Lovecraft remained unambiguous in his own approach to the horror genre. In his numerous essays, notes, and letters, he notes, with an unflinching misanthropy, how a horror story should evoke “an atmosphere of breathless and unexplainable dread of outer, unknown forces,” forces that point towards a “malign and particular suspension or defeat of those fixed laws of Nature which are our only safeguard against the assaults of chaos and the daemons of unplumbed space.” The “monsters” in such tales were far from the usual line-up of vampires, werewolves, zombies, and demons – all of which, for Lovecraft and his colleagues, end up serving as mere solipsistic reflections of human-centric hopes and fears. They are often described in abstract, elemental, almost primordial ways: “the colour out of space,” “the shadow out of time,” or simply “the lurking fear.”

    The story of “Cthulhu” itself  – which details the discovery of a cult devoted to an ancient, malefic, Elder Deity vaguely resembling a oozing winged cephalopod emerging from a hidden tomb of impossibly-shaped Cyclopean black geometry foretelling not only the end of the world but the deeper futility of the entirety of human civilization – the story itself has since obtained a cult status among horror authors, critics, and fans alike. In the early 20th century, like-minded tales of cosmic misanthropy were written by Lovecraft contemporaries Clark Ashton Smith, Robert E. Howard, and Robert Bloch, as well as by later authors of the weird tale such as Ramsey Campbell, Claitlín Kiernan, China Miéville, and Junji Ito. Like a slow-moving, tentacular meme, the Cthulhu “mythos” has reached far beyond the confines of literary horror. Film adaptations abound (the term “straight-to-video” no longer applies, but is still apt here). Video games, which nearly always end in despair and/or death. Role-playing games, complete with impossibly-shaped 10-sided black dice. A visit to any Comic Con will yield a dizzying array of comics, ‘zines, artwork, posters, bumper stickers, hoodies, Miskatonic University course catalogs, editions of the dreaded Necronomicon, and even Cthulhu plushies for the Lovecraft toddler. An industry is born. Today, distant cousins of Cthulhu can be seen in the Academy Award-nominated Arrival (2016), and the distinctly un-nominated burlesque that is Independence Day: Resurgence (2016). Cthulhu, it seems, has gone mainstream.

    Amid all the fondness for such abysmal and tentacular monstrosities, it is easy to overlook the themes that run through Lovecraft’s short tale, themes at once disturbing and compelling, and which mark the tradition often referred to as “supernatural horror” or “cosmic horror.” When Lovecraft characters happen upon strange creatures like Cthulhu (or worse, the Shoggoths), they don’t have the typical reactions. “Fear” is too simple a term to describe it; it encompasses everything without saying anything. But neither are they overcome by the more literary affects of “terror” or “horror,” like the characters of an old gothic novel. They have neither the time nor the patience for the critical distance afforded by a psychoanalytic “uncanny,” or the literary structures of the “fantastic.” Confronted with Cthulhu, Lovecraft’s characters simply freeze. They become numb. They go dark. Frozen thought. They can’t wrap their heads around that is right before them. What they “feel” is exactly this “inability of the human mind to correlate all its contents.” Forget the fear of death, I’ve just discovered a primordial, other-dimensional, slime-ridden necropolis of obsidian blasphemy that throws into question all human knowledge on this now-forsaken speck of cosmic dust we laughably call “our” planet.

    Yet, in all their pulpy, melodramatic, low-brow seriousness, the questions raised by Lovecraft and other writers in Weird Tales are also philosophical questions. They are questions that address the limits of human knowledge in a rapidly-changing world, a world that seems indifferent to the machinations of science or doctrinal exuberance of religion, impassive before the hubris of technological advance or the lures of political ideology – a cold “crawling chaos” lurking just beneath the fragile fabric of humanity. What the characters of such stories discover (aside from the usual train of madness, dread, and, well, death) is a kind of stumbling humbleness, the human brain discovering its own limit, enlightened only of its own hubris – the humility of thought.

    *

    This theme  – the limits of what can be known, the limits of what can be felt, the limits of what can be done – is central to Fisher’s The Weird and the Eerie. This is markedly different from other approaches to horror, which, however critical they may seem, often regard the horror genre as having an essentially therapeutic function, enabling us to purge, cope with, or work through our collective fears and anxieties. This therapeutic view of horror often becomes polarized between reactionary readings (a horror story that promotes the establishing or re-establishing of norms) or progressive readings (a horror story that promotes otherness, difference, and transgression of norms). And yet, in the final analysis, it is also hard to escape the sense that there is a certain kind of solipsism to the horror genre, that it is we human beings that remain at the center of it all, who have either constructed boundaries and bunkers and have once again staved off another threat to our collective identity, or who have devised clever ways of creating hybrids, fusions, and monstrous couplings with the other, thereby extending humanity’s long dreamed-of share of immortality.

    Whether reactionary or progressive, both responses to the horror genre involve a strategy in which the world in all its strangeness is transformed into a world made in our own image (anthropomorphism), or a world rendered useful for us as human beings (anthropocentrism). In spite of all the horrifying things that happen to the characters in horror stories, there is a sense in which the horror genre is ultimately a kind of humanism, a panegyric to the limitless potential of human knowledge, the immeasurable capacity for human feeling, the infinite promise of human sovereignty. This is, of course, not surprising, given the somber didactics of even the most extreme zombie apocalypses, vampiric mutations, or demonic plagues. Species self-interest is at stake. Humanity may be brought to the brink of extinction, only so that that same humanity may extend its mastery (self-mastery and mastery over its environment), and even obtain some form of ascendency over its own tenuous, existential status. Subtending the survivalist imperative of the horror genre and its pragmatic arsenal of mastering monsters of all kinds is another kind of mastery – a metaphysical mastery.

    But this is only one way of understanding the horror genre. The insight of books like Fisher’s is that the horror genre is also capable of chipping away at this species-specific sovereignty, taking aim at the twin pillars of anthropomorphism and anthropocentrism. Instead of being concerned with species self-interest and mastery, such horror stories tend more towards humility, hubris, and even, in its darkest moments, futility. It is a project that is doomed to failure, of course, and perhaps this why so many of the characters in the tales of Lovecraft, Algernon Blackwood, or Izumi Kyoka find themselves in worlds that are both untenable and unlivable. They end up with nothing but a bit of useless quasi-wisdom, scribbling away madly in a darkened forest room trying to make sense of it all not making any sense. Or they detach themselves from the humdrum human world of plans and projects, finding themselves inexorably pulled headlong into the ambivalent abyss of self-abnegation. Or worse – they simply continue to exist. What results is what we might call a “bleak humanism” – a horror story interested in humanity only to the extent that humanity is defined by its uncertainties, its finitude, its doubts – the humility of being human.

    Fisher’s terms are relatively clear. “What the weird and the eerie have in common is a preoccupation with the strange.” For Fisher, the strange is, quite simply, “a fascination for the outside […] that which lies beyond standard perception, cognition and experience.” But the weird and the eerie are quite different in how they apprehend the strange. As Fisher writes, “the weird is constituted by a presence – the presence of that which does not belong.” There is something exorbitant, out-of-place, and incongruous about the weird. It is the part that does not fit into the whole, or the part that disturbs the whole – threshold worlds populated by portals, gateways, time loops, and simulacra. Fundamental presumptions about self, other, knowledge, and reality will have to be rethought. “The eerie, by contrast, is constituted by a failure of absence or by a failure of presence. There is something where there should be nothing, or there is nothing where there should be something.” Here we encounter disembodied voices, lapses in memory, selves that are others, revelations of the alien within, and nefarious motives buried in the unconscious, inorganic world in which we are embedded.

    The weird and the eerie are not exclusive to the more esoteric regions of cosmic horror; they are also embedded in and bound up with quotidian notions of selfhood and the everyday relationship between self and world. The weird and eerie crop up in those furtive moments when we suspect we are not who we think we are, when we wonder if we do not act so much as we are acted upon. When everything we assumed to be a cause is really an effect. The weird and eerie are, ultimately, inseparable from the fabric of the social, cultural, and political landscape in which we are embedded. Fisher: “Capital is at every level an eerie entity: conjured out of nothing, capital nevertheless exerts more influence than any allegedly substantial entity.” There is a sense in which, for Fisher, the weird and the eerie constitute the poles of our ubiquitous “capitalist realism,” prompting us to re-examine not only presumptions concerning human agency, intentionality, and control, but also inviting a darker, more disturbing reflection on the strange agency of the inanimate and impersonal materiality of the world around us and within us.

    Fisher’s interest in Lovecraft stems from this shift in perspective from the human-centric to the nonhuman-oriented – not simply a psychology of “fear,” but the unnerving, impersonal calm of the weird and eerie. As scholars of the horror genre frequently note, Lovecraft’s tales are distinct from genre fantasy, in that they rarely posit an other world beyond, beneath, or parallel to this one. And yet, anomalous and strange events do take place within this world. Furthermore, they seem to take place according to some logic that remains utterly alien to the human world of moral codes, natural law, and cosmic order. If such anomalies could simply be dismissed as anomalies, as errors or aberrations in nature, then the natural order of the world would remain intact. But they cannot be so easily dismissed, and neither can they simply be incorporated into the existing order without undermining it entirely. Fisher nicely summarizes the dilemma: “a weird entity or object is so strange that it makes us feel that it should not exist, or at least that it should not exist here. Yet if the entity or object is here, then the categories which we have up until now used to the make sense of the world cannot be valid. The weird thing is not wrong, after all: it is our conceptions that must be inadequate.”

    *

    This dilemma (which literary critic Tzvetan Todorov called “the fantastic”) is presented in unique ways by authors of the weird tale and cosmic horror. Such authors refuse to identify the weird with the supernatural, and often refuse the distinction between the natural and supernatural entirely. They do so not via mythology or religion, but via science – or at least a peculiar take on science. In cosmic horror, the strange reality described by science is often far more unreal than any vampire, werewolf, or zombie. Fisher highlights this: “In many ways, a natural phenomenon such as a black hole is more weird than a vampire.” Why? Because the existence of the vampire, anomalous and transgressive as it may seem, actually reinforces the boundary between the natural order “in here” and a transcendent, supernatural order “out there.” “Compare this to a black hole,” Fisher continues, “the bizarre ways in which it bends space and time are completely outside our common experience, and yet a black hole belongs to the natural-material cosmos – a cosmos which must therefore be much stranger than our ordinary experience can comprehend.” Science, for all its explanatory power, inadvertently reveals the hubris of the explanatory impulse of all human knowledge, not just science.

    Authors such as Lovecraft were well aware of this shift in their approach to the horror genre. An oft-cited passage from one of Lovecraft’s letters reads: “…all my tales are based on the fundamental premise that common human laws and interests and emotions have no validity or significance in the vast cosmos-at-large.” To write the truly weird tale, Lovecraft notes, “one must forget that such things as organic life, good and evil, love and hate, and all such local attributes of a negligible and temporary race called mankind, have any existence at all.” So much for humanism, then. But Fisher is also right to note that Lovecraft’s tales are not simply horror tales. As Lovecraft himself repeatedly noted, the affects of fear, terror, and horror are merely consequences of human being confronting an impersonal and indifferent non-human world – what Lovecraft once called “indifferentism” (which, as he jibes, wonders “whether the cosmos gives a damn one way or the other”). There is an allure to the unhuman that is, at the same time, opaque and obscure. As Fisher writes, “it is not horror but fascination – albeit a fascination usually mixed with a certain trepidation – that is integral to Lovecraft’s rendition of the weird…the weird cannot only repel, it must also compel our attention.”

    This reaches a pitch in Fisher’s writing on author Nigel Kneale and his series of Quatermass films and TV shows. The Quatermass and the Pit series, for instance, opens with the shocking discovery of an alien spaceship buried within the bowels of a London tube station (which station I will not say). The strange, quasi-insect remains inside the ship point to another, very different form of life than that of terrestrial life. But the science tells them that the alien spaceship is actually a relic from the distant past. It seems that not only geology and cosmology, but human history will have to be rethought. Gradually, the scientists learn that the alien relics are millions of years old, and in fact a distant, early progenitor of human beings. We, in turns, out, are they – or vice-versa. The Quatermass series not only demonstrates the efficacy of scientific inquiry, it puts forth a further proposition: that science works too well. “Kneale shows that an enquiry into the nature of what the world is like is also inevitably an unraveling of what human beings had taken themselves to be…if human beings fully belong to the so-called natural world, then on what grounds can a special case be made for them?” Reality turns out to be weirder and more eerie than any fantastical world or alien civilization. This is what Fisher calls “Radical Enlightenment,” a kind of physics that goes all the way, a materialism to the nth degree, even at the cost of disassembling the self-aware and self-privileging human brain that conceives of it. Reversals and inversions abound. What if humanity itself is not the cause of world history but the effect of material and physical laws that we can only dimly intuit?

    This theme of Radical Enlightenment runs through Fisher’s book. While he does discuss works of fiction or film one would expect in relation to the horror genre (Lovecraft, Kubrick’s The Shining, David Lynch’s recent films), Fisher also offers ruminations on contemporary works (such as Jonathan Glazer’s 2013 film Under the Skin), as well as a number of evocative comparisons, such as a chapter on the weird effects of time loops in Rainer Werner Fassbinder’s film World on a Wire and Philip K. Dick’s novel Time Out Of Joint. There are also several surprises, including a meditation on the strange “vanishing landscapes” in M.R. James’s ghost stories and Brian Eno’s 1982 ambient album On Land. Also welcome is Fisher’s attentiveness to under-appreciated works in the horror genre, including the disquieting short fiction of Daphne du Maurier. In the span of a few carefully-written pages, Fisher follows the twists and turns of his twin concepts one chapter at a time, one example at a time, until it is revealed exactly how enmeshed the weird and the eerie are in culture generally.

    *

    The Weird and the Eerie is an evocative and carefully-written short study in cultural aesthetics. Far from the familiar line-up of vampires, zombies, and demons, Fisher’s eclectic examples speak directly to one of the central themes of the horror genre: the limits of human knowledge, the metamorphic shapes of fear, and the blurriness of boundaries of all types. His simple conceptual distinction quickly gives way to reversals, permutations, and complications, ultimately refusing any notion of a monstrous or alien unhumanness  “out there”; with Fisher, the unhuman is more likely to reside within the human itself (or as Lovecraft might write it, “the unhuman is discovered to reside within the human itself”).

    Many books on the horror genre are concerned with providing answers, using varieties of taxonomy and psychology to provide a therapeutic application to “our” lives, helping us to cathartically purge collective anxieties and fears. For Fisher, the emphasis is more on questions, questions that target the vanity and presumptuousness of human culture, questions regarding human consciousness elevating itself above all else, questions concerning the presumed sovereignty of the species at whatever cost – perhaps questions it’s better not to pose, at the risk of undermining the entire endeavor to begin with.

    I should let the reader decide which approach makes more sense, given the weird and/or eerie “Waldo-moment” in which we currently find ourselves. But the weird and the eerie are scalable, pervading broad cultural structures as well as the minutiae of personal ruminations. I’ve known Fisher as a colleague for some time. About a week after I had agreed to do this review, I heard via email of Fisher’s suicide. Someone I knew was previously there, over there, doing what they do, they way we so often presume a person’s presence in between moments of punctuated interaction. And then, suddenly, they’re not there. About a week after this, The Weird and the Eerie arrived in the mail. It was hard not to pick up the book and feel it had a kind of aura around it, as if it was some kind of final statement, a last communiqué. I had it on the table in a short stack with other books, and I kept half-expecting it to also vanish, as if its very presence there were incongruous. I would occasionally pick up the book and flip through it, as if secretly hoping to discover pages that weren’t there before. But my copy was the same as all the others. Besides, isn’t that essentially what a book is, a last word written by someone either long dead or who will die in the future? Maybe all books are eerie in this way.

    Eugene Thacker is the author of several books, including In The Dust Of This Planet (Zero Books, 2011) and Cosmic Pessimism (Univocal, 2015).

  • Alexander R. Galloway — Brometheanism

    Alexander R. Galloway — Brometheanism

    By Alexander R. Galloway
    ~

    In recent months I’ve remained quiet about the speculative turn, mostly because I’m reticent to rekindle the “Internet war” that broke out a couple of years ago mostly on blogs but also in various published papers. And while I’ve taught accelerationism in my recent graduate seminars, I opted for radio silence when accelerationism first appeared on the scene through the Accelerationist Manifesto, followed later by the book Inventing the Future. Truth is I have mixed feelings about accelerationism. Part of me wants to send “comradely greetings” to a team of well-meaning fellow Marxists and leave it at that. Lord knows the left needs to stick together. Likewise there’s little I can add that people like Steven Shaviro and McKenzie Wark haven’t already written, and articulated much better than I could. But at the same time a number of difficulties remain that are increasingly hard to overlook. To begin I might simply echo Wark’s original assessment of the Accelerationist Manifesto: two cheers for accelerationism, but only two!

    What’s good about accelerationism? And what’s bad? I love the ambition and scope. Certainly the accelerationists’ willingness to challenge leftist orthodoxies is refreshing. I also like how the accelerationists demand that we take technology and science seriously. And I also agree that there are important tactical uses of accelerationist or otherwise hypertrophic interventions (Eugene Thacker and I have referred to them as exploits). Still I see accelerationism essentially as a tactic mistaken for a strategy. At the same time this kind of accelerationism is precisely what dot-com entrepreneurs want to see from the left. Further, and ultimately most important, accelerationism is paternalistic and thus suffers from the problems of elitism and ultimately reactionary politics.

    Let me explain. I’ll talk first about Srnicek and Williams’ 2015 book Inventing the Future, and then address one of the central themes fueling the accelerationist juggernaut, Prometheanism. Well written, easy to read, and exhaustively footnoted, Inventing the Future is ostensibly a follow up to the Accelerationist Manifesto, although the themes of the two texts are different and they almost never mention accelerationism in the book. (Srnicek in particular is nothing if not shrewd and agile: present at the christening of #A, we also find him on the masthead of the speculative realist reader, and today nosing in on “platform studies.” Wherever he alights next will doubtless portend future significance.) The book is vaguely similar to Michael Hardt and Antonio Negri’s Declaration from 2012 in that it tries to assess the current condition of the left while also providing a set of specific steps to be taken for the future. And while the accelerationists have garnered significantly more attention of late, mostly because it feels so fresh and new, Hardt and Negri’s is the better book (and interestingly Srnicek and Williams never cite them).

    Inventing the Future

    Inventing the Future has essentially two themes. The first consists in a series of denunciations of what they call “folk politics” defined in terms of Occupy, the Zapatistas, Tiqqun, localism, and direct democracy, ostensibly in favor of a new “hegemony” of planetary social democracy (also known as Leninism). The second theme concerns an anti-work polemic focused on the universal basic income (UBI) and shortening the work week. Indeed even as these two authors collaborate and mix their thoughts, there seem to be two books mixed together into one. This produces an interesting irony: while the first half of the book unabashedly denigrates anarchism in favor of Leninism, the second half of the book focuses on that very theme (anti-work) that has defined anarchist theory since the split in the First International, if not since time immemorial.

    What’s so wrong with “folk politics”? There are a few ways to answer this question. First the accelerationists are clearly frustrated by the failures of the left, and rightly so, a left debilitated by “apathy, melancholy and defeat” (5). There’s a demographic explanation as well. This is the cri de coeur of a younger generation seeking to move beyond what are seen as the sclerotic failures of postmodern theory with all of its “culturalist” baggage (which too often is a codeword for punks, queers, women, and people of color — more on that in a moment).

    Folk politics includes “the fetishization of local spaces, immediate actions, transient gestures, and particularisms of all kinds” (3); it privileges the “small-scale, the authentic, the traditional and the natural” (10). The following virtues help fill out the definition:

    immediacy…tactics…inherently fleeting…the past…the voluntarist and spontaneous…the small…withdrawal or exit…the everyday…feeling…the particular…the ethical…the suffering of the particular and the authenticity of the local (10-11)

    Wow, that’s a lot of good stuff to get rid of. Still, they don’t quit there, targeting horizontalism of various kinds. Radical democracy is in the crosshairs too. Anti-representational politics is out as well. All the “from below” movements, from the undercommons to the black bloc, anything that smacks of “anarchism, council communism, libertarian communism and autonomism” (26) — it’s all under indictment. This unceasing polemic culminates in the book’s most potent sentence, if not also its most ridiculous, where the authors dismiss all of the following movements in one fell swoop:

    Occupy, Spain’s 15M, student occupations, left communist insurrectionists like Tiqqun and the Invisible Committee, most forms of horizontalism, the Zapatistas…localism…slow-food (11-12)

    That scoops up a lot of people. And the reader is left to quibble over whatever principal of decision might group all these disparate factions together. But the larger point is clear: for Srnicek and Williams folk politics emerged because of an outdated Left (i.e. the abject failures of social democracy and communism) (16-), and an outmaneuvered Left (i.e. the rampant successes of neoliberalism) (19-). Thus their goal is to update the left with a new ideology, and overhaul its infrastructure allowing it to modernize and scale up to the level of the planet.

    In the second half of the book, particularly in chapters 5 and 6, Srnicek and Williams elaborate their vision for anti-work and post-work. This hinges on the concept of full automation, and they provocatively assert that “the tendencies towards automation and the replacement of human labor should be enthusiastically accelerated” (109). Yet the details are scant. What kind of tech are we talking about? We get some vague references at the outset to “Open-source designs, copyleft creativity, and 3D printing” (1), then again later to “data collection (radio-frequency identification, big data)” and so on (110). But one thing this book does not provide is an examination of the technology of modern capitalism. (Srnicek’s Platform Capitalism is an improvement thematically but not substantively: he provides an analysis of political economy, but no tech audit.) Thus Inventing the Future has a sort of Wizard of Oz problem at its core. It’s not clear what clever devices are behind the curtain, we’re just supposed to assume that they will be sufficiently communistical if we all believe hard enough.

    At the same time the authors come across as rather tone deaf on the question of labor, bemoaning above all “the misery of not being exploited,” as if exploitation is some grand prize awarded to the subaltern. Further, they fail to address adequately the two key challenges of automation, both of which have been widely discussed in political and economic theory: first that automation eliminates jobs for people who very much want and need them, leading to surplus populations, unemployment, migration, and intrenched poverty; and second that automation transforms the organic composition of labor through deskilling and proletarianization, the offshoring of menial labor, and the introduction of technical and specialist labor required to design, build, operate, and repair those seemingly “automagical” machines. In other words, under automation some people work less, but everyone works differently. Automation reduces work for some, but changes (and in fact often increases) work for others. Marx’s analysis of machines in Capital is useful here, where he addresses all of these various tendencies, from the elimination of labor and the increase in labor, to the transformation of the organic composition of labor — the last point being the most significant. (And while machines might help lubricate and increase the productive forces — not a bad thing — it’s clear that machines are absolutely not revolutionary actors for Marx. Optimistic interpretations gleaned from the Grundrisse notwithstanding, Marx defines machines essentially as large batteries for value. I have yet to find any evidence that today’s machines are any different.)

    So the devil is in the details: what kind of technology are we talking about? But perhaps more importantly, if you get rid of the “folk,” aren’t you also getting rid of the people? Srnicek and Williams try to address this in chapter 8, although I’m more convinced by Hardt and Negri’s “multitude,” Harney and Moten’s “undercommons,” or even formulations like “the part of no part” or the “inoperative community” found scattered across a variety of other texts. By the end Srnicek and Williams out themselves as reticular pessimists: let’s not specify “the proper form of organization” (162), let’s just let it happen naturally in an “ecology of organizations” (163). The irony being that we’re back to square one, and these anti-folk evangelists are hippy ecologists after all. (The reference to function over form [169] appears as a weak afterthought to help rationalize their decision, but it re-introduces the problem of techno-fetishism, this time a fetishism of the function.)

    To summarize, accelerationism presents a rich spectrum of problems. The first stems from the notion that technology/automation will save us, replete with vague references to “the latest technological developments” unencumbered by any real details. Second is the question of capitalism itself. Despite the authors’ Marxist tendencies, it’s not at all clear that accelerationism is anti-capitalist. In fact accelerationism would be better described as a form of post-capitalism, what Zizek likes to mock as “capitalism with a friendly face.” What is post-capitalism exactly? More capitalism? A modified form of capitalism? For this reason it becomes difficult to untangle accelerationism from the most visionary dreams of the business elite. Isn’t this exactly what dot-com entrepreneurs are calling for? Isn’t the avant-garde of acceleration taking place right now in Silicon Valley? This leads to a third point: accelerationism is a tactic mistaken for a strategy. Certainly accelerationist or otherwise hypertrophic methods are useful in a provisional, local, which is to say tactical way. But accelerationism is, in my view, naïve about how capitalism works at a strategic level. Capitalism wants nothing more than to accelerate. Adding to the acceleration will help capitalism not hinder it. Capitalism is this accelerating force, from primitive accumulation on up to today. (Accelerationists don’t dispute this; they just simply disagree on the moral status of capitalism.) Fourth and finally is the most important problem revealed by accelerationism, the problem of elitism and reactionary politics. Given unequal technological development, those who accelerate will necessarily do so on the backs of others who are forced to proletarianize. Thus accelerationists are faced with a kind of “internal colonialism” problem, meaning there must be a distinction made between those who accelerate and those who facilitate acceleration through their very bodies. We already know who suffers most under unequal technological acceleration, and it’s not young white male academics living in England. Thus their skepticism toward the “folk” is all too often a paternalistic skepticism toward the wants and needs of the generic population. Hence the need for accelerationists to talk glowingly about things like “engineering consent.” It’s hard to see where this actually leads. Or more to the point who leads: if not Leninists then who, technocrats? Philosopher kings?

    *

    Accelerationism gains much inspiration from the philosophy of Prometheanism. If accelerationism provides a theory of political economy, Prometheanism supplies a theory of the subject. Yet it’s not always clear what people mean by this term. In a recent lecture titled “Prometheanism and Rationalism” Peter Wolfendale defines Prometheanism in such general terms that it becomes a synonym for any number of things: history and historical change; being against fatalism and messianism; being against the aristocracy; being against Fukuyama; being for feminism; the UBI and post-capitalism; the Enlightenment and secularism; deductive logic; overcoming (perceived) natural limits; technology; “automation” (which as I’ve just indicated is the most problematic concept of them all). Even very modest and narrow definitions of Prometheanism — technology for humans to overcome natural limit — present their own problems and wind up largely deflating the sloganeering of it all. “Okay so both the hydrogen bomb and the contraceptive pill are equally Promethean? So then who adjudicates their potential uses?” And we’re left with Prometheanism as the latest YAM philosophy (Yet Another Morality).

    Still, Prometheanism has a particular vision for itself and it’s worth describing the high points. I can think of six specific qualities. (1) Prometheanism defines itself as posthuman or otherwise antihuman. (2) Prometheanism is an attempt to transcend the bounds of physical limitation. (3) Prometheanism promotes freedom, as in for instance the freedom to change the body through hormone therapy. (4) Prometheanism sees itself as politically progressive. (5) Prometheanism sees itself as being technologically savvy. (6) Prometheanism proposes to offer technical solutions to real problems.

    But is any of this true? Interestingly Bernard Stiegler provided an answer to some of these questions already in 1994, and it’s worth returning to his book from that year Technics and Time, 1: The Fault of Epimetheus to fill out a conversation that has, thus far, been mostly one-sided. Stiegler’s book is long and complicated, and touches on many different things including technology and the increased rationalization of life, by way of some of Stiegler’s key influences including Gilbert Simondon, André Leroi-Gourhan, and Bertrand Gille. Let me focus however on the second part of the book, where Stiegler examines the two brothers Epimetheus and Prometheus.

    A myth about powers and qualities, the fable of Epimetheus and Prometheus is recounted by the sophist Protagoras starting at line 320c in Plato’s dialogue of that name. In Stiegler’s retelling of the story, we begin with Epimetheus, who, via a “principle of compensation” governed by notions of difference and equilibrium, hands out powers and qualities to all the animals of the Earth. For instance extra speed might be endowed to the gazelle, but only by way of balanced compensation given to another animal, say a boost in strength bestowed upon the lion. Seemingly diligent in his duties, Epimetheus nevertheless tires before the job is complete, shirking his duties before arriving at humankind, who is left relatively naked without a special power or quality of its own. To compensate humankind, Prometheus absconds with “the gift of skill in the arts and fire” — “τὴν ἔντεχνον σοφίαν σὺν πυρί” — captured from Athena and Hephaestus, respectively, conferring these two gifts to humanity (Plato, “Protagoras,” 321d).

    In this way humans are defined first not via technical supplement but through an elemental fault — this is Stiegler’s lingering poststructuralism — the fault of Epimetheus. Epimetheus forgets about us, leaving us until the end, and hence “Humans only occur through their being forgotten; they only appear in disappearing” (188). But it’s more than that: a fault followed by a theft, and hence a twin fault. Humanity is the “fruit of a double fault–an act of forgetting [by Epimetheus], then of theft [by Prometheus]” (188). Humans are thus a forgotten afterthought, remedied afterward by a lucky forethought.

    “Afterthought” and “forethought” — Stiegler means these terms quite literally. Who is Epimetheus? And who is Prometheus? Greek names often have etymological if not allegorical significance, as is the case here. Both names share the root “-metheus,” cognate with manthánō [μανθάνω], which means learning, study, or cultivation of knowledge. Hence a mathitís [μαθητής] is a learner or a student. (And in fact in a very literal sense “mathematics” simply refers to the things that one learns, not to arithmetic or geometry per se.) The two brothers are thus both varieties of learners, both varieties of thinkers. The key is which variety. The key is the Epi– and the Pro-.

    Epi carries the character of the accidentally and artificial factuality of something happening, arriving, a primordial ‘passibility,’” Stiegler explains. “Epimetheia means heritage. Heritage is always epimathesis. Epimetheia would also mean then tradition-originating in a fault that is always already there and that is nothing but technicity” (206-207). Hence Epimetheus means something like “learning on the basis of,” “thinking after,” or, more simply, or “afterthought” or “hindsight.” This is why Epimetheus forgets, why he is at fault, why he acts foolishly, because these are all the things that generate hindsight.

    Prometheus on the other hand is “foresight” or “fore-thought.” If Epimetheus means “thinking and learning on the basis of,” Prometheus means something more like “thinking and learning in anticipation of.” In this way, Prometheus comes to stand in for cleverness (but also theft), ingenuity, and thus technics as a whole.

    But is that all? Is the lesson simply to restore Epimetheus to his position next to Prometheus? To remember the Epimethean omission along with the Promethean endowment? In fact the old Greek myth isn’t quite finished, and, after initially overlooking the ending, Stiegler eventually broaches the closing section on Hermes. For even after benefiting from its Promethean supplement, humanity remains incomplete. Specifically, the gods notice that Man has a tendency toward war and political strife. Thus Hermes is tasked to implant a kind of socio-political virtue, supplementing humanity with “the qualities of respect for others [αἰδώ] and a sense of justice [δίκη]” (Plato 322c). In other words, a second supplement is necessary, only this time a supplement not rooted in the identitarian logic of heterogeneous qualities. “Another tekhnē is required,” writes Stiegler, “a tekhnē that is no longer paradoxically…the privilege of specialists” (201). This point about specialists is key — all you Leninists take note — because on Zeus’s command Hermes delivers respect and justice generically and equally across all persons, not via the “principle of compensation” based on difference and equilibrium used previously by Epimetheus to divvy up the powers and qualities of the animals. Thus while some people may have a talent for the piano, and others might be gifted in some other way, justice and respect are bestowed equally to all.

    This is why politics is always a question of the “hermeneutic community,” that is, the ad hoc translation and interpretation of real political dynamics; it comes from Hermes (201). At the same time politics also means “the community of those who have no community” because there is no adjudication of heterogenous qualities, no truth or law stipulated in advance, except for the very “conditions” of the political (those “hermeneutic conditions,” namely αἰδώ and δίκη, respect and justice).

    To summarize, the Promethean story has three moments, not one, and all three ought to be given full voice:

    1. Default of origin (being forgotten about by Epimetheus/Hindsight)
    2. Gaining technicity (fire and skills from Prometheus/Foresight)
    3. Revealing the generic (“respect for others and a sense of justice” from Hermes)

    This strikes me as a much better way to think about Prometheanism overall, better than the narrow definition of “using technology to overcome natural limits.” Recognizing all three moments, Prometheanism (if we can still call it that) entails not just technological advancement, but also insufficiency and failure, along with a political consciousness rooted in generic humanity.

    And now would be a good time to pass the baton over to the Xenofeminists, who make much better use of accelerationism than its original authors do. The Xenofeminist manifesto provides a more holistic picture of what might simply be called a “universalism from below” — yes, that very folk politics that Srnicek and Williams seek to suppress — doing justice not only to Prometheus, but to Epimetheus and Hermes as well:

    Xenofeminism understands that the viability of emancipatory abolitionist projects — the abolition of class, gender, and race — hinges on a profound reworking of the universal. The universal must be grasped as generic, which is to say, intersectional. Intersectionality is not the morcellation of collectives into a static fuzz of cross-referenced identities, but a political orientation that slices through every particular, refusing the crass pigeonholing of bodies. This is not a universal that can be imposed from above, but built from the bottom up — or, better, laterally, opening new lines of transit across an uneven landscape. This non-absolute, generic universality must guard against the facile tendency of conflation with bloated, unmarked particulars — namely Eurocentric universalism — whereby the male is mistaken for the sexless, the white for raceless, the cis for the real, and so on. Absent such a universal, the abolition of class will remain a bourgeois fantasy, the abolition of race will remain a tacit white-supremacism, and the abolition of gender will remain a thinly veiled misogyny, even — especially — when prosecuted by avowed feminists themselves. (The absurd and reckless spectacle of so many self-proclaimed ‘gender abolitionists’ campaign against trans women is proof enough of this). (0x0F)


    _____

    Alexander R. Galloway is a writer and computer programmer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay

  • Poetics of Control

    Poetics of Control

    a review of Alexander R. Galloway, The Interface Effect (Polity, 2012)

    by Bradley J. Fest

    ~

    This summer marks the twenty-fifth anniversary of the original French publication of Gilles Deleuze’s seminal essay, “Postscript on the Societies of Control” (1990). A strikingly powerful short piece, “Postscript” remains, even at this late date, one of the most poignant, prescient, and concise diagnoses of life in the overdeveloped digital world of the twenty-first century and the “ultrarapid forms of apparently free-floating control that are taking over from the old disciplines.”[1] A stylistic departure from much of Deleuze’s other writing in its clarity and straightforwardness, the essay describes a general transformation from the modes of disciplinary power that Michel Foucault famously analyzed in Discipline and Punish (1975) to “societies of control.” For Deleuze, the late twentieth century is characterized by “a general breakdown of all sites of confinement—prisons, hospitals, factories, schools, the family.”[2] The institutions that were formerly able to strictly organize time and space through perpetual surveillance—thereby, according to Foucault, fabricating the modern individual subject—have become fluid and modular, “continually changing from one moment to the next.”[3] Individuals have become “dividuals,” “dissolv[ed] . . . into distributed networks of information.”[4]

    Over the past decade, media theorist Alexander R. Galloway has extensively and rigorously elaborated on Deleuze’s suggestive pronouncements, probably devoting more pages in print to thinking about the “Postscript” than has any other single writer.[5] Galloway’s most important work in this regard is his first book, Protocol: How Control Exists after Decentralization (2004). If the figure for the disciplinary society was Jeremy Bentham’s panopticon, a machine designed to induce a sense of permanent visibility in prisoners (and, by extension, the modern subject), Galloway argues that the distributed network, and particularly the distributed network we call the internet, is an apposite figure for control societies. Rhizomatic and flexible, distributed networks historically emerged as an alternative to hierarchical, rigid, centralized (and decentralized) networks. But far from being chaotic and unorganized, the protocols that organize our digital networks have created “the most highly controlled mass media hitherto known. . . . While control used to be a law of society, now it is more like a law of nature. Because of this, resisting control has become very challenging indeed.”[6] To put it another way: if in 1980 Deleuze and Félix Guattari complained that “we’re tired of trees,” Galloway and philosopher Eugene Thacker suggest that today “we’re tired of rhizomes.”[7]

    The imperative to think through the novel challenges presented by control societies and the urgent need to develop new methodologies for engaging the digital realities of the twenty-first century are at the heart of The Interface Effect (2012), the final volume in a trio of works Galloway calls Allegories of Control.[8] Guiding the various inquiries in the book is his provocative claim that “we do not yet have a critical or poetic language in which to represent the control society.”[9] This is because there is an “unrepresentability lurking within information aesthetics” (86). This claim for unrepresentability, that what occurs with digital media is not representation per se, is The Interface Effect’s most significant departure from previous media theory. Rather than rehearse familiar media ecologies, Galloway suggests that “the remediation argument (handed down from McLuhan and his followers including Kittler) is so full of holes that it is probably best to toss it wholesale” (20). The Interface Effect challenges thinking about mimesis that would place computers at the end of a line of increasingly complex modes of representation, a line extending from Plato, through Erich Auerbach, Marshall McLuhan, and Friedrich Kittler, and terminating in Richard Grusin, Jay David Bolter, and many others. Rather than continue to understand digital media in terms of remediation and representation, Galloway emphasizes the processes of computational media, suggesting that the inability to productively represent control societies stems from misunderstandings about how to critically analyze and engage with the basic materiality of computers.

    The book begins with an introduction polemically positioning Galloway’s own media theory directly against Lev Manovich’s field-defining book, The Language of New Media (2001). Contra Manovich, Galloway stresses that digital media are not objects but actions. Unlike cinema, which he calls an ontology because it attempts to bring some aspect of the phenomenal world nearer to the viewer—film, echoing Oedipa Maas’s famous phrase, “projects worlds” (11)—computers involve practices and effects (what Galloway calls an “ethic”) because they are “simply on a world . . . subjecting it to various forms of manipulation, preemption, modeling, and synthetic transformation. . . . The matter at hand is not that of coming to know a world, but rather that of how specific, abstract definitions are executed to form a world” (12, 13, 23). Or to take two other examples Galloway uses to positive effect: the difference can be understood as that between language, which describes and represents, encoding a world, versus calculus, which does or simulates doing something to the world; calculus is a “system of reasoning, an executable machine” (22). Though Galloway does more in Gaming: Essays on Algorithmic Culture (2006) to fully develop a way of analyzing computational media that privileges action over representation, The Interface Effect theoretically grounds this important distinction between mimesis and action, description and process.[10] Further, it constitutes a bold methodological step away from some of the dominant ways of thinking about digital media that simultaneously offers its readers new ways to connect media studies more firmly to politics.

    Further distinguishing himself from writers like Manovich, Galloway says that there has been a basic misunderstanding regarding media and mediation, and that the two systems are “violently unconnected” (13). Galloway demonstrates, in contrast to such thinkers as Kittler, that there is an old line of thinking about mediation that can be traced very far back and that is not dependent on thinking about media as exclusively tied to nineteenth and twentieth century communications technology:

    Doubtless certain Greek philosophers had negative views regarding hypomnesis. Yet Kittler is reckless to suggest that the Greeks had no theory of mediation. The Greeks indubitably had an intimate understanding of the physicality of transmission and message sending (Hermes). They differentiated between mediation as immanence and mediation as expression (Iris versus Hermes). They understood the mediation of poetry via the Muses and their techne. They understood the mediation of bodies through the “middle loving” Aphrodite. They even understood swarming and networked presence (in the incontinent mediating forms of the Eumenides who pursued Orestes in order to “process” him at the procès of Athena). Thus we need only look a little bit further to shed this rather vulgar, consumer-electronics view of media, and instead graduate into the deep history of media as modes of mediation. (15)

    Galloway’s point here is that the larger contemporary discussion of mediation that he is pursuing in The Interface Effect should not be restricted to merely the digital artifacts that have occasioned so much recent theoretical activity, and that there is an urgent need for deeper histories of mediation. Though the book appears to be primarily concerned with the twentieth and twenty-first century, this gesture toward the Greeks signals the important work of historicization that often distinguishes much of Galloway’s work. In “Love of the Middle” (2014), for example, which appears in the book Excommunication (2014), co-authored with Thacker and McKenzie Wark, Galloway fully develops a rigorous reading of Greek mediation, suggesting that in the Eumenides, or what the Romans called the Furies, reside a notable historical precursor for understanding the mediation of distributed networks.[11]

    In The Interface Effect these larger efforts at historicization allow Galloway to always understand “media as modes of mediation,” and consequently his big theoretical step involves claiming that “an interface is not a thing, an interface is an effect. It is always a process or a translation” (33). There are a variety of positive implications for the study of media understood as modes of mediation, as a study of interface effects. Principal amongst these are the rigorous methodological possibilities Galloway’s focus emphasizes.

    In this, methodologically and otherwise, Galloway’s work in The Interface Effect resembles and extends that of his teacher Fredric Jameson, particularly the kind of work found in The Political Unconscious (1981). Following Jameson’s emphasis on the “poetics of social forms,” Galloway’s goal is “not to reenact the interface, much less to ‘define’ it, but to identify the interface itself as historical. . . . This produces . . . a perspective on how cultural production and the socio-historical situation take form as they are interfaced together” (30). The Interface Effect firmly ties the cultural to the social, economic, historical, and political, finding in a variety of locations ways that interfaces function as allegories of control. “The social field itself constitutes a grand interface, an interface between subject and world, between surface and source, and between critique and the objects of criticism. Hence the interface is above all an allegorical device that will help us gain some perspective on culture in the age of information” (54). The power of looking at the interface as an allegorical device, as a “control allegory” (30), is demonstrated throughout the book’s relatively wide-ranging analyses of various interface effects.

    Chapter 1, “The Unworkable Interface,” historicizes some twentieth century transformations of the interface, concisely summarizing a history of mediation by moving from Norman Rockwell’s “Triple Self-Portrait” (1960), through Mad Magazine’s satirization of Rockwell, to World of Warcraft (2004-2015). Viewed from the level of the interface, with all of its nondiegetic menus and icons and the ways it erases the line between play and labor, Galloway demonstrates both here and in the last chapter that World of Warcraft is a powerful control allegory: “it is not an avant-garde image, but, nevertheless, it firmly delivers an avant-garde lesson in politics” (44).[12] Further exemplifying the importance of historicizing interfaces, Chapter 2 continues to demonstrate the value of approaching interface effects allegorically. Galloway finds “a formal similarity between the structure of ideology and the structure of software” (55), arguing that software “is an allegorical figure for the way in which . . . political and social realities are ‘resolved’ today: not through oppression or false consciousness . . . but through the ruthless rule of code” (76). Chapter 4 extends such thinking toward a masterful reading of the various mediations at play in a show such as 24 (2001-2010, 2014), arguing that 24 is political not because of its content but “because the show embodies in its formal technique the essential grammar of the control society, dominated as it is by specific network and informatic logics” (119). In short, The Interface Effect continually demonstrates the potent critical tools approaching mediation as allegory can provide, reaffirming the importance of a Jamesonian approach to cultural production in the digital age.

    Whether or not readers are convinced, however, by Galloway’s larger reworking of the field of digital media studies, his emphasis on attending to contemporary cultural artifacts as allegories of control, or his call in the book’s conclusion for a politics of “whatever being” probably depends upon their thoughts about the unrepresentability of today’s global networks in Chapter 3, “Are Some Things Unrepresentable?” His answer to the chapter’s question is, quite simply, “Yes.” Attempts to visualize the World Wide Web only result in incoherent repetition: “every map of the internet looks the same,” and as a result “no poetics is possible in this uniform aesthetic space” (85). He argues that, in the face of such an aesthetic regime, what Jacques Rancière calls a “distribution of the sensible”[13]:

    The point is not so much to call for a return to cognitive mapping, which of course is of the highest importance, but to call for a poetics as such for this mysterious new machinic space. . . . Today’s systemics have no contrary. Algorithms and other logical structures are uniquely, and perhaps not surprisingly, monolithic in their historical development. There is one game in town: a positivistic dominant of reductive, systemic efficiency and expediency. Offering a counter-aesthetic in the face of such systematicity is the first step toward building a poetics for it, a language of representability adequate to it. (99)

    There are, to my mind, two ways of responding to Galloway’s call for a poetics as such in the face of the digital realities of contemporaneity.

    On the one hand, I am tempted to agree with him. Galloway is clearly signaling his debt to some of Jameson’s more important large claims and is reviving the need “to think the impossible totality of the contemporary world system,” what Jameson once called the “technological” or “postmodern sublime.”[14] But Galloway is also signaling the importance of poesis for this activity. Not only is Jamesonian “cognitive mapping” necessary, but the totality of twenty-first century digital networks requires new imaginative activity, a counter-aesthetics commensurate with informatics. This is an immensely attractive position, at least to me, as it preserves a space for poetic, avant-garde activity, and indeed, demands that, all evidence to the contrary, the imagination still has an important role to play in the face of societies of control. (In other words, there may be some “humanities” left in the “digital humanities.”[15]) Rather than suggesting that the imagination has been utterly foreclosed by the cultural logic of late capitalism—that we can no longer imagine any other world, that it is easier to imagine the end of the world than a better one—Galloway says that there must be a reinvestment in the imagination, in poetics as such, that will allow us to better represent, understand, and intervene in societies of control (though not necessarily to imagine a better world; more on this below). Given the present landscape, how could one not be attracted to such a position?

    On the other hand, Galloway’s argument hinges on his claim that such a poetics has not emerged and, as Patrick Jagoda and others have suggested, one might merely point out that such a claim is demonstrably false.[16] Though I hope I hardly need to list some of the significant cultural products across a range of media that have appeared over the last fifteen years that critically and complexly engage with the realities of control (e.g., The Wire [2002-08]), it is not radical to suggest that art engaged with pressing contemporary concerns has appeared and will continue to appear, that there are a variety of significant artists who are attempting to understand, represent, and cope with the distributed networks of contemporaneity. One could obviously suggest Galloway’s argument is largely rhetorical, a device to get his readers to think about the different kinds of poesis control societies, distributed networks, and interfaces call for, but this blanket statement threatens to shut down some of the vibrant activity that is going on all over the world commenting upon the contemporary situation. In other words, yes we need a poetics of control, but why must the need for such a poetics hinge on the claim that there has not yet emerged “a critical or poetic language in which to represent the control society”? Is not Galloway’s own substantial, impressive, and important decade-long intellectual project proof that people have developed a critical language that is capable of representing the control society? I would certainly answer in the affirmative.

    There are some other rhetorical choices in the conclusion of The Interface Effect that, though compelling, deserve to be questioned, or at least highlighted. I am referring to Galloway’s penchant—following another one of his teachers at Duke, Michael Hardt—for invoking a Bartlebian politics, what Galloway calls “whatever being,” as an appropriate response to present problems.[17] In Hardt and Antonio Negri’s Empire (2000), in the face of the new realities of late capitalism—the multitude, the management of hybridities, the non-place of Empire, etc.—they propose that Herman Melville’s “Bartleby in his pure passivity and his refusal of any particulars presents us with a figure of generic being, being as such, being and nothing more. . . . This refusal certainly is the beginning of a liberatory politics, but it is only a beginning.”[18] Bartleby, with his famous response of “‘I would prefer not to,’”[19] has been frequently invoked by such substantial figures as Giorgio Agamben in the 1990s and Slavoj Žižek in the 2000s (following Hardt and Negri). Such thinkers have frequently theorized Bartleby’s passive negativity as a potentially radical political position, and perhaps the only one possible in the face of global economic realities.[20] (And indeed, it is easy enough to read, say, Occupy Wall Street as a Bartlebian political gesture.) Galloway’s response to the affective postfordist labor of digital networks, that “each and every day, anyone plugged into a network is performing hour after hour of unpaid micro labor” (136), is similarly to withdraw, to “demilitarize being. Stand down. Cease participating” (143).

    Like Hardt and Negri and so many others, Galloway’s “whatever being” is a response to the failures of twentieth century emancipatory politics. He writes:

    We must stress that it is not the job of politics to invent a new world. On the contrary it is the job of politics to make all these new worlds irrelevant. . . . It is time now to subtract from this world, not add to it. The challenge today is not one of political or moral imagination, for this problem was solved ages ago—kill the despots, surpass capitalism, inclusion of the excluded, equality for all of humanity, end exploitation. The world does not need new ideas. The challenge is simply to realize what we already know to be true. (138-39)

    And thus the tension of The Interface Effect is between this call for withdrawal, to work with what there is, to exploit protocological possibility, etc., and the call for a poetics of control, a poesis capable of representing control societies, which to my mind implies imagination (and thus, inevitably, something different, if not new). If there is anything wanting about the book it is its lack of clarity about how these two critical projects are connected (or indeed, if they are perhaps the same thing!). Further, it is not always clear what exactly Galloway means by “poetics” nor how a need for a poetics corresponds to the book’s emphasis on understanding mediation as process over representation, action over objects. This lack of clarity may be due in part to the fact that, as Galloway indicates in his most recent work, Laruelle: Against the Digital (2014), there is some necessary theorization that he needs to do before he can adequately address the digital head-on. As he writes in the conclusion to that book: “The goal here has not been to elucidate, promote, or disparage contemporary digital technologies, but rather to draft a simple prolegomenon for future writing on digitality and philosophy.”[21] In other words, it seems like Allegories of Control, The Exploit: A Theory of Networks (2007), and Laruelle may constitute the groundwork for an even more ambitious confrontation with the digital, one where the kinds of tensions just noted might dissolve. As such, perhaps the reinvocation of a Bartlebian politics of withdrawal at the end of The Interface Effect is merely a kind of stop-gap, a place-holder before a more coherent poetics of control can emerge (as seems to be the case for the Hardt and Negri of Empire). Although contemporary theorists frequently invoke Bartleby, he remains a rather uninspiring figure.

    These criticisms aside, however, Galloway’s conclusion of the larger project that is Allegories of Control reveals him to be a consistently accessible and powerful guide to the control society and the digital networks of the twenty-first century. If the new directions in his recent work are any indication, and Laruelle is merely a prolegomenon to future projects, then we should perhaps not despair at all about the present lack of a critical language for representing control societies.

    _____

    Bradley J. Fest teaches literature at the University of Pittsburgh. At present he is working on The Nuclear Archive: American Literature Before and After the Bomb, a book investigating the relationship between nuclear and information technology in twentieth and twenty-first century American literature. He has published articles in boundary 2, Critical Quarterly, and Studies in the Novel; and his essays have appeared in David Foster Wallace and “The Long Thing” (2014) and The Silence of Fallout (2013). The Rocking Chair, his first collection of poems, is forthcoming from Blue Sketch Press. He blogs at The Hyperarchival Parallax.

    Back to the essay
    _____

    [1] Though best-known in the Anglophone world via the translation that appeared in 1992 in October as “Postscript on the Societies of Control,” the piece appears as “Postscript on Control Societies,” in Gilles Deleuze, Negotiations: 1972-1990, trans. Martin Joughin (New York: Columbia University Press, 1995), 178. For the original French see Gilles Deleuze, “Post-scriptum sur des sociétés de contrôle,” in Pourparlers, 1972-1990 (Paris: Les Éditions de Minuit, 1990), 240-47. The essay originally appeared as “Les sociétés de contrôle,” L’Autre Journal, no. 1 (May 1990). Further references are to the Negotiations version.

    [2] Ibid.

    [3] Ibid., 179.

    [4] Alexander R. Galloway, Protocol: How Control Exists after Decentralization (Cambridge, MA: MIT Press, 2004), 12n18.

    [5] In his most recent book, Galloway even goes so far as to ask about the “Postscript”: “Could it be that Deleuze’s most lasting legacy will consist of 2,300 words from 1990?” (Alexander R. Galloway, Laruelle: Against the Digital [Minneapolis: University of Minnesota Press, 2014], 96, emphases in original). For Andrew Culp’s review of Laruelle for The b2 Review, see “From the Decision to the Digital.”

    [6] Galloway, Protocol, 147.

    [7] Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia, trans. Brian Massumi (Minneapolis: University of Minnesota Press, 1987), 15; and Alexander R. Galloway and Eugene Thacker, The Exploit: A Theory of Networks (Minneapolis: University of Minnesota Press, 2007), 153. For further discussions of networks see Alexander R. Galloway, “Networks,” in Critical Terms for Media Studies, ed. W. J. T. Mitchell and Mark B. N. Hansen (Chicago: University of Chicago Press), 280-96.

    [8] The other books in the trilogy include Protocol and Alexander R. Galloway, Gaming: Essays on Algorithmic Culture (Minneapolis: University of Minnesota Press, 2006).

    [9] Alexander R. Galloway, The Interface Effect (Malden, MA: Polity, 2012), 98. Hereafter, this work is cited parenthetically.

    [10] See especially Galloway’s masterful first chapter of Gaming, “Gamic Action, Four Moments,” 1-38. To my mind, this is one of the best primers for critically thinking about videogames, and it does much to fundamentally ground the study of videogames in action (rather than, as had previously been the case, in either ludology or narratology).

    [11] See Alexander R. Galloway, “Love of the Middle,” in Excommunication: Three Inquiries in Media and Mediation, by Alexander R. Galloway, Eugene Thacker, and McKenzie Wark (Chicago: University of Chicago Press, 2014), 25-76.

    [12] This is also something he touched on in his remarkable reading of Donald Rumsfeld’s famous “unknown unknowns.” See Alexander R. Galloway, Warcraft and Utopia,” Ctheory.net (16 February 2006). For a discussion of labor in World of Warcraft, see David Golumbia, “Games Without Play,” in “Play,” special issue, New Literary History 40, no. 1 (Winter 2009): 179-204.

    [13] See the following by Jacques Rancière: The Politics of Aesthetics: The Distribution of the Sensible, trans. Gabriel Rockhill (New York: Continuum, 2004), and “Are Some Things Unrepresentable?” in The Future of the Image, trans. Gregory Elliott (New York: Verso, 2007), 109-38.

    [14] Fredric Jameson, Postmodernism; or, the Cultural Logic of Late Capitalism (Durham, NC: Duke University Press, 1991), 38.

    [15] For Galloway’s take on the digital humanities more generally, see his “Everything Is Computational,” Los Angeles Review of Books (27 June 2013), and “The Cybernetic Hypothesis,” differences 25, no. 1 (Spring 2014): 107-31.

    [16] See Patrick Jagoda, introduction to Network Aesthetics (Chicago: University of Chicago Press, forthcoming 2015).

    [17] Galloway’s “whatever being” is derived from Giorgio Agamben, The Coming Community, trans. Michael Hardt (Minneapolis: University of Minnesota Press, 1993).

    [18] Michael Hardt and Antonio Negri, Empire (Cambridge, MA: Harvard University Press, 2000), 203, 204.

    [19] Herman Melville, “Bartleby, The Scrivener: A Story of Wall-street,” in Melville’s Short Novels, critical ed., ed. Dan McCall (New York: W. W. Norton, 2002), 10.

    [20] See Giorgio Agamben, “Bartleby, or On Contingency,” in Potentialities: Collected Essays in Philosophy, trans. and ed. Daniel Heller-Roazen (Stanford: Stanford University Press, 1999), 243-71; and see the following by Slavoj Žižek: Iraq: The Borrowed Kettle (New York: Verso, 2004), esp. 71-73, and The Parallax View (New York: Verso, 2006), esp. 381-85.

    [21] Galloway, Laruelle, 220.

  • Network Pessimism

    Network Pessimism

    By Alexander R. Galloway
    ~

    I’ve been thinking a lot about pessimism recently. Eugene Thacker has been deep in this material for some time already. In fact he has a new, lengthy manuscript on pessimism called Infinite Resignation, which is a bit of departure from his other books in terms of tone and structure. I’ve read it and it’s excellent. Definitely “the worst” he’s ever written! Following the style of other treatises from the history of philosophical pessimism–Leopardi, Cioran, Schopenhauer, Kierkegaard, and others–the bulk of the book is written in short aphorisms. It’s very poetic language, and some sections are driven by his own memories and meditations, all in an attempt to plumb the deepest, darkest corners of the worst the universe has to offer.

    Meanwhile, the worst can’t stay hidden. Pessimism has made it to prime time, to NPR, and even right-wing media. Despite all this attention, Eugene seems to have little interest in showing his manuscript to publishers. A true pessimist! Not to worry, I’m sure the book will see the light of day eventually. Or should I say dead of night? When it does, the book is sure to sadden, discourage, and generally worsen the lives of Thacker fans everywhere.

    Interestingly pessimism also appears in a number of other authors and fields. I’m thinking, for instance, of critical race theory and the concept of Afro-pessimism. The work of Fred Moten and Frank B. Wilderson, III is particularly interesting in that regard. Likewise queer theory has often wrestled with pessimism, be it the “no future” debates around reproductive futurity, or what Anna Conlan has simply labeled “homo-pessimism,” that is, the way in which the “persistent association of homosexuality with death and oppression contributes to a negative stereotype of LGBTQ lives as unhappy and unhealthy.”[1]

    In his review of my new book, Andrew Culp made reference to how some of this material has influenced me. I’ll be posting more on Moten and these other themes in the future, but let me here describe, in very general terms, how the concept of pessimism might apply to contemporary digital media.

    *

    A previous post was devoted to the reticular fallacy, defined as the false assumption that the erosion of hierarchical organization leads to an erosion of organization as such. Here I’d like to address the related question of reticular pessimism or, more simply, network pessimism.

    Network pessimism relies on two basic assumptions: (1) “everything is a network”; (2) “the best response to networks is more networks.”

    Who says everything is a network? Everyone, it seems. In philosophy, Bruno Latour: ontology is a network. In literary studies, Franco Moretti: Hamlet is a network. In the military, Donald Rumsfeld: the battlefield is a network. (But so too our enemies are networks: the terror network.) Art, architecture, managerial literature, computer science, neuroscience, and many other fields–all have shifted prominently in recent years toward a network model. Most important, however, is the contemporary economy and the mode of production. Today’s most advanced companies are essentially network companies. Google monetizes the shape of networks (in part via clustering algorithms). Facebook has rewritten subjectivity and social interaction along the lines of canalized and discretized network services. The list goes on and on. Thus I characterize the first assumption — “everything is a network” — as a kind of network fundamentalism. It claims that whatever exists in the world appears naturally in the form of a system, an ecology, an assemblage, in short, as a network.

    Ladies and gentlemen, behold the good news, postmodernism is definitively over! We have a new grand récit. As metanarrative, the network will guide us into a new Dark Age.

    If the first assumption expresses a positive dogma or creed, the second is more negative or nihilistic. The second assumption — that the best response to networks is more networks — is also evident in all manner of social and political life today. Eugene and I described this phenomena at greater length in The Exploit, but consider a few different examples from contemporary debates… In military theory: network-centric warfare is the best response to terror networks. In Deleuzian philosophy: the rhizome is the best response to schizophrenic multiplicity. In autonomist Marxism: the multitude is the best response to empire. In the environmental movement: ecologies and systems are the best response to the systemic colonization of nature. In computer science: distributed architectures are the best response to bottlenecks in connectivity. In economics: heterogenous “economies of scope” are the best response to the distributed nature of the “long tail.”

    To be sure, there are many sites today where networks still confront power centers. The point is not to deny the continuing existence of massified, centralized sovereignty. But at the same time it’s important to contextualize such confrontations within a larger ideological structure, one that inoculates the network form and recasts it as the exclusive site of liberation, deviation, political maturation, complex thinking, and indeed the very living of life itself.

    Why label this a pessimism? For the same reasons that queer theory and critical race theory are grappling with pessimism: Is alterity a death sentence? Is this as good as it gets? Is this all there is? Can we imagine a parallel universe different from this one? (Although the pro-pessimism camp would likely state it in the reverse: We must destabilize and annihilate all normative descriptions of the “good.” This world isn’t good, and hooray for that!)

    So what’s the problem? Why should we be concerned about network pessimism? Let me state clearly so there’s no misunderstanding, pessimism isn’t the problem here. Likewise, networks are not the problem. (Let no one label me “anti network” nor “anti pessimism” — in fact I’m not even sure what either of those positions would mean.) The issue, as I see it, is that network pessimism deploys and sustains a specific dogma, confining both networks and pessimism to a single, narrow ideological position. It’s this narrow-mindedness that should be questioned.

    Specifically I can see three basic problems with network pessimism, the problem of presentism, the problem of ideology, and the problem of the event.

    The problem of presentism refers to the way in which networks and network thinking are, by design, allergic to historicization. This exhibits itself in a number of different ways. Networks arrive on the scene at the proverbial “end of history” (and they do so precisely because they help end this history). Ecological and systems-oriented thinking, while admittedly always temporal by nature, gained popularity as a kind of solution to the problems of diachrony. Space and landscape take the place of time and history. As Fredric Jameson has noted, the “spatial turn” of postmodernity goes hand in hand with a denigration of the “temporal moment” of previous intellectual movements.

    man machines buy fritz kahn
    Fritz Kahn, “Der Mensch als Industriepalast (Man as Industrial Palace)” (Stuttgart, 1926). Image source: NIH

    From Hegel’s history to Luhmann’s systems. From Einstein’s general relativity to Riemann’s complex surfaces. From phenomenology to assemblage theory. From the “time image” of cinema to the “database image” of the internet. From the old mantra always historicize to the new mantra always connect.

    During the age of clockwork, the universe was thought to be a huge mechanism, with the heavens rotating according to the music of the spheres. When the steam engine was the source of newfound power, the world suddenly became a dynamo of untold thermodynamic force. After full-fledged industrialization, the body became a factory. Technologies and infrastructures are seductive metaphors. So it’s no surprise (and no coincidence) that today, in the age of the network, a new template imprints itself on everything in sight. In other words, the assumption “everything is a network” gradually falls apart into a kind of tautology of presentism. “Everything right now is a network…because everything right now has been already defined as a network.”

    This leads to the problem of ideology. Again we’re faced with an existential challenge, because network technologies were largely invented as a non-ideological or extra-ideological structure. When writing Protocol I interviewed some of the computer scientists responsible for the basic internet protocols and most of them reported that they “have no ideology” when designing networks, that they are merely interested in “code that works” and “systems that are efficient and robust.” In sociology and philosophy of science, figures like Bruno Latour routinely describe their work as “post-critical,” merely focused on the direct mechanisms of network organization. Hence ideology as a problem to be forgotten or subsumed: networks are specifically conceived and designed as those things that both are non-ideological in their conception (we just want to “get things done”), but also post-ideological in their architecture (in that they acknowledge and co-opt the very terms of previous ideological debates, things like heterogeneity, difference, agency, and subject formation).

    The problem of the event indicates a crisis for the very concept of events themselves. Here the work of Alain Badiou is invaluable. Network architectures are the perfect instantiation of what Badiou derisively labels “democratic materialism,” that is, a world in which there are “only bodies and languages.” In Badiou’s terms, if networks are the natural state of the situation and there is no way to deviate from nature, then there is no event, and hence no possibility for truth. Networks appear, then, as the consummate “being without event.”

    What could be worse? If networks are designed to accommodate massive levels of contingency — as with the famous Robustness Principle — then they are also exceptionally adept at warding off “uncontrollable” change wherever it might arise. If everything is a network, then there’s no escape, there’s no possibility for the event.

    Jameson writes as much in The Seeds of Time when he says that it is easier to imagine the end of the earth and the end of nature than it is to imagine the ends of capitalism. Network pessimism, in other words, is really a kind of network defeatism in that it makes networks the alpha and omega of our world. It’s easier to imagine the end of that world than it is to discard the network metaphor and imagine a kind of non-world in which networks are no longer dominant.

    In sum, we shouldn’t give in to network pessimism. We shouldn’t subscribe to the strong claim that everything is a network. (Nor should we subscribe to the softer claim, that networks are merely the most common, popular, or natural architecture for today’s world.) Further, we shouldn’t think that networks are the best response to networks. Instead we must ask the hard questions. What is the political fate of networks? Did heterogeneity and systematicity survive the Twentieth Century? If so, at what cost? What would a non-net look like? And does thinking have a future without the network as guide?

    _____

    Alexander R. Galloway is a writer and computer programer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay
    _____

    Notes

    [1] Anna Conlan, “Representing Possibility: Mourning, Memorial, and Queer Museology,” in Gender, Sexuality and Museums, ed. Amy K. Levin (London: Routledge, 2010). 253-263.