boundary 2

Tag: Alexander Galloway

  • Alexander R. Galloway — Big Bro (Review of Wendy Hui Kyun Chun, Discriminating Data Correlation, Neighborhoods, and the New Politics of Recognition)

    Alexander R. Galloway — Big Bro (Review of Wendy Hui Kyun Chun, Discriminating Data Correlation, Neighborhoods, and the New Politics of Recognition)

    a review of Wendy Hui Kyun Chun, Discriminating Data Correlation, Neighborhoods, and the New Politics of Recognition (MIT Press, 2021)

    by Alexander R. Galloway

    I remember snickering when Chris Anderson announced “The End of Theory” in 2008. Writing in Wired magazine, Anderson claimed that the structure of knowledge had inverted. It wasn’t that models and principles revealed the facts of the world, but the reverse, that the data of the world spoke their truth unassisted. Given that data were already correlated, Anderson argued, what mattered was to extract existing structures of meaning, not to pursue some deeper cause. Anderson’s simple conclusion was that “correlation supersedes causation…correlation is enough.”

    This hypothesis — that correlation is enough — is the thorny little nexus at the heart of Wendy Chun’s new book, Discriminating Data. Chun’s topic is data analytics, a hard target that she tackles with technical sophistication and rhetorical flair. Focusing on data-driven tech like social media, search, consumer tracking, AI, and many other things, her task is to exhume the prehistory of correlation, and to show that the new epistemology of correlation is not liberating at all, but instead a kind of curse recalling the worst ghosts of the modern age. As Chun concludes, even amid the precarious fluidity of hyper-capitalism, power operates through likeness, similarity, and correlated identity.

    While interleaved with a number of divergent polemics throughout, the book focuses on four main themes: correlation, discrimination, authentication, and recognition. Chun deals with these four as general problems in society and culture, but also interestingly as specific scientific techniques. For instance correlation has a particular mathematical meaning, as well as a philosophical one. Discrimination is a social pathology but it’s also integral to discrete rationality. I appreciated Chun’s attention to details large and small; she’s writing about big ideas — essence, identity, love and hate, what does it mean to live together? — but she’s also engaging directly with statistics, probability, clustering algorithms, and all the minutia of data science.

    In crude terms, Chun rejects the — how best to call it — the “anarcho-materialist” turn in theory, typified by someone like Gilles Deleuze, where disciplinary power gave way to distributed rhizomes, schizophrenic subjects, and irrepressible lines of flight. Chun’s theory of power isn’t so much about tessellated tapestries of desiring machines as it is the more strictly structuralist concerns of norm and discipline, sovereign and subject, dominant and subdominant. Big tech is the mechanism through which power operates today, Chun argues. And today’s power is racist, misogynist, repressive, and exclusionary. Power doesn’t incite desire so much as stifle and discipline it. In other words George Orwell’s old grey-state villain, Big Brother, never vanished. He just migrated into a new villain, Big Bro, embodied by tech billionaires like Mark Zuckerberg or Larry Page.

    But what are the origins of this new kind of data-driven power? The reader learns that correlation and homophily, or “the notion that birds of a feather naturally flock together” (23), not only subtend contemporary social media platforms like Facebook, but were in fact originally developed by eugenicists like Francis Galton and Karl Pearson. “British eugenicists developed correlation and linear regression” (59), Chun notes dryly, before reminding us that these two techniques are at the core of today’s data science. “When correlation works, it does so by making the present and future coincide with a highly curated past” (52). Or as she puts it insightfully elsewhere, data science doesn’t so much anticipate the future, but predict the past.

    If correlation (pairing two or more pieces of data) is the first step of this new epistemological regime, it is quickly followed by some additional steps. After correlation comes discrimination, where correlated data are separated from other data (and indeed internally separated from themselves). This entails the introduction of a norm. Discriminated data are not simply data that have been paired, but measurements plotted along an axis of comparison. One data point may fall within a normal distribution, while another strays outside the norm within a zone of anomaly. Here Chun focuses on “homophily” (love of the same), writing that homophily “introduces normativity within a supposedly nonnormative system” (96).

    The third and fourth moments in Chun’s structural condition, tagged as “authenticity” and “recognition,” complete the narrative. Once groups are defined via discrimination, they are authenticated as a positive group identity, then ultimately recognized, or we could say self-recognized, by reversing the outward-facing discriminatory force into an inward-facing act of identification. It’s a complex libidinal economy that Chun patiently elaborates over four long chapters, linking these structural moments to specific technologies and techniques such as Bayes’ theorem, clustering algorithms, and facial recognition technology.

    A number of potential paths emerge in the wake of Chun’s work on correlation, which we will briefly mention in passing. One path would be toward Shane Denson’s recent volume, Discorrelated Images, on the loss of correlated experience in media aesthetics. Another would be to collide Chun’s critique of correlation in data science with Quentin Meillassoux’s critique of correlation in philosophy, notwithstanding the significant differences between their two projects.

    Correlation, discrimination, authentication, and recognition are the manifest contents of the book as it unfolds page by page. At the same time Chun puts forward a few meta arguments that span the text as a whole. The first is about difference and the second is about history. In both, Chun reveals herself as a metaphysician and moralist of the highest order.

    First Chun picks up a refrain familiar to feminism and anti-racist theory, that of erasure, forgetting, and ignorance. Marginalized people are erased from the archive; women are silenced; a subject’s embodiment is ignored. Chun offers an appealing catch phrase for this operation, “hopeful ignorance.” Many people in power hope that by ignoring difference they can overcome it. Or as Chun puts it, they “assume that the best way to fight abuse and oppression is by ignoring difference and discrimination” (2). Indeed this posture has been central to political liberalism for a long time, in for instance John Rawls’ derivation of justice via a “veil of ignorance.” For Chun the attempt to find an unmarked category of subjectivity — through that frequently contested pronoun “we” — will perforce erase and exclude those structurally denied access to the universal. “[John Perry] Barlow’s ‘we’ erased so many people,” Chun noted in dismay. “McLuhan’s ‘we’ excludes most of humanity” (9, 15). This is the primary crime for Chun, forgetting or ignoring the racialized and gendered body. (In her last book, Updating to Remain the Same, Chun reprinted a parody of a well-known New Yorker cartoon bearing the caption “On the Internet, nobody knows you’re a dog.” The posture of ignorance, of “nobody knowing,” was thoroughly critiqued by Chun in that book, even as it continues to be defended by liberals).

    Yet if the first crime against difference is to forget the mark, the second crime is to enforce it, to mince and chop people into segregated groups. After all, data is designed to discriminate, as Chun takes the better part of her book to elaborate. These are engines of difference and it’s no coincidence that Charles Babbage called his early calculating machine a “Difference Engine.” Data is designed to segregate, to cluster, to group, to split and mark people into micro identities. We might label this “bad” difference. Bad difference is when the naturally occurring multiplicity of the world is canalized into clans and cliques, leveraged for the machinations of power rather than the real experience of people.

    To complete the triad, Chun has proposed a kind of “good” difference. For Chun authentic life is rooted in difference, often found through marginalized experience. Her muse is “a world that resonates with and in difference” (3). She writes about “the needs and concerns of black women” (49). She attends to “those whom the archive seeks to forget” (237). Good difference is intersectional. Good difference attends to identity politics and the complexities of collective experience.

    Bad, bad, good — this is a triad, but not a dialectical one. Begin with 1) the bad tech posture of ignoring difference; followed by 2) the worse tech posture of specifying difference in granular detail; contrasted with 3) a good life that “resonates with and in difference.” I say “not dialectical” because the triad documents difference changing position rather than the position of difference changing (to paraphrase Catherine Malabou from her book on Changing Difference). Is bad difference resolved by good difference? How to tell the difference? For this reason I suggest we consider Discriminating Data as a moral tale — although I suspect Chun would balk at that adjective — because everything hinges on a difference between the good and the bad.

    Chun’s argument about good and bad difference is related to an argument about history, revealed through what she terms the “Transgressive Hypothesis.” I was captivated by this section of the book. It connects to a number of debates happening today in both theory and culture at large. Her argument about history has two distinct waves, and, following the contradictory convolutions of history, the second wave reverses and inverts the first.

    Loosely inspired by Michel Foucault’s Repressive Hypothesis, Chun’s Transgressive Hypothesis initially describes a shift in society and culture roughly coinciding with the Baby Boom generation in the late Twentieth Century. Let’s call it the 1968 mindset. Reacting to the oppressions of patriarchy, the grey-state threats of centralized bureaucracy, and the totalitarian menace of “Nazi eugenics and Stalinism,” liberation was found through “‘authentic transgression’” via “individualism and rebellion” (76). This was the time of the alternative, of the outsider, of the nonconformist, of the anti-authoritarian, the time of “thinking different.” Here being “alt” meant being left, albeit a new kind of left.

    Chun summons a familiar reference to make her point: the Apple Macintosh advertisement from 1984 directed by Ridley Scott, in which a scary Big Brother is dethroned by a colorful lady jogger brandishing a sledge hammer. “Resist, resist, resist,” was how Chun put the mantra. “To transgress…was to be free” (76). Join the resistance, unplug, blow your mind on red pills. Indeed the existential choice from The Matrix — blue pill for a life of slavery mollified by ignorance, red pill for enlightenment and militancy tempered by mortal danger — acts as a refrain throughout Chun’s book. In sum the Transgressive Hypothesis “equated democracy with nonnormative structures and behaviors” (76). To live a good life was to transgress.

    But this all changed in 1984, or thereabouts. Chun describes a “reverse hegemony” — a lovely phrase that she uses only twice — where “complaints against the ‘mainstream’ have become ‘mainstreamed’” (242). Power operates through reverse hegemony, she claims, “The point is never to be a ‘normie’ even as you form a norm” (34). These are the consequences of the rise of neoliberalism, fake corporate multiculturalism, Ronald Reagan and Margaret Thatcher but even more so Bill Clinton and Tony Blaire. Think postfordism and postmodernism. Think long tails and the multiplicity of the digital economy. Think woke-washing at CIA and Spike Lee shilling cryptocurrency. Think Hypernormalization, New Spirit of Capitalism, Theory of the Young Girl, To Live and Think Like Pigs. Complaints against the mainstream have become mainstreamed. And if power today has shifted “left,” then — Reverse Hegemony Brain go brrr — resistance to power shifts “right.” A generation ago the Q Shaman would have been a leftwing nut nattering about the Kennedy assassination. But today he’s a right wing nut (alas still nattering about the Kennedy assassination).

    “Red pill toxicity” (29) is how Chun characterizes the responses to this new topsy-turvy world of reverse hegemony. (To be sure, she’s only the latest critic weighing in on the history of the present; other well-known accounts include Angela Nagle’s 2017 book Kill All Normies, and Mark Fisher’s notorious 2013 essay “Exiting the Vampire Castle.”) And if libs, hippies, and anarchists had become the new dominant, the election of Donald Trump showed that “populism, paranoia, polarization” (77) could also reemerge as a kind of throwback to the worst political ideologies of the Twentieth Century. With Trump the revolutions of history — ironically, unstoppably — return to where they began, in “the totalitarian world view” (77).

    In other words these self-styled rebels never actually disrupted anything, according to Chun. At best they used disruption as a kind of ideological distraction for the same kinds of disciplinary management structures that have existed since time immemorial. And if Foucault showed that nineteenth-century repression also entailed an incitement to discourse, Chun describes how twentieth-century transgression also entailed a novel form of management. Before it was “you thought you were repressed but in fact you’re endlessly sublating and expressing.” Now it’s “you thought you were a rebel but disruption is a standard tactic of the Professional Managerial Class.” Or as Jacques Lacan said in response to some young agitators in his seminar, vous voulez un maître, vous l’aurez. Slavoj Žižek’s rendering, slightly embellished, best captures the gist: “As hysterics, you demand a new master. You will get it!

    I doubt Chun would embrace the word “hysteric,” a term indelibly marked by misogyny, but I wish she would, since hysteria is crucial to her Transgressive Hypothesis. In psychoanalysis, the hysteric is the one who refuses authority, endlessly and irrationally. And bless them for that; we need more hysterics in these dark times. Yet the lesson from Lacan and Žižek is not so much that the hysteric will conjure up a new master out of thin air. In a certain sense, the lesson is the reverse, that the Big Other doesn’t exist, that Big Brother himself is a kind of hysteric, that power is the very power that refuses power.

    This position makes sense, but not completely. As a recovering Deleuzian, I am indelibly marked by a kind of antinomian political theory that defines power as already heterogenous, unlawful, multiple, anarchic, and material. However I am also persuaded by Chun’s more classical posture, where power is a question of sovereign fiat, homogeneity, the central and the singular, the violence of the arche, which works through enclosure, normalization, and discipline. Faced with this type of power, Chun’s conclusion is, if I can compress a hefty book into a single writ, that difference will save us from normalization. In other words, while Chun is critical of the Transgressive Hypothesis, she ends up favoring the Big-Brother theory of power, where authentic alternatives escape repressive norms.

    I’ll admit it’s a seductive story. Who doesn’t want to believe in outsiders and heroes winning against oppressive villains? And the story is especially appropriate for the themes of Discriminating Data: data science of course entails norms and deviations; but also, in a less obvious way, data science inherits the old anxieties of skeptical empiricism, where the desire to make a general claim is always undercut by an inability to ground generality.

    Yet I suspect her political posture relies a bit too heavily on the first half of the Transgressive Hypothesis, the 1984 narrative of difference contra norm, even as she acknowledges the second half of the narrative where difference became a revanchist weapon for big tech (to say nothing of difference as a bonafide management style). This leads to some interesting inconsistencies. For instance Chun notes that Apple’s 1984 hammer thrower is a white woman disrupting an audience of white men. But she doesn’t say much else about her being a woman, or about the rainbow flag that ends the commercial. The Transgressive Hypothesis might be the quintessential tech bro narrative but it’s also the narrative of feminism, queerness, and the new left more generally. Chun avoids claiming that feminism failed; but she’s also savvy enough to avoid saying that it succeeded. And if Sadie Plant once wrote that “cybernetics is feminization,” for Chun it’s not so clear. According to Chun the cybernetic age of computers, data, and ubiquitous networks still orients around structures of normalization: masculine, white, straight, affluent and able-bodied. Resistant to such regimes of normativity, Chun must nevertheless invent a way to resist those who were resisting normativity.

    Regardless, for Chun the conclusion is clear: these hysterics got their new master. If not immediately they got it eventually, via the advent of Web 2.0 and the new kind of data-centric capitalism invented in the early 2000s. Correlation isn’t enough — and that’s the reason why. Correlation means the forming of a general relation, if only the most minimal generality of two paired data points. And, worse, correlation’s generality will always derive from past power and organization rather than from a reimagining of the present. Hence correlation for Chun is a type of structural pessimism, in that it will necessarily erase and exclude those denied access to the general relation.

    Characterized by a narrative poignancy and an attention to the ideological conditions of everyday life, Chun highlights alternative relations that could hopefully replace the pessimism of correlation. Such alternatives might take the form of a “potential history” or a “critical fabulation,” phrases borrowed from Ariella Azoulay and Saidiya Hartman, respectively. For Azoulay potential history means to “‘give an account of diverse worlds that persist’”; for Hartman, critical fabulation means “to see beyond numbers and sources” (79). A slim offering covering a few pages, nevertheless these references to Azoulay and Hartman indicate an appealing alternative for Chun, and she ends her book where it began, with an eloquent call to acknowledge “a world that resonates with and in difference.”

    _____

    Alexander R. Galloway is a writer and computer programmer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), Laruelle: Against the Digital (University of Minnesota, 2014), and most recently, Uncomputable: Play and Politics in the Long Digital Age (Verso, 2021).

    Back to the essay

     

  • Michael Miller — Seeing Ourselves, Loving Our Captors: Mark Jarzombek’s Digital Stockholm Syndrome in the Post-Ontological Age

    Michael Miller — Seeing Ourselves, Loving Our Captors: Mark Jarzombek’s Digital Stockholm Syndrome in the Post-Ontological Age

    a review of Mark Jarzombek, Digital Stockholm Syndrome in the Post-Ontological Age (University of Minnesota Press Forerunners Series, 2016)

    by Michael Miller

    ~

    All existence is Beta, basically. A ceaseless codependent improvement unto death, but then death is not even the end. Nothing will be finalized. There is no end, no closure. The search will outlive us forever

    — Joshua Cohen, Book of Numbers

    Being a (in)human is to be a beta tester

    — Mark Jarzombek, Digital Stockholm Syndrome in the Post-Ontological Age

    Too many people have access to your state of mind

    — Renata Adler, Speedboat

    Whenever I read through Vilém Flusser’s vast body of work and encounter, in print no less, one of the core concepts of his thought—which is that “human communication is unnatural” (2002, 5)––I find it nearly impossible to shake the feeling that the late Czech-Brazilian thinker must have derived some kind of preternatural pleasure from insisting on the ironic gesture’s repetition. Flusser’s rather grim view that “there is no possible form of communication that can communicate concrete experience to others” (2016, 23) leads him to declare that the intersubjective dimension of communication implies inevitably the existence of a society which is, in his eyes, itself an unnatural institution. One can find all over in Flusser’s work traces of his life-long attempt to think through the full philosophical implications of European nihilism, and evidence of this intellectual engagement can be readily found in his theories of communication.

    One of Flusser’s key ideas that draws me in is his notion that human communication affords us the ability to “forget the meaningless context in which we are completely alone and incommunicado, that is, the world in which we are condemned to solitary confinement and death: the world of ‘nature’” (2002, 4). In order to help stave off the inexorable tide of nature’s muted nothingness, Flusser suggests that humans communicate by storing memories, externalized thoughts whose eventual transmission binds two or more people into a system of meaning. Only when an intersubjective system of communication like writing or speech is established between people does the purpose of our enduring commitment to communication become clear: we communicate in order “to become immortal within others (2016, 31). Flusser’s playful positing of the ironic paradox inherent in the improbability of communication—that communication is unnatural to the human but it is also “so incredibly rich despite its limitations” (26)––enacts its own impossibility. In a representatively ironic sense, Flusser’s point is that all we are able to fully understand is our inability to understand fully.

    As Flusser’s theory of communication can be viewed as his response to the twentieth-century’s shifting technical-medial milieu, his ideas about communication and technics eventually led him to conclude that “the original intention of producing the apparatus, namely, to serve the interests of freedom, has turned on itself…In a way, the terms human and apparatus are reversed, and human beings operate as a function of the apparatus. A man gives an apparatus instructions that the apparatus has instructed him to give” (2011,73).[1] Flusser’s skeptical perspective toward the alleged affordances of human mastery over technology is most assuredly not the view that Apple or Google would prefer you harbor (not-so-secretly). Any cursory glance at Wired or the technology blog at Insider Higher Ed, to pick two long-hanging examples, would yield a radically different perspective than the one Flusser puts forth in his work. In fact, Flusser writes, “objects meant to be media may obstruct communication” (2016, 45). If media objects like the technical apparatuses of today actually obstruct communication, then why are we so often led to believe that they facilitate it? And to shift registers just slightly, if everything is said to be an object of some kind—even technical apparatuses––then cannot one be permitted to claim daily communion with all kinds of objects? What happens when an object—and an object as obsolete as a book, no less—speaks to us? Will we still heed its call?

    ***

    Speaking in its expanded capacity as neither narrator nor focalized character, the book as literary object addresses us in a direct and antagonistic fashion in the opening line to Joshua Cohen’s 2015 novel Book of Numbers. “If you’re reading this on a screen, fuck off. I’ll only talk if I’m gripped with both hands” (5), the book-object warns. As Cohen’s narrative tells the story of a struggling writer named Joshua Cohen (whose backstory corresponds mostly to the historical-biographical author Joshua Cohen) who is contracted to ghostwrite the memoir of another Joshua Cohen (who is the CEO of a massive Google-type company named Tetration), the novel’s middle section provides an “unedited” transcript of the conversation between the two Cohens in which the CEO recounts his upbringing and tremendous business success in and around the Bay Area from the late 1970s up to 2013 of the narrative’s present. The novel’s Silicon Valley setting, nominal and characterological doubling, and structural narrative coupling of the two Cohens’ lives makes it all but impossible to distinguish the personal histories of Cohen-the-CEO and Cohen-the-narrator from the cultural history of the development of personal computing and networked information technologies. The history of one Joshua Cohen––or all Joshua Cohens––is indistinguishable from the history of intrusive computational/digital media. “I had access to stuff I shouldn’t have had access to, but then Principal shouldn’t have had such access to me—cameras, mics,” Cohen-the-narrator laments. In other words, as Cohen-the-narrator ghostwrites another Cohen’s memoir within the context of the broad history of personal computing and the emergence of algorithmic governance and surveillance, the novel invites us to consider how the history of an individual––or every individual, it does not really matter––is also nothing more or anything less than the surveilled history of its data usage, which is always written by someone or something else, the ever-present Not-Me (who just might have the same name as me). The Self is nothing but a networked repository of information to be mined in the future.

    While the novel’s opening line addresses its hypothetical reader directly, its relatively benign warning fixes reader and text in a relation of rancor. The object speaks![2] And yet tech-savvy twenty-first century readers are not the only ones who seem to be fed up with books; books too are fed up with us, and perhaps rightly so. In an age when objects are said to speak vibrantly and withdraw infinitely; processes like human cognition are considered to be operative in complex technical-computational systems; and when the only excuse to preserve the category of “subjective experience” we are able to muster is that it affords us the ability “to grasp how networks technically distribute and disperse agency,” it would seem at first glance that the second-person addressee of the novel’s opening line would intuitively have to be a reading, thinking subject.[3] Yet this is the very same reading subject who has been urged by Cohen’s novel to politely “fuck off” if he or she has chosen to read the text on a screen. And though the text does not completely dismiss its readers who still prefer “paper of pulp, covers of board and cloth” (5), a slight change of preposition in its title points exactly to what the book fears most of all: Book as Numbers. The book-object speaks, but only to offer an ominous admonition: neither the book nor its readers ought to be reducible to computable numbers.

    The transduction of literary language into digital bits eliminates the need for a phenomenological, reading subject, and it suggests too that literature––or even just language in a general sense––and humans in particular are ontologically reducible to data objects that can be “read” and subsequently “interpreted” by computational algorithms. As Cohen’s novel collapses the distinction between author, narrator, character, and medium, its narrator observes that “the only record of my one life would be this record of another’s” (9). But in this instance, the record of one’s (or another’s) life is merely the history of how personal computational technologies have effaced the phenomenological subject. How have we arrived at the theoretically permissible premise that “People matter, but they don’t occupy a privileged subject position distinct from everything else in the world” (Huehls 20)? How might the “turn toward ontology” in theory/philosophy be viewed as contributing to our present condition?

    * **

    Mark Jarzombek’s Digital Stockholm Syndrome in the Post-Ontological Age (2016) provides a brief, yet stylistically ironic and incisive interrogation into how recent iterations of post- or inhumanist theory have found a strange bedfellow in the rhetorical boosterism that accompanies the alleged affordances of digital technologies and big data. Despite the differences between these two seemingly unrelated discourses, they both share a particularly critical or diminished conception of the anthro- in “anthropocentrism” that borrows liberally from the postulates of the “ontological turn” in theory/philosophy (Rosenberg n.p.). While the parallels between these discourses are not made explicit in Jarzombek’s book, Digital Stockholm Syndrome asks us to consider how a shared commitment to an ontologically diminished view of “the human” that galvanizes both technological determinism’s anti-humanism and post- or inhumanist theory has found its common expression in recent philosophies of ontology. In other words, the problem Digital Stockholm Syndrome takes up is this: what kind of theory of ontology, Being, and to a lesser extent, subjectivity, appeals equally to contemporary philosophers and Silicon Valley tech-gurus? Jarzombek gestures toward such an inquiry early on: “What is this new ontology?” he asks, and “What were the historical situations that produced it? And how do we adjust to the realities of the new Self?” (x).

    A curious set of related philosophical commitments united by their efforts to “de-center” and occasionally even eject “anthropocentrism” from the critical conversation constitute some of the realities swirling around Jarzombek’s “new Self.”[4] Digital Stockholm Syndrome provocatively locates the conceptual legibility of these philosophical realities squarely within an explicitly algorithmic-computational historical milieu. By inviting such a comparison, Jarzombek’s book encourages us to contemplate how contemporary ontological thought might mediate our understanding of the historical and philosophical parallels that bind the tradition of in humanist philosophical thinking and the rhetoric of twenty-first century digital media.[5]

    In much the same way that Alexander Galloway has argued for a conceptual confluence that exceeds the contingencies of coincidence between “the structure of ontological systems and the structure of the most highly evolved technologies of post-Fordist capitalism” (347), Digital Stockholm Syndrome argues similarly that today’s world is “designed from the micro/molecular level to fuse the algorithmic with the ontological” (italics in original, x).[6] We now understand Being as the informatic/algorithmic byproduct of what ubiquitous computational technologies have gathered and subsequently fed back to us. Our personal histories––or simply the records of our data use (and its subsequent use of us)––comprise what Jarzombek calls our “ontic exhaust…or what data experts call our data exhaust…[which] is meticulously scrutinized, packaged, formatted, processed, sold, and resold to come back to us in the form of entertainment, social media, apps, health insurance, clickbait, data contracts, and the like” (x).

    The empty second-person pronoun is placed on equal ontological footing with, and perhaps even defined by, its credit score, medical records, 4G data usage, Facebook likes, and threefold of its Tweets. “The purpose of these ‘devices,’” Jarzombek writes, “is to produce, magnify, and expose our ontic exhaust” (25). We give our ontic exhaust away for free every time we log into Facebook because it, in return, feeds back to us the only sense of “self” we are able to identify as “me.”[7] If “who we are cannot be traced from the human side of the equation, much less than the analytic side.‘I’ am untraceable” (31), then why do techno-determinists and contemporary oracles of ontology operate otherwise? What accounts for their shared commitment to formalizing ontology? Why must the Self be tracked and accounted for like a map or a ledger?

    As this “new Self,” which Jarzombek calls the “Being-Global” (2), travels around the world and checks its bank statement in Paris or tags a photo of a Facebook friend in Berlin while sitting in a cafe in Amsterdam, it leaks ontic exhaust everywhere it goes. While the hoovering up of ontic exhaust by GPS and commercial satellites “make[s] us global,” it also inadvertently redefines Being as a question of “positioning/depositioning” (1). For Jarzombek, the question of today’s ontology is not so much a matter of asking “what exists?” but of asking “where is it and how can it be found?” Instead of the human who attempts to locate and understand Being, now Being finds us, but only as long as we allow ourselves to be located.

    Today’s ontological thinking, Jarzombek points out, is not really interested in asking questions about Being––it is too “anthropocentric.”[8] Ontology in the twenty-first century attempts to locate Being by gathering data, keeping track, tracking changes, taking inventory, making lists, listing litanies, crunching the numbers, and searching the database. “Can I search for it on Google?” is now the most important question for ontological thought in the twenty-first century.

    Ontological thinking––which today means ontological accounting, or finding ways to account for the ontologically actuarial––is today’s philosophical equivalent to a best practices for data management, except there is no difference between one’s data and one’s Self. Nonetheless, any ontological difference that might have once stubbornly separated you from data about you no longer applies. Digital Stockholm Syndrome identifies this shift with the formulation: “From ontology to trackology” (71).[9] The philosophical shift that has allowed data about the Self to become the ontological equivalent to the Self emerges out of what Jarzombek calls an “animated ontology.”

    In this “animated ontology,” subject position and object position are indistinguishable…The entire system of humanity is microprocessed through the grid of sequestered empiricism” (31, 29). Jarzombek is careful to distinguish his “animated ontology” from the recently rebooted romanticisms which merely turn their objects into vibrant subjects. He notes that “the irony is that whereas the subject (the ‘I’) remains relatively stable in its ability to self-affirm (the lingering by-product of the psychologizing of the modern Self), objectivity (as in the social sciences) collapses into the illusions produced by the global cyclone of the informatic industry” (28).”[10] By devising tricky new ways to flatten ontology (all of which are made via po-faced linguistic fiat), “the human and its (dis/re-)embodied computational signifiers are on equal footing”(32). I do not define my data, but my data define me.

    ***

    Digital Stockholm Syndrome asserts that what exists in today’s ontological systems––systems both philosophical and computational––is what can be tracked and stored as data. Jarzombek sums up our situation with another pithy formulation: “algorithmic modeling + global positioning + human scaling +computational speed=data geopolitics” (12). While the universalization of tracking technologies defines the “global” in Jarzombek’s Being-Global, it also provides us with another way to understand the humanities’ enthusiasm for GIS and other digital mapping platforms as institutional-disciplinary expressions of a “bio-chemo-techno-spiritual-corporate environment that feeds the Human its sense-of-Self” (5).

    Mark Jarzombek, Digital Stockholm Syndrome in the Post-Ontological Age

    One wonders if the incessant cultural and political reminders regarding the humanities’ waning relevance have moved humanists to reconsider the very basic intellectual terms of their broad disciplinary pursuits. How come it is humanities scholars who are in some cases most visibly leading the charge to overturn many decades of humanist thought? Has the internalization of this depleted conception of the human reshaped the basic premises of humanities scholarship, Digital Stockholm Syndrome wonders? What would it even mean to pursue a “humanities” purged of “the human?” And is it fair to wonder if this impoverished image of humanity has trickled down into the formation of new (sub)disciplines?”[11]

    In a late chapter titled “Onto-Paranoia,” Jarzombek finally arrives at a working definition of Digital Stockholm Syndrome: data visualization. For Jarzombek, data-visualization “has been devised by the architects of the digital world” to ease the existential torture—or “onto-torture”—that is produced by Security Threats (59). Security threats are threatening because they remind us that “security is there to obscure the fact that whole purpose is to produce insecurity” (59). When a system fails, or when a problem occurs, we need to be conscious of the fact that the system has not really failed; “it means that the system is working” (61).[12] The Social, the Other, the Not-Me—these are all variations of the same security threat, which is just another way of defining “indeterminacy” (66). So if everything is working the way it should, we rarely consider the full implications of indeterminacy—both technical and philosophical—because to do so might make us paranoid, or worse: we would have to recognize ourselves as (in)human subjects.

    Data-visualizations, however, provide a soothing salve which we can (self-)apply in order to ease the pain of our “onto-torture.” Visualizing data and creating maps of our data use provide us with a useful and also pleasurable tool with which we locate ourselves in the era of “post-ontology.”[13] “We experiment with and develop data visualization and collection tools that allow us to highlight urban phenomena. Our methods borrow from the traditions of science and design by using spatial analytics to expose patterns and communicating those results, through design, to new audiences,” we are told by one data-visualization project (http://civicdatadesignlab.org/).  As we affirm our existence every time we travel around the globe and self-map our location, we silently make our geo-data available for those who care to sift through it and turn it into art or profit.

    “It is a paradox that our self-aestheticizing performance as subjects…feeds into our ever more precise (self-)identification as knowable and predictable (in)human-digital objects,” Jarzombek writes. Yet we ought not to spend too much time contemplating the historical and philosophical complexities that have helped create this paradoxical situation. Perhaps it is best we do not reach the conclusion that mapping the Self as an object on digital platforms increases the creeping unease that arises from the realization that we are mappable, hackable, predictable, digital objects––that our data are us. We could, instead, celebrate how our data (which we are and which is us) is helping to change the world. “’Big data’ will not change the world unless it is collected and synthesized into tools that have a public benefit,” the same data visualization project announces on its website’s homepage.

    While it is true that I may be a little paranoid, I have finally rested easy after having read Digital Stockholm Syndrome because I now know that my data/I are going to good use.[14] Like me, maybe you find comfort in knowing that your existence is nothing more than a few pixels in someone else’s data visualization.

    _____

    Michael Miller is a doctoral candidate in the Department of English at Rice University. His work has appeared or is forthcoming in symplokē and the Journal of Film and Video.

    Back to the essay

    _____

    Notes

    [1] I am reminded of a similar argument advanced by Tung-Hui Hu in his A Prehistory of the Cloud (2016). Encapsulating Flusser’s spirit of healthy skepticism toward technical apparatuses, the situation that both Flusser and Hu fear is one in which “the technology has produced the means of its own interpretation” (xixx).

    [2] It is not my aim to wade explicitly into discussions regarding “object-oriented ontology” or other related philosophical developments. For the purposes of this essay, however, Andrew Cole’s critique of OOO as a “new occasionalism” will be useful. “’New occasionalism,’” Cole writes, “is the idea that when we speak of things, we put them into contact with one another and ourselves” (112). In other words, the speaking of objects makes them objectively real, though this is only possible when everything is considered to be an object. The question, though, is not about what is or is not an object, but is rather what it means to be. For related arguments regarding the relation between OOO/speculative realism/new materialism and mysticism, see Sheldon (2016), Altieri (2016), Wolfendale (2014), O’Gorman (2013), and to a lesser extent Colebrook (2013).

    [3] For the full set of references here, see Bennett (2010), Hayles (2014 and 2016), and Hansen (2015).

    [4] While I cede that no thinker of “post-humanism” worth her philosophical salt would admit the possibility or even desirability of purging the sins of “correlationism” from critical thought all together, I cannot help but view such occasional posturing with a skeptical eye. For example, I find convincing Barbara Herrnstein-Smith’s recent essay “Scientizing the Humanities: Shifts, Negotiations, Collisions,” in which she compares the drive in contemporary critical theory to displace “the human” from humanistic inquiry to the impossible and equally incomprehensible task of overcoming the “‘astro’-centrism of astronomy or the biocentrism of biology” (359).

    [5] In “Modest Proposal for the Inhuman,” Julian Murphet identifies four interrelated strands of post- or inhumanist thought that combine a kind of metaphysical speculation with a full-blown demolition of traditional ontology’s conceptual foundations. They are: “(1) cosmic nihilism, (2) molecular bio-plasticity, (3) technical accelerationism, and (4) animality. These sometimes overlapping trends are severally engaged in the mortification of humankind’s stubborn pretensions to mastery over the domain of the intelligible and the knowable in an era of sentient machines, routine genetic modification, looming ecological disaster, and irrefutable evidence that we share 99 percent of our biological information with chimpanzees” (653).

    [6] The full quotation from Galloway’s essay reads: “Why, within the current renaissance of research in continental philosophy, is there a coincidence between the structure of ontological systems and the structure of the most highly evolved technologies of post-Fordist capitalism? [….] Why, in short, is there a coincidence between today’s ontologies and the software of big business?” (347). Digital Stockholm Syndrome begins by accepting Galloway’s provocations as descriptive instead of speculative. We do not necessarily wonder in 2017 if “there is a coincidence between today’s ontologies and the software of big business”; we now wonder instead how such a confluence came to be.

    [7] Wendy Hui Kyun Chun makes a similar point in her 2016 monograph Updating to Remain the Same: Habitual New Media. She writes, “If users now ‘curate’ their lives, it is because their bodies have become archives” (x-xi). While there is not ample space here to discuss the  full theoretical implications of her book, Chun’s discussion of the inherently gendered dimension to confession, self-curation as self-exposition, and online privacy as something that only the unexposed deserve (hence the need for preemptive confession and self-exposition on the internet) in digital/social media networks is tremendously relevant to Jarzombek’s Digital Stockholm Syndrome, as both texts consider the Self as a set of mutable and “marketable/governable/hackable categories” (Jarzombek 26) that are collected without our knowledge and subsequently fed back to the data/media user in the form of its own packaged and unique identity. For recent similar variations of this argument, see Simanowski (2017) and McNeill (2012).

    I also think Chun’s book offers a helpful tool for thinking through recent confessional memoirs or instances of “auto-theory” (fictionalized or not) like Maggie Nelson’s The Argonauts (2015), Sheila Heti’s How Should a Person Be (2010), Marie Calloway’s what purpose did i serve in your life (2013), and perhaps to a lesser degree Tao Lin’s Richard Yates (2010), Taipei (2013), Natasha Stagg’s Surveys, and Ben Lerner’s Leaving the Atocha Station (2011) and 10:04 (2014). The extent to which these texts’ varied formal-aesthetic techniques can be said to be motivated by political aims is very much up for debate, but nonetheless, I think it is fair to say that many of them revel in the reveal. That is to say, via confession or self-exposition, many of these novels enact the allegedly performative subversion of political power by documenting their protagonists’ and/or narrators’ certain social/political acts of transgression. Chun notes, however, that this strategy of self-revealing performs “resistance as a form of showing off and scandalizing, which thrives off moral outrage. This resistance also mimics power by out-spying, monitoring, watching, and bringing to light, that is, doxing” (151). The term “autotheory,” which was has been applied to Nelson’s The Argonauts in particular, takes on a very different meaning in this context. “Autotheory” can be considered as a theory of the self, or a self-theorization, or perhaps even the idea that personal experience is itself a kind of theory might apply here, too. I wonder, though, how its meaning would change if the prefix “auto” was understood within a media-theoretical framework not as “self” but as “automation.” “Autotheory” becomes, then, an automatization of theory or theoretical thinking, but also a theoretical automatization; or more to the point: what if “autotheory” describes instead a theorization of the Self or experience wherein “the self” is only legible as the product of automated computational-algorithmic processes?

    [8] Echoing the critiques of “correlationism” or “anthropocentrism” or what have you, Jarzombek declares that “The age of anthrocentrism is over” (32).

    [9] Whatever notion of (self)identity the Self might find to be most palatable today, Jarzombek argues, is inevitably mediated via global satellites. “The intermediaries are the satellites hovering above the planet. They are what make us global–what make me global” (1), and as such, they represent the “civilianization” of military technologies (4). What I am trying to suggest is that the concepts and categories of self-identity we work with today are derived from the informatic feedback we receive from long-standing military technologies.

    [10] Here Jarzombek seems to be suggesting that the “object” in the “objectivity” of “the social sciences” has been carelessly conflated with the “object” in “object-oriented” philosophy. The prioritization of all things “objective” in both philosophy and science has inadvertently produced this semantic and conceptual slippage. Data objects about the Self exist, and thus by existing, they determine what is objective about the Self. In this new formulation, what is objective about the Self or subject, in other words, is what can be verified as information about the self. In Indexing It All: The Subject in the Age of Documentation, Information, and Data (2014), Ronald Day argues that these global tracking technologies supplant traditional ontology’s “ideas or concepts of our human manner of being” and have in the process “subsume[d] and subvert[ed] the former roles of personal judgment and critique in personal and social beings and politics” (1). While such technologies might be said to obliterate “traditional” notions of subjectivity, judgment, and critique, Day demonstrates how this simultaneous feeding-forward and feeding back of data-about-the-Self represents the return of autoaffection, though in his formulation self-presence is defined as information or data-about-the-self whose authenticity is produced when it is fact-checked against a biographical database (3)—self-presence is a presencing of data-about-the-Self. This is all to say that the Self’s informational “aboutness”–its representation in and as data–comes to stand in for the Self’s identity, which can only be comprehended as “authentic” in its limited metaphysical capacity as a general informatic or documented “aboutness.”

    [11] Flusser is again instructive on this point, albeit in his own idiosyncratic way­­. Drawing attention to the strange unnatural plurality in the term “humanities,” he writes, “The American term humanities appropriately describes the essence of these disciplines. It underscores that the human being is an unnatural animal” (2002, 3). The plurality of “humanities,” as opposed to the singular “humanity,” constitutes for Flusser a disciplinary admission that not only the category of “the human” is unnatural, but that the study of such an unnatural thing is itself unnatural, as well. I think it is also worth pointing out that in the context of Flusser’s observation, we might begin to situate the rise of “the supplemental humanities” as an attempt to redefine the value of a humanities education. The spatial humanities, the energy humanities, medical humanities, the digital humanities, etc.—it is not difficult to see how these disciplinary off-shoots consider themselves as supplements to whatever it is they think “the humanities” are up to; regardless, their institutional injection into traditional humanistic discourse will undoubtedly improve both(sub)disciplines, with the tacit acknowledgment being that the latter has just a little more to gain from the former in terms of skills, technical know-how, and data management. Many thanks to Aaron Jaffe for bringing this point to my attention.

    [12] In his essay “Algorithmic Catastrophe—The Revenge of Contingency,” Yuk Hui notes that “the anticipation of catastrophe becomes a design principle” (125). Drawing from the work of Bernard Stiegler, Hui shows how the pharmacological dimension of “technics, which aims to overcome contingency, also generates accidents” (127). And so “as the anticipation of catastrophe becomes a design principle…it no longer plays the role it did with the laws of nature” (132). Simply put, by placing algorithmic catastrophe on par with a failure of reason qua the operations of mathematics, Hui demonstrates how “algorithms are open to contingency” only insofar as “contingency is equivalent to a causality, which can be logically and technically deduced” (136). To take Jarzombek’s example of the failing computer or what have you, while the blue screen of death might be understood to represent the faithful execution of its programmed commands, we should also keep in mind that the obverse of Jarzombek’s scenario would force us to come to grips with how the philosophical implications of the “shit happens” logic that underpins contingency-as-(absent) causality “accompanies and normalizes speculative aesthetics” (139).

    [13] I am reminded here of one of the six theses from the manifesto “What would a floating sheep map?,” jointly written by the Floating Sheep Collective, which is a cohort of geography professors. The fifth thesis reads: “Map or be mapped. But not everything can (or should) be mapped.” The Floating Sheep Collective raises in this section crucially important questions regarding ownership of data with regard to marginalized communities. Because it is not always clear when to map and when not to map, they decide that “with mapping squarely at the center of power struggles, perhaps it’s better that not everything be mapped.” If mapping technologies operate as ontological radars–the Self’s data points help point the Self towards its own ontological location in and as data—then it is fair to say that such operations are only philosophically coherent when they are understood to be framed within the parameters outlined by recent iterations of ontological thinking and its concomitant theoretical deflation of the rich conceptual make-up that constitutes the “the human.” You can map the human’s data points, but only insofar as you buy into the idea that points of data map the human. See http://manifesto.floatingsheep.org/.

    [14]Mind/paranoia: they are the same word!”(Jarzombek 71).

    _____

    Works Cited

    • Adler, Renata. Speedboat. New York Review of Books Press, 1976.
    • Altieri, Charles. “Are We Being Materialist Yet?” symplokē 24.1-2 (2016):241-57.
    • Calloway, Marie. what purpose did i serve in your life. Tyrant Books, 2013.
    • Chun, Wendy Hui Kyun. Updating to Remain the Same: Habitual New Media. The MIT Press, 2016.
    • Cohen, Joshua. Book of Numbers. Random House, 2015.
    • Cole, Andrew. “The Call of Things: A Critique of Object-Oriented Ontologies.” minnesota review 80 (2013): 106-118.
    • Colebrook, Claire. “Hypo-Hyper-Hapto-Neuro-Mysticism.” Parrhesia 18 (2013).
    • Day, Ronald. Indexing It All: The Subject in the Age of Documentation, Information, and Data. The MIT Press, 2014.
    • Floating Sheep Collective. “What would a floating sheep map?” http://manifesto.floatingsheep.org/.
    • Flusser, Vilém. Into the Universe of Technical Images. Translated by Nancy Ann Roth. University of Minnesota Press, 2011.
    • –––. The Surprising Phenomenon of Human Communication. 1975. Metaflux, 2016.
    • –––. Writings, edited by Andreas Ströhl. Translated by Erik Eisel. University of Minnesota Press, 2002.
    • Galloway, Alexander R. “The Poverty of Philosophy: Realism and Post-Fordism.” Critical Inquiry 39.2 (2013): 347-366.
    • Hansen, Mark B.N. Feed Forward: On the Future of Twenty-First Century Media. Duke University Press, 2015.
    • Hayles, N. Katherine. “Cognition Everywhere: The Rise of the Cognitive Nonconscious and the Costs of Consciousness.” New Literary History 45.2 (2014):199-220.
    • –––. “The Cognitive Nonconscious: Enlarging the Mind of the Humanities.” Critical Inquiry 42 (Summer 2016): 783-808.
    • Herrnstein-Smith, Barbara. “Scientizing the Humanities: Shifts, Collisions, Negotiations.” Common Knowledge  22.3 (2016):353-72.
    • Heti, Sheila. How Should a Person Be? Picador, 2010.
    • Hu, Tung-Hui. A Prehistory of the Cloud. The MIT Press, 2016.
    • Huehls, Mitchum. After Critique: Twenty-First Century Fiction in a Neoliberal Age. Oxford University Press, 2016.
    • Hui, Yuk. “Algorithmic Catastrophe–The Revenge of Contingency.” Parrhesia 23(2015): 122-43.
    • Jarzombek, Mark. Digital Stockholm Syndrome in the Post-Ontological Age. University of Minnesota Press, 2016.
    • Lin, Tao. Richard Yates. Melville House, 2010.
    • –––. Taipei. Vintage, 2013.
    • McNeill, Laurie. “There Is No ‘I’ in Network: Social Networking Sites and Posthuman Auto/Biography.” Biography 35.1 (2012): 65-82.
    • Murphet, Julian. “A Modest Proposal for the Inhuman.” Modernism/Modernity 23.3(2016): 651-70.
    • Nelson, Maggie. The Argonauts. Graywolf P, 2015.
    • O’Gorman, Marcel. “Speculative Realism in Chains: A Love Story.” Angelaki: Journal of the Theoretical Humanities 18.1 (2013): 31-43.
    • Rosenberg, Jordana. “The Molecularization of Sexuality: On Some Primitivisms of the Present.” Theory and Event 17.2 (2014):  n.p.
    • Sheldon, Rebekah. “Dark Correlationism: Mysticism, Magic, and the New Realisms.” symplokē 24.1-2 (2016): 137-53.
    • Simanowski, Roberto. “Instant Selves: Algorithmic Autobiographies on Social Network Sites.” New German Critique 44.1 (2017): 205-216.
    • Stagg, Natasha. Surveys. Semiotext(e), 2016.
    • Wolfendale, Peter. Object Oriented Philosophy: The Noumenon’s New Clothes. Urbanomic, 2014.
  • Alexander R. Galloway — Brometheanism

    Alexander R. Galloway — Brometheanism

    By Alexander R. Galloway
    ~

    In recent months I’ve remained quiet about the speculative turn, mostly because I’m reticent to rekindle the “Internet war” that broke out a couple of years ago mostly on blogs but also in various published papers. And while I’ve taught accelerationism in my recent graduate seminars, I opted for radio silence when accelerationism first appeared on the scene through the Accelerationist Manifesto, followed later by the book Inventing the Future. Truth is I have mixed feelings about accelerationism. Part of me wants to send “comradely greetings” to a team of well-meaning fellow Marxists and leave it at that. Lord knows the left needs to stick together. Likewise there’s little I can add that people like Steven Shaviro and McKenzie Wark haven’t already written, and articulated much better than I could. But at the same time a number of difficulties remain that are increasingly hard to overlook. To begin I might simply echo Wark’s original assessment of the Accelerationist Manifesto: two cheers for accelerationism, but only two!

    What’s good about accelerationism? And what’s bad? I love the ambition and scope. Certainly the accelerationists’ willingness to challenge leftist orthodoxies is refreshing. I also like how the accelerationists demand that we take technology and science seriously. And I also agree that there are important tactical uses of accelerationist or otherwise hypertrophic interventions (Eugene Thacker and I have referred to them as exploits). Still I see accelerationism essentially as a tactic mistaken for a strategy. At the same time this kind of accelerationism is precisely what dot-com entrepreneurs want to see from the left. Further, and ultimately most important, accelerationism is paternalistic and thus suffers from the problems of elitism and ultimately reactionary politics.

    Let me explain. I’ll talk first about Srnicek and Williams’ 2015 book Inventing the Future, and then address one of the central themes fueling the accelerationist juggernaut, Prometheanism. Well written, easy to read, and exhaustively footnoted, Inventing the Future is ostensibly a follow up to the Accelerationist Manifesto, although the themes of the two texts are different and they almost never mention accelerationism in the book. (Srnicek in particular is nothing if not shrewd and agile: present at the christening of #A, we also find him on the masthead of the speculative realist reader, and today nosing in on “platform studies.” Wherever he alights next will doubtless portend future significance.) The book is vaguely similar to Michael Hardt and Antonio Negri’s Declaration from 2012 in that it tries to assess the current condition of the left while also providing a set of specific steps to be taken for the future. And while the accelerationists have garnered significantly more attention of late, mostly because it feels so fresh and new, Hardt and Negri’s is the better book (and interestingly Srnicek and Williams never cite them).

    Inventing the Future

    Inventing the Future has essentially two themes. The first consists in a series of denunciations of what they call “folk politics” defined in terms of Occupy, the Zapatistas, Tiqqun, localism, and direct democracy, ostensibly in favor of a new “hegemony” of planetary social democracy (also known as Leninism). The second theme concerns an anti-work polemic focused on the universal basic income (UBI) and shortening the work week. Indeed even as these two authors collaborate and mix their thoughts, there seem to be two books mixed together into one. This produces an interesting irony: while the first half of the book unabashedly denigrates anarchism in favor of Leninism, the second half of the book focuses on that very theme (anti-work) that has defined anarchist theory since the split in the First International, if not since time immemorial.

    What’s so wrong with “folk politics”? There are a few ways to answer this question. First the accelerationists are clearly frustrated by the failures of the left, and rightly so, a left debilitated by “apathy, melancholy and defeat” (5). There’s a demographic explanation as well. This is the cri de coeur of a younger generation seeking to move beyond what are seen as the sclerotic failures of postmodern theory with all of its “culturalist” baggage (which too often is a codeword for punks, queers, women, and people of color — more on that in a moment).

    Folk politics includes “the fetishization of local spaces, immediate actions, transient gestures, and particularisms of all kinds” (3); it privileges the “small-scale, the authentic, the traditional and the natural” (10). The following virtues help fill out the definition:

    immediacy…tactics…inherently fleeting…the past…the voluntarist and spontaneous…the small…withdrawal or exit…the everyday…feeling…the particular…the ethical…the suffering of the particular and the authenticity of the local (10-11)

    Wow, that’s a lot of good stuff to get rid of. Still, they don’t quit there, targeting horizontalism of various kinds. Radical democracy is in the crosshairs too. Anti-representational politics is out as well. All the “from below” movements, from the undercommons to the black bloc, anything that smacks of “anarchism, council communism, libertarian communism and autonomism” (26) — it’s all under indictment. This unceasing polemic culminates in the book’s most potent sentence, if not also its most ridiculous, where the authors dismiss all of the following movements in one fell swoop:

    Occupy, Spain’s 15M, student occupations, left communist insurrectionists like Tiqqun and the Invisible Committee, most forms of horizontalism, the Zapatistas…localism…slow-food (11-12)

    That scoops up a lot of people. And the reader is left to quibble over whatever principal of decision might group all these disparate factions together. But the larger point is clear: for Srnicek and Williams folk politics emerged because of an outdated Left (i.e. the abject failures of social democracy and communism) (16-), and an outmaneuvered Left (i.e. the rampant successes of neoliberalism) (19-). Thus their goal is to update the left with a new ideology, and overhaul its infrastructure allowing it to modernize and scale up to the level of the planet.

    In the second half of the book, particularly in chapters 5 and 6, Srnicek and Williams elaborate their vision for anti-work and post-work. This hinges on the concept of full automation, and they provocatively assert that “the tendencies towards automation and the replacement of human labor should be enthusiastically accelerated” (109). Yet the details are scant. What kind of tech are we talking about? We get some vague references at the outset to “Open-source designs, copyleft creativity, and 3D printing” (1), then again later to “data collection (radio-frequency identification, big data)” and so on (110). But one thing this book does not provide is an examination of the technology of modern capitalism. (Srnicek’s Platform Capitalism is an improvement thematically but not substantively: he provides an analysis of political economy, but no tech audit.) Thus Inventing the Future has a sort of Wizard of Oz problem at its core. It’s not clear what clever devices are behind the curtain, we’re just supposed to assume that they will be sufficiently communistical if we all believe hard enough.

    At the same time the authors come across as rather tone deaf on the question of labor, bemoaning above all “the misery of not being exploited,” as if exploitation is some grand prize awarded to the subaltern. Further, they fail to address adequately the two key challenges of automation, both of which have been widely discussed in political and economic theory: first that automation eliminates jobs for people who very much want and need them, leading to surplus populations, unemployment, migration, and intrenched poverty; and second that automation transforms the organic composition of labor through deskilling and proletarianization, the offshoring of menial labor, and the introduction of technical and specialist labor required to design, build, operate, and repair those seemingly “automagical” machines. In other words, under automation some people work less, but everyone works differently. Automation reduces work for some, but changes (and in fact often increases) work for others. Marx’s analysis of machines in Capital is useful here, where he addresses all of these various tendencies, from the elimination of labor and the increase in labor, to the transformation of the organic composition of labor — the last point being the most significant. (And while machines might help lubricate and increase the productive forces — not a bad thing — it’s clear that machines are absolutely not revolutionary actors for Marx. Optimistic interpretations gleaned from the Grundrisse notwithstanding, Marx defines machines essentially as large batteries for value. I have yet to find any evidence that today’s machines are any different.)

    So the devil is in the details: what kind of technology are we talking about? But perhaps more importantly, if you get rid of the “folk,” aren’t you also getting rid of the people? Srnicek and Williams try to address this in chapter 8, although I’m more convinced by Hardt and Negri’s “multitude,” Harney and Moten’s “undercommons,” or even formulations like “the part of no part” or the “inoperative community” found scattered across a variety of other texts. By the end Srnicek and Williams out themselves as reticular pessimists: let’s not specify “the proper form of organization” (162), let’s just let it happen naturally in an “ecology of organizations” (163). The irony being that we’re back to square one, and these anti-folk evangelists are hippy ecologists after all. (The reference to function over form [169] appears as a weak afterthought to help rationalize their decision, but it re-introduces the problem of techno-fetishism, this time a fetishism of the function.)

    To summarize, accelerationism presents a rich spectrum of problems. The first stems from the notion that technology/automation will save us, replete with vague references to “the latest technological developments” unencumbered by any real details. Second is the question of capitalism itself. Despite the authors’ Marxist tendencies, it’s not at all clear that accelerationism is anti-capitalist. In fact accelerationism would be better described as a form of post-capitalism, what Zizek likes to mock as “capitalism with a friendly face.” What is post-capitalism exactly? More capitalism? A modified form of capitalism? For this reason it becomes difficult to untangle accelerationism from the most visionary dreams of the business elite. Isn’t this exactly what dot-com entrepreneurs are calling for? Isn’t the avant-garde of acceleration taking place right now in Silicon Valley? This leads to a third point: accelerationism is a tactic mistaken for a strategy. Certainly accelerationist or otherwise hypertrophic methods are useful in a provisional, local, which is to say tactical way. But accelerationism is, in my view, naïve about how capitalism works at a strategic level. Capitalism wants nothing more than to accelerate. Adding to the acceleration will help capitalism not hinder it. Capitalism is this accelerating force, from primitive accumulation on up to today. (Accelerationists don’t dispute this; they just simply disagree on the moral status of capitalism.) Fourth and finally is the most important problem revealed by accelerationism, the problem of elitism and reactionary politics. Given unequal technological development, those who accelerate will necessarily do so on the backs of others who are forced to proletarianize. Thus accelerationists are faced with a kind of “internal colonialism” problem, meaning there must be a distinction made between those who accelerate and those who facilitate acceleration through their very bodies. We already know who suffers most under unequal technological acceleration, and it’s not young white male academics living in England. Thus their skepticism toward the “folk” is all too often a paternalistic skepticism toward the wants and needs of the generic population. Hence the need for accelerationists to talk glowingly about things like “engineering consent.” It’s hard to see where this actually leads. Or more to the point who leads: if not Leninists then who, technocrats? Philosopher kings?

    *

    Accelerationism gains much inspiration from the philosophy of Prometheanism. If accelerationism provides a theory of political economy, Prometheanism supplies a theory of the subject. Yet it’s not always clear what people mean by this term. In a recent lecture titled “Prometheanism and Rationalism” Peter Wolfendale defines Prometheanism in such general terms that it becomes a synonym for any number of things: history and historical change; being against fatalism and messianism; being against the aristocracy; being against Fukuyama; being for feminism; the UBI and post-capitalism; the Enlightenment and secularism; deductive logic; overcoming (perceived) natural limits; technology; “automation” (which as I’ve just indicated is the most problematic concept of them all). Even very modest and narrow definitions of Prometheanism — technology for humans to overcome natural limit — present their own problems and wind up largely deflating the sloganeering of it all. “Okay so both the hydrogen bomb and the contraceptive pill are equally Promethean? So then who adjudicates their potential uses?” And we’re left with Prometheanism as the latest YAM philosophy (Yet Another Morality).

    Still, Prometheanism has a particular vision for itself and it’s worth describing the high points. I can think of six specific qualities. (1) Prometheanism defines itself as posthuman or otherwise antihuman. (2) Prometheanism is an attempt to transcend the bounds of physical limitation. (3) Prometheanism promotes freedom, as in for instance the freedom to change the body through hormone therapy. (4) Prometheanism sees itself as politically progressive. (5) Prometheanism sees itself as being technologically savvy. (6) Prometheanism proposes to offer technical solutions to real problems.

    But is any of this true? Interestingly Bernard Stiegler provided an answer to some of these questions already in 1994, and it’s worth returning to his book from that year Technics and Time, 1: The Fault of Epimetheus to fill out a conversation that has, thus far, been mostly one-sided. Stiegler’s book is long and complicated, and touches on many different things including technology and the increased rationalization of life, by way of some of Stiegler’s key influences including Gilbert Simondon, André Leroi-Gourhan, and Bertrand Gille. Let me focus however on the second part of the book, where Stiegler examines the two brothers Epimetheus and Prometheus.

    A myth about powers and qualities, the fable of Epimetheus and Prometheus is recounted by the sophist Protagoras starting at line 320c in Plato’s dialogue of that name. In Stiegler’s retelling of the story, we begin with Epimetheus, who, via a “principle of compensation” governed by notions of difference and equilibrium, hands out powers and qualities to all the animals of the Earth. For instance extra speed might be endowed to the gazelle, but only by way of balanced compensation given to another animal, say a boost in strength bestowed upon the lion. Seemingly diligent in his duties, Epimetheus nevertheless tires before the job is complete, shirking his duties before arriving at humankind, who is left relatively naked without a special power or quality of its own. To compensate humankind, Prometheus absconds with “the gift of skill in the arts and fire” — “τὴν ἔντεχνον σοφίαν σὺν πυρί” — captured from Athena and Hephaestus, respectively, conferring these two gifts to humanity (Plato, “Protagoras,” 321d).

    In this way humans are defined first not via technical supplement but through an elemental fault — this is Stiegler’s lingering poststructuralism — the fault of Epimetheus. Epimetheus forgets about us, leaving us until the end, and hence “Humans only occur through their being forgotten; they only appear in disappearing” (188). But it’s more than that: a fault followed by a theft, and hence a twin fault. Humanity is the “fruit of a double fault–an act of forgetting [by Epimetheus], then of theft [by Prometheus]” (188). Humans are thus a forgotten afterthought, remedied afterward by a lucky forethought.

    “Afterthought” and “forethought” — Stiegler means these terms quite literally. Who is Epimetheus? And who is Prometheus? Greek names often have etymological if not allegorical significance, as is the case here. Both names share the root “-metheus,” cognate with manthánō [μανθάνω], which means learning, study, or cultivation of knowledge. Hence a mathitís [μαθητής] is a learner or a student. (And in fact in a very literal sense “mathematics” simply refers to the things that one learns, not to arithmetic or geometry per se.) The two brothers are thus both varieties of learners, both varieties of thinkers. The key is which variety. The key is the Epi– and the Pro-.

    Epi carries the character of the accidentally and artificial factuality of something happening, arriving, a primordial ‘passibility,’” Stiegler explains. “Epimetheia means heritage. Heritage is always epimathesis. Epimetheia would also mean then tradition-originating in a fault that is always already there and that is nothing but technicity” (206-207). Hence Epimetheus means something like “learning on the basis of,” “thinking after,” or, more simply, or “afterthought” or “hindsight.” This is why Epimetheus forgets, why he is at fault, why he acts foolishly, because these are all the things that generate hindsight.

    Prometheus on the other hand is “foresight” or “fore-thought.” If Epimetheus means “thinking and learning on the basis of,” Prometheus means something more like “thinking and learning in anticipation of.” In this way, Prometheus comes to stand in for cleverness (but also theft), ingenuity, and thus technics as a whole.

    But is that all? Is the lesson simply to restore Epimetheus to his position next to Prometheus? To remember the Epimethean omission along with the Promethean endowment? In fact the old Greek myth isn’t quite finished, and, after initially overlooking the ending, Stiegler eventually broaches the closing section on Hermes. For even after benefiting from its Promethean supplement, humanity remains incomplete. Specifically, the gods notice that Man has a tendency toward war and political strife. Thus Hermes is tasked to implant a kind of socio-political virtue, supplementing humanity with “the qualities of respect for others [αἰδώ] and a sense of justice [δίκη]” (Plato 322c). In other words, a second supplement is necessary, only this time a supplement not rooted in the identitarian logic of heterogeneous qualities. “Another tekhnē is required,” writes Stiegler, “a tekhnē that is no longer paradoxically…the privilege of specialists” (201). This point about specialists is key — all you Leninists take note — because on Zeus’s command Hermes delivers respect and justice generically and equally across all persons, not via the “principle of compensation” based on difference and equilibrium used previously by Epimetheus to divvy up the powers and qualities of the animals. Thus while some people may have a talent for the piano, and others might be gifted in some other way, justice and respect are bestowed equally to all.

    This is why politics is always a question of the “hermeneutic community,” that is, the ad hoc translation and interpretation of real political dynamics; it comes from Hermes (201). At the same time politics also means “the community of those who have no community” because there is no adjudication of heterogenous qualities, no truth or law stipulated in advance, except for the very “conditions” of the political (those “hermeneutic conditions,” namely αἰδώ and δίκη, respect and justice).

    To summarize, the Promethean story has three moments, not one, and all three ought to be given full voice:

    1. Default of origin (being forgotten about by Epimetheus/Hindsight)
    2. Gaining technicity (fire and skills from Prometheus/Foresight)
    3. Revealing the generic (“respect for others and a sense of justice” from Hermes)

    This strikes me as a much better way to think about Prometheanism overall, better than the narrow definition of “using technology to overcome natural limits.” Recognizing all three moments, Prometheanism (if we can still call it that) entails not just technological advancement, but also insufficiency and failure, along with a political consciousness rooted in generic humanity.

    And now would be a good time to pass the baton over to the Xenofeminists, who make much better use of accelerationism than its original authors do. The Xenofeminist manifesto provides a more holistic picture of what might simply be called a “universalism from below” — yes, that very folk politics that Srnicek and Williams seek to suppress — doing justice not only to Prometheus, but to Epimetheus and Hermes as well:

    Xenofeminism understands that the viability of emancipatory abolitionist projects — the abolition of class, gender, and race — hinges on a profound reworking of the universal. The universal must be grasped as generic, which is to say, intersectional. Intersectionality is not the morcellation of collectives into a static fuzz of cross-referenced identities, but a political orientation that slices through every particular, refusing the crass pigeonholing of bodies. This is not a universal that can be imposed from above, but built from the bottom up — or, better, laterally, opening new lines of transit across an uneven landscape. This non-absolute, generic universality must guard against the facile tendency of conflation with bloated, unmarked particulars — namely Eurocentric universalism — whereby the male is mistaken for the sexless, the white for raceless, the cis for the real, and so on. Absent such a universal, the abolition of class will remain a bourgeois fantasy, the abolition of race will remain a tacit white-supremacism, and the abolition of gender will remain a thinly veiled misogyny, even — especially — when prosecuted by avowed feminists themselves. (The absurd and reckless spectacle of so many self-proclaimed ‘gender abolitionists’ campaign against trans women is proof enough of this). (0x0F)


    _____

    Alexander R. Galloway is a writer and computer programmer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay

  • Ending the World as We Know It: Alexander R. Galloway in Conversation with Andrew Culp

    Ending the World as We Know It: Alexander R. Galloway in Conversation with Andrew Culp

    by Alexander R. Galloway and Andrew Culp
    ~

    Alexander R. Galloway: You have a new book called Dark Deleuze (University of Minnesota Press, 2016). I particularly like the expression “canon of joy” that guides your investigation. Can you explain what canon of joy means and why it makes sense to use it when talking about Deleuze?

    Andrew Culp, Dark Deleuze (University of Minnesota Press, 2016)

    Andrew Culp: My opening is cribbed from a letter Gilles Deleuze wrote to philosopher and literary critic Arnaud Villani in the early 1980s. Deleuze suggests that any worthwhile book must have three things: a polemic against an error, a recovery of something forgotten, and an innovation. Proceeding along those three lines, I first argue against those who worship Deleuze as the patron saint of affirmation, second I rehabilitate the negative that already saturates his work, and third I propose something he himself was not capable of proposing, a “hatred for this world.” So in an odd twist of Marx on history, I begin with those who hold up Deleuze as an eternal optimist, yet not to stand on their shoulders but to topple the church of affirmation.

    The canon portion of “canon of joy” is not unimportant. Perhaps more than any other recent thinker, Deleuze queered philosophy’s line of succession. A large portion of his books were commentaries on outcast thinkers that he brought back from exile. Deleuze was unwilling to discard Nietzsche as a fascist, Bergson as a spiritualist, or Spinoza as a rationalist. Apparently this led to lots of teasing by fellow agrégation students at the Sorbonne in the late ’40s. Further showing his strange journey through the history of philosophy, his only published monograph for nearly a decade was an anti-transcendental reading of Hume at a time in France when phenomenology reigned. Such an itinerant path made it easy to take Deleuze at his word as a self-professed practitioner of “minor philosophy.” Yet look at Deleuze’s outcasts now! His initiation into the pantheon even bought admission for relatively forgotten figures such as sociologist Gabriel Tarde. Deleuze’s popularity thus raises a thorny question for us today: how do we continue the minor Deleuzian line when Deleuze has become a “major thinker”? For me, the first step is to separate Deleuze (and Guattari) from his commentators.

    I see two popular joyous interpretations of Deleuze in the canon: unreconstructed Deleuzians committed to liberating flows, and realists committed to belief in this world. The first position repeats the language of molecular revolution, becoming, schizos, transversality, and the like. Some even use the terms without transforming them! The resulting monotony seals Deleuze and Guattari’s fate as a wooden tongue used by people still living in the ’80s. Such calcification of their concepts is an especially grave injustice because Deleuze quite consciously shifted terminology from book to book to avoid this very outcome. Don’t get me wrong, I am deeply indebted to the early work on Deleuze! I take my insistence on the Marxo-Freudian core of Deleuze and Guattari from one of their earliest Anglophone commentators, Eugene Holland, who I sought out to direct my dissertation. But for me, the Tiqqun line “the revolution was molecular, and so was the counter-revolution” perfectly depicts the problem of advocating molecular politics. Why? Today’s techniques of control are now molecular. The result is that control societies have emptied the molecular thinker’s only bag of tricks (Bifo is a good test case here), which leaves us with a revolution that only goes one direction: backward.

    I am equally dissatisfied by realist Deleuzians who delve deep into the early strata of A Thousand Plateaus and away from the “infinite speed of thought” that motivates What is Philosophy? I’m thinking of the early incorporations of dynamical systems theory, the ’90s astonishment over everything serendipitously looking like a rhizome, the mid-00s emergence of Speculative Realism, and the ongoing “ontological” turn. Anyone who has read Manuel DeLanda will know this exact dilemma of materiality versus thought. He uses examples that slow down Deleuze and Guattari’s concepts to something easily graspable. In his first book, he narrates history as a “robot historian,” and in A Thousand Years of Nonlinear History, he literally traces the last thousand years of economics, biology, and language back to clearly identifiable technological inventions. Such accounts are dangerously compelling due to their lucidity, but they come at a steep cost: android realism dispenses with Deleuze and Guattari’s desiring subject, which is necessary for a theory of revolution by way of the psychoanalytic insistence on the human ability to overcome biological instincts (e.g. Freud’s Instincts and their Vicissitudes and Beyond the Pleasure Principle). Realist interpretations of Deleuze conceive of the subject as fully of this world. And with it, thought all but evaporates under the weight of this world. Deleuze’s Hume book is an early version of this criticism, but the realists have not taken heed. Whether emergent, entangled, or actant, strong realists ignore Deleuze and Guattari’s point in What is Philosophy? that thought always comes from the outside at a moment when we are confronted by something so intolerable that the only thing remaining is to think.

    Galloway: The left has always been ambivalent about media and technology, sometimes decrying its corrosive influence (Frankfurt School), sometimes embracing its revolutionary potential (hippy cyberculture). Still, you ditch technical “acceleration” in favor of “escape.” Can you expand your position on media and technology, by way of Deleuze’s notion of the machinic?

    Culp: Foucault says that an episteme can be grasped as we are leaving it. Maybe we can finally catalogue all of the contemporary positions on technology? The romantic (computer will never capture my soul), the paranoiac (there is an unknown force pulling the strings), the fascist-pessimist (computers will control everything)…

    Deleuze and Guattari are certainly not allergic to technology. My favorite quote actually comes from the Foucault book in which Deleuze says that “technology is social before it is technical” (6). The lesson we can draw from this is that every social formation draws out different capacities from any given technology. An easy example is from the nomads Deleuze loved so much. Anarcho-primitivists speculate that humans learn oppression with the domestication of animals and settled agriculture during the Neolithic Revolution. Diverging from the narrative, Deleuze celebrates the horse people of the Eurasian steppe described by Arnold Toynbee. Threatened by forces that would require them to change their habitat, Toynbee says, they instead chose to change their habits. The subsequent domestication of the horse did not sew the seeds of the state, which was actually done by those who migrated from the steppes after the last Ice Age to begin wet rice cultivation in alluvial valleys (for more, see James C Scott’s The Art of Not Being Governed). On the contrary, the new relationship between men and horses allowed nomadism to achieve a higher speed, which was necessary to evade the raiding-and-trading used by padi-states to secure the massive foreign labor needed for rice farming. This is why the nomad is “he who does not move” and not a migrant (A Thousand Plateaus, 381).

    Accelerationism attempts to overcome the capitalist opposition of human and machine through the demand for full automation. As such, it peddles in technological Proudhonism that believes one can select what is good about technology and just delete what is bad. The Marxist retort is that development proceeds by its bad side. So instead of flashy things like self-driving cars, the real dot-communist question is: how will Amazon automate the tedious, low-paying jobs that computers are no good at? What happens to the data entry clerks, abusive-content managers, or help desk technicians? Until it figures out who will empty the recycle bin, accelerationism is only a socialism of the creative class.

    The machinic is more than just machines–it approaches technology as a question of organization. The term is first used by Guattari in a 1968 paper titled “Machine and Structure” that he presented to Lacan’s Freudian School of Paris, a paper that would jumpstart his collaboration with Deleuze. He argues for favoring machine to structure. Structures transform parts of a whole by exchanging or substituting particularities so that every part shares in a general form (in other words, the production of isomorphism). An easy political example is the Leninist Party, which mediates the particularized private interests to form them into the general will of a class. Machines instead treat the relationship between things as a problem of communication. The result is the “control and communication” of Norbert Wiener’s cybernetics, which connects distinct things in a circuit instead of implanting a general logic. The word “machine” never really caught on but the concept has made inroads in the social sciences, where actor-network theory, game theory, behaviorism, systems theory, and other cybernetic approaches have gained acceptance.

    Structure or machine, each engenders a different type of subjectivity, and each realizes a different model of communication. The two are found in A Thousand Plateaus, where Deleuze and Guattari note two different types of state subject formation: social subjection and machinic enslavement (456-460). While it only takes up a few short pages, the distinction is essential to Bernard Stiegler’s work and has been expertly elaborated by Maurizio Lazzarato in the book Signs and Machines. We are all familiar with molar social subjection synonymous with “agency”–it is the power that results from individuals bridging the gap between themselves and broader structures of representation, social roles, and institutional demands. This subjectivity is well outlined by Lacanians and other theorists of the linguistic turn (Virno, Rancière, Butler, Agamben). Missing from their accounts is machinic enslavement, which treats people as simply cogs in the machine. Such subjectivity is largely overlooked because it bypasses existential questions of recognition or self-identity. This is because machinic enslavement operates at the level of the infra-social or pre-individual through the molecular operators of unindividuated affects, sensations, desires not assigned to a subject. Offering a concrete example, Deleuze and Guattari reference Mumford’s megamachines of surplus societies that create huge landworks by treating humans as mere constituent parts. Capitalism revived the megamachine in the sixteenth century, and more recently, we have entered the “third age” of enslavement marked by the development of cybernetic and informational machines. In place of the pyramids are technical machines that use humans at places in technical circuits where computers are incapable or too costly, e.g. Amazon’s Mechanical Turk.

    I should also clarify that not all machines are bad. Rather, Dark Deleuze only trusts one kind of machine, the war machine. And war machines follow a single trajectory–a line of flight out of this world. A major task of the war machine conveniently aligns with my politics of techno-anarchism: to blow apart the networks of communication created by the state.

    Galloway: I can’t resist a silly pun, cannon of joy. Part of your project is about resisting a certain masculinist tendency. Is that a fair assessment? How do feminism and queer theory influence your project?

    Culp: Feminism is hardwired into the tagline for Dark Deleuze through a critique of emotional labor and the exhibition of bodies–“A revolutionary Deleuze for today’s digital world of compulsory happiness, decentralized control, and overexposure.” The major thread I pull through the book is a materialist feminist one: something intolerable about this world is that it demands we participate in its accumulation and reproduction. So how about a different play on words: Sara Ahmed’s feminist killjoy, who refuses the sexual contract that requires women to appear outwardly grateful and agreeable? Or better yet, Joy Division? The name would associate the project with post-punk, its conceptual attack on the mainstream, and the band’s nod to the sexual labor depicted in the novella House of Dolls.

    My critique of accumulation is also a media argument about connection. The most popular critics of ‘net culture are worried that we are losing ourselves. So on the one hand, we have Sherry Turkle who is worried that humans are becoming isolated in a state of being “alone-together”; and on the other, there is Bernard Stiegler, who thinks that the network supplants important parts of what it means to be human. I find this kind of critique socially conservative. It also victim-blames those who use social media the most. Recall the countless articles attacking women who take selfies as part of self-care regimen or teens who creatively evade parental authority. I’m more interested in the critique of early ’90s ‘net culture and its enthusiasm for the network. In general, I argue that network-centric approaches are now the dominant form of power. As such, I am much more interested in how the rhizome prefigures the digitally-coordinated networks of exploitation that have made Apple, Amazon, and Google into the world’s most powerful corporations. While not a feminist issue on its face, it’s easy to see feminism’s relevance when we consider the gendered division of labor that usually makes women the employees of choice for low-paying jobs in electronics manufacturing, call centers, and other digital industries.

    Lastly, feminism and queer theory explicitly meet in my critique of reproduction. A key argument of Deleuze and Guattari in Anti-Oedipus is the auto-production of the real, which is to say, we already live in a “world without us.” My argument is that we need to learn how to hate some of the things it produces. Of course, this is a reworked critique of capitalist alienation and exploitation, which is a system that gives to us (goods and the wage) only because it already stole them behind our back (restriction from the means of subsistence and surplus value). Such ambivalence is the everyday reality of the maquiladora worker who needs her job but may secretly hope that all the factories burn to the ground. Such degrading feelings are the result of the compromises we make to reproduce ourselves. In the book, I give voice to them by fusing together David Halperin and Valerie Traub’s notion of gay shame acting as a solvent to whatever binds us to identity and Deleuze’s shame at not being able to prevent the intolerable. But feeling shame is not enough. To complete the argument, we need to draw out the queer feminist critique of reproduction latent in Marx and Freud. Détourning an old phrase: direct action begins at the point of reproduction. My first impulse is to rely on the punk rock attitude of Lee Edelman and Paul Preciado’s indictment of reproduction. But you are right that they have their masculinist moments, so what we need is something more post-punk–a little less aggressive and a lot more experimental. Hopefully Dark Deleuze is that.

    Galloway: Edelman’s “fuck Annie” is one of the best lines in recent theory. “Fuck the social order and the Child in whose name we’re collectively terrorized; fuck Annie; fuck the waif from Les Mis; fuck the poor, innocent kid on the Net; fuck Laws both with capital ls and small; fuck the whole network of Symbolic relations and the future that serves as its prop” (No Future, 29). Your book claims, in essence, that the Fuck Annies are more interesting than the Aleatory Materialists. But how can we escape the long arm of Lucretius?

    Culp: My feeling is that the politics of aleatory materialism remains ambiguous. Beyond the literal meaning of “joy,” there are important feminist takes on the materialist Spinoza of the encounter that deserve our attention. Isabelle Stengers’s work is among the most comprehensive, though the two most famous are probably Donna Haraway’s cyborg feminism and Karen Barad’s agential realism. Curiously, while New Materialism has been quite a boon for the art and design world, its socio-political stakes have never been more uncertain. One would hope that appeals to matter would lend philosophical credence to topical events such as #blacklivesmatter. Yet for many, New Materialism has simply led to a new formalism focused on material forms or realist accounts of physical systems meant to eclipse the “epistemological excesses” of post-structuralism. This divergence was not lost on commentators in the most recent issue of of October, which functioned as a sort of referendum on New Materialism. On the hand, the issue included a generous accounting of the many avenues artists have taken in exploring various “new materialist” directions. Of those, I most appreciated Mel Chen’s reminder that materialism cannot serve as a “get out of jail free card” on the history of racism, sexism, ablism, and speciesism. While on the other, it included the first sustained attack on New Materialism by fellow travelers. Certainly the New Materialist stance of seeing the world from the perspective of “real objects” can be valuable, but only if it does not exclude old materialism’s politics of labor. I draw from Deleuzian New Materialist feminists in my critique of accumulation and reproduction, but only after short-circuiting their world-building. This is a move I learned from Sue Ruddick, whose Theory, Culture & Society article on the affect of the philosopher’s scream is an absolute tour de force. And then there is Graham Burnett’s remark that recent materialisms are like “Etsy kissed by philosophy.” The phrase perfectly crystallizes the controversy, but it might be too hot to touch for at least a decade…

    Galloway: Let’s focus more on the theme of affirmation and negation, since the tide seems to be changing. In recent years, a number of theorists have turned away from affirmation toward a different set of vectors such as negation, eclipse, extinction, or pessimism. Have we reached peak affirmation?

    Culp: We should first nail down what affirmation means in this context. There is the metaphysical version of affirmation, such as Foucault’s proud title as a “happy positivist.” In this declaration in Archaeology of Knowledge and “The Order of Discourse,” he is not claiming to be a logical positivist. Rather, Foucault is distinguishing his approach from Sartrean totality, transcendentalism, and genetic origins (his secondary target being the reading-between-the-lines method of Althusserian symptomatic reading). He goes on to formalize this disagreement in his famous statement on the genealogical method, “Nietzsche, Genealogy, History.” Despite being an admirer of Sartre, Deleuze shares this affirmative metaphysics with Foucault, which commentators usually describe as an alternative to the Hegelian system of identity, contradiction, determinate negation, and sublation. Nothing about this “happily positivist” system forces us to be optimists. In fact, it only raises the stakes for locating how all the non-metaphysical senses of the negative persist.

    Affirmation could be taken to imply a simple “more is better” logic as seen in Assemblage Theory and Latourian Compositionalism. Behind this logic is a principle of accumulation that lacks a theory of exploitation and fails to consider the power of disconnection. The Spinozist definition of joy does little to dispel this myth, but it is not like either project has revolutionary political aspirations. I think we would be better served to follow the currents of radical political developments over the last twenty years, which have been following an increasingly negative path. One part of the story is a history of failure. The February 15, 2003 global demonstration against the Iraq War was the largest protest in history but had no effect on the course of the war. More recently, the election of democratic socialist governments in Europe has done little to stave off austerity, even as economists publicly describe it as a bankrupt model destined to deepen the crisis. I actually find hope in the current circuit of struggle and think that its lack of alter-globalization world-building aspirations might be a plus. My cues come from the anarchist black bloc and those of the post-Occupy generation who would rather not pose any demands. This is why I return to the late Deleuze of the “control societies” essay and his advice to scramble the codes, to seek out spaces where nothing needs to be said, and to establish vacuoles of non-communication. Those actions feed the subterranean source of Dark Deleuze‘s darkness and the well from which comes hatred, cruelty, interruption, un-becoming, escape, cataclysm, and the destruction of worlds.

    Galloway: Does hatred for the world do a similar work for you that judgment or moralism does in other writers? How do we avoid the more violent and corrosive forms of hate?

    Culp: Writer Antonin Artaud’s attempt “to have done with the judgment of God” plays a crucial role in Dark Deleuze. Not just any specific authority but whatever gods are left. The easiest way to summarize this is “the three deaths.” Deleuze already makes note of these deaths in the preface to Difference and Repetition, but it only became clear to me after I read Gregg Flaxman’s Gilles Deleuze and the Fabulation of Philosophy. We all know of Nietzsche’s Death of God. With it, Nietzsche notes that God no longer serves as the central organizing principle for us moderns. Important to Dark Deleuze is Pierre Klossowski’s Nietzsche, who is part of a conspiracy against all of humanity. Why? Because even as God is dead, humanity has replaced him with itself. Next comes the Death of Man, which we can lay at the feet of Foucault. More than any other text, The Order of Things demonstrates how the birth of modern man was an invention doomed to fail. So if that death is already written in sand about to be washed away, then what comes next? Here I turn to the world, worlding, and world-building. It seems obvious when looking at the problems that plague our world: global climate change, integrated world capitalism, and other planet-scale catastrophes. We could try to deal with each problem one by one. But why not pose an even more radical proposition? What if we gave up on trying to save this world? We are already awash in sci-fi that tries to do this, though most of it is incredibly socially conservative. Perhaps now is the time for thinkers like us to catch up. Fragments of Deleuze already lay out the terms of the project. He ends the preface to Different and Repetition by assigning philosophy the task of writing apocalyptic science fiction. Deleuze’s book opens with lightning across the black sky and ends with the world swelling into a single ocean of excess. Dark Deleuze collects those moments and names it the Death of This World.

    Galloway: Speaking of climate change, I’m reminded how ecological thinkers can be very religious, if not in word then in deed. Ecologists like to critique “nature” and tout their anti-essentialist credentials, while at the same time promulgating tellurian “change” as necessary, even beneficial. Have they simply replaced one irresistible force with another? But your “hatred of the world” follows a different logic…

    Culp: Irresistible indeed! Yet it is very dangerous to let the earth have the final say. Not only does psychoanalysis teach us that it is necessary to buck the judgment of nature, the is/ought distinction at the philosophical core of most ethical thought refuses to let natural fact define the good. I introduce hatred to develop a critical distance from what is, and, as such, hatred is also a reclamation of the future in that it is a refusal to allow what-is to prevail over what-could-be. Such an orientation to the future is already in Deleuze and Guattari. What else is de-territorialization? I just give it a name. They have another name for what I call hatred: utopia.

    Speaking of utopia, Deleuze and Guattari’s definition of utopia in What is Philosophy? as simultaneously now-here and no-where is often used by commentators to justify odd compromise positions with the present state of affairs. The immediate reference is Samuel Butler’s 1872 book Erewhon, a backward spelling of nowhere, which Deleuze also references across his other work. I would imagine most people would assume it is a utopian novel in the vein of Edward Bellamy’s Looking Backward. And Erewhon does borrow from the conventions of utopian literature, but only to skewer them with satire. A closer examination reveals that the book is really a jab at religion, Victorian values, and the British colonization of New Zealand! So if there is anything that the now-here of Erewhon has to contribute to utopia, it is that the present deserves our ruthless criticism. So instead of being a simultaneous now-here and no-where, hatred follows from Deleuze and Guattari’s suggestion in A Thousand Plateaus to “overthrow ontology” (25). Therefore, utopia is only found in Erewhon by taking leave of the now-here to get to no-where.

    Galloway: In Dark Deleuze you talk about avoiding “the liberal trap of tolerance, compassion, and respect.” And you conclude by saying that the “greatest crime of joyousness is tolerance.” Can you explain what you mean, particularly for those who might value tolerance as a virtue?

    Culp: Among the many followers of Deleuze today, there are a number of liberal Deleuzians. Perhaps the biggest stronghold is in political science, where there is a committed group of self-professed radical liberals. Another strain bridges Deleuze with the liberalism of John Rawls. I was a bit shocked to discover both of these approaches, but I suppose it was inevitable given liberalism’s ability to assimilate nearly any form of thought.

    Herbert Marcuse recognized “repressive tolerance” as the incredible power of liberalism to justify the violence of positions clothed as neutral. The examples Marcuse cites are governments who say they respect democratic liberties because they allow political protest although they ignore protesters by labeling them a special interest group. For those of us who have seen university administrations calmly collect student demands, set up dead-end committees, and slap pictures of protestors on promotional materials as a badge of diversity, it should be no surprise that Marcuse dedicated the essay to his students. An important elaboration on repressive tolerance is Wendy Brown’s Regulating Aversion. She argues that imperialist US foreign policy drapes itself in tolerance discourse. This helps diagnose why liberal feminist groups lined up behind the US invasion of Afghanistan (the Taliban is patriarchal) and explains how a mere utterance of ISIS inspires even the most progressive liberals to support outrageous war budgets.

    Because of their commitment to democracy, Brown and Marcuse can only qualify liberalism’s universal procedures for an ethical subject. Each criticizes certain uses of tolerance but does not want to dispense with it completely. Deleuze’s hatred of democracy makes it much easier for me. Instead, I embrace the perspective of a communist partisan because communists fight from a different structural position than that of the capitalist.

    Galloway: Speaking of structure and position, you have a section in the book on asymmetry. Most authors avoid asymmetry, instead favoring concepts like exchange or reciprocity. I’m thinking of texts on “the encounter” or “the gift,” not to mention dialectics itself as a system of exchange. Still you want to embrace irreversibility, incommensurability, and formal inoperability–why?

    Culp: There are a lot of reasons to prefer asymmetry, but for me, it comes down to a question of political strategy.

    First, a little background. Deleuze and Guattari’s critique of exchange is important to Anti-Oedipus, which was staged through a challenge to Claude Lévi-Strauss. This is why they shift from the traditional Marxist analysis of mode of production to an anthropological study of anti-production, for which they use the work of Pierre Clastres and Georges Bataille to outline non-economic forms of power that prevented the emergence of capitalism. Contemporary anthropologists have renewed this line of inquiry, for instance, Eduardo Viveiros de Castro, who argues in Cannibal Metaphysics that cosmologies differ radically enough between peoples that they essentially live in different worlds. The cannibal, he shows, is not the subject of a mode of production but a mode of predation.

    Those are not the stakes that interest me the most. Consider instead the consequence of ethical systems built on the gift and political systems of incommensurability. The ethical approach is exemplified by Derrida, whose responsibility to the other draws from the liberal theological tradition of accepting the stranger. While there is distance between self and other, it is a difference that is bridged through the democratic project of radical inclusion, even if such incorporation can only be aporetically described as a necessary-impossibility. In contrast, the politics of asymmetry uses incommensurability to widen the chasm opened by difference. It offers a strategy for generating antagonism without the formal equivalence of dialectics and provides an image of revolution based on fundamental transformation. The former can be seen in the inherent difference between the perspective of labor and the perspective of capital, whereas the latter is a way out of what Guy Debord calls “a perpetual present.”

    Galloway: You are exploring a “dark” Deleuze, and I’m reminded how the concepts of darkness and blackness have expanded and interwoven in recent years in everything from afro-pessimism to black metal theory (which we know is frighteningly white). How do you differentiate between darkness and blackness? Or perhaps that’s not the point?

    Culp: The writing on Deleuze and race is uneven. A lot of it can be blamed on the imprecise definition of becoming. The most vulgar version of becoming is embodied by neoliberal subjects who undergo an always-incomplete process of coming more into being (finding themselves, identifying their capacities, commanding their abilities). The molecular version is a bit better in that it theorizes subjectivity as developing outside of or in tension with identity. Yet the prominent uses of becoming and race rarely escaped the postmodern orbit of hybridity, difference, and inclusive disjunction–the White Man’s face as master signifier, miscegenation as anti-racist practice, “I am all the names of history.” You are right to mention afro-pessimism, as it cuts a new way through the problem. As I’ve written elsewhere, Frantz Fanon describes being caught between “infinity and nothingness” in his famous chapter on the fact of blackness in Black Skin White Masks. The position of infinity is best championed by Fred Moten, whose black fugitive is the effect of an excessive vitality that has survived five hundred years of captivity. He catches fleeting moments of it in performances of jazz, art, and poetry. This position fits well with the familiar figures of Deleuzo-Guattarian politics: the itinerant nomad, the foreigner speaking in a minor tongue, the virtuoso trapped in-between lands. In short: the bastard combination of two or more distinct worlds. In contrast, afro-pessimism is not the opposite of the black radical tradition but its outside. According to afro-pessimism, the definition of blackness is nothing but the social death of captivity. Remember the scene of subjection mentioned by Fanon? During that nauseating moment he is assailed by a whole series of cultural associations attached to him by strangers on the street. “I was battered down by tom-toms, cannibalism, intellectual deficiency, fetishism, racial defects, slave-ships, and above all else, above all: ‘Sho’ good eatin”” (112). The lesson that afro-pessimism draws from this scene is that cultural representations of blackness only reflect back the interior of white civil society. The conclusion is that combining social death with a culture of resistance, such as the one embodied by Fanon’s mentor Aimé Césaire, is a trap that leads only back to whiteness. Afro-pessimism thus follows the alternate route of darkness. It casts a line to the outside through an un-becoming that dissolves the identity we are give as a token for the shame of being a survivor.

    Galloway: In a recent interview the filmmaker Haile Gerima spoke about whiteness as “realization.” By this he meant both realization as such–self-realization, the realization of the self, the ability to realize the self–but also the more nefarious version as “realization through the other.” What’s astounding is that one can replace “through” with almost any other preposition–for, against, with, without, etc.–and the dynamic still holds. Whiteness is the thing that turns everything else, including black bodies, into fodder for its own realization. Is this why you turn away from realization toward something like profanation? And is darkness just another kind of whiteness?

    Culp: Perhaps blackness is to the profane as darkness is to the outside. What is black metal if not a project of political-aesthetic profanation? But as other commentators have pointed out, the politics of black metal is ultimately telluric (e.g. Benjamin Noys’s “‘Remain True to the Earth!’: Remarks on the Politics of Black Metal”). The left wing of black metal is anarchist anti-civ and the right is fascist-nativist. Both trace authority back to the earth that they treat as an ultimate judge usurped by false idols.

    The process follows what Badiou calls “the passion for the real,” his diagnosis of the Twentieth Century’s obsession with true identity, false copies, and inauthentic fakes. His critique equally applies to Deleuzian realists. This is why I think it is essential to return to Deleuze’s work on cinema and the powers of the false. One key example is Orson Welles’s F for Fake. Yet my favorite is the noir novel, which he praises in “The Philosophy of Crime Novels.” The noir protagonist never follows in the footsteps of Sherlock Holmes or other classical detectives’s search for the real, which happens by sniffing out the truth through a scientific attunement of the senses. Rather, the dirty streets lead the detective down enough dead ends that he proceeds by way of a series of errors. What noir reveals is that crime and the police have “nothing to do with a metaphysical or scientific search for truth” (82). The truth is rarely decisive in noir because breakthroughs only come by way of “the great trinity of falsehood”: informant-corruption-torture. The ultimate gift of noir is a new vision of the world whereby honest people are just dupes of the police because society is fueled by falsehood all the way down.

    To specify the descent to darkness, I use darkness to signify the outside. The outside has many names: the contingent, the void, the unexpected, the accidental, the crack-up, the catastrophe. The dominant affects associated with it are anticipation, foreboding, and terror. To give a few examples, H. P. Lovecraft’s scariest monsters are those so alien that characters cannot describe them with any clarity, Maurice Blanchot’s disaster is the Holocaust as well as any other event so terrible that it interrupts thinking, and Don DeLillo’s “airborne toxic event” is an incident so foreign that it can only be described in the most banal terms. Of Deleuze and Guattari’s many different bodies without organs, one of the conservative varieties comes from a Freudian model of the psyche as a shell meant to protect the ego from outside perturbations. We all have these protective barriers made up of habits that help us navigate an uncertain world–that is the purpose of Guattari’s ritornello, that little ditty we whistle to remind us of the familiar even when we travel to strange lands. There are two parts that work together, the refrain and the strange land. The refrains have only grown yet the journeys seem to have ended.

    I’ll end with an example close to my own heart. Deleuze and Guattari are being used to support new anarchist “pre-figurative politics,” which is defined as seeking to build a new society within the constraints of the now. The consequence is that the political horizon of the future gets collapsed into the present. This is frustrating for someone like me, who holds out hope for a revolutionary future that ceases the million tiny humiliations that make up everyday life. I like J. K. Gibson-Graham’s feminist critique of political economy, but community currencies, labor time banks, and worker’s coops are not my image of communism. This is why I have drawn on the gothic for inspiration. A revolution that emerges from the darkness holds the apocalyptic potential of ending the world as we know it.

    Works Cited

    • Ahmed, Sara. The Promise of Happiness. Durham, NC: Duke University Press, 2010.
    • Artaud, Antonin. To Have Done With The Judgment of God. 1947. Live play, Boston: Exploding Envelope, c1985. https://www.youtube.com/watch?v=VHtrY1UtwNs.
    • Badiou, Alain. The Century. 2005. Cambridge, UK: Polity Press, 2007.
    • Barad, Karen. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter. Durham, NC: Duke University Press, 2007.
    • Bataille, Georges. “The Notion of Expenditure.” 1933. In Visions of Excess: Selected Writings, 1927-1939, translated by Allan Stoekl, Carl R. Lovin, and Donald M. Leslie Jr., 167-81. Minneapolis: University of Minnesota Press, 1985.
    • Bellamy, Edward. Looking Backward: From 2000 to 1887. Boston: Ticknor & co., 1888.
    • Blanchot, Maurice. The Writing of the Disaster. 1980. Translated by Ann Smock. Lincoln, NE: University of Nebraska Press, 1995.
    • Brown, Wendy. Regulating Aversion: Tolerance in the Age of Identity and Empire. Princeton, N.J.: Princeton University Press, 2006.
    • Burnett, Graham. “A Questionnaire on Materialisms.” October 155 (2016): 19-20.
    • Butler, Samuel. Erewhon: or, Over the Range. 1872. London: A.C. Fifield, 1910. http://www.gutenberg.org/files/1906/1906-h/1906-h.htm.
    • Chen, Mel Y. “A Questionnaire on Materialisms.” October 155 (2016): 21-22.
    • Clastres, Pierre. Society against the State. 1974. Translated by Robert Hurley and Abe Stein. New York: Zone Books, 1987.
    • Culp, Andrew. Dark Deleuze. Minneapolis: University of Minnesota Press, 2016.
    • ———. “Blackness.” New York: Hostis, 2015.
    • Debord, Guy. The Society of the Spectacle. 1967. Translated by Fredy Perlman et al. Detroit: Red and Black, 1977.
    • DeLanda, Manuel. A Thousand Years of Nonlinear History. New York: Zone Books, 2000.
    • ———. War in the Age of Intelligent Machines. New York: Zone Books, 1991.
    • DeLillo, Don. White Noise. New York: Viking Press, 1985.
    • Deleuze, Gilles. Cinema 2: The Time-Image. 1985. Translated by Hugh Tomlinson and Robert Galeta. Minneapolis: University of Minnesota Press, 1989.
    • ———. “The Philosophy of Crime Novels.” 1966. Translated by Michael Taormina. In Desert Islands and Other Texts, 1953-1974, 80-85. New York: Semiotext(e), 2004.
    • ———. Difference and Repetition. 1968. Translated by Paul Patton. New York: Columbia University Press, 1994.
    • ———. Empiricism and Subjectivity: An Essay on Hume’s Theory of Human Nature. 1953. Translated by Constantin V. Boundas. New York: Columbia University Press, 1995.
    • ———. Foucault. 1986. Translated by Seán Hand. Minneapolis: University of Minnesota Press, 1988.
    • Deleuze, Gilles, and Félix Guattari. Anti-Oedipus. 1972. Translated by Robert Hurley, Mark Seem, and Helen R. Lane. Minneapolis: University of Minnesota Press, 1977.
    • ———. A Thousand Plateaus. 1980. Translated by Brian Massumi. Minneapolis: University of Minnesota Press, 1987.
    • ———. What Is Philosophy? 1991. Translated by Hugh Tomlinson and Graham Burchell. New York: Columbia University Press, 1994.
    • Derrida, Jacques. The Gift of Death and Literature in Secret. Translated by David Willis. Chicago: University of Chicago Press, 2007; second edition.
    • Edelman, Lee. No Future: Queer Theory and the Death Drive. Durham, N.C.: Duke University Press, 2004.
    • Fanon, Frantz. Black Skin White Masks. 1952. Translated by Charles Lam Markmann. New York: Grove Press, 1968.
    • Flaxman, Gregory. Gilles Deleuze and the Fabulation of Philosophy. Minneapolis: University of Minnesota Press, 2011.
    • Foucault, Michel. The Archaeology of Knowledge and the Discourse on Language. 1971. Translated by A.M. Sheridan Smith. New York: Pantheon Books, 1972.
    • ———. “Nietzsche, Genealogy, History.” 1971. In Language, Counter-Memory, Practice: Selected Essays and Interviews, translated by Donald F. Bouchard and Sherry Simon, 113-38. Ithaca, N.Y.: Cornell University Press, 1977.
    • ———. The Order of Things. 1966. New York: Pantheon Books, 1970.
    • Freud, Sigmund. Beyond the Pleasure Principle. 1920. Translated by James Strachley. London: Hogarth Press, 1955.
    • ———. “Instincts and their Vicissitudes.” 1915. Translated by James Strachley. In Standard Edition of the Complete Psychological Works of Sigmund Freud 14, 111-140. London: Hogarth Press, 1957.
    • Gerima, Haile. “Love Visual: A Conversation with Haile Gerima.” Interview by Sarah Lewis and Dagmawi Woubshet. Aperture, Feb 23, 2016. http://aperture.org/blog/love-visual-haile-gerima/.
    • Gibson-Graham, J.K. The End of Capitalism (As We Knew It): A Feminist Critique of Political Economy. Hoboken: Blackwell, 1996.
    • ———. A Postcapitalist Politics. Minneapolis: University of Minnesota Press, 2006.
    • Guattari, Félix. “Machine and Structure.” 1968. Translated by Rosemary Sheed. In Molecular Revolution: Psychiatry and Politics, 111-119. Harmondsworth, Middlesex: Penguin, 1984.
    • Halperin, David, and Valerie Traub. “Beyond Gay Pride.” In Gay Shame, 3-40. Chicago: University of Chicago Press, 2009.
    • Haraway, Donna. Simians, Cyborgs, and Women: The Reinvention of Nature. New York: Routledge, 1991.
    • Klossowski, Pierre. “Circulus Vitiosus.” Translated by Joseph Kuzma. The Agonist: A Nietzsche Circle Journal 2, no. 1 (2009): 31-47.
    • ———. Nietzsche and the Vicious Circle. 1969. Translated by Daniel W. Smith. Chicago: University of Chicago Press, 1997.
    • Lazzarato, Maurizio. Signs and Machines. 2010. Translated by Joshua David Jordan. Los Angeles: Semiotext(e), 2014.
    • Marcuse, Herbert. “Repressive Tolerance.” In A Critique of Pure Tolerance, 81-117. Boston: Beacon Press, 1965.
    • Mauss, Marcel. The Gift: The Form and Reason for Exchange in Archaic Societies. 1950. Translated by W. D. Hallis. New York: Routledge, 1990.
    • Moten, Fred. In The Break: The Aesthetics of the Black Radical Tradition. Minneapolis: University of Minnesota Press, 2003.
    • Mumford, Lewis. Technics and Human Development. San Diego: Harcourt Brace Jovanovich, 1967.
    • Noys, Benjamin. “‘Remain True to the Earth!’: Remarks on the Politics of Black Metal.” In: Hideous Gnosis: Black Metal Theory Symposium 1 (2010): 105-128.
    • Preciado, Paul. Testo-Junkie: Sex, Drugs, and Biopolitics in the Phamacopornographic Era. 2008. Translated by Bruce Benderson. New York: The Feminist Press, 2013.
    • Ruddick, Susan. “The Politics of Affect: Spinoza in the Work of Negri and Deleuze.” Theory, Culture, Society 27, no. 4 (2010): 21-45.
    • Scott, James C. The Art of Not Being Governed: An Anarchist History of Upland Southeast Asia. New Haven: Yale University Press, 2009.
    • Sexton, Jared. “Afro-Pessimism: The Unclear Word.” In Rhizomes 29 (2016). http://www.rhizomes.net/issue29/sexton.html.
    • ———. “Ante-Anti-Blackness: Afterthoughts.” In Lateral 1 (2012). http://lateral.culturalstudiesassociation.org/issue1/content/sexton.html.
    • ———. “The Social Life of Social Death: On Afro-Pessimism and Black Optimism.” In Intensions 5 (2011). http://www.yorku.ca/intent/issue5/articles/jaredsexton.php.
    • Stiegler, Bernard. For a New Critique of Political Economy. Cambridge: Polity Press, 2010.
    • ———. Technics and Time 1: The Fault of Epimetheus. 1994. Translated by George Collins and Richard Beardsworth. Redwood City, CA: Stanford University Press, 1998.
    • Tiqqun. “How Is It to Be Done?” 2001. In Introduction to Civil War. 2001. Translated by Alexander R. Galloway and Jason E. Smith. Los Angeles, Calif.: Semiotext(e), 2010.
    • Toynbee, Arnold. A Study of History. Abridgement of Volumes I-VI by D.C. Somervell. London, Oxford University Press, 1946.
    • Turkle, Sherry. Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books, 2012.
    • Viveiros de Castro, Eduardo. Cannibal Metaphysics: For a Post-structural Anthropology. 2009. Translated by Peter Skafish. Minneapolis, Minn.: Univocal, 2014.
    • Villani, Arnaud. La guêpe et l’orchidée. Essai sur Gilles Deleuze. Paris: Éditions de Belin, 1999.
    • Welles, Orson, dir. F for Fake. 1974. New York: Criterion Collection, 2005.
    • Wiener, Norbert. Cybernetics: Cybernetics: Or Control and Communication in the Animal and the Machine. Cambridge, MA: MIT Press, 1948; second revised edition.
    • Williams, Alex, and Nick Srincek. “#ACCELERATE MANIFESTO for an Accelerationist Politics.” Critical Legal Thinking. 2013. http://criticallegalthinking.com/2013/05/14/accelerate-manifesto-for-an-accelerationist-politics/.

    _____

    Alexander R. Galloway is a writer and computer programer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. He is a frequent contributor to The b2 Review “Digital Studies.”

    Andrew Culp is a Visiting Assistant Professor of Rhetoric Studies at Whitman College. He specializes in cultural-communicative theories of power, the politics of emerging media, and gendered responses to urbanization. His work has appeared in Radical Philosophy, Angelaki, Affinities, and other venues. He previously pre-reviewed Galloway’s Laruelle: Against the Digital for The b2 Review “Digital Studies.”

    Back to the essay

  • Alexander R. Galloway — From Data to Information

    Alexander R. Galloway — From Data to Information

    By Alexander R. Galloway
    ~

    In recent months I’ve been spending time learning Swift. As such, I’ve been thinking a lot about data structures. Swift has a nice spectrum of possible data structures to pick from — something that I’ll have to discuss another day — but what interests me here is the question of data itself. Scholars often treat etymology as a special kind of divination. (And philosophers like Heidegger made a career of it.) But I find the etymology of the word “data” to be particularly elegant and revealing.

    Data comes from the Latin dare, meaning to give. But it’s the form that’s most interesting. First of all, it’s in the neuter plural, so it refers to “things.” Second, data is a participle in the perfect passive form. Thus the word means literally “the things having been given.” Or, for short, I like to think of data as “the givens.” French preserves this double meaning nicely by calling data the données. (The French also use the word “data,” although *I believe* this is technically an anglicism imported from technical vocabulary, despite French being much closer to Latin than English.)

    Data are the things having been given. Using the language of philosophy, and more specifically of phenomenology, data are the very facts of the givenness of Being. They are knowable and measurable. Data display a facticity; they are “what already exists,” and as such are a determining apparatus. They indicate what is present, what exists. The word data carries certain scientific or empirical undertones. But more important are the phenomenological overtones: data refer to the neutered, generic fact of the things having been given.

    Even in this simple arrangement a rudimentary relation holds sway. For implicit in the notion of the facticity of givenness is a relation to givenness. Data are not just a question of the givenness of Being, but are also necessarily illustrative of a relationship back toward a Being that has been given. In short, givenness itself implies a relation. This is one of the fundamental observations of phenomenology.

    Chicago datum

    Even if nothing specific can be said about a given entity x, it is possible to say that, if given, x is something as opposed to nothing, and therefore that x has a relationship to its own givenness as something. X is “as x”; the as-structure is all that is required to demonstrate that x exists in a relation. (By contrast, if x were immanent to itself, it would not be possible to assume relation. But by virtue of being made distinct as something given, givenness implies non-immanence and thus relation.) Such a “something” can be understood in terms of self-similar identity or, as the scientists say, negentropy, a striving to remain the same.

    So even as data are defined in terms of their givenness, their non-immanence with the one, they also display a relation with themselves. Through their own self-similarity or relation with themselves, they tend back toward the one (as the most generic instance of the same). The logic of data is therefore a logic of existence and identity: on the one hand, the facticity of data means that they exist, that they ex-sistere, meaning to stand out of or from; on the other hand, the givenness of data as something means that they assume a relationship of identity, as the self-similar “whatever entity” that was given.

    The true definition of data, therefore, is not simply “the things having been given.” The definition must conjoin givenness and relation. For this reason, data often go by another name, a name that more suitably describes the implicit imbrication of givenness and relation. The name is information.

    Information combines both aspects of data: the root form refers to a relationship (here a relationship of identity as same), while the prefix in refers to the entering into existence of form, the actual givenness of abstract form into real concrete formation.

    Heidegger sums it up well with the following observation about the idea: “All metaphysics including its opponent positivism speaks the language of Plato. The basic word of its thinking, that is, of his presentation of the Being of beings, is eidos, idea: the outward appearance in which beings as such show themselves. Outward appearance, however, is a manner of presence.” In other words, outward appearance or idea is not a deviation from presence, or some precondition that produces presence. Idea is precisely coterminous with presence. To understand data as information means to understand data as idea, but not just idea, also a host of related terms: form, class, concept, thought, image, outward appearance, shape, presence, or form-of-appearance.

    As Lisa Gitelman has reminded us, there is no such thing as “raw” data, because to enter into presence means to enter into form. An entity “in-form” is not a substantive entity, nor is it an objective one. The in-form is the negentropic transcendental of the situation, be it “material” like the givens or “ideal” like the encoded event. Hence an idea is just as much subject to in-formation as are material objects. An oak tree is in-formation, just as much as a computer file is in-formation.

    All of this is simply another way to understand Parmenides’s claim about the primary identity of philosophy: “Thought and being are the same.”

    [Contains a modified excerpt from Laruelle: Against the Digital [University of Minnesota Press: 2014], pp. 75-77.]
    _____

    Alexander R. Galloway is a writer and computer programer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay

  • Poetics of Control

    Poetics of Control

    a review of Alexander R. Galloway, The Interface Effect (Polity, 2012)

    by Bradley J. Fest

    ~

    This summer marks the twenty-fifth anniversary of the original French publication of Gilles Deleuze’s seminal essay, “Postscript on the Societies of Control” (1990). A strikingly powerful short piece, “Postscript” remains, even at this late date, one of the most poignant, prescient, and concise diagnoses of life in the overdeveloped digital world of the twenty-first century and the “ultrarapid forms of apparently free-floating control that are taking over from the old disciplines.”[1] A stylistic departure from much of Deleuze’s other writing in its clarity and straightforwardness, the essay describes a general transformation from the modes of disciplinary power that Michel Foucault famously analyzed in Discipline and Punish (1975) to “societies of control.” For Deleuze, the late twentieth century is characterized by “a general breakdown of all sites of confinement—prisons, hospitals, factories, schools, the family.”[2] The institutions that were formerly able to strictly organize time and space through perpetual surveillance—thereby, according to Foucault, fabricating the modern individual subject—have become fluid and modular, “continually changing from one moment to the next.”[3] Individuals have become “dividuals,” “dissolv[ed] . . . into distributed networks of information.”[4]

    Over the past decade, media theorist Alexander R. Galloway has extensively and rigorously elaborated on Deleuze’s suggestive pronouncements, probably devoting more pages in print to thinking about the “Postscript” than has any other single writer.[5] Galloway’s most important work in this regard is his first book, Protocol: How Control Exists after Decentralization (2004). If the figure for the disciplinary society was Jeremy Bentham’s panopticon, a machine designed to induce a sense of permanent visibility in prisoners (and, by extension, the modern subject), Galloway argues that the distributed network, and particularly the distributed network we call the internet, is an apposite figure for control societies. Rhizomatic and flexible, distributed networks historically emerged as an alternative to hierarchical, rigid, centralized (and decentralized) networks. But far from being chaotic and unorganized, the protocols that organize our digital networks have created “the most highly controlled mass media hitherto known. . . . While control used to be a law of society, now it is more like a law of nature. Because of this, resisting control has become very challenging indeed.”[6] To put it another way: if in 1980 Deleuze and Félix Guattari complained that “we’re tired of trees,” Galloway and philosopher Eugene Thacker suggest that today “we’re tired of rhizomes.”[7]

    The imperative to think through the novel challenges presented by control societies and the urgent need to develop new methodologies for engaging the digital realities of the twenty-first century are at the heart of The Interface Effect (2012), the final volume in a trio of works Galloway calls Allegories of Control.[8] Guiding the various inquiries in the book is his provocative claim that “we do not yet have a critical or poetic language in which to represent the control society.”[9] This is because there is an “unrepresentability lurking within information aesthetics” (86). This claim for unrepresentability, that what occurs with digital media is not representation per se, is The Interface Effect’s most significant departure from previous media theory. Rather than rehearse familiar media ecologies, Galloway suggests that “the remediation argument (handed down from McLuhan and his followers including Kittler) is so full of holes that it is probably best to toss it wholesale” (20). The Interface Effect challenges thinking about mimesis that would place computers at the end of a line of increasingly complex modes of representation, a line extending from Plato, through Erich Auerbach, Marshall McLuhan, and Friedrich Kittler, and terminating in Richard Grusin, Jay David Bolter, and many others. Rather than continue to understand digital media in terms of remediation and representation, Galloway emphasizes the processes of computational media, suggesting that the inability to productively represent control societies stems from misunderstandings about how to critically analyze and engage with the basic materiality of computers.

    The book begins with an introduction polemically positioning Galloway’s own media theory directly against Lev Manovich’s field-defining book, The Language of New Media (2001). Contra Manovich, Galloway stresses that digital media are not objects but actions. Unlike cinema, which he calls an ontology because it attempts to bring some aspect of the phenomenal world nearer to the viewer—film, echoing Oedipa Maas’s famous phrase, “projects worlds” (11)—computers involve practices and effects (what Galloway calls an “ethic”) because they are “simply on a world . . . subjecting it to various forms of manipulation, preemption, modeling, and synthetic transformation. . . . The matter at hand is not that of coming to know a world, but rather that of how specific, abstract definitions are executed to form a world” (12, 13, 23). Or to take two other examples Galloway uses to positive effect: the difference can be understood as that between language, which describes and represents, encoding a world, versus calculus, which does or simulates doing something to the world; calculus is a “system of reasoning, an executable machine” (22). Though Galloway does more in Gaming: Essays on Algorithmic Culture (2006) to fully develop a way of analyzing computational media that privileges action over representation, The Interface Effect theoretically grounds this important distinction between mimesis and action, description and process.[10] Further, it constitutes a bold methodological step away from some of the dominant ways of thinking about digital media that simultaneously offers its readers new ways to connect media studies more firmly to politics.

    Further distinguishing himself from writers like Manovich, Galloway says that there has been a basic misunderstanding regarding media and mediation, and that the two systems are “violently unconnected” (13). Galloway demonstrates, in contrast to such thinkers as Kittler, that there is an old line of thinking about mediation that can be traced very far back and that is not dependent on thinking about media as exclusively tied to nineteenth and twentieth century communications technology:

    Doubtless certain Greek philosophers had negative views regarding hypomnesis. Yet Kittler is reckless to suggest that the Greeks had no theory of mediation. The Greeks indubitably had an intimate understanding of the physicality of transmission and message sending (Hermes). They differentiated between mediation as immanence and mediation as expression (Iris versus Hermes). They understood the mediation of poetry via the Muses and their techne. They understood the mediation of bodies through the “middle loving” Aphrodite. They even understood swarming and networked presence (in the incontinent mediating forms of the Eumenides who pursued Orestes in order to “process” him at the procès of Athena). Thus we need only look a little bit further to shed this rather vulgar, consumer-electronics view of media, and instead graduate into the deep history of media as modes of mediation. (15)

    Galloway’s point here is that the larger contemporary discussion of mediation that he is pursuing in The Interface Effect should not be restricted to merely the digital artifacts that have occasioned so much recent theoretical activity, and that there is an urgent need for deeper histories of mediation. Though the book appears to be primarily concerned with the twentieth and twenty-first century, this gesture toward the Greeks signals the important work of historicization that often distinguishes much of Galloway’s work. In “Love of the Middle” (2014), for example, which appears in the book Excommunication (2014), co-authored with Thacker and McKenzie Wark, Galloway fully develops a rigorous reading of Greek mediation, suggesting that in the Eumenides, or what the Romans called the Furies, reside a notable historical precursor for understanding the mediation of distributed networks.[11]

    In The Interface Effect these larger efforts at historicization allow Galloway to always understand “media as modes of mediation,” and consequently his big theoretical step involves claiming that “an interface is not a thing, an interface is an effect. It is always a process or a translation” (33). There are a variety of positive implications for the study of media understood as modes of mediation, as a study of interface effects. Principal amongst these are the rigorous methodological possibilities Galloway’s focus emphasizes.

    In this, methodologically and otherwise, Galloway’s work in The Interface Effect resembles and extends that of his teacher Fredric Jameson, particularly the kind of work found in The Political Unconscious (1981). Following Jameson’s emphasis on the “poetics of social forms,” Galloway’s goal is “not to reenact the interface, much less to ‘define’ it, but to identify the interface itself as historical. . . . This produces . . . a perspective on how cultural production and the socio-historical situation take form as they are interfaced together” (30). The Interface Effect firmly ties the cultural to the social, economic, historical, and political, finding in a variety of locations ways that interfaces function as allegories of control. “The social field itself constitutes a grand interface, an interface between subject and world, between surface and source, and between critique and the objects of criticism. Hence the interface is above all an allegorical device that will help us gain some perspective on culture in the age of information” (54). The power of looking at the interface as an allegorical device, as a “control allegory” (30), is demonstrated throughout the book’s relatively wide-ranging analyses of various interface effects.

    Chapter 1, “The Unworkable Interface,” historicizes some twentieth century transformations of the interface, concisely summarizing a history of mediation by moving from Norman Rockwell’s “Triple Self-Portrait” (1960), through Mad Magazine’s satirization of Rockwell, to World of Warcraft (2004-2015). Viewed from the level of the interface, with all of its nondiegetic menus and icons and the ways it erases the line between play and labor, Galloway demonstrates both here and in the last chapter that World of Warcraft is a powerful control allegory: “it is not an avant-garde image, but, nevertheless, it firmly delivers an avant-garde lesson in politics” (44).[12] Further exemplifying the importance of historicizing interfaces, Chapter 2 continues to demonstrate the value of approaching interface effects allegorically. Galloway finds “a formal similarity between the structure of ideology and the structure of software” (55), arguing that software “is an allegorical figure for the way in which . . . political and social realities are ‘resolved’ today: not through oppression or false consciousness . . . but through the ruthless rule of code” (76). Chapter 4 extends such thinking toward a masterful reading of the various mediations at play in a show such as 24 (2001-2010, 2014), arguing that 24 is political not because of its content but “because the show embodies in its formal technique the essential grammar of the control society, dominated as it is by specific network and informatic logics” (119). In short, The Interface Effect continually demonstrates the potent critical tools approaching mediation as allegory can provide, reaffirming the importance of a Jamesonian approach to cultural production in the digital age.

    Whether or not readers are convinced, however, by Galloway’s larger reworking of the field of digital media studies, his emphasis on attending to contemporary cultural artifacts as allegories of control, or his call in the book’s conclusion for a politics of “whatever being” probably depends upon their thoughts about the unrepresentability of today’s global networks in Chapter 3, “Are Some Things Unrepresentable?” His answer to the chapter’s question is, quite simply, “Yes.” Attempts to visualize the World Wide Web only result in incoherent repetition: “every map of the internet looks the same,” and as a result “no poetics is possible in this uniform aesthetic space” (85). He argues that, in the face of such an aesthetic regime, what Jacques Rancière calls a “distribution of the sensible”[13]:

    The point is not so much to call for a return to cognitive mapping, which of course is of the highest importance, but to call for a poetics as such for this mysterious new machinic space. . . . Today’s systemics have no contrary. Algorithms and other logical structures are uniquely, and perhaps not surprisingly, monolithic in their historical development. There is one game in town: a positivistic dominant of reductive, systemic efficiency and expediency. Offering a counter-aesthetic in the face of such systematicity is the first step toward building a poetics for it, a language of representability adequate to it. (99)

    There are, to my mind, two ways of responding to Galloway’s call for a poetics as such in the face of the digital realities of contemporaneity.

    On the one hand, I am tempted to agree with him. Galloway is clearly signaling his debt to some of Jameson’s more important large claims and is reviving the need “to think the impossible totality of the contemporary world system,” what Jameson once called the “technological” or “postmodern sublime.”[14] But Galloway is also signaling the importance of poesis for this activity. Not only is Jamesonian “cognitive mapping” necessary, but the totality of twenty-first century digital networks requires new imaginative activity, a counter-aesthetics commensurate with informatics. This is an immensely attractive position, at least to me, as it preserves a space for poetic, avant-garde activity, and indeed, demands that, all evidence to the contrary, the imagination still has an important role to play in the face of societies of control. (In other words, there may be some “humanities” left in the “digital humanities.”[15]) Rather than suggesting that the imagination has been utterly foreclosed by the cultural logic of late capitalism—that we can no longer imagine any other world, that it is easier to imagine the end of the world than a better one—Galloway says that there must be a reinvestment in the imagination, in poetics as such, that will allow us to better represent, understand, and intervene in societies of control (though not necessarily to imagine a better world; more on this below). Given the present landscape, how could one not be attracted to such a position?

    On the other hand, Galloway’s argument hinges on his claim that such a poetics has not emerged and, as Patrick Jagoda and others have suggested, one might merely point out that such a claim is demonstrably false.[16] Though I hope I hardly need to list some of the significant cultural products across a range of media that have appeared over the last fifteen years that critically and complexly engage with the realities of control (e.g., The Wire [2002-08]), it is not radical to suggest that art engaged with pressing contemporary concerns has appeared and will continue to appear, that there are a variety of significant artists who are attempting to understand, represent, and cope with the distributed networks of contemporaneity. One could obviously suggest Galloway’s argument is largely rhetorical, a device to get his readers to think about the different kinds of poesis control societies, distributed networks, and interfaces call for, but this blanket statement threatens to shut down some of the vibrant activity that is going on all over the world commenting upon the contemporary situation. In other words, yes we need a poetics of control, but why must the need for such a poetics hinge on the claim that there has not yet emerged “a critical or poetic language in which to represent the control society”? Is not Galloway’s own substantial, impressive, and important decade-long intellectual project proof that people have developed a critical language that is capable of representing the control society? I would certainly answer in the affirmative.

    There are some other rhetorical choices in the conclusion of The Interface Effect that, though compelling, deserve to be questioned, or at least highlighted. I am referring to Galloway’s penchant—following another one of his teachers at Duke, Michael Hardt—for invoking a Bartlebian politics, what Galloway calls “whatever being,” as an appropriate response to present problems.[17] In Hardt and Antonio Negri’s Empire (2000), in the face of the new realities of late capitalism—the multitude, the management of hybridities, the non-place of Empire, etc.—they propose that Herman Melville’s “Bartleby in his pure passivity and his refusal of any particulars presents us with a figure of generic being, being as such, being and nothing more. . . . This refusal certainly is the beginning of a liberatory politics, but it is only a beginning.”[18] Bartleby, with his famous response of “‘I would prefer not to,’”[19] has been frequently invoked by such substantial figures as Giorgio Agamben in the 1990s and Slavoj Žižek in the 2000s (following Hardt and Negri). Such thinkers have frequently theorized Bartleby’s passive negativity as a potentially radical political position, and perhaps the only one possible in the face of global economic realities.[20] (And indeed, it is easy enough to read, say, Occupy Wall Street as a Bartlebian political gesture.) Galloway’s response to the affective postfordist labor of digital networks, that “each and every day, anyone plugged into a network is performing hour after hour of unpaid micro labor” (136), is similarly to withdraw, to “demilitarize being. Stand down. Cease participating” (143).

    Like Hardt and Negri and so many others, Galloway’s “whatever being” is a response to the failures of twentieth century emancipatory politics. He writes:

    We must stress that it is not the job of politics to invent a new world. On the contrary it is the job of politics to make all these new worlds irrelevant. . . . It is time now to subtract from this world, not add to it. The challenge today is not one of political or moral imagination, for this problem was solved ages ago—kill the despots, surpass capitalism, inclusion of the excluded, equality for all of humanity, end exploitation. The world does not need new ideas. The challenge is simply to realize what we already know to be true. (138-39)

    And thus the tension of The Interface Effect is between this call for withdrawal, to work with what there is, to exploit protocological possibility, etc., and the call for a poetics of control, a poesis capable of representing control societies, which to my mind implies imagination (and thus, inevitably, something different, if not new). If there is anything wanting about the book it is its lack of clarity about how these two critical projects are connected (or indeed, if they are perhaps the same thing!). Further, it is not always clear what exactly Galloway means by “poetics” nor how a need for a poetics corresponds to the book’s emphasis on understanding mediation as process over representation, action over objects. This lack of clarity may be due in part to the fact that, as Galloway indicates in his most recent work, Laruelle: Against the Digital (2014), there is some necessary theorization that he needs to do before he can adequately address the digital head-on. As he writes in the conclusion to that book: “The goal here has not been to elucidate, promote, or disparage contemporary digital technologies, but rather to draft a simple prolegomenon for future writing on digitality and philosophy.”[21] In other words, it seems like Allegories of Control, The Exploit: A Theory of Networks (2007), and Laruelle may constitute the groundwork for an even more ambitious confrontation with the digital, one where the kinds of tensions just noted might dissolve. As such, perhaps the reinvocation of a Bartlebian politics of withdrawal at the end of The Interface Effect is merely a kind of stop-gap, a place-holder before a more coherent poetics of control can emerge (as seems to be the case for the Hardt and Negri of Empire). Although contemporary theorists frequently invoke Bartleby, he remains a rather uninspiring figure.

    These criticisms aside, however, Galloway’s conclusion of the larger project that is Allegories of Control reveals him to be a consistently accessible and powerful guide to the control society and the digital networks of the twenty-first century. If the new directions in his recent work are any indication, and Laruelle is merely a prolegomenon to future projects, then we should perhaps not despair at all about the present lack of a critical language for representing control societies.

    _____

    Bradley J. Fest teaches literature at the University of Pittsburgh. At present he is working on The Nuclear Archive: American Literature Before and After the Bomb, a book investigating the relationship between nuclear and information technology in twentieth and twenty-first century American literature. He has published articles in boundary 2, Critical Quarterly, and Studies in the Novel; and his essays have appeared in David Foster Wallace and “The Long Thing” (2014) and The Silence of Fallout (2013). The Rocking Chair, his first collection of poems, is forthcoming from Blue Sketch Press. He blogs at The Hyperarchival Parallax.

    Back to the essay
    _____

    [1] Though best-known in the Anglophone world via the translation that appeared in 1992 in October as “Postscript on the Societies of Control,” the piece appears as “Postscript on Control Societies,” in Gilles Deleuze, Negotiations: 1972-1990, trans. Martin Joughin (New York: Columbia University Press, 1995), 178. For the original French see Gilles Deleuze, “Post-scriptum sur des sociétés de contrôle,” in Pourparlers, 1972-1990 (Paris: Les Éditions de Minuit, 1990), 240-47. The essay originally appeared as “Les sociétés de contrôle,” L’Autre Journal, no. 1 (May 1990). Further references are to the Negotiations version.

    [2] Ibid.

    [3] Ibid., 179.

    [4] Alexander R. Galloway, Protocol: How Control Exists after Decentralization (Cambridge, MA: MIT Press, 2004), 12n18.

    [5] In his most recent book, Galloway even goes so far as to ask about the “Postscript”: “Could it be that Deleuze’s most lasting legacy will consist of 2,300 words from 1990?” (Alexander R. Galloway, Laruelle: Against the Digital [Minneapolis: University of Minnesota Press, 2014], 96, emphases in original). For Andrew Culp’s review of Laruelle for The b2 Review, see “From the Decision to the Digital.”

    [6] Galloway, Protocol, 147.

    [7] Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia, trans. Brian Massumi (Minneapolis: University of Minnesota Press, 1987), 15; and Alexander R. Galloway and Eugene Thacker, The Exploit: A Theory of Networks (Minneapolis: University of Minnesota Press, 2007), 153. For further discussions of networks see Alexander R. Galloway, “Networks,” in Critical Terms for Media Studies, ed. W. J. T. Mitchell and Mark B. N. Hansen (Chicago: University of Chicago Press), 280-96.

    [8] The other books in the trilogy include Protocol and Alexander R. Galloway, Gaming: Essays on Algorithmic Culture (Minneapolis: University of Minnesota Press, 2006).

    [9] Alexander R. Galloway, The Interface Effect (Malden, MA: Polity, 2012), 98. Hereafter, this work is cited parenthetically.

    [10] See especially Galloway’s masterful first chapter of Gaming, “Gamic Action, Four Moments,” 1-38. To my mind, this is one of the best primers for critically thinking about videogames, and it does much to fundamentally ground the study of videogames in action (rather than, as had previously been the case, in either ludology or narratology).

    [11] See Alexander R. Galloway, “Love of the Middle,” in Excommunication: Three Inquiries in Media and Mediation, by Alexander R. Galloway, Eugene Thacker, and McKenzie Wark (Chicago: University of Chicago Press, 2014), 25-76.

    [12] This is also something he touched on in his remarkable reading of Donald Rumsfeld’s famous “unknown unknowns.” See Alexander R. Galloway, Warcraft and Utopia,” Ctheory.net (16 February 2006). For a discussion of labor in World of Warcraft, see David Golumbia, “Games Without Play,” in “Play,” special issue, New Literary History 40, no. 1 (Winter 2009): 179-204.

    [13] See the following by Jacques Rancière: The Politics of Aesthetics: The Distribution of the Sensible, trans. Gabriel Rockhill (New York: Continuum, 2004), and “Are Some Things Unrepresentable?” in The Future of the Image, trans. Gregory Elliott (New York: Verso, 2007), 109-38.

    [14] Fredric Jameson, Postmodernism; or, the Cultural Logic of Late Capitalism (Durham, NC: Duke University Press, 1991), 38.

    [15] For Galloway’s take on the digital humanities more generally, see his “Everything Is Computational,” Los Angeles Review of Books (27 June 2013), and “The Cybernetic Hypothesis,” differences 25, no. 1 (Spring 2014): 107-31.

    [16] See Patrick Jagoda, introduction to Network Aesthetics (Chicago: University of Chicago Press, forthcoming 2015).

    [17] Galloway’s “whatever being” is derived from Giorgio Agamben, The Coming Community, trans. Michael Hardt (Minneapolis: University of Minnesota Press, 1993).

    [18] Michael Hardt and Antonio Negri, Empire (Cambridge, MA: Harvard University Press, 2000), 203, 204.

    [19] Herman Melville, “Bartleby, The Scrivener: A Story of Wall-street,” in Melville’s Short Novels, critical ed., ed. Dan McCall (New York: W. W. Norton, 2002), 10.

    [20] See Giorgio Agamben, “Bartleby, or On Contingency,” in Potentialities: Collected Essays in Philosophy, trans. and ed. Daniel Heller-Roazen (Stanford: Stanford University Press, 1999), 243-71; and see the following by Slavoj Žižek: Iraq: The Borrowed Kettle (New York: Verso, 2004), esp. 71-73, and The Parallax View (New York: Verso, 2006), esp. 381-85.

    [21] Galloway, Laruelle, 220.

  • Something About the Digital

    Something About the Digital

    By Alexander R. Galloway
    ~

    (This catalog essay was written in 2011 for the exhibition “Chaos as Usual,” curated by Hanne Mugaas at the Bergen Kunsthall in Norway. Artists in the exhibition included Philip Kwame Apagya, Ann Craven, Liz Deschenes, Thomas Julier [in collaboration with Cédric Eisenring and Kaspar Mueller], Olia Lialina and Dragan Espenschied, Takeshi Murata, Seth Price, and Antek Walczak.)

    There is something about the digital. Most people aren’t quite sure what it is. Or what they feel about it. But something.

    In 2001 Lev Manovich said it was a language. For Steven Shaviro, the issue is being connected. Others talk about “cyber” this and “cyber” that. Is the Internet about the search (John Battelle)? Or is it rather, even more primordially, about the information (James Gleick)? Whatever it is, something is afoot.

    What is this something? Given the times in which we live, it is ironic that this term is so rarely defined and even more rarely defined correctly. But the definition is simple: the digital means the one divides into two.

    Digital doesn’t mean machine. It doesn’t mean virtual reality. It doesn’t even mean the computer – there are analog computers after all, like grandfather clocks or slide rules. Digital means the digits: the fingers and toes. And since most of us have a discrete number of fingers and toes, the digital has come to mean, by extension, any mode of representation rooted in individually separate and distinct units. So the natural numbers (1, 2, 3, …) are aptly labeled “digital” because they are separate and distinct, but the arc of a bird in flight is not because it is smooth and continuous. A reel of celluloid film is correctly called “digital” because it contains distinct breaks between each frame, but the photographic frames themselves are not because they record continuously variable chromatic intensities.

    We must stop believing the myth, then, about the digital future versus the analog past. For the digital died its first death in the continuous calculus of Newton and Leibniz, and the curvilinear revolution of the Baroque that came with it. And the digital has suffered a thousand blows since, from the swirling vortexes of nineteenth-century thermodynamics, to the chaos theory of recent decades. The switch from analog computing to digital computing in the middle twentieth century is but a single battle in the multi-millennial skirmish within western culture between the unary and the binary, proportion and distinction, curves and jumps, integration and division – in short, over when and how the one divides into two.

    What would it mean to say that a work of art divides into two? Or to put it another way, what would art look like if it began to meditate on the one dividing into two? I think this is the only way we can truly begin to think about “digital art.” And because of this we shall leave Photoshop, and iMovie, and the Internet and all the digital tools behind us, because interrogating them will not nearly begin to address these questions. Instead look to Ann Craven’s paintings. Or look to the delightful conversation sparked here between Philip Kwame Apagya and Liz Deschenes. Or look to the work of Thomas Julier, even to a piece of his not included in the show, “Architecture Reflecting in Architecture” (2010, made with Cedric Eisenring), which depicts a rectilinear cityscape reflected inside the mirror skins of skyscrapers, just like Saul Bass’s famous title sequence in North By Northwest (1959).

    DSC_0002__560
    Liz Deschenes, “Green Screen 4” (2001)

    All of these works deal with the question of twoness. But it is twoness only in a very particular sense. This is not the twoness of the doppelganger of the romantic period, or the twoness of the “split mind” of the schizophrenic, and neither is it the twoness of the self/other distinction that so forcefully animated culture and philosophy during the twentieth century, particularly in cultural anthropology and then later in poststructuralism. Rather we see here a twoness of the material, a digitization at the level of the aesthetic regime itself.

    Consider the call and response heard across the works featured here by Apagya and Deschenes. At the most superficial level, one might observe that these are works about superimposition, about compositing. Apagya’s photographs exploit one of the oldest and most useful tricks of picture making: superimpose one layer on top of another layer in order to produce a picture. Painters do this all the time of course, and very early on it became a mainstay of photographic technique (even if it often remained relegated to mere “trick” photography), evident in photomontage, spirit photography, and even the side-by-side compositing techniques of the carte de visite popularized by André-Adolphe-Eugène Disdéri in the 1850s. Recall too that the cinema has made productive use of superimposition, adopting the technique with great facility from the theater and its painted scrims and moving backdrops. (Perhaps the best illustration of this comes at the end of A Night at the Opera [1935], when Harpo Marx goes on a lunatic rampage through the flyloft during the opera’s performance, raising and lowering painted backdrops to great comic effect.) So the more “modern” cinematic techniques of, first, rear screen projection, and then later chromakey (known commonly as the “green screen” or “blue screen” effect), are but a reiteration of the much longer legacy of compositing in image making.

    Deschenes’ “Green Screen #4” points to this broad aesthetic history, as it empties out the content of the image, forcing us to acknowledge the suppressed color itself – in this case green, but any color will work. Hence Deschenes gives us nothing but a pure background, a pure something.

    Allowed to curve gracefully off the wall onto the floor, the green color field resembles the “sweep wall” used commonly in portraiture or fashion photography whenever an artist wishes to erase the lines and shadows of the studio environment. “Green Screen #4” is thus the antithesis of what has remained for many years the signal art work about video chromakey, Peter Campus’ “Three Transitions” (1973). Whereas Campus attempted to draw attention to the visual and spatial paradoxes made possible by chromakey, and even in so doing was forced to hide the effect inside the jittery gaps between images, Deschenes by contrast feels no such anxiety, presenting us with the medium itself, minus any “content” necessary to fuel it, minus the powerful mise en abyme of the Campus video, and so too minus Campus’ mirthless autobiographical staging. If Campus ultimately resolves the relationship between images through a version of montage, Deschenes offers something more like a “divorced digitality” in which no two images are brought into relation at all, only the minimal substrate remains, without input or output.

    The sweep wall is evident too in Apagya’s images, only of a different sort, as the artifice of the various backgrounds – in a nod not so much to fantasy as to kitsch – both fuses with and separates from the foreground subject. Yet what might ultimately unite the works by Apagya and Deschenes is not so much the compositing technique, but a more general reference, albeit oblique but nevertheless crucial, to the fact that such techniques are today entirely quotidian, entirely usual. These are everyday folk techniques through and through. One needs only a web cam and simple software to perform chromakey compositing on a computer, just as one might go to the county fair and have one’s portrait superimposed on the body of a cartoon character.

    What I’m trying to stress here is that there is nothing particularly “technological” about digitality. All that is required is a division from one to two – and by extension from two to three and beyond to the multiple. This is why I see layering as so important, for it spotlights an internal separation within the image. Apagya’s settings are digital, therefore, simply by virtue of the fact that he addresses our eye toward two incompatible aesthetic zones existing within the image. The artifice of a painted backdrop, and the pose of a person in a portrait.

    Certainly the digital computer is “digital” by virtue of being binary, which is to say by virtue of encoding and processing numbers at the lowest levels using base-two mathematics. But that is only the most prosaic and obvious exhibit of its digitality. For the computer is “digital” too in its atomization of the universe, into, for example, a million Facebook profiles, all equally separate and discrete. Or likewise “digital” too in the computer interface itself which splits things irretrievably into cursor and content, window and file, or even, as we see commonly in video games, into heads-up-display and playable world. The one divides into two.

    So when clusters of repetition appear across Ann Craven’s paintings, or the iterative layers of the “copy” of the “reconstruction” in the video here by Thomas Julier and Cédric Eisenring, or the accumulations of images that proliferate in Olia Lialina and Dragon Espenschied’s “Comparative History of Classic Animated GIFs and Glitter Graphics” [2007] (a small snapshot of what they have assembled in their spectacular book from 2009 titled Digital Folklore), or elsewhere in works like Oliver Laric’s clipart videos (“787 Cliparts” [2006] and “2000 Cliparts” [2010]), we should not simply recall the famous meditations on copies and repetitions, from Walter Benjamin in 1936 to Gilles Deleuze in 1968, but also a larger backdrop that evokes the very cleavages emanating from western metaphysics itself from Plato onward. For this same metaphysics of division is always already a digital metaphysics as it forever differentiates between subject and object, Being and being, essence and instance, or original and repetition. It shouldn’t come as a surprise that we see here such vivid aesthetic meditations on that same cleavage, whether or not a computer was involved.

    Another perspective on the same question would be to think about appropriation. There is a common way of talking about Internet art that goes roughly as follows: the beginning of net art in the middle to late 1990s was mostly “modernist” in that it tended to reflect back on the possibilities of the new medium, building an aesthetic from the material affordances of code, screen, browser, and jpeg, just as modernists in painting or literature built their own aesthetic style from a reflection on the specific affordances of line, color, tone, or timbre; whereas the second phase of net art, coinciding with “Web 2.0” technologies like blogging and video sharing sites, is altogether more “postmodern” in that it tends to co-opt existing material into recombinant appropriations and remixes. If something like the “WebStalker” web browser or the Jodi.org homepage are emblematic of the first period, then John Michael Boling’s “Guitar Solo Threeway,” Brody Condon’s “Without Sun,” or the Nasty Nets web surfing club, now sadly defunct, are emblematic of the second period.

    I’m not entirely unsatisfied by such a periodization, even if it tends to confuse as many things as it clarifies – not entirely unsatisfied because it indicates that appropriation too is a technique of digitality. As Martin Heidegger signals, by way of his notoriously enigmatic concept Ereignis, western thought and culture was always a process in which a proper relationship of belonging is established in a world, and so too appropriation establishes new relationships of belonging between objects and their contexts, between artists and materials, and between viewers and works of art. (Such is the definition of appropriation after all: to establish a belonging.) This is what I mean when I say that appropriation is a technique of digitality: it calls out a distinction in the object from “where it was prior” to “where it is now,” simply by removing that object from one context of belonging and separating it out into another. That these two contexts are merely different – that something has changed – is evidence enough of the digitality of appropriation. Even when the act of appropriation does not reduplicate the object or rely on multiple sources, as with the artistic ready-made, it still inaugurates a “twoness” in the appropriated object, an asterisk appended to the art work denoting that something is different.

    TMu_Cyborg_2011_18-1024x682
    Takeshi Murata, “Cyborg” (2011)

    Perhaps this is why Takeshi Murata continues his exploration of the multiplicities at the core of digital aesthetics by returning to that age old format, the still life. Is not the still life itself a kind of appropriation, in that it brings together various objects into a relationship of belonging: fig and fowl in the Dutch masters, or here the various detritus of contemporary cyber culture, from cult films to iPhones?

    Because appropriation brings things together it must grapple with a fundamental question. Whatever is brought together must form a relation. These various things must sit side-by-side with each other. Hence one might speak of any grouping of objects in terms of their “parallel” nature, that is to say, in terms of the way in which they maintain their multiple identities in parallel.

    But let us dwell for a moment longer on these agglomerations of things, and in particular their “parallel” composition. By parallel I mean the way in which digital media tend to segregate and divide art into multiple, separate channels. These parallel channels may be quite manifest, as in the separate video feeds that make up the aforementioned “Guitar Solo Threeway,” or they may issue from the lowest levels of the medium, as when video compression codecs divide the moving image into small blocks of pixels that move and morph semi-autonomously within the frame. In fact I have found it useful to speak of this in terms of the “parallel image” in order to differentiate today’s media making from that of a century ago, which Friedrich Kittler and others have chosen to label “serial” after the serial sequences of the film strip, or the rat-ta-tat-tat of a typewriter.

    Thus films like Tatjana Marusic’s “The Memory of a Landscape” (2004) or Takeshi Murata’s “Monster Movie” (2005) are genuinely digital films, for they show parallelity in inscription. Each individual block in the video compression scheme has its own autonomy and is able to write to the screen in parallel with all the other blocks. These are quite literally, then, “multichannel” videos – we might even take a cue from online gaming circles and label them “massively multichannel” videos. They are multichannel not because they require multiple monitors, but because each individual block or “channel” within the image acts as an individual micro video feed. Each color block is its own channel. Thus, the video compression scheme illustrates, through metonymy, how pixel images work in general, and, as I suggest, it also illustrates the larger currents of digitality, for it shows that these images, in order to create “an” image must first proliferate the division of sub-images, which themselves ultimately coalesce into something resembling a whole. In other words, in order to create a “one” they must first bifurcate the single image source into two or more separate images.

    The digital image is thus a cellular and discrete image, consisting of separate channels multiplexed in tandem or triplicate or, greater, into nine, twelve, twenty-four, one hundred, or indeed into a massively parallel image of a virtually infinite visuality.

    For me this generates a more appealing explanation for why art and culture has, over the last several decades, developed a growing anxiety over copies, repetitions, simulations, appropriations, reenactments – you name it. It is common to attribute such anxiety to a generalized disenchantment permeating modern life: our culture has lost its aura and can no longer discern an original from a copy due to endless proliferations of simulation. Such an assessment is only partially correct. I say only partially because I am skeptical of the romantic nostalgia that often fuels such pronouncements. For who can demonstrate with certainty that the past carried with it a greater sense of aesthetic integrity, a greater unity in art? Yet the assessment begins to adopt a modicum of sense if we consider it from a different point of view, from the perspective of a generalized digitality. For if we define the digital as “the one dividing into two,” then it would be fitting to witness works of art that proliferate these same dualities and multiplicities. In other words, even if there was a “pure” aesthetic origin it was a digital origin to begin with. And thus one needn’t fret over it having infected our so-called contemporary sensibilities.

    Instead it is important not to be blinded by the technology. But rather to determine that, within a generalized digitality, there must be some kind of differential at play. There must be something different, and without such a differential it is impossible to say that something is something (rather than something else, or indeed rather than nothing). The one must divide into something else. Nothing less and nothing more is required, only a generic difference. And this is our first insight into the “something” of the digital.

    _____

    Alexander R. Galloway is a writer and computer programer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay

  • Is the Network a Brain?

    Is the Network a Brain?

    Pickering, Cybernetic Braina review of Andrew Pickering, The Cybernetic Brain: Sketches of Another Future (University of Chicago Press, 2011)
    by Jonathan Goodwin
    ~

    Evgeny Morozov’s recent New Yorker article about Project Cybersyn in Allende’s Chile caused some controversy when critics accused Morozov of not fully acknowledging his sources. One of those sources was sociologist of science Andrew Pickering’s The Cybernetic Brain. Morozov is quoted as finding Pickering’s book “awful.” It’s unlikely that Morozov meant “awful” in the sense of “awe-inspiring,” but that was closer to my reaction after reading Pickering’s 500+ pp. work on the British tradition in cybernetics. This tradition was less militarist and more artistic, among other qualities, in Pickering’s account, than is popularly understood. I found myself greatly intrigued—if not awed—by the alternate future that his subtitle and final chapter announces. Cybernetics is now a largely forgotten dead-end in science. And the British tradition that Pickering describes had relatively little influence within cybernetics itself. So what is important about it now, and what is the nature of this other future that Pickering sketches?

    The major figures of this book, which proceeds with overviews of their careers, views, and accomplishments, are Grey Walter, Ross Ashby, Gregory Bateson, R. D. Laing, Stafford Beer, and Gordon Pask. Stuart Kauffman’s and Stephen Wolfram’s work on complexity theory also makes an appearance.[1] Laing and Bateson’s relevance may not be immediately clear. Pickering’s interest in them derives from their extension of cybernetic ideas to the emerging technologies of the self in the 1960s. Both Bateson and Laing approached schizophrenia as an adaptation to the increasing “double-binds” of Western culture, and both looked to Eastern spiritual traditions and chemical methods of consciousness-alteration as potential treatments. The Bateson and Laing material makes the most direct reference to the connection between the cybernetic tradition and the “Californian Ideology” that animates much Silicon Valley thinking. Stewart Brand was influenced by Bateson’s Steps to an Ecology of Mind (183), for example. Pickering identifies Northern California as the site where cybernetics migrated into the counterculture. As a technology of control, it is arguable that this countercultural migration has become part of the ruling ideology of the present moment. Pickering recognizes this but seems to concede that the inherent topicality would detract from the focus of his work. It is a facet that would be of interest to the readers of this “Digital Studies” section of The b2 Review, however, and I will thus return to it at the end of this review.

    Pickering’s path to Bateson and Laing originates with Grey Walter’s and Ross Ashby’s pursuit of cybernetic models of the brain. Computational models of the brain, though originally informed by cybernetic research, quickly replaced it in Pickering’s account (62). He asks why computational models of the brain quickly gathered so much cultural interest. Rodney Brooks’s robots, with their more embodied approach, Pickering argues, are in the tradition of Walter’s tortoises and outside the symbolic tradition of artificial intelligence. I find it noteworthy that the neurological underpinnings of early cybernetics were so strongly influenced by behaviorism. Computationalist approaches, associated by Pickering with the establishment or “royal” science, here, were intellectually formed by an attack on behaviorism. Pickering even addresses this point obliquely, when he wonders why literary scholars had not noticed that the octopus in Gravity’s Rainbow was apparently named “Grigori” in homage to Gregory Bateson (439n13).[2] I think one reason this hasn’t been noticed is that it’s much more likely that the name was random but for its Slavic form, which is clearly in the same pattern of references to Russian behaviorist psychology that informs Pynchon’s novel. An offshoot of behaviorism inspiring a countercultural movement devoted to freedom and experimentation seems peculiar.

    One of Pickering’s key insights into this alternate tradition of cybernetics is that its science is performative. Rather than being as theory-laden as are the strictly computationalist approaches, cybernetic science often studied complex systems as assemblages whose interactions generated novel insights. Contrast this epistemology to what critics point to as the frequent invocation of the Duhem-Quine thesis by Noam Chomsky.[3] For Pickering, Ross Ashby’s version of cybernetics was a “supremely general and protean science” (147). As it developed, the brain lost its central place and cybernetics became a “freestanding general science” (147). As I mentioned, the chapter on Ashby closes with a consideration of the complexity science of Stuart Kauffman and Stephen Wolfram. That Kauffman and Wolfram largely have worked outside mainstream academic institutions is important for Pickering.[4] Christopher Alexander’s pattern language in architecture is a third example. Pickering mentions that Alexander’s concept was influential in some areas of computer science; the notion of “object-oriented programming” is sometimes considered to have been influenced by Alexander’s ideas.

    I mention this connection because many of the alternate traditions in cybernetics have become mainstream influences in contemporary digital culture. It is difficult to imagine Laing and Bateson’s alternative therapeutic ideas having any resonance in that culture, however. The doctrine that “selves are endlessly complex and endlessly explorable” (211) is sometimes proposed as something the internet facilitates, but the inevitable result of anonymity and pseudonymity in internet discourse is the enframing of hierarchical relations. I realize this point may sound controversial to those with a more benign or optimistic view of digital culture. That this countercultural strand of cybernetic practice has clear parallels with much digital libertarian rhetoric is hard to dispute. Again, Pickering is not concerned in the book with tracing these contemporary parallels. I mention them because of my own interest and this venue’s presumed interest in the subject.

    The progression that begins with some variety of conventional rationalism, extends through a career in cybernetics, and ends in some variety of mysticism is seen with almost all of the figures that Pickering profiles in The Cybernetic Brain. Perhaps the clearest example—and most fascinating in general—is that of Stafford Beer. Philip Mirowski’s review of Pickering’s book refers to Beer as “a slightly wackier Herbert Simon.” Pickering enjoys recounting the adventures of the wizard of Prang, a work that Beer composed after he had moved to a remote Welsh village and renounced many of the world’s pleasures. Beer’s involvement in Project Cybersyn makes him perhaps the most well-known of the figures profiled in this book.[5] What perhaps fascinate Pickering more than anything else in Beer’s work is the concept of viability. From early in his career, Beer advocated for upwardly viable management strategies. The firm would not need a brain, in his model, “it would react to changing circumstances; it would grow and evolve like an organism or species, all without any human intervention at all” (225). Mirowski’s review compares Beer to Friedrich Hayek and accuses Pickering of refusing to engage with this seemingly obvious intellectual affinity.[6] Beer’s intuitions in this area led him to experiment with biological and ecological computing; Pickering surmises that Douglas Adams’s superintelligent mice derived from Beer’s murine experiments in this area (241).

    In a review of a recent translation of Stanislaw Lem’s Summa Technologiae, Pickering mentions that natural adaptive systems being like brains and being able to be utilized for intelligence amplification is the most “amazing idea in the history of cybernetics” (247).[7] Despite its association with the dreaded “synergy” (the original “syn” of Project Cybersyn), Beer’s viable system model never became a management fad (256). Alexander Galloway has recently written here about the “reticular fallacy,” the notion that de-centralized forms of organization are necessarily less repressive than are centralized or hierachical forms. Beer’s viable system model proposes an emergent and non-hierarchical management system that would increase the general “eudemony” (general well-being, another of Beer’s not-quite original neologisms [272]). Beer’s turn towards Tantric mysticism seems somehow inevitable in Pickering’s narrative of his career. The syntegric icosahedron, one of Beer’s late baroque flourishes, reminded me quite a bit of a Paul Laffoley painting. Syntegration as a concept takes reticularity to a level of mysticism rarely achieved by digital utopians. Pickering concludes the chapter on Beer with a discussion of his influence on Brian Eno’s ambient music.

    Laffoley, "The Orgone Motor"
    Paul Laffoley, “The Orgone Motor” (1981). Image source: paullaffoley.net.

    The discussion of Eno chides him for not reading Gordon Pask’s explicitly aesthetic cybernetics (308). Pask is the final cybernetician of Pickering’s study and perhaps the most eccentric. Pickering describes him as a model for Patrick Troughton’s Dr. Who (475n3), and his synaesthetic work in cybernetics with projects like the Musicolor are explicitly theatrical. A theatrical performance that directly incorporates audience feedback into the production, not just at the level of applause or hiss, but in audience interest in a particular character—a kind of choose-your-own adventure theater—was planned with Joan Littlewood (348-49). Pask’s work in interface design has been identified as an influence on hypertext (464n17). A great deal of the chapter on Pask involves his influence on British countercultural arts and architecture movements in the 1960s. Mirowski’s review shortly notes that even the anti-establishment Gordon Pask was funded by the Office of Naval Research for fifteen years (194). Mirowski also accuses Pickering of ignoring the computer as the emblematic cultural artifact of the cybernetic worldview (195). Pask is the strongest example offered of an alternate future of computation and social organization, but it is difficult to imagine his cybernetic present.

    The final chapter of Pickering’s book is entitled “Sketches of Another Future.” What is called “maker culture” combined with the “internet of things” might lead some prognosticators to imagine an increasingly cybernetic digital future. Cybernetic, that is, not in the sense of increasing what Mirowski refers to as the neoliberal “background noise of modern culture” but as a “challenge to the hegemony of modernity” (393). Before reading Pickering’s book, I would have regarded such a prediction with skepticism. I still do, but Pickering has argued that an alternate—and more optimistic—perspective is worth taking seriously.

    _____

    Jonathan Goodwin is Associate Professor of English at the University of Louisiana, Lafayette. He is working on a book about cultural representations of statistics and probability in the twentieth century.

    Back to the essay

    _____

    [1] Wolfram was born in England, though he has lived in the United States since the 1970s. Pickering taught at the University of Illinois while this book was being written, and he mentions having several interviews with Wolfram, whose company Wolfram Research is based in Champaign, Illinois (457n73). Pickering’s discussion of Wolfram’s A New Kind of Science is largely neutral; for a more skeptical view, see Cosma Shalizi’s review.

    [2] Bateson experimented with octopuses, as Pickering describes. Whether Pynchon knew about this, however, remains doubtful. Pickering’s note may also be somewhat facetious.

    [3] See the interview with George Lakoff in Ideology and Linguistic Theory: Noam Chomsky and the Deep Structure Debates, ed. Geoffrey J. Huck and John A. Goldsmith (New York: Routledge, 1995), p. 115. Lakoff’s account of Chomsky’s philosophical justification for his linguistic theories is tendentious; I mention it here because of the strong contrast, even in caricature, with the performative quality of the cybernetic research Pickering describes. (1999).

    [4] Though it is difficult to think of the Santa Fe Institute this way now.

    [5] For a detailed cultural history of Project Cybersyn, see Eden Medina, Cybernetic Revolutionaries: Technology and Politics in Allende’s Chile (MIT Press, 2011). Medina notes that Beer formed the word “algedonic” from two words meaning “pain” and “pleasure,” but the OED notes an example in the same sense from 1894. This citation does not rule out independent coinage, of course. Curiously enough, John Fowles uses the term in The Magus (1966), where it could have easily been derived from Beer.

    [6] Hayek’s name appears neither in the index nor the reference list. It does seem a curious omission in the broader intellectual context of cybernetics.

    [7] Though there is a reference to Lem’s fiction in an endnote (427n25), Summa Technologiae, a visionary exploration of cybernetic philosophy dating from the early 1960s, does not appear in Pickering’s work. A complete English translation only recently appeared, and I know of no evidence that Pickering’s principal figures were influenced by Lem at all. The book, as Pickering’s review acknowledges, is astonishingly prescient and highly recommended for anyone interested in the culture of cybernetics.

  • Network Pessimism

    Network Pessimism

    By Alexander R. Galloway
    ~

    I’ve been thinking a lot about pessimism recently. Eugene Thacker has been deep in this material for some time already. In fact he has a new, lengthy manuscript on pessimism called Infinite Resignation, which is a bit of departure from his other books in terms of tone and structure. I’ve read it and it’s excellent. Definitely “the worst” he’s ever written! Following the style of other treatises from the history of philosophical pessimism–Leopardi, Cioran, Schopenhauer, Kierkegaard, and others–the bulk of the book is written in short aphorisms. It’s very poetic language, and some sections are driven by his own memories and meditations, all in an attempt to plumb the deepest, darkest corners of the worst the universe has to offer.

    Meanwhile, the worst can’t stay hidden. Pessimism has made it to prime time, to NPR, and even right-wing media. Despite all this attention, Eugene seems to have little interest in showing his manuscript to publishers. A true pessimist! Not to worry, I’m sure the book will see the light of day eventually. Or should I say dead of night? When it does, the book is sure to sadden, discourage, and generally worsen the lives of Thacker fans everywhere.

    Interestingly pessimism also appears in a number of other authors and fields. I’m thinking, for instance, of critical race theory and the concept of Afro-pessimism. The work of Fred Moten and Frank B. Wilderson, III is particularly interesting in that regard. Likewise queer theory has often wrestled with pessimism, be it the “no future” debates around reproductive futurity, or what Anna Conlan has simply labeled “homo-pessimism,” that is, the way in which the “persistent association of homosexuality with death and oppression contributes to a negative stereotype of LGBTQ lives as unhappy and unhealthy.”[1]

    In his review of my new book, Andrew Culp made reference to how some of this material has influenced me. I’ll be posting more on Moten and these other themes in the future, but let me here describe, in very general terms, how the concept of pessimism might apply to contemporary digital media.

    *

    A previous post was devoted to the reticular fallacy, defined as the false assumption that the erosion of hierarchical organization leads to an erosion of organization as such. Here I’d like to address the related question of reticular pessimism or, more simply, network pessimism.

    Network pessimism relies on two basic assumptions: (1) “everything is a network”; (2) “the best response to networks is more networks.”

    Who says everything is a network? Everyone, it seems. In philosophy, Bruno Latour: ontology is a network. In literary studies, Franco Moretti: Hamlet is a network. In the military, Donald Rumsfeld: the battlefield is a network. (But so too our enemies are networks: the terror network.) Art, architecture, managerial literature, computer science, neuroscience, and many other fields–all have shifted prominently in recent years toward a network model. Most important, however, is the contemporary economy and the mode of production. Today’s most advanced companies are essentially network companies. Google monetizes the shape of networks (in part via clustering algorithms). Facebook has rewritten subjectivity and social interaction along the lines of canalized and discretized network services. The list goes on and on. Thus I characterize the first assumption — “everything is a network” — as a kind of network fundamentalism. It claims that whatever exists in the world appears naturally in the form of a system, an ecology, an assemblage, in short, as a network.

    Ladies and gentlemen, behold the good news, postmodernism is definitively over! We have a new grand récit. As metanarrative, the network will guide us into a new Dark Age.

    If the first assumption expresses a positive dogma or creed, the second is more negative or nihilistic. The second assumption — that the best response to networks is more networks — is also evident in all manner of social and political life today. Eugene and I described this phenomena at greater length in The Exploit, but consider a few different examples from contemporary debates… In military theory: network-centric warfare is the best response to terror networks. In Deleuzian philosophy: the rhizome is the best response to schizophrenic multiplicity. In autonomist Marxism: the multitude is the best response to empire. In the environmental movement: ecologies and systems are the best response to the systemic colonization of nature. In computer science: distributed architectures are the best response to bottlenecks in connectivity. In economics: heterogenous “economies of scope” are the best response to the distributed nature of the “long tail.”

    To be sure, there are many sites today where networks still confront power centers. The point is not to deny the continuing existence of massified, centralized sovereignty. But at the same time it’s important to contextualize such confrontations within a larger ideological structure, one that inoculates the network form and recasts it as the exclusive site of liberation, deviation, political maturation, complex thinking, and indeed the very living of life itself.

    Why label this a pessimism? For the same reasons that queer theory and critical race theory are grappling with pessimism: Is alterity a death sentence? Is this as good as it gets? Is this all there is? Can we imagine a parallel universe different from this one? (Although the pro-pessimism camp would likely state it in the reverse: We must destabilize and annihilate all normative descriptions of the “good.” This world isn’t good, and hooray for that!)

    So what’s the problem? Why should we be concerned about network pessimism? Let me state clearly so there’s no misunderstanding, pessimism isn’t the problem here. Likewise, networks are not the problem. (Let no one label me “anti network” nor “anti pessimism” — in fact I’m not even sure what either of those positions would mean.) The issue, as I see it, is that network pessimism deploys and sustains a specific dogma, confining both networks and pessimism to a single, narrow ideological position. It’s this narrow-mindedness that should be questioned.

    Specifically I can see three basic problems with network pessimism, the problem of presentism, the problem of ideology, and the problem of the event.

    The problem of presentism refers to the way in which networks and network thinking are, by design, allergic to historicization. This exhibits itself in a number of different ways. Networks arrive on the scene at the proverbial “end of history” (and they do so precisely because they help end this history). Ecological and systems-oriented thinking, while admittedly always temporal by nature, gained popularity as a kind of solution to the problems of diachrony. Space and landscape take the place of time and history. As Fredric Jameson has noted, the “spatial turn” of postmodernity goes hand in hand with a denigration of the “temporal moment” of previous intellectual movements.

    man machines buy fritz kahn
    Fritz Kahn, “Der Mensch als Industriepalast (Man as Industrial Palace)” (Stuttgart, 1926). Image source: NIH

    From Hegel’s history to Luhmann’s systems. From Einstein’s general relativity to Riemann’s complex surfaces. From phenomenology to assemblage theory. From the “time image” of cinema to the “database image” of the internet. From the old mantra always historicize to the new mantra always connect.

    During the age of clockwork, the universe was thought to be a huge mechanism, with the heavens rotating according to the music of the spheres. When the steam engine was the source of newfound power, the world suddenly became a dynamo of untold thermodynamic force. After full-fledged industrialization, the body became a factory. Technologies and infrastructures are seductive metaphors. So it’s no surprise (and no coincidence) that today, in the age of the network, a new template imprints itself on everything in sight. In other words, the assumption “everything is a network” gradually falls apart into a kind of tautology of presentism. “Everything right now is a network…because everything right now has been already defined as a network.”

    This leads to the problem of ideology. Again we’re faced with an existential challenge, because network technologies were largely invented as a non-ideological or extra-ideological structure. When writing Protocol I interviewed some of the computer scientists responsible for the basic internet protocols and most of them reported that they “have no ideology” when designing networks, that they are merely interested in “code that works” and “systems that are efficient and robust.” In sociology and philosophy of science, figures like Bruno Latour routinely describe their work as “post-critical,” merely focused on the direct mechanisms of network organization. Hence ideology as a problem to be forgotten or subsumed: networks are specifically conceived and designed as those things that both are non-ideological in their conception (we just want to “get things done”), but also post-ideological in their architecture (in that they acknowledge and co-opt the very terms of previous ideological debates, things like heterogeneity, difference, agency, and subject formation).

    The problem of the event indicates a crisis for the very concept of events themselves. Here the work of Alain Badiou is invaluable. Network architectures are the perfect instantiation of what Badiou derisively labels “democratic materialism,” that is, a world in which there are “only bodies and languages.” In Badiou’s terms, if networks are the natural state of the situation and there is no way to deviate from nature, then there is no event, and hence no possibility for truth. Networks appear, then, as the consummate “being without event.”

    What could be worse? If networks are designed to accommodate massive levels of contingency — as with the famous Robustness Principle — then they are also exceptionally adept at warding off “uncontrollable” change wherever it might arise. If everything is a network, then there’s no escape, there’s no possibility for the event.

    Jameson writes as much in The Seeds of Time when he says that it is easier to imagine the end of the earth and the end of nature than it is to imagine the ends of capitalism. Network pessimism, in other words, is really a kind of network defeatism in that it makes networks the alpha and omega of our world. It’s easier to imagine the end of that world than it is to discard the network metaphor and imagine a kind of non-world in which networks are no longer dominant.

    In sum, we shouldn’t give in to network pessimism. We shouldn’t subscribe to the strong claim that everything is a network. (Nor should we subscribe to the softer claim, that networks are merely the most common, popular, or natural architecture for today’s world.) Further, we shouldn’t think that networks are the best response to networks. Instead we must ask the hard questions. What is the political fate of networks? Did heterogeneity and systematicity survive the Twentieth Century? If so, at what cost? What would a non-net look like? And does thinking have a future without the network as guide?

    _____

    Alexander R. Galloway is a writer and computer programer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay
    _____

    Notes

    [1] Anna Conlan, “Representing Possibility: Mourning, Memorial, and Queer Museology,” in Gender, Sexuality and Museums, ed. Amy K. Levin (London: Routledge, 2010). 253-263.

  • The Reticular Fallacy

    The Reticular Fallacy

    By Alexander R. Galloway
    ~

    We live in an age of heterogenous anarchism. Contingency is king. Fluidity and flux win over solidity and stasis. Becoming has replaced being. Rhizomes are better than trees. To be political today, one must laud horizontality. Anti-essentialism and anti-foundationalism are the order of the day. Call it “vulgar ’68-ism.” The principles of social upheaval, so associated with the new social movements in and around 1968, have succeed in becoming the very bedrock of society at the new millennium.

    But there’s a flaw in this narrative, or at least a part of the story that strategically remains untold. The “reticular fallacy” can be broken down into two key assumptions. The first is an assumption about the nature of sovereignty and power. The second is an assumption about history and historical change. Consider them both in turn.

    (1) First, under the reticular fallacy, sovereignty and power are defined in terms of verticality, centralization, essence, foundation, or rigid creeds of whatever kind (viz. dogma, be it sacred or secular). Thus the sovereign is the one who is centralized, who stands at the top of a vertical order of command, who rests on an essentialist ideology in order to retain command, who asserts, dogmatically, unchangeable facts about his own essence and the essence of nature. This is the model of kings and queens, but also egos and individuals. It is what Barthes means by author in his influential essay “Death of the Author,” or Foucault in his “What is an Author?” This is the model of the Prince, so often invoked in political theory, or the Father invoked in psycho-analytic theory. In Derrida, the model appears as logos, that is, the special way or order of word, speech, and reason. Likewise, arkhe: a term that means both beginning and command. The arkhe is the thing that begins, and in so doing issues an order or command to guide whatever issues from such a beginning. Or as Rancière so succinctly put it in his Hatred of Democracy, the arkhe is both “commandment and commencement.” These are some of the many aspects of sovereignty and power as defined in terms of verticality, centralization, essence, and foundation.

    (2) The second assumption of the reticular fallacy is that, given the elimination of such dogmatic verticality, there will follow an elimination of sovereignty as such. In other words, if the aforementioned sovereign power should crumble or fall, for whatever reason, the very nature of command and organization will also vanish. Under this second assumption, the structure of sovereignty and the structure of organization become coterminous, superimposed in such a way that the shape of organization assumes the identical shape of sovereignty. Sovereign power is vertical, hence organization is vertical; sovereign power is centralized, hence organization is centralized; sovereign power is essentialist, hence organization, and so on. Here we see the claims of, let’s call it, “naïve” anarchism (the non-arkhe, or non foundation), which assumes that repressive force lies in the hands of the bosses, the rulers, or the hierarchy per se, and thus after the elimination of such hierarchy, life will revert so a more direct form of social interaction. (I say this not to smear anarchism in general, and will often wish to defend a form of anarcho-syndicalism.) At the same time, consider the case of bourgeois liberalism, which asserts the rule of law and constitutional right as a way to mitigate the excesses of both royal fiat and popular caprice.

    reticular connective tissue
    source: imgkid.com

    We name this the “reticular” fallacy because, during the late Twentieth Century and accelerating at the turn of the millennium with new media technologies, the chief agent driving the kind of historical change described in the above two assumptions was the network or rhizome, the structure of horizontal distribution described so well in Deleuze and Guattari. The change is evident in many different corners of society and culture. Consider mass media: the uni-directional broadcast media of the 1920s or ’30s gradually gave way to multi-directional distributed media of the 1990s. Or consider the mode of production, and the shift from a Fordist model rooted in massification, centralization, and standardization, to a post-Fordist model reliant more on horizontality, distribution, and heterogeneous customization. Consider even the changes in theories of the subject, shifting as they have from a more essentialist model of the integral ego, however fraught by the volatility of the unconscious, to an anti-essentialist model of the distributed subject, be it postmodernism’s “schizophrenic” subject or the kind of networked brain described by today’s most advanced medical researchers.

    Why is this a fallacy? What is wrong about the above scenario? The problem isn’t so much with the historical narrative. The problem lies in an unwillingness to derive an alternative form of sovereignty appropriate for the new rhizomatic societies. Opponents of the reticular fallacy claim, in other words, that horizontality, distributed networks, anti-essentialism, etc., have their own forms of organization and control, and indeed should be analyzed accordingly. In the past I’ve used the concept of “protocol” to describe such a scenario as it exists in digital media infrastructure. Others have used different concepts to describe it in different contexts. On the whole, though, opponents of the reticular fallacy have not effectively made their case, myself included. The notion that rhizomatic structures are corrosive of power and sovereignty is still the dominant narrative today, evident across both popular and academic discourses. From talk of the “Twitter revolution” during the Arab Spring, to the ideologies of “disruption” and “flexibility” common in corporate management speak, to the putative egalitarianism of blog-based journalism, to the growing popularity of the Deleuzian and Latourian schools in philosophy and theory: all of these reveal the contemporary assumption that networks are somehow different from sovereignty, organization, and control.

    To summarize, the reticular fallacy refers to the following argument: since power and organization are defined in terms of verticality, centralization, essence, and foundation, the elimination of such things will prompt a general mollification if not elimination of power and organization as such. Such an argument is false because it doesn’t take into account the fact that power and organization may inhabit any number of structural forms. Centralized verticality is only one form of organization. The distributed network is simply a different form of organization, one with its own special brand of management and control.

    Consider the kind of methods and concepts still popular in critical theory today: contingency, heterogeneity, anti-essentialism, anti-foundationalism, anarchism, chaos, plasticity, flux, fluidity, horizontality, flexibility. Such concepts are often praised and deployed in theories of the subject, analyses of society and culture, even descriptions of ontology and metaphysics. The reticular fallacy does not invalidate such concepts. But it does put them in question. We can not assume that such concepts are merely descriptive or neutrally empirical. Given the way in which horizontality, flexibility, and contingency are sewn into the mode of production, such “descriptive” claims are at best mirrors of the economic infrastructure and at worse ideologically suspect. At the same time, we can not simply assume that such concepts are, by nature, politically or ethically desirable in themselves. Rather, we ought to reverse the line of inquiry. The many qualities of rhizomatic systems should be understood not as the pure and innocent laws of a newer and more just society, but as the basic tendencies and conventional rules of protocological control.


    _____

    Alexander R. Galloway is a writer and computer programer working on issues in philosophy, technology, and theories of mediation. Professor of Media, Culture, and Communication at New York University, he is author of several books and dozens of articles on digital media and critical theory, including Protocol: How Control Exists after Decentralization (MIT, 2006), Gaming: Essays in Algorithmic Culture (University of Minnesota, 2006); The Interface Effect (Polity, 2012), and most recently Laruelle: Against the Digital (University of Minnesota, 2014), reviewed here earlier in 2014. Galloway has recently been writing brief notes on media and digital culture and theory at his blog, on which this post first appeared.

    Back to the essay