Author: boundary2

  • The Eversion of the Digital Humanities

    The Eversion of the Digital Humanities

    image
    by Brian Lennon

    on The Emergence of the Digital Humanities by Steven E. Jones

    1

    Steven E. Jones begins his Introduction to The Emergence of the Digital Humanities (Routledge, 2014) with an anecdote concerning a speaking engagement at the Illinois Institute of Technology in Chicago. “[M]y hosts from the Humanities department,” Jones tells us,

    had also arranged for me to drop in to see the fabrication and rapid-prototyping lab, the Idea Shop at the University Technology Park. In one empty room we looked into, with schematic drawings on the walls, a large tabletop machine jumped to life and began whirring, as an arm with a router moved into position. A minute later, a student emerged from an adjacent room and adjusted something on the keyboard and monitor attached by an extension arm to the frame for the router, then examined an intricately milled block of wood on the table. Next door, someone was demonstrating finely machined parts in various materials, but mostly plastic, wheels within bearings, for example, hot off the 3D printer….

    What exactly, again, was my interest as a humanist in taking this tour, one of my hosts politely asked?1

    It is left almost entirely to more or less clear implication, here, that Jones’s humanities department hosts had arranged the expedition at his request, and mainly or even only to oblige a visitor’s unusual curiosity, which we are encouraged to believe his hosts (if “politely”) found mystifying. Any reader of this book must ask herself, first, if she believes this can really have occurred as reported: and if the answer to that question is yes, if such a genuinely unlikely and unusual scenario — the presumably full-time, salaried employees of an Institute of Technology left baffled by a visitor’s remarkable curiosity about their employer’s very raison d’être — warrants any generalization at all. For that is how Jones proceeds: by generalization, first of all from a strained and improbably dramatic attempt at defamiliarization, in the apparent confidence that this anecdote illuminating the spirit of the digital humanities will charm — whom, exactly?

    It must be said that Jones’s history of “digital humanities” is refreshingly direct and initially, at least, free of obfuscation, linking the emergence of what it denotes to events in roughly the decade preceding the book’s publication, though his reading of those events is tendentious. It was the “chastened” retrenchment after the dot-com bubble in 2000, Jones suggests (rather, just for example, than the bubble’s continued inflation by other means) that produced the modesty of companies like our beloved Facebook and Twitter, along with their modest social networking platform-products, as well as the profound modesty of Google Inc. initiatives like Google Books (“a development of particular interest to humanists,” we are told2) and Google Maps. Jones is clearer-headed when it comes to the disciplinary history of “digital humanities” as a rebaptism of humanities computing and thus — though he doesn’t put it this way — a catachrestic asseveration of traditional (imperial-nationalist) philology like its predecessor:

    It’s my premise that what sets DH apart from other forms of media studies, say, or other approaches to the cultural theory of computing, ultimately comes through its roots in (often text-based) humanities computing, which always had a kind of mixed-reality focus on physical artifacts and archives.3

    Jones is also clear-headed on the usage history of “digital humanities” as a phrase in the English language, linking it to moments of consolidation marked by Blackwell’s Companion to Digital Humanities, the establishment of the National Endowment for the Humanities Office for the Digital Humanities, and higher-education journalism covering the annual Modern Language Association of America conventions. It is perhaps this sensitivity to “digital humanities” as a phrase whose roots lie not in original scholarship or cultural criticism itself (as was still the case with “deconstruction” or “postmodernism,” even at their most shopworn) but in the dependent, even parasitic domains of reference publishing, grant-making, and journalism that leads Jones to declare “digital humanities” a “fork” of humanities computing, rather than a Kuhnian paradigm shift marking otherwise insoluble structural conflict in an intellectual discpline.

    At least at first. Having suggested it, Jones then discards the metaphor drawn from the tree structures of software version control, turning to “another set of metaphors” describing the digital humanities as having emerged not “out of the primordial soup” but “into the spotlight” (Jones, 5). We are left to guess at the provenance of this second metaphor, but its purpose is clear: to construe the digital humanities, both phenomenally and phenomenologically, as the product of a “shift in focus, driven […] by a new set of contexts, generating attention to a range of new activities” (5).

    Change; shift; new, new, new. Not a branch or a fork, not even a trunk: we’re now in the ecoverse of history and historical time, in its collision with the present. The appearance and circulation of the English-language phrase “digital humanities” can be documented — that is one of the things that professors of English like Jones do especially well, when they care to. But “changes in the culture,” much more broadly, within only the last ten years or so? No scholar in any discipline is particularly well trained, well positioned, or even well suited to diagnosing those; and scholars in English studies won’t be at the top of anyone’s list. Indeed, Jones very quickly appeals to “author William Gibson” for help, settling on the emergence of the digital humanities as a response to what Gibson called “the eversion of cyberspace,” in its ostensibly post-panopticist colonization of the physical world.6 It makes for a rather inarticulate and self-deflating statement of argument, in which on its first appearance eversion, ambiguously, appears to denote the response as much as its condition or object:

    My thesis is simple: I think that the cultural response to changes in technology, the eversion, provides an essential context for understanding the emergence of DH as a new field of study in the new millennium.7

    Jones offers weak support for the grandiose claim that “we can roughly date the watershed moment when the preponderant collective perception changed to 2004–2008″ (21). Second Life “peaked,” we are told, while World of Warcraft “was taking off”; Nintendo introduced the Wii; then Facebook “came into its own,” and was joined by Twitter and Foursquare, then Apple’s iPhone. Even then (and setting aside the question of whether such benchmarking is acceptable evidence), for the most part Jones’s argument, such as it is, is that something is happening because we are talking about something happening.

    But who are we? Jones’s is the typical deference of the scholar to the creative artist, unwilling to challenge the latter’s utter dependence on meme engineering, at least where someone like Gibson is concerned; and Jones’s subsequent turn to the work of a scholar like N. Katherine Hayles on the history of cybernetics comes too late to amend the impression that the order of things here is marked first by gadgets, memes, and conversations about gadgets and memes, and only subsequently by ideas and arguments about ideas. The generally unflattering company among whom Hayles is placed (Clay Shirky, Nathan Jurgenson) does little to move us out of the shallows, and Jones’s profoundly limited range of literary reference, even within a profoundly narrowed frame — it’s Gibson, Gibson, Gibson all the time, with the usual cameos by Bruce Sterling and Neal Stephenson — doesn’t help either.

    Jones does have one problem with the digital humanities: it ignores games. “My own interest in games met with resistance from some anonymous peer reviewers for the program for the DH 2013 conference, for example,” he tells us (33). “[T]he digital humanities, at least in some quarters, has been somewhat slow to embrace the study of games” (59). “The digital humanities could do worse than look to games” (36). And so on: there is genuine resentment here.

    But nobody wants to give a hater a slice of the pie, and a Roman peace mandates that such resentment be sublated if it is to be, as we say, taken seriously. And so in a magical resolution of that tension, the digital humanities turns out to be constituted by what it accidentally ignores or actively rejects, in this case — a solution that sweeps antagonism under the rug as we do in any other proper family. “[C]omputer-based video games embody procedures and structures that speak to the fundamental concerns of the digital humanities” (33). “Contemporary video games offer vital examples of digital humanities in practice” (59). If gaming “sounds like what I’ve been describing as the agenda of the digital humanities, it’s no accident” (144).

    Some will applaud Jones’s niceness on this count. It may strike others as desperately friendly, a lingering under a big tent as provisional as any other tent, someday to be replaced by a building, if not by nothing. Few of us will deny recognition to Second Life, World of Warcraft, Wii, Facebook, Twitter, etc. as cultural presences, at least for now. But Jones’s book is also marked by slighter and less sensibly chosen benchmarks, less sensibly chosen because Jones’s treatment of them, in a book whose ambition is to preach to the choir, simply imputes their cultural presence. Such brute force argument drives the pathos that Jones surely feels, as a scholar — in the recognition that among modern institutions, it is only scholarship and the law that preserve any memory at all — into a kind of melancholic unconscious, from whence his objects return to embarrass him. “[A]s I write this,” we read, “QR codes show no signs yet of fading away” (41). Quod erat demonstrandum.

    And it is just there, in such a melancholic unconscious, that the triumphalism of the book’s title, and the “emergence of the digital humanities” that it purports to mark, claim, or force into recognition, straightforwardly gives itself away. For the digital humanities will pass away, and rather than being absorbed into the current order of things, as digital humanities enthusiasts like to believe happened to “high theory” (it didn’t happen), the digital humanities seems more likely, at this point, to end as a blank anachronism, overwritten by the next conjuncture in line with its own critical mass of prognostications.

    2

    To be sure, who could deny the fact of significant “changes in the culture” since 2000, in the United States at least, and at regular intervals: 2001, 2008, 2013…? Warfare — military in character, but when that won’t do, economic; of any interval, but especially when prolonged and deliberately open-ended; of any intensity, but especially when flagrantly extrajudicial and opportunistically, indeed sadistically asymmetrical — will do that to you. No one who sets out to historicize the historical present can afford to ignore the facts of present history, at the very least — but the fact is that Jones finds such facts unworthy of comment, and in that sense, for all its pretense to worldliness, The Emergence of the Digital Humanities is an entirely typical product of the so-called ivory tower, wherein arcane and plain speech alike are crafted to euphemize and thus redirect and defuse the conflicts of the university with other social institutions, especially those other institutions who command the university to do this or do that. To take the ambiguity of Jones’s thesis statement (as quoted above) at its word: what if the cultural response that Jones asks us to imagine, here, is indeed and itself the “eversion” of the digital humanities, in one of the metaphorical senses he doesn’t quite consider: an autotomy or self-amputation that, as McLuhan so enjoyed suggesting in so many different ways, serves to deflect the fact of the world as a whole?

    There are few moments of outright ignorance in The Emergence of the Digital Humanities — how could there be, in the security of such a narrow channel?6 Still, pace Jones’s basic assumption here (it is not quite an argument), we might understand the emergence of the digital humanities as the emergence of a conversation that is not about something — cultural change, etc. — as much as it is an attempt to avoid conversing about something: to avoid discussing such cultural change in its most salient and obvious flesh-and-concrete manifestations. “DH is, of course, a socially constructed phenomenon,” Jones tells us (7) — yet “the social,” here, is limited to what Jones himself selects, and selectively indeed. “This is not a question of technological determinism,” he insists. “It’s a matter of recognizing that DH emerged, not in isolation, but as part of larger changes in the culture at large and that culture’s technological infrastructure” (8). Yet the largeness of those larger changes is smaller than any truly reasonable reader, reading any history of the past decade, might have reason to expect. How pleasant that such historical change was “intertwined with culture, creativity, and commerce” (8) — not brutality, bootlicking, and bank fraud. Not even the modest and rather opportunistic gloom of Gibson’s 2010 New York Times op-ed entitled “Google’s Earth” finds its way into Jones’s discourse, despite the extended treatment that Gibson’s “eversion” gets here.

    From our most ostensibly traditional scholarly colleagues, toiling away in their genuine and genuinely book-dusty modesty, we don’t expect much respect for the present moment (which is why they often surprise us). But The Emergence of the Digital Humanities is, at least in ambition, a book about cultural change over the last decade. And such historiographic elision is substantive — enough so to warrant impatient response. While one might not want to say that nothing good can have emerged from the cultural change of the period in question, it would be infantile to deny that conditions have been unpropitious in the extreme, possibly as unpropitious as they have ever been, in U.S. postwar history — and that claims for the value of what emerges into institutionality and institutionalization, under such conditions, deserve extra care and, indeed defense in advance, if one wants not to invite a reasonably caustic skepticism.

    When Jones does engage in such defense, it is weakly argued. To construe the emergence of the digital humanities as non-meaninglessly concurrent with the emergence of yet another wave of mass educational automation (in the MOOC hype that crested in 2013), for example, is wrong not because Jones can demonstrate that their concurrence is the concurrence of two entirely segregated genealogies — one rooted in Silicon Valley ideology and product marketing, say, and one utterly and completely uncaused and untouched by it — but because to observe their concurrence is “particularly galling” to many self-identified DH practitioners (11). Well, excuse me for galling you! “DH practitioners I know,” Jones informs us, “are well aware of [the] complications and complicities” of emergence in an age of precarious labor, “and they’re often busy answering, complicating, and resisting such opportunistic and simplistic views” (10). Argumentative non sequitur aside, that sounds like a lot of work undertaken in self-defense — more than anyone really ought to have to do, if they’re near to the right side of history. Finally, “those outside DH,” Jones opines in an attempt at counter-critique, “often underestimate the theoretical sophistication of many in computing,” who “know better than many of their humanist critics that their science is provisional and contingent” (10): a statement that will only earn Jones super-demerits from those of such humanist critics — they are more numerous than the likes of Jones ever seem to suspect — who came to the humanities with scientific and/or technical aptitudes, sometimes with extensive educational and/or professional training and experience, and whose “sometimes world-weary and condescending skepticism” (10) is sometimes very well-informed and well-justified indeed, and certain to outlive Jones’s winded jabs at it.

    Jones is especially clumsy in confronting the charge that the digital humanities is marked by a forgetting or evasion of the commitment to cultural criticism foregrounded by other, older and now explicitly competing formations, like so-called new media studies. Citing the suggestion by “media scholar Nick Montfort” that “work in the digital humanities is usually considered to be the digitization and analysis of pre-digital cultural artifacts, not the investigation of contemporary computational media,” Jones remarks that “Montfort’s own work […] seems to me to belie the distinction,”7 as if Montfort — or anyone making such a statement — were simply deluded about his own work, or about his experience of a social economy of intellectual attention under identifiably specific social and historical conditions, or else merely expressing pain at being excluded from a social space to which he desired admission, rather than objecting on principle to a secessionist act of imagination.8

    3

    Jones tells us that he doesn’t “mean to gloss over the uneven distribution of [network] technologies around the world, or the serious social and political problems associated with manufacturing and discarding the devices and maintaining the server farms and cell towers on which the network depends” — but he goes ahead and does it anyway, and without apology or evident regret. “[I]t’s not my topic in this book,” we are told, “and I’ve deliberately restricted my focus to the already-networked world” (3). The message is clear: this is a book for readers who will accept such circumscription, in what they read and contemplate. Perhaps this is what marks the emergence of the digital humanities, in the re-emergence of license for restrictive intellectual ambition and a generally restrictive purview: a bracketing of the world that was increasingly discredited, and discredited with increasing ferocity, just by the way, in the academic humanities in the course of the three decades preceding the first Silicon Valley bubble. Jones suggests that “it can be too easy to assume a qualitative hierarchical difference in the impact of networked technology, too easy to extend the deeper biases of privilege into binary theories of the global ‘digital divide’” (4), and one wonders what authority to grant to such a pronouncement when articulated by someone who admits he is not interested, at least in this book, in thinking about how an — how any — other half lives. It’s the latter, not the former, that is the easy choice here. (Against a single, entirely inconsequential squib in Computer Business Review entitled “Report: Global Digital Divide Getting Worse,” an almost obnoxiously perfunctory footnote pits “a United Nations Telecoms Agency report” from 2012. This is not scholarship.)

    Thus it is that, read closely, the demand for finitude in the one capacity in which we are non-mortal — in thought and intellectual ambition — and the more or less cheerful imagination of an implied reader satisfied by such finitude, become passive microaggressions aimed at another mode of the production of knowledge, whose expansive focus on a theoretical totality of social antagonism (what Jones calls “hierarchical difference”) and justice (what he calls “binary theories”) makes the author of The Emergence of the Digital Humanities uncomfortable, at least on its pages.

    That’s fine, of course. No: no, it’s not. What I mean to say is that it’s unfair to write as if the author of The Emergence of the Digital Humanities alone bears responsibility for this particular, certainly overdetermined state of affairs. He doesn’t — how could he? But he’s getting no help, either, from most of those who will be more or less pleased by the title of his book, and by its argument, such as it is: because they want to believe they have “emerged” along with it, and with that tension resolved, its discomforts relieved. Jones’s book doesn’t seriously challenge that desire, its (few) hedges and provisos notwithstanding. If that desire is more anxious now than ever, as digital humanities enthusiasts find themselves scrutinized from all sides, it is with good reason.
    _____

    Brian Lennon is Associate Professor of English and Comparative Literature at Pennsylvania State University and the author of In Babel’s Shadow: Multilingual Literatures, Monolingual States (University of Minnesota Press, 2010).
    _____

    notes:
    1. Jones, 1.
    Back to the essay

    2. Jones, 4. “Interest” is presumed to be affirmative, here, marking one elision of the range of humanistic critical and scholarly attitudes toward Google generally and the Google Books project in particular. And of the unequivocally less affirmative “interest” of creative writers as represented by the Authors Guild, just for example, Jones has nothing to say: another elision.
    Back to the essay

    3. Jones, 13.
    Back to the essay

    4. See Gibson.
    Back to the essay

    5. Jones, 5.
    Back to the essay

    6. As eager as any other digital humanities enthusiast to accept Franco Moretti’s legitimation of DH, but apparently incurious about the intellectual formation, career and body of work that led such a big fish to such a small pond, Jones opines that Moretti’s “call for a distant reading” stands “opposed to the close reading that has been central to literary studies since the late nineteenth century” (Jones, 62). “Late nineteenth century” when exactly, and where (and how, and why)? one wonders. But to judge by what Jones sees fit to say by way of explanation — that is, nothing at all — this is mere hearsay.
    Back to the essay

    7. Jones, 5. See also Montfort.
    Back to the essay

    8. As further evidence that Montfort’s statement is a mischaracterization or expresses a misunderstanding, Jones suggests the fact that “[t]he Electronic Literature Organization itself, an important center of gravity for the study of computational media in which Montfort has been instrumental, was for a time housed at the Maryland Institute for Technology in the Humanities (MITH), a preeminent DH center where Matthew Kirschenbaum served as faculty advisor” (Jones, 5–6). The non sequiturs continue: “digital humanities” includes the study of computing and media because “self-identified practitioners doing DH” study computing and media (Jones, 6); the study of computing and media is also “digital humanities” because the study of computing and digital media might be performed at institutions like MITH or George Mason University’s Roy Rosenzweig Center for History and New Media, which are “digital humanities centers” (although the phrase “digital humanities” appears nowhere in their names); “digital humanities” also adequately describes work in “media archaeology” or “media history,” because such work has “continued to influence DH” (Jones, 6); new media studies is a component of the digital humanities because some scholars suggest it is so, and others cannot be heard to object, at least after one has placed one’s fingers in one’s ears; and so on.
    Back to the essay

    (feature image: “Bandeau – Manifeste des Digital Humanities,” uncredited; originally posted on flickr.)

  • Democracy: An Unfinished Project

    Democracy: An Unfinished Project

    Leaflet from Malayan Emergency
    by Susan Buck-Morss
    ~

    This essay criticizes Ahmet Davutoğlu’s proposal that Islamic civilization complete the “unfinished project of modernity” (Jürgen Habermas), by challenging the concept of civilization itself. As scholars in multiple disciplines have demonstrated, civilizations are hybrid constructions that cannot be contained within a uniform conceptual frame, such as Islamic “authenticity.” The past is shared, and the present is as well. The Arab Spring demonstrates that modernity confronts political actors with similar problems, whatever their background. The essay addresses successive paradoxes within the unfinished project of democracy: the contradiction between free markets (capitalist inequality) and free societies (political equality), the hierarchical relationship between the people and their leaders (Jacques Ranciére’s Ignorant Schoolmaster is discussed), and the lack of democracy between nations within the present world order.

    Read the full essay here.

    Summer 2014

    Summer 2014


    Feature Image: leaflet dropped on MNLA during the Malayan Emergency, offering $1,000 in exchange for the individual surrender of targeted MCP insurgents and the turning in of their Bren gun. A labeled “emergency” and not “war” for insurance purposes, it is suggested.

  • The Lenses of Failure

    The Lenses of Failure

    The Art of Failure

    by Nathan Altice

    On Software’s Dark Souls II and Jesper Juul’s The Art of Failure

    ~

    I am speaking to a cat named Sweet Shalquoir. She lounges on a desk in a diminutive house near the center of Majula, a coastal settlement that harbors a small band of itinerant merchants, tradespeople, and mystics. Among Shalquoir’s wares is the Silvercat ring, whose circlet resembles a leaping, blue-eyed cat.

    ‘You’ve seen that gaping hole over there? Well, there’s nasty little vermin down there,’ Shalquoir says, observing my window shopping. ‘Although who you seek is even further below.’ She laughs. She knows her costly ring grants its wearer a cat-like affinity for lengthy drops. I check my inventory. Having just arrived in Majula, I have few souls on hand.

    I turn from Shalquoir and exit the house ringless. True to her word, a yawning chasm opens before me, its perimeter edged in slabbed stonework and crumbling statues but otherwise unmarked and unguarded. One could easily fall in while sprinting from house to house in search of Majula’s residents. Wary of an accidental fall, I nudge toward its edge.

    The pit has a mossy patina, as if it was once a well for giants that now lies parched after drinking centuries of Majula’s sun. Its surface is smooth save for a few distant torches sawing at the dark and several crossbeams that bisect its diameter at uneven intervals. Their configuration forms a makeshift spiral ladder. Corpses are slung across the beams like macabre dolls, warning wanderers fool enough to chase after nasty little vermin. But atop the first corpse gleams a pinprick of ethereal light, both a beacon to guide the first lengthy drop and a promise of immediate reward if one survives.

    Silvercat ring be damned, I think I can make it.

    I position myself parallel to the first crossbeam, eyes fixed on that glimmering point. I jump.

    The Jump

    [Dark Souls II screenshots source: ItsBlueLizardJello via YouTube]

    For a breathless second, I plunge toward the beam. My aim is true—but my body is weak. I collapse, sprawled atop the lashed wooden planks, inches from my coveted jewel. I evaporate into a green vapor as two words appear in the screen’s lower half: ‘YOU DIED.’

    Decisions such as these abound in Dark Souls II, the latest entry in developer From Software’s cult-to-crossover-hit series of games bearing the Souls moniker. The first, Demon’s Souls, debuted on the PlayStation 3 in 2009, attracting players with its understated lore, intricate level design, and relentless difficulty. Spiritual successor Dark Souls followed in 2011 and its direct sequel Dark Souls II released earlier this year.

    Each game adheres to standard medieval fantasy tropes: there are spellcasters, armor-clad knights, parapet-trimmed castles, and a variety of fire-spewing dragons. You select one out of several archetypal character classes (e.g., Cleric, Sorcerer, Swordsman), customize a few appearance options, then explore and fight through a series of interconnected, yet typically non-linear, locations populated by creatures of escalating difficulty. What distinguishes these games from the hundreds of other fantasy games those initial conditions could describe are their melancholy tone and their general disregard for player hand-holding. Your hero begins as little more than a voiceless, fragile husk with minimal direction and fewer resources. Merely surviving takes precedence over rescuing princesses or looting dungeons. The Souls games similarly reveal little about their settings or systems, driving some players to declare them among the worst games ever made while catalyzing others to revisit the game’s environs for hundreds of hours. Vibrant communities have emerged around the Souls series, partly in an effort to document the mechanics From Software purposefully obscures and partly to construct a coherent logic and lore from the scraps and minutiae the game provides.

    Dark Souls II Settings

    Unlike most action games, every encounter in Dark Souls II is potentially deadly, from the lowliest grunts to the largest boss creatures. To further raise the stakes, death has consequences. Slaying foes grants souls, the titular items that fuel both trade and character progression. Spending souls increases your survivability, whether you invest them directly in your character stats (e.g. Vitality) or a more powerful shield. However, dying forfeits any souls you are currently carrying and resets your progress to the last bonfire (i.e., checkpoint) you rested beside. The catch is that dying or resting resets any creatures you have previously slain, giving your quest a moribund, Sisyphean repetition that grinds impatient players to a halt. And once slain, you have one chance to recover your lost souls. A glowing green aura marks the site of your previous bereavement. Touch that mark before you die again and you regain your cache; fail to do so and you lose it forever. You will often fail to do so.

    What many Souls reviewers find refreshing about the game’s difficulty is actually a more forgiving variation of the death mechanics found in early ASCII-based games like Rogue (1980), Hack (1985), and NetHack (1987), wherein ‘permadeath’—i.e., death meant starting the game anew—was a central conceit. And those games were almost direct ‘ports’ of tabletop roleplaying progenitors like Dungeons & Dragons, whose early versions were skewed more toward the gritty realism of pulp literature than the godlike power fantasies of modern roleplaying games. A successful career in D&D meant accumulating enough treasure to eventually retire from dungeon-delving, so one could hire other hapless retainers to loot on your behalf. Death was frequent and expected because dungeons were dangerous places. And unless one’s Dungeon Master was particularly lenient, death was final. A fatal mistake meant re-rolling your character. In this sense, the Souls games stand apart from their videogame peers because of the conservatism of their design. Though countless games ape D&D’s generic fantasy setting and stat-based progress model, few adopt the existential dread of its early forms.

    Dark Souls II’s adherence to opaque systems and traditional difficulty has alienated players unaccustomed to the demands of earlier gaming models. For those repeatedly stymied by the game’s frustrations, several questions arise: Why put forth the effort in a game that feels so antagonistic toward its players? Is there any reward worth the frequent, unforgiving failure? Aren’t games supposed to be fun—and is failing fun?

    YOU DIED

    Games scholar Jesper Juul raises similar questions in The Art of Failure, the second book in MIT’s new Playful Thinking series. His central thesis is that games present players a ‘paradox of failure’: we do not like to fail, yet games perpetually make us do so; weirder still, we seek out games voluntarily, even though the only victory they offer is over a failure that they themselves create. Despite games’ reputation as frivolous fun, they can humiliate and infuriate us. Real emotions are at stake. And, as Juul argues, ‘the paradox of failure is unique in that when you fail in a game, it really means that you were in some way inadequate’ (7). So when my character plunges down the pit in Majula, the developers do not tell me ‘Your character died,’ even though I have named that character. Instead the games remind us, ‘YOU DIED.’ YOU, the player, the one holding the Xbox 360 controller.

    The strength of Juul’s argument is that he does not rely on a single discipline but instead approaches failure via four related ‘lenses’: philosophy, psychology, game design, and fiction (30). Each lens has its own brief chapter and accompanying game examples, and throughout Juul interjects anecdotes from his personal play experience alongside lessons he’s learned co-designing a number of experimental video games. The breadth of examples is wide, ranging from big-budget games like Uncharted 2, Meteos, and Skate 2 to more obscure works like Flywrench, September 12, and Super Real Tennis.

    Juul’s first lens (chapter 2) links up his paradox of failure to a longstanding philosophical quandary known as the ‘paradox of painful art.’ Like video games, art tends to elicit painful emotions from viewers, whether a tragic stage play or a disturbing novel, yet contrary to the notion that we seek to avoid pain, people regularly pursue such art—even enjoy it. Juul provides a summary of positions philosophers have offered to explain this behavior, categorized as follows: deflationary arguments skirt the paradox by claiming that art doesn’t actually cause us pain in the first place; compensatory arguments acknowledge the pain, but claim that the sum of painful vs. pleasant reactions to art yield a net positive; and a-hedonistic arguments deny that humans are solely pleasure-seekers—some of us pursue pain.

    Juul’s commonsense response is that we should not limit human motivation to narrow, atemporal explanations. Instead, a synthesis of categories is possible, because we can successfully manage multiple contradictory desires based on immediate and long-term (i.e., aesthetic) time frames. He writes, ‘Our moment-to-moment desire to avoid unpleasant experiences is at odds with a longer-term aesthetic desire in which we understand failure, tragedy, and general unpleasantness to be necessary for our experience’ (115). In Dark Souls II, I faced a particularly challenging section early on when my character, a sorcerer, was under-powered and under-equipped to face a strong, agile boss known as The Pursuer. I spent close to four hours running the same path to the boss, dying dozens of times, with no net progress.

    Facing the Pursuer

    For Juul, my continued persistence did not betray a masochistic personality flaw (not that I didn’t consider it), nor would he trivialize my frustration (which I certainly felt), nor would he argue that I was eking out more pleasure than pain during my repeated trials (I certainly wasn’t). Instead, I was tolerating immediate failure in pursuit of a distant aesthetic goal, one that would not arrive during that game session—or many sessions to come. And indeed, this is why Juul calls games the ‘art of failure,’ because ‘games hurt us and then induce an urgency to repair our self-image’ (45). I could only overcome the Pursuer if I learned to play better. Juul writes, ‘Failure is integral to the enjoyment of game playing in a way that it is not integral to the enjoyment of learning in general. Games are a perspective on failure and learning as enjoyment, or satisfaction’ (45). Failure is part of what makes a game a game.

    Chapter 3 proceeds to the psychological lens, allowing Juul to review the myriad ways we experience failure emotionally. For many games, the impact can be significant: ‘To play a game is to take an emotional gamble. The higher the stakes, in terms of time investment, public acknowledgement, and personal importance, the higher are the potential losses and rewards’ (57). Failure doesn’t feel good, but again, paradoxically, we must first accept responsibility for our failures in order to then learn from them. ‘Once we accept responsibility,’ Juul writes, ‘failure also concretely pushes us to search for new strategies and learning opportunities in a game’ (116). But why can’t we learn without the painful consequences? Because most of us need prodding to be the best players we can be. In the absence of failure, players will cheese and cheat their way to favorable outcomes (59).

    Juul concludes that games help us grow—‘we come away from any skill-based game changed, wiser, and possessing new skills’ (59)—but his more interesting point is how we buffer the emotional toll of failure by diverting or transforming it. ‘Self-defeating’ players react to failure by lessening their efforts, a laissez-faire attitude that makes failure expected and thus less painful. ‘Spectacular’ failures, on the other hand, elevate negativity to an aesthetic focal point. When I laugh at the quivering pile of polygons clipped halfway through the floor geometry by the Pursuer’s blade, I’m no longer lamenting my own failure but celebrating the game’s.

    Chapter 4 provides a broad view of how games are designed to make us fail and counters much conventional wisdom about prevailing design trends. For instance, many players complain that contemporary games are too easy, that we don’t fail enough, but Juul argues that those players are confusing failure with punishment. Failure is now designed to be more frequent than in the past, but punishment is far less severe. Death in early arcade or console games often meant total failure, resetting your progress to the beginning of the game. Death in Dark Souls II merely forfeits your souls in-hand—any spent souls, found items, gained levels, or cached equipment are permanent. Punishment certainly feels severe when you lose tens of thousands of souls, but the consequences are far less jarring than losing your final life in Ghost ’n’ Goblins.

    Juul outlines three different paths through which games lead us to success or failure—skill, chance, and labor—but notes that his categories are neither exhaustive nor mutually exclusive (75, 82). The first category is likely the most familiar for frequent game players: ‘When we fail in a game of skill, we are therefore marked as deficient in a straightforward way: as lacking the skills required to play the game’ (74). When our skills fail us, we only have ourselves to blame. Chance, however, ‘marks us in a different way…as being on poor terms with the gods, or as simply unlucky, which is still a personal trait that we would rather not have’ (75). With chance in play, failure gains a cosmic significance.

    Labor is one of the newer design paths, characterized by the low-skill, slow-grind style of play frequently maligned in Farmville and its clones, but also found in better-regarded titles like World of Warcraft (and RPGs in general). In these games, failure has its lowest stakes: ‘Lack of success in a game of labor therefore does not mark us as lacking in skill or luck, but at worst as someone lazy (or too busy). For those who are afraid of failure, this is close to an ideal state. For those who think of games as personal struggles for improvement, games of labor are anathema’ (79). Juul’s last point is an important lesson for critics quick to dismiss the ‘click-to-win’ genre outright. For players averse to personal or cosmic failure, games of labor are a welcome respite.

    Juul’s final lens (chapter 5) examines fictional failure. ‘Most video games,’ he writes, ‘represent our failures and successes by letting our performance be mirrored by a protagonist (or society, etc.) in the game’s fictional world. When we are unhappy to have failed, a fictional character is also unhappy’ (117). Beginning with this conventional case, Juul then discusses games that subvert or challenge the presumed alignment of player/character interests, asking whether games can be tragic or present situations where character failure might be the desired outcome. While Juul concedes that ‘the self-destruction of the protagonist remains awkward,’ complicity—a sense of player regret when facing a character’s repugnant actions—offers a ‘better variation’ of game tragedy (117). Juul argues that complicity is unique to games, an experience that is ‘more personal and stronger than simply witnessing a fictional character performing the same actions’ (113). When I nudge my character into Majula’s pit, I’m no longer a witness—I’m a participant.

    The Art of Failure’s final chapter focuses the prior lens’ viewpoints on failure into a humanistic concluding point: ‘Failure forces us to reconsider what we are doing, to learn. Failure connects us personally to the events in the game; it proves that we matter, that the world does no simply continue regardless of our actions’ (122). For those who already accept games as a meaningful, expressive medium, Juul’s conclusion may be unsurprising. But this kind of thoughtful optimism is also part of the book’s strength. Juul’s writing is approachable and jargon-free, and the Playful Thinking series’ focus on depth, readability, and pocket-size volumes makes The Art of Failure an ideal book to pass along to friends and colleagues that might question your ‘frivolous’ videogame hobby—or, more importantly, justify why you often spend hours swearing at the screen while purportedly in pursuit of ‘fun.’

    The final chapter also offers a tantalizingly brief analysis of how Juul’s lenses might refract outward, beyond games, to culture at large. Specifically targeting the now-widespread corporate practice of gamification, wherein game design principles are applied as motivators and performance measures for non-leisure activities (usually work), Juul reminds us that the technique often fails because workplace performance goals ‘rarely measure what they are supposed to measure’ (120). Games are ideal for performance measurement because of their peculiar teleology: ‘The value system that the goal of a game creates is not an artificial measure of the value of the player’s performance; the goal is what creates the value in the first place by assigning values to the possible outcomes of a game’ (121). This kind of pushback against digital idealism is an important reminder that games ‘are not a pixie dust of motivation to be sprinkled on any subject’ (10), and Juul leaves a lot of room for further development of his thesis beyond the narrow scope of videogames.

    For the converted, The Art of Failure provides cross-disciplinary insights into many of our unexamined play habits. While playing Dark Souls II, I frequently thought of Juul’s triumvirate of design paths. Dark Souls II is an exemplary hybrid—though much of your success is skill-based, chance and labor play significant roles. The algorithmic systems that govern item drops or boss attacks can often sway one’s fortunes toward success or failure, as many speedrunners would attest. And for as much ink is spilt about Dark Souls II being a ‘hardcore’ game with ‘old-school’ challenge, success can also be won through skill-less labor. Summoning high-level allies to clear difficult paths or simply investing hours grinding souls to level your character are both viable supplements for chance and skill.

    But what of games that do not fit these paths? How do they contend with failure? There is a rich tradition of experimental or independent artgames, notgames, game poems, and the like that are designed with no path to failure. Standout examples like Proteus, Dys4ia, and Your Lover Has Turned Into a Flock of Birds require no skills beyond operating a keyboard or mouse, do not rely on chance, and require little time investment. Unsurprisingly, games like these are often targeted as ‘non-games,’ and Juul’s analysis leaves little room for games that skirt these borderlines. There is a subtext in The Art of Failure that draws distinctions between ‘good’ and ‘bad’ design. Early on, Juul writes that ‘(good) games are designed such that they give us a fair chance’ (7) and ‘for something to be a good game, and a game at all, we expect resistance and the possibility of failure’ (12).

    There are essentialist, formalist assumptions guiding Juul’s thesis, leading him to privilege games’ ‘unique’ qualities at the risk of further marginalizing genres, creators, and hybrid play practices that already operate at the margins. To argue that complicity is unique to games or that games are the art of failure is to make an unwarranted leap into medium specificity and draw borderlines that need not be drawn. Certainly other media can draw us into complicity, a path well-trodden in cinema’s exploration of voyeurism (Rear Window, Blow-Up) and extreme horror (Saw, Hostel). Can’t games simply be particularly strong at complicity, rather than its sole purveyor?

    I’m similarly unconvinced that games are the quintessential art of failure. Critics often contend that video games are unique as a medium in that they require a certain skill threshold to complete. While it is true that finishing Super Mario Bros. is different than watching the entirety of The Godfather, we can use Juul’s own multi-path model to understand how we might fail at other media. The latter example certainly requires more labor—one can play dozens of Super Mario runs during The Godfather’s 175-minute runtime. Further, watching a film lauded as one of history’s greatest carries unique expectations that many viewers may fail to satisfy, from the societal pressure to agree on its quality to the comprehensive faculties necessary to follow its narrative. Different failures arise from different media—I’ve failed reading Infinite Jest more than I’ve failed completing Dark Souls II. And any visit to a museum will teach you that many people feel as though they fail at modern art. Tackling Dark Souls II’s Pursuer or Barnett Newman’s Onement, I can be equally daunting.

    When scholars ask, as Juul does, what games can do, they must be careful that by doing so they do not also police what games can be. Failure is a compelling lens through which to examine our relationship to play, but we needn’t valorize it as the only means to count as a game.
    _____


    Nathan Altice is an instructor of sound and game design at Virginia Commonwealth University and author of the platform study of the NES/Famicom, I AM ERROR (MIT, 2015). He writes at metopal.com and burns bridges at @circuitlions.

  • Adventures in Reading the American Novel

    Adventures in Reading the American Novel

    image

    by Sean J. Kelly

    on Reading the American Novel 1780-1865 by Shirley Samuels

    Shirley Samuels’s Reading the American Novel 1780-1865 (2012) is an installment of the Reading the Novel series edited by Daniel R. Schwarz, a series dedicated to “provid[ing] practical introductions to reading the novel in both the British and Irish, and the American traditions.” While the volume does offer a “practical introduction” to the American novel of the antebellum era—its major themes, cultural contexts, and modes of production—its primary focus is the expansion of the American literary canon, particularly with regard to nineteenth-century women writers. In this respect, Samuels’s book continues a strong tradition of feminist cultural and historicist criticism pioneered by such landmark studies as Jane Tompkins’s Sensational Designs: The Cultural Work of American Fiction 1790-1860 (1985) and Cathy N. Davidson’s Revolution and the Word: The Rise of the Novel in America (1986). Tompkins’s explicit goal was to challenge the view of American literary history codified by F.O. Matthiessen’s monumental work, American Renaissance: Art and Expression in the Age of Emerson and Whitman (1941). In particular, Tompkins was concerned with reevaluating what she wryly termed the “other American Renaissance,” namely the “entire body of work” 1 of popular female sentimental writers such as Harriet Beecher Stowe, Maria Cummins, and Susan Warner, whose narratives “offer powerful examples of the way a culture thinks about itself.” 2

    Recent decades have witnessed a growing scholarly interest in not only expanding the literary canon through the rediscovery of “lost” works by women writers such as Tabitha Gilman Tenney3
    and P.D. Manvill4, to name a few, but also reassessing how the study of nineteenth-century sentimentalism and material culture might complicate, extend, and enrich our present understandings of the works of such canonical figures as Cooper, Hawthorne, and Melville. In this critical vein, Samuels asks, “what happens when a student starts to read Nathaniel Hawthorne’s The Scarlet Letter (1850), not simply in relation to its Puritan setting but also in relation to the novels that surround it?” (160). Reading the American Novel engages in both of these critical enterprises—rediscovery and reassessment of nineteenth-century American literature—by promoting what she describes as “not a sequential, but a layered reading” (153). In her “Afterward,” Samuels explains:

    Such a reading produces a form of pleasure layered into alternatives and identities where metaphors of confinement or escape are often the most significant. What produces the emergence of spatial or visual relations often lies within the historical attention to geography, architecture, or music as elements in this fiction that might re-orient the reader. With such knowledge, the reader can ask the fiction to perform different functions. What happens here? The spatial imagining of towns and landscapes corresponds to the minute landscape of particular bodies in time. Through close attention to the movements of these bodies, the critic discovers not only new literatures, but also new histories” (153).

    It is this “richly textured” (2) type of reading—a set of hermeneutic techniques to be deployed tactically across textual surfaces (including primary texts, marginalia, geographical locations, and “particular bodies in time” [153])—that leads, eventually, to Samuels’s, and the reader’s, greatest discoveries. The reader may find Samuels’s approach to be a bit disorienting initially. This is because Reading the American Novel traces not the evolution of a central concept in the way that Elizabeth Barnes, in States of Sympathy: Seduction and Democracy in the American Novel (1997), follows the development of seduction from late eighteenth-century to the domestic fiction of the 1860s. Rather, Samuels introduces a constellation of loosely-related motifs or what she later calls “possibilities for reading” (152)—“reading by waterways, by configurations of home, by blood and contract” (152)—that will provide the anchoring points for the set of disparate and innovative readings that follow.

    Samuels’s introductory chapter, “Introduction to the American Novel: From Charles Brockden Brown’s Gothic Novels to Caroline Kirkland’s Wilderness,” considers the development of the novel from the standpoint of cultural production and consumption, arguing that a nineteenth-century audience would have “assumed that the novel must act in the world” (4). In addition, Samuels briefly introduces the various motifs, themes, and sites of conflict (e.g. “Violence and the Novel,” “Nationalism,” Landscapes and Houses,” “Crossing Borders,” “Water”) that will provide the conceptual frameworks for her layers of reading in the subsequent chapters. If her categories at first appear arbitrary, this is because, as Samuels points out, “the novel in the United States does not follow set patterns” (20). The complex conceptual topography introduced in Chapter 1 reflects the need for what she calls a “fractal critical attention, the ability to follow patterns that fold ideas into one another while admiring designs that appear to arise organically, as if without volition” (20).

    The second chapter of the book, “Historical Codes in Literary Analysis: The Writing Projects of Nathaniel Hawthorne, Elizabeth Stoddard, and Hannah Crafts,” examines the value of archival research by considering the ways in which “historical codes . . . include[ing] abstractions such as iconography as well as the minutiae derived from historical research . . . are there to be interpreted and deciphered as much as to be deployed” (28). Samuels’s reading of Hawthorne, for example, links the fragmentary status of the author’s late work, The Dolliver Romance (1863-1864), to the more general “ideological fragmentation” (28) apparent in Hawthorne’s emotional exchange of letters with his editor, James T. Fields, concerning the representation of President Lincoln and his “increasing material difficulty of holding a pen” (25).

    Samuels’s third chapter, “Women, Blood, and Contract: Land Claims in Lydia Maria Child, Catharine Sedgwick, and James Fenimore Cooper,” explores the prevalence of “contracts involving women and blood” (45) in three early nineteenth-century historical romances, Child’s Hobomok (1824), Cooper’s The Last of the Mohicans (1826), and Sedgwick’s Hope Leslie (1827). In these works, Samuels argues, the struggle over national citizenship and westward expansion is dramatized against the “powerfully absent immediate context” (45) of racial politics. She maintains that in such dramas “the gift of women’s blood” (62)—often represented in the guise of romantic desire and sacrifice— “both obscures and exposes the contract of land” (62).

    Chapter four, “Black Rivers, Red Letters, and White Whales: Mobility and Desire in Catharine Williams, Nathaniel Hawthorne, and Herman Melville,” extends Samuels’s meditation on the figure of women’s bodies in relation to “the promise or threat of reproduction” (68) in the narrative of national identity; however, in her readings of Williams’ Fall River (1834), Hawthorne’s The Scarlet Letter (1850), and Melville’s Moby Dick (1851), the focus shifts from issues of land and contracts to the representation of water as symbolic of “national dispossession” (68) and “anxieties about birth” (68).

    Samuels’s fifth chapter, “Promoting the Nation in James Fenimore Cooper and Harriet Beecher Stowe,” returns to the question of the historical romance, critically examining how Cooper’s 1841 novel, The Deerslayer, might be read as evidence of “ambivalent nationalism” (102), as it links “early American nationalism and capitalism to violence against women and children” (109). Samuels then considers the possibility of applying such ambivalence to Stowe’s abolitionist vision for the future of America limned in Uncle Tom’s Cabin (1852), a vision founded, in part, on Stowe’s conceptual remapping of the Puritan jeremiad onto the abolitionist discourse of divine retribution and national apocalypse (111-112). Because Stowe “set out to produce a history of the United States that would have become obsolete in the moment of its telling” (111), Samuels argues that we witness a break in the development of historical fiction caused by the Civil War, a “gap” during which “the purpose of nationalism with respect to the historical novel changes” (113).

    Chapter six, “Women’s Worlds in the Nineteenth-Century Novel: Susan B. Warner, Elizabeth Stuart Phelps, Fanny Fern, E.D.E.N. Southworth, Harriet Wilson, and Louisa May Alcott,” and the book’s Afterward—in my opinion, the strongest sections of the book—survey a wide variety of nineteenth-century American women writers, including: Warner, Fern, Southworth, Wilson, Alcott, Caroline Kirkland, and Julia Ward Howe, among others. These discussions explore the ways in which writing functions as a type of labor which “gives the woman a face with which to face the world” (145). Samuels seeks to challenge the over-simplification of “separate spheres” ideology (153) by offering careful critical attention to the ways in which the labor of writing shapes identities in a multiplicity of distinct cultural locations. Hence, Samuels writes: “It is difficult to summarize motifs that appear in women’s writing in the nineteenth century. To speak of women’s worlds in the novel raises the matter of: what women?” (143).

    Admittedly, there are moments when Samuels’s layered readings necessitate extended swaths of summary; the works that become the primary focus of Samuels’s analyses, such as Catharine Williams’ Fall River and the novels of Elizabeth Stuart Phelps and E.D.E.N. Southworth, may be unfamiliar to many readers. At other instances, the very intricacy, novelty, and ambitiousness of Samuels’s reading performances begin to challenge the reader’s desire for linear consistency. Her interpretive strategies, which prioritize reading at the margins, the textual rendering of historical codes, and provocative juxtapositions, produce, at times, a kind of tunneling effect. The reader is swept breathlessly along, relieved when the author pauses to say: “But to return to my opening question” (82). Ultimately however, Samuels’s critical approaches throughout this book pose an important challenge to our conventional ways of assigning value and significance to nineteenth-century popular fiction. By reading canonical works such as Moby Dick and The Scarlet Letter with and against the popular crime novel Fall River, for example, she is able to map similarities between all three works in order to create “a more complete fiction” (83). All of these novels, she writes, “lure New Englanders to die. To read them together is to recover the bodies of laboring women and men from watery depths” (83). This type of creative reading, to invoke Ralph Waldo Emerson’s phrase, allows us potentially to tease out significant conflicts and tensions in well-known works that might have otherwise remained invisible in a conventional reading. “What happens,” she asks, “when we remember that Captain Ahab is a father?” (83). Because Samuels offers not only insightful interpretations of nineteenth-century American novels but also introduces new and creative ways to read—and ways to think about the meaning of reading as a critical practice—Reading the American Novel must be viewed as a valuable addition to American literary scholarship.

    _____

    Sean J. Kelly is Associate Professor of English at Wilkes University. His articles on nineteenth-century American literature and culture have recently appeared in PLL, The Edgar Allan Poe Review, and Short Story.

    _____

    notes:
    1. Tompkins, Jane. Sensational Designs: The Cultural Work of American Fiction 1790-1860. New York: Oxford UP, 1985. 147
    Back to the essay

    2. Ibid. xi
    Back to the essay

    3. Tenney, Tabitha Gilman. Female Quixotism: Exhibited in the Romantic Opinions and Extravagant
    Adventures of Dorcasina Sheldon
    . 1801. Intro. Cathy N. Davidson. New York: Oxford UP, 1992.
    Back to the essay

    4. Manvill, P.D. Lucinda; Or, the Mountain Mourner: Being Recent Facts, in a Series of Letters, from Mrs.
    Manvill, in the State of New York, to Her Sister in Pennsylvania
    . 1807. Intro. Mischelle B. Anthony. Syracuse: Syracuse UP, 2009.
    Back to the essay

  • Transgender Studies Today: An Interview with Susan Stryker

    Transgender Studies Today: An Interview with Susan Stryker

    _____________________________________________________________________________________

    Petra Dierkes-Thrun interviews Susan Stryker, leader of an unprecedented initiative in transgender studies at the University of Arizona, and one of two founding co-editors of the new journal TSQ: Transgender Studies Quarterly (together with Paisley Currah). Stryker is Associate Professor of Gender and Women’s Studies, and Director of the Institute for LGBT Studies at the University of Arizona. The author or editor of numerous books and articles on transgender and queer topics for popular and scholarly audiences alike, she won an Emmy Award for the documentary film Screaming Queens: The Riot at Compton’s Cafeteria, a Lambda Literary Award for The Transgender Studies Reader, and the Ruth Benedict Book Prize for The Transgender Studies Reader 2.
    _____________________________________________________________________________________

    Transgender Studies initiative at the University of Arizona. Left to Right (Front): Paisley Currah, Susan Stryker, Monica Casper, Francisco Galarte; (Back): Eric Plemons, Max Strassfeld, Eva Hayward. Not pictured: TC Tolbert.
    Transgender Studies initiative at the University of Arizona. Left to Right (Front): Paisley Currah, Susan Stryker, Monica Casper, Francisco Galarte; (Back): Eric Plemons, Max Strassfeld, Eva Hayward. Not pictured: TC Tolbert. Photo by Paisley Currah.

     

    DIERKES-THRUN:  The University of Arizona recently initiated an unprecedented cluster hire in transgender studies and is actively working towards a graduate degree program in transgender studies. Can you tell us a bit more about the history and the thinking behind this strong, coordinated move at your institution?

    STRYKER: After the University of Arizona (UA) recruited me away from my previous job to direct the Institute for LGBT Studies in 2011, I came in saying that I wanted to put equal emphasis on the “T” in that acronym, and they were supportive of that. But none of us anticipated that the T was going to become the tail that wagged the dog, so to speak. It would not have happened had I not been courted by another, much more prestigious university during my second year on the job. UA asked what it would take to retain me, and I said I wanted to do something unprecedented, something I would not be able to do at that other university, something that would transform my field, while also putting UA on the map in a bold new way. I said I wanted to launch a transgender studies initiative, which represents my vision of the field’s need to grow. The institution said yes to what I proposed, and to the upper administration’s credit, they saw an opportunity in what I pitched.

    The truly unprecedented institutional commitment came in the form of strategic hiring support for a transgender studies faculty cluster. As UA has been quick to point out to conservative critics of this initiative, no new funds were identified to create these faculty lines—they came from existing pools of discretionary funds, and represent a shifting towards emerging areas of study of faculty lines freed up by retirement or resignation. That said, no university anywhere in the world has ever conducted a faculty cluster hire in transgender studies. Four lines were made available: two in the College of Social and Behavioral Sciences, and two in colleges elsewhere in the University. We wound up filling three of those positions last year—hiring in medical anthropology, feminist science and technology studies, and religious studies—and are in negotiations about where to place the remaining line.

    UA has a strong institutional culture of interdisciplinary collaboration, as well as a good track record of supporting LGBT issues, so this fit right in. They understand that transgender issues have a lot of cultural saliency at the moment, and that studying the rapid shifts in contemporary gender systems, including the emergence of historically new forms of gender expression, particularly in the context of the biomedical technologization of “life itself,” is a legitimate field of study and research. Pragmatically, they saw the initiative as a way to attract and retain innovative and diverse faculty members, to bring in out-of-state tuition dollars, to compete for external research grants, and to push back against the popular misconception that Arizona is only a politically reactionary place. From the institution’s perspective, there was no advocacy agenda at work here, just an opportunity to increase the bottom line by building on existing faculty and research strengths.

    The lowest-hanging fruit, which can be accomplished with relatively little bureaucracy, is a graduate concentration, minor, or designated emphasis in transgender studies, and there is definitely support for that. We hope to have that in place within a year. It is also possible that a currently existing MA program in Gender and Women’s Studies could be adapted relatively easily to accommodate a transgender studies emphasis, but that involves a lot of inside-the-ballpark negotiation with current GWS faculty. Actually creating a new, stand-alone graduate program at the state’s land grant university would require approval by the Arizona Board of Regents, and ultimately by the Governor’s Office, so that will be a longer and tougher row to hoe.

    The final element of the initiative is approval to pursue establishing a new research enterprise called the “Center for Critical Studies of the Body.” The rationale here was to provide a non-identitarian rubric that could bring transgender studies into dialog with other interdisciplinary fields, such as the study of disability, trauma, sports, medical humanities, etc. No funds were provided for this, just a green light for starting the process of cobbling a center together.

    Of course, it’s vital to ask the question why, in an era when the teaching of Chicano/a studies is literally being outlawed in Arizona public schools, when xenophobic attitudes inform the state’s border politics, attention to transgender identities and practices can appear palatable. How does institutional investment in transgender studies at this particular historical juncture play into a deep logic of “managing difference” through expert knowledges, or get positioned as less threatening than calls for racial and economic justice? As the person heading up this initiative, I want to be attentive to ways I can use trans studies to advance other concerns that currently have a harder time getting traction in Arizona. I think my deepest challenge in trying to spearhead this initiative lies in resisting the ways that transgender studies can be co-opted for neoliberal uses that fall short of its radical transformative potential.

    DIERKES-THRUN: The University of Arizona also provided financial and logistical support for the establishment of a new journal of record for the field of transgender studies, TSQ: Transgender Studies Quarterly, published by Duke University Press in 2014, with you and Paisley Currah (Professor of Political Science at Brooklyn College and the CUNY Graduate Center) as founding co-editors. How did that come about?

    STRYKER: Launching this journal had been a long-term project of mine and Paisley’s and was already well underway before the opportunity to launch the broader transgender studies initiative came up, but it nevertheless constitutes an important element of what has become the bigger project. UA has significantly supported the establishment of  TSQ by contributing about one-third of the start-up costs. Those funds were cobbled together from a lot of different institutional sources, including the Provost’s Office, the office of the Vice President for Research, the College of Social and Behavioral Sciences, the Department of Gender and Women’s Studies, and the Institute for LGBT Studies.

    DIERKES-THRUN: For our readers who are just now becoming acquainted with transgender studies as a diverse intellectual and academic field, how would you summarize its most important constants and changes over the past two decades? What are some important subareas and affiliated fields for transgender studies?

    STRYKER: I’d recommend taking a look at the tables of contents in the two volumes of The Transgender Studies Reader. The first volume, from 2006, offers a genealogy of field formation, highlighting historical ties to scientific sexology, feminism, and poststructuralist theory.

    It includes work from the “transgender moment” of the early 1990s that changed the conversation on trans issues and tackles many of the topics that were of interest in the field’s first decade—questions of self-representation, diversity within trans communities, the increasing visibility of trans-masculinities. The second volume, from 2013, showcases the rapid evolution of the field in the 21st century, which is self-consciously moving in strongly transnational directions away from the Anglophone North American biases of the field’s first decade. There has been much more attention paid to the relationship between transgender issues and other structural forms of inequality and injustice, and, post 9/11, to questions about borders, surveillance, and security—and the ways that non-conventionally gendered bodies experience heightened scrutiny and limitations on movement, and can be seen as posing a terroristic threat to the body politic. There are increasing affinities with posthumanist work, as well as with animal studies, critical life studies, and the so-called “new materialism.” The first several issues of TSQ suggest something of current directions in the field: they address decolonization, cultural production, population studies, transanimalities, higher education studies, archives, transfeminism, political economy, sex classification, translation, surgery, sinophone studies, and psychoanalytic theory.

    DIERKES-THRUN: Can you say something about the trans- and international context of transgender studies today? What are the most important challenges there and why should we be thinking about them?

    STRYKER: The field has indeed been moving in a strongly transnational direction for more than a decade. I was particularly pleased that The Transgender Studies Reader 2 was awarded the 2013 Ruth Benedict Prize from the Association for Queer Anthropology/American Anthropological Association, precisely because the field of transgender studies challenges us to think anew about how we understand sex/gender/identity cross-culturally. I think one of the biggest intellectual challenges has to do with fully acknowledging that some of the fundamental categories that we use to understand “human being”—like man and woman—are not ontologically given, but rather are themselves historically and cultural variable and contingent. Translation is also a huge problem—how do we facilitate the exchange of knowledge across language and culture, when the very categories we use to organize and recognize our own being and that of others can be so deeply incommensurable?

    DIERKES-THRUN: In the introduction to the inaugural issue of TSQ, the editors write, “Transgender studies promises to make a significant intellectual and political intervention into contemporary knowledge production in much the same manner that queer theory did twenty years ago.” What are some of the most needed intellectual and political interventions that you anticipate transgender studies can and will make?

    TSQ coverSTRYKER: First and foremost, I see it creating more space for critical conversations that involve transgender speakers. Bringing trans studies into the academy is one way of bringing more trans people into the academy. Of course I’m not arguing that trans studies is something that on trans people can participate in. Far from it—anybody can develop an expertise in this area, or feel that they have some sort of stake in it. But just as disability activists said in the 70s and 80, “nothing about us without us.” What’s most significant is creating an opportunity for the privileged and powerful kinds of knowledge production that takes place in the academy (about trans topics or any other area that involves people) to be not just objectifying knowledge, what we might call “knowledge of,” but also “knowledge with,” knowledge that emerges from a dialog that includes trans people who bring an additional kind of experiential or embodied knowledge along with their formal, expert knowledges. It’s the same rationale for any kind of diversity hiring initiative. People have different kinds of “situated knowledges” that derive from how they live their bodily differences in the world. It’s important to have people in critical conversations who come from different perspectives based on race/ethnicity, gender, ability, national origin, first languages, etc. Transgender represents a different kind of difference that offers a novel perspective on how gender systems, and therefore society, work.

    DIERKES-THRUN: You also say, in the same TSQ introduction, that transgender studies “offers fertile ground for conversations about what the posthuman might practically entail (as well as what, historically, it has already been).” The posthuman is a topic of interest to many of our readers. Could you map out for us what specific or broader contributions transgender studies can make to past and future discussions of the posthuman?

    STRYKER: The first thing we say of a new child is “It’s a girl” or It’s a boy.” Through the operation of language, we move a body across the line that separates mere biological organism from human community, transforming the status of a nonhuman “it” into a person through the conferral of a gender status. It has been very difficult to think of the human without thinking of it through the binary gender schema. I think a lot of the violence and discrimination trans people face derives from a fundamental inability on the part of others to see us as fully human because we are considered improperly gendered, and thus lower on the animacy hierarchy, therefore closer to death and inanimacy, therefore more expendable and less valuable than humans. A transgender will to life thus serves as a point from which to critique the human as a universal status attributed to all members of the species, and to reveal it instead as a narrower set of criteria wielded by some to dehumanize others.

    DIERKES-THRUN: The journal description announces that TSQ “will publish interdisciplinary work that explores the diversity of gender, sex, sexuality, embodiment, and identity in ways that have not been adequately addressed by feminist and queer scholarship.” What have been some of feminist and queer theory’s most important blind spots when it comes to thinking about the transgender experience?

    STRYKER: Transgender Studies emerged as an interdisciplinary field in the early 1990s, at roughly the same time as queer theory. There’s been a robust conversation about the relationship between the two, especially given the simultaneous formation of what’s come to be called the “LGBT” community. I contend that trans studies, as it was first articulated, shared an agenda with queer studies in the sense that it critiqued heteronormative society from a place of oppositional difference. It argued that “queer” was not just a five letter word for homosexual, but rather that queer encompassed a range of “different differences” that all had a stake in contesting various sorts of oppressive and coercive normativities related to sex, sexuality, identity, and embodiment. As queer theory developed, however, issues of sexuality really did remain in the forefront. From a transgender studies perspective, the whole distinction between homo and hetero sexualities depends on a prior agreement about what constitutes “sex,” on who’s a man and who’s a woman. Destabilizing those material referents, or needing to account for their sequentiality, their fuzzy boundaries, their historicity or cultural specificity, or their hybridity really opens up a whole different set of questions. In addition, trans studies is not organized primarily around issues of sexuality; equally important are questions of gender, bodily difference, heath care provision, technology studies, and a host of other things that have not been central to queer studies. So the debate between queer and trans studies has been about whether they are different parts of the same big intellectual and critical project, employing the same transversal methodologies for bringing into analytical focus and contesting oppressive normativities, or whether they overlap with one another—sharing some interests but not others—or whether they are really two different enterprises, concerned with different objects of study.

    My personal answer is all of the above, sometimes. At its most radical, trans studies offers a critique of the ways in which gay and lesbian liberation and civil rights struggles have advanced themselves by securing greater access to citizenship for homosexuals precisely through the reproduction of gender normativities—the liberal “I’m just like a straight person except for who I have sex with” argument. What actually provides the commonality there between homo and hetero is an agreement about who is a man and who is a woman, and how we can tell the difference between the two. Trans studies puts pressure on that tacit agreement.

    With regard to feminism, I think the major innovation transgender studies offers has to do with how gender hierarchies operate. In the most conventional feminist frameworks, what has seemed most important is to better understand and thereby better resist the subordination of women to men. Without contesting that basic tenet, transgender studies suggests that it is also necessary to understand how contesting the hierarchized gender binary itself can increase vulnerabilities to structural oppression for those people who don’t fit in, or who refuse to be fixed in place. That is, in addition to needing to address power structures that privilege normatively gendered men and masculinity over normatively gendered women and femininity, we also need to address a wide range of gender nonnormativities, atypicalities, transitivities, and fluidities. I see this as extending, rather than challenging, fundamental feminist insights.

    DIERKES-THRUN: Many of our readers may not know this, but traditionally, the relationship between queer theory and transgender studies and activism has been quite contentious. Is the fact that there is now a separate academic journal for trans studies indicative of an ongoing divide with queer studies, despite what you call the recent “transgender turn”?

    STRYKER: There’s a big enough and deep enough conversation on trans topics to merit and sustain an independent journal for the field, that’s all. There is more publishable scholarship on trans issues and topic than will ever fit into GLQ, given that journal’s broader scope, or that can ever fit into one-off special issues of disciplinary or interdisciplinary journals devoted to trans topics. Worrying that the advent of TSQ signals a divergence or parting of the ways between queer and trans studies is an overblown concern. Personally, I’d hate to see queer and trans studies drift further apart, because I feel strongly committed to both. I think trans studies is expansive enough to encompass a lot of queer scholarship on sex/gender nonnormativity, while also advancing scholarship on transgender-related topics that queer studies has never been particularly interested in.

    DIERKES-THRUN: As someone who has worked as a historian, social activist for trans rights and documentary filmmaker on trans history, how would you describe the state of our society’s understanding and attitudes towards transgender today? Does it feel like the tide has finally shifted?

    STRYKER: I think it is a mixed bag. Pretty much everybody today knows that there is this thing called “transgender”, but they can’t say exactly what it is. They know if they want to be considered progressive they are supposed to be OK with it, even if they secretly feel squeamish or judgmental or confused. That’s an improvement over the situation in decades past, when pretty much everybody agreed that there were these sick people and freaks and weirdoes who wanted to cross-dress or take hormones or cut up their genitals, but they were not important, and society really didn’t have to pay any attention to such a marginal and stigmatized phenomenon. So yes, there has been a shift, but yes, there is still a long way to go.

    DIERKES-THRUN: Which projects are you working on now?

    STRYKER: I have a really heavy administrative load right now. I was already trying to run a research institute, teach, commute between my job in Tucson and my home in San Francisco, and launch a new peer-reviewed journal, before the trans studies initiative became a possibility. That has definitely been a “be careful what you ask for” lesson, in terms of workload. I feel like I don’t write anything these days that doesn’t start with the words “Executive Summary” and end with the words “Total Budget.” It will probably be like that for a couple more years, especially until I complete my agreed-upon term of service as director of the Institute for LGBT Studies at the end of 2016.

    But there are a couple of projects percolating along on the back burner. At the time I came to Arizona, I was working on an experimental media project called Christine in the Cutting Room, about the 1950s transsexual celebrity Christine Jorgensen, who burst onto the global stage when news of her sex-change surgery made headlines around the world. The project was sparked for me by a comment Jorgensen made in an interview with television journalist Mike Wallace. She was talking about her pre-fame job as a film cutter in the newsreel division at RKO Studios in New York, and said that she “used to work on one side of the camera” because she “didn’t know how to appear on the other side.” That gave me the idea of approaching the question of transsexuality from an aesthetic perspective, as a technique of visualization, accomplished through media manipulation. I saw Jorgensen using cinematic techniques of media cutting, suturing, image creation, and projection to move her from one side of the camera to the other, by moving herself from one kind of “cutting room” to another. I have always been interested in ways of exploring trans experience outside the pervasive psychomedical framework, and this project lets me do that. I mix archival audiovisual media of Jorgensen herself, found sound and images, electronic glitch music, and a scripted voice-over narration performed by an actress playing Jorgensen. At some point I hope to edit this material into a narrative film, but I have found it also works well as a multimedia installation in galleries and clubs.

    I am also trying to write a book. I’ve finally hit on a way to piece together into one overarching argument lots of fragments of abandoned or incomplete projects on embodiment and technology, the early Mormons, members of San Francisco’s elite Bohemian Club, transsexuals, urban history, and popular music. My working title is Identity is a War Machine: The Somatechnics of Gender, Race, and Whiteness. It’s about the processes through which we incorporate—literally somaticize—culturally specific and historically revisable categories of individual identity within biopolitical regimes of governmentality. I won’t say any more about it at this time, because this book itself could be one of my many unfinished projects.

    DIERKES-THRUN: Transgender as a topic of public curiosity seems to be everywhere in U.S. media culture these days, from Laverne Cox and Orange Is the New Black to Chelsea Manning, Andreja Pejic and others. (There is also a lot of naïve conflation with drag and cross-dressing, as the media treatment of Conchita Wurst illustrates.) Do you worry about the glamorization and commodification of certain kinds of trans bodies in the media and the silence around others? Are famous celebrity spokespeople like Laverne Cox or Janet Mock good or bad for the movement, from your perspective?

    STRYKER: In the wake of the repeal of the U.S. military’s Don’t-Ask-Don’t-Tell policy regarding homosexual service members, and after the Supreme Court decisions on marriage equality, transgender has emerged in some quarters as the “next big thing” in minority rights. I have a lot of problems with that way of framing things, and am very leery of the ways that story functions as a neoliberal progress narrative, and of the ways in which protecting trans people (now that gays have been taken care of) can exemplify the values of inclusivity and diversity, so that the US or the West can use support for trans rights to assert influence over other parts of the world who purportedly do not do as good a job on this front. What is truly amazing to me, after having been out as trans for nearly a quarter century, is the extent to which it is now becoming possible for some trans people to access what I call “transnormative citizenship,” while at the same time truly horrific life circumstances persist for other trans people. Race really does seem to be the dividing line that allows some trans people to be cultivated for life, invested in, recognized, and enfolded into the biopolitical state, while allowing others to be consigned to malignant neglect or lethal violence. The contemporary celebrity culture of transgender plays to both sides of this dichotomy. It’s increasingly possible to see trans people represented as successful, beautiful, productive, or innovative (and I salute those trans people who have accomplished those things). At the same time, you see people like Laverne Cox and Janet Mock using their platform to call attention the persistence of injustices, particularly for trans women of color. I am truly inspired by the way they both speak out on race, classism, the prison-industrial complex, and sex-work.

  • June Fourth at 25: Forget Tiananmen, You Don’t Want to Hurt the Chinese People’s Feelings – and Miss Out on the Business of the New “New China”!

    June Fourth at 25: Forget Tiananmen, You Don’t Want to Hurt the Chinese People’s Feelings – and Miss Out on the Business of the New “New China”!

    by Arif Dirlik

    ~
    Tiananmen_Square,_Beijing,_China_1988_(1) Twenty-five years ago, in the early hours of June 4, the people’s government in Beijing turned its guns on the people of the city who had risen in protests that spring to express their frustration with Party despotism and corruption. The refusal to this day to acknowledge the crime is matched by continued criminalization of those who still live under the shadow of Tiananmen, and with courage continue to pursue the goals it had put on the political agenda – some from within the country, others from exile. The Tiananmen democracy movement brought to a head the contradictions of “reform and opening” that had acquired increasing sharpness during the decade of the 1980s. The successful turn to global capitalism in the aftermath of the suppression has been at least as important as the censorship of memories in the “forgetting” of Tiananmen among the PRC population. In historical perspective, Tiananmen appears as one of a series of popular uprisings around the globe that have accompanied the globalization of neo-liberal capitalism. The discussion throughout stresses foreign complicity – including that of foreign China scholars and educational institutions – in covering up this open sore on so-called “socialism with Chinese characteristics”.

    Read the full article here.
    in International Journal of China Studies
    Vol. 5, No. 2, June/August 201 4, pp. 295-329

  • Michael Hays as Interim Book Review Editor

     

     

     

    Hays for Review Editor

     

     

    Thanks to Michael Hays for assuming the post of Interim Book Review Editor of boundary 2 effective today.  Michael has been a long-serving member of the B2 collective.  He is an independent scholar recently retired as Dean of Soka University in California and Professor at Cornell.  He has also taught at NYU and Columbia.  You can reach Michael at boundary2@gmail.com.

  • Summer 2014: Volume 41, Number 2

    Summer 2014: Volume 41, Number 2

    In Memoriam of Stuart McPhail Hall

    Each crisis provides an opportunity to shift the direction of popular thinking instead of simply mirroring the right’s populist touch or pursuing short-term opportunism. The left…must adopt a more courageous, innovative, “educative” and path-breaking strategic approach if they are to gain ground.
    –Stuart Hall and Alan O’Shea, “Common-sense Neoliberalism”

    Summer 2014: Volume 41, Number 2

    home_cover

    Intervention / Mandela’s Reflections

    Editor’s Note from Paul Bové:
    …We decided to gather responses to Mandela as a political figure. b2 issued a call for very brief papers from several spots on the globe and from different generations. Our contributors have given us reason to feel this attempt was a success.

    Preface by Anthony Bogues

    Mbu ya Ũrambu: Mbaara ya Cuito Cuanavale / The Cry of Hypocrisy: The Battle of Cuito Cuanavale by Ngũgĩ wa Thiong’o

    Discomforts by Hortense Spillers

    home_cover
    The Mandela Enigma by Wlad Godzich

    Mandela, Charisma, and Compromise by Joe Cleary

    Nelson Mandela on Nightline; or, How Palestine Matters by Colin Dayan

    Or, The Whale by Jim Merod

    Malaysian Mandela by Masturah Alatas

    Mandela, Tunisia, and I by Mohamed-Salah Omri

    Nelson Mandela by Ruth Y. Y. Hung

    home_cover

    Mandela Memories: An African Prometheus by Ngũgĩ wa Thiong’o

    Nelson Mandela: Decolonization, Apartheid, and the Politics of Moral Force by Anthony Bogues

    Mandela’s Wholeness, Perhaps Infinite by Dawn Lundy Martin

    [untitled] by Gayatri Chakravorty Spivak

    Mandela’s Gift by Sobia Saleem





    _____

    Three Models of Emergency Politics by Bonnie Honig

    Democracy: An Unfinished Project by Susan Buck-Morss

    The Future of Reading? Memories and Thoughts toward a Genealogical Approach by Hans Ulrich Gumbrecht

    _____

    b2 Interview
    History Unabridged: An Interview with Stefan Collini with Jeffrey J. Williams

    _____

    Articles
    King Kong in America by Arif Dirlik

    How Global Capitalism Transforms Deng Xiaoping by Ruth Y. Y. Hung

    Is Dasein People? Heidegger According to Haugeland by Taylor Carman

    It’s Only the End of the World by Ben Conisbee Baer

    Passive Aggressive: Scalia and Garner on Interpretation by Andrew Koppelman

  • There's a Riot Going on: From Haiti to Tunisia

    There's a Riot Going on: From Haiti to Tunisia

    by R. A. Judy

    “The true scandal is not in the proposition of analogy between the Haitian and Tunisian revolutions, but in this epistemological failure, which perpetuates the refusal to recognize that they are not derivative analogues of the French Revolution or the European Spring of Revolution, but are distinctive events of social transformation, which while in part stimulated by a certain set of Enlightenment concepts and institutions, have taken a course that cannot be charted according to the dominate mapping of our common modernity…”

    Given on May 17 2014 at The Tunisian Revolution: Causes, Course and Aftermath Conference, Saint Anthony’s College, Oxford University, Oxford UK

    ©Ronald A. Judy
    (Do not quote or cite without the author’s express permission)

    I should like to take full measure of the gravitas of my charge today, which is to locate the Tunisian Revolution in its international dimension. In doing so, I begin by pressing some on two of the crucial terms of that charge: locate and international. Without appealing to the rather authoritative force of etymology—itself a mode of placement and so begging the question—I’ll merely point out that locating something is to place it within some set of boundaries and to so settle it; to situate it. How does one situate or settle revolution, except to, as in the manner of the National Convention in 1795, having just repressed the last uprising of the Revolutionary Parisian sans-culottes and yielding power to the Directory, stabilize it. Nor is it a trivial fact to our purposes here that chief among the institutions of stability was the comprehensive public education law enacted in October of that year, establishing the Institut national de sciences et arts (National Institute of Sciences and Arts), whose expressed mission was indeed to advise the Directory about intellectual work, both scientific and literary, in France and abroad, which might have been of use in stabilizing the energies of the revolution—in other words, their management for the glory of the republic. This was perhaps most successfully realized in the work of the Institut’s second class, the Classe des sciences morales et politiques (Class for Moral and Political Sciences), in which de Tracy’s Idéologues held considerable sway; a heuristic of some of the pitfalls involved in the academicization of revolution well worth attending to now. Nonetheless, it warrants pointing out that in its voluminous work of memoirs, the Institut national de sciences et arts achieved a corpus of psychological social science, including theories of mind as well as ethics, all focused on the well-tempered individual as the proper embodiment of revolutionary force, that still contributes to our understanding of proper social order in change. And that is precisely why we cannot “locate” the Tunisian Revolution, per se. Even if we were to locate it in the seemingly straightforward geo-political sense of placement, I should still dissent, because it is not merely circumscribed within the ambit of the Arab World in any easy way, and it remains porous both northerly and southerly in a way that severely troubles the distinguishing boundaries of Europe, Mediterranean, and Africa.

    So, then, rather than locating the Tunisian Revolution in its international dimension, I raise, and will try to address the question of what and how it is meaningful as an earthly historic human event. The most succinct answer to this question is that the Tunisian Revolution, “which we have seen unfolding in our day, whether it may succeed or miscarry . . . finds in the hearts of all spectators (who are not engaged in the game themselves) a wishful participation that borders closely on enthusiasm, the very expression of which is fraught with danger; this sympathy, therefore, can have no other cause than a moral predisposition in the human race.” The last sentence sums things up: this revolution is evidence that humanity can progress of its own accord. That, I think, is the significance of the Tunisian Revolution of Dignity in all the details of its events beginning in Gafsa in 2008 up through to the moment. And, in that regard, it is far more analogous to the events that shook the Caribbean Island of Saint Domingue from 1791-1804, when the independent republic of Haiti was established,1 than it is to either those that transformed France from 1789 to 95, or those from 1848 to 71, which ushered in the hegemony of the European bourgeois liberal nation-state. I know this seems like a radical provocation. I do not, however, intend it as a scandalous remark, but rather as a serious proposition aimed at getting us to think something else. Its seeming scandalous has to do with its incomprehensibleness, which in turn has to do with a failure of knowledge regarding those events of Haiti that, as the Haitian anthropologist, Michel-Rolph Trouillot, asserted in 1990, persist as “‘unthinkable’ facts . . . for which one has no adequate instruments to conceptualize.”2 The true scandal is not in the proposition of analogy between the Haitian and Tunisian revolutions, but in this epistemological failure, which perpetuates the refusal to recognize that they are not derivative analogues of the French Revolution or the European Spring of Revolution, but are distinctive events of social transformation, which while in part stimulated by a certain set of Enlightenment concepts and institutions, have taken a course that cannot be charted according to the dominate mapping of our common modernity.

    What I am proposing, then, is that in order to address the significance of the Tunisian Revolution, to seriously ask what this is as an earthly historic human event, we need another historiography of revolution, one that not only makes use of alternative archives but also deploys an alternative anthropology. In addressing the question what is this, we need ask what does it look like; hence, my answer: Haiti. To the extent that this entails locating the Tunisian Revolution within an international milieu, it means situating it in the lineage of, to put it bluntly, “other-than-European” popular revolution. This does not mean non-European, which would assume that the question of Europe itself is settled; which it is not, remaining instead the principal conundrum of modern political science, as well as human sciences: What are we and how can we see ourselves in common? The incomprehensibleness of the commonality of the Haitian and Tunisian revolutions to the current political and sociological analysis is indicative of the utter failure of these sciences to adequately address that question. In the case of Haiti, this is expressed as an outright hostility to the possibility of there ever being let alone ever have been a revolution. In the case of Tunisia, it is manifested as an equally assertive indifference. Both responses have a similar effect: the blockage of destructive neglect of the revolutionary momentum. There are two specific points of analogy to which I wish to draw attention today. The first has to do with why both Haiti and Tunisia are incomprehensible as revolutions in their own right. The second has to do with, incomprehensibleness notwithstanding, the Haitian and Tunisian revolutions’ function in common as actual catalysts for worldwide revolution. Both are emblematic of the movement of les damnes of modernity to realize the better aspirations of humanist modernity: Universal human dignity and rights. This has certainly been so for Haiti historically, which has long been an emblem of radical revolutionary freedom among radicals, and not just Black radicals for 200 years despite, no precisely because of the efforts of the great powers to erase it. Tunisia may perhaps, and this is the aspirational bit, come to be the same for our era.

    Taking up the first point, I’ll remark what I am sure many of you already noted, which is that my proposition the Tunisian Revolution is evidence humanity can progress of its own accord is a paraphrasing of Immanuel Kant’s assessment of the French Revolution given in his treatise on education, Der Streit der Fakultäten (The Conflict of the Faculties). Kant’s pronouncements of revolution have come under considerable scrutiny among political philosophers of late in accordance with a renewed investment in his conception of cosmopolitanism; the reason having to do with the idea that we may be indeed approaching such a world order. Of course, Kant is notoriously counterrevolutionary, precisely because, as Lewis Beck and even Chris Surprenant have pointed out, his theory of the deontological foundation for the origins of civil society dictates absolute prohibition on violent rebellion. Nonetheless, he did publicly express enthusiasm for the French Revolution, seeing in the events of 1789 to 1798, when he wrote The Conflict, a mode of thinking—we might best call it, daring to correct him, an emergent intelligence—that “demonstrates a character of the human race at large and all at once.” That this should have emerged all at once, spontaneously, among the populous without the benefit of the discipline, Zucht, achieved through cultured pedagogy, trending toward instituting a civil constitution is precisely what recommends it as evidence of human progress. It was evidence of the inherent universal human tendency of progressive change, where the movement is towards realizing a common association of life and living. The fact that even though, for Kant, this is expressly a communicative association in reason, its conceptual schemata is principally a function of imagination need not concern us here. I merely want to mark it as a useful insight for understanding the eventfulness of Abou el-Kacem Chebbi’s 1933 poem, “If the People One Day Will to Live,” in the spontaneity of the Tunisian’s popular uprisings and their manifesting a certain sort of sovereignty as self-conscious autopoesis; and that it is precisely the unlawfulness of such collective imagination that inclined Kant to view the events unfolding on Saint Domingue during the same time as those in France as the purest instance of collective irrational emotion— in the sense of ill-directed public commotion and unrest: riots—acting against moral-reason, and so absolutely an illegitimate eruption of violence against not only government but also civil society. By that same token, I’ll not rehearse Kant’s account of the origins of civil society, with its complicated elaboration of duties of right—virtue to the self and justice to others—and his notion of authorized reciprocal coercion, which lays the foundation for his views on revolution. It suffices to remark here that his account turns on the postulate that humankind is comprised of individuals who, even in the state of nature, are all rational, autonomous beings. These two aspects of Kant’s thinking are key reasons why all he could see happening in Saint Domingue was a Negro slave rebellion. It is crucial we understand that this was not a failure of personal morals, or some kind of irrational reaction to human difference. It was a fundamental function of Kant’s transcendental deduction, which is to say his account of what is our reality and how we have it, and so what it means to be a free human subject capable of enlightenment, of warranting the motto Sapere Aude. In his assessment of all that, the Negro is a type of hominid firmly situated in the natural domain of things governed by physical law, but not so fully within the supranaturalistic domain of persons governed by the rational moral law. In that light, the basis of the Haitian Revolution’s incomprehensibleness Trouillot references has precisely to do with the priority of the individual in the tradition of European political philosophy; it is because the Negro can not be admitted into the ranks of rational cosmopolitan individuals, and so cannot be the generator of civil society that the prospect of a revolution forming a republic—that is, constituting a civil society—is unfathomable, and nearly unimaginable. My point here— and it is a complicated one that I shall have to make quickly yet I hope coherently—is not about race. Rather, what gets expressed in Haiti’s case as a problem of race is indicative of a more fundamental problem of anthropological psychology and philosophy. That is the long enduring premise that only one mode of subjectivity drives the history of knowledge, as well as history, and it has a definitive singular formation.

    The Haitian Revolution’s being a contradictory corrective to this premise was announced by Jean-Jacques Dessalines on April 28 1804, when he justified the recriminatory violence that had just taken place against the island’s whites with the words: “We have paid these true cannibals back in full; war for war, crime for crime, outrage for outrage. . . . I have saved my country. I have avenged America.” Just fourth months earlier, Dessalines had declared the establishment of the Republic of Hayti, in his capacity as its first president. Naming the new country by the assumed Taino term for the island of Hispaniola—the very first place to see the arrival of Iberian colonists and the emergence of Europeans on the world stage—was a symbolically powerful statement, as was his reversing the accusation of cannibalism that had long justified the autochthonous people’s enslavement and murder. Dessalines’ April 28 statement signified an act of solidarity with not only all the oppressed populations, les damnes, of the Western hemisphere, but also the entire world, as was made explicit in the language of the 1804 constitution. One is inclined to agree with Nick Nesbitt and recognize in that constitution the first attempt to construct a society in accordance with the radical Enlightenment axioms of universal emancipation and universal human autonomy, in which all human subjects retain their autonomous constituent power. Dessalines thus defined the Haitian Revolution as a war of worlds, one that in “saving” Haiti from colonial slavery had avenged an entire hemisphere. In so doing, he expressly took up the Radical Enlightenment, further radicalizing in turn that very Enlightenment, which had refused to address anyone other than Whites as full subjects of human rights. As Nesbitt characterizes it, the Haitian Revolution amounted to an “invention of an egalitarian freedom unknown in the North Atlantic.” One might quibble with the term “invention,” preferring manifestation, yet concur fully with the assessment of the revolution’s scope, articulating a distinctive historical subjectivity—that is, distinct in its formation from that of the bourgeoisie of the Enlightenment. This articulation was remarked by the first properly Haitian theorist and polemicist for the revolution, Pompee-Valentin baron de Vastey, in his An Essay on the Causes of the Revolution and Civil Wars of Hayti where he writes of a population that only twenty-five years earlier was “in slavery and the most profound ignorance, with “no idea of human societies, no thought of happiness, no kind of energy, yet through massive spontaneous individual autodidactic effort— many of them learned to read and write of themselves without an instructor. They walked about with books in their bands, inquired of persons whom they met, whether they could read; if they could, they were then desired to explain the meaning of such a particular sign, or such a word — produced in the span of one generation a corps of indigenous Haitian notaries, barristers, judges, statesmen, that “astonished every one by the solidity of their judgment.” Even more significant than this being a direct contradiction of Kant’s dismissal of the Negro as an inferior more natural hominid, is that the facts of Haitian auto-didacticism is in evidence of his theory of humankind’s capacity for autopoetic progression, and that, even more than the French Revolution, the Haitian Revolution proves this. So what the incomprehensibleness of the events of the Haitian Revolution clearly indicates is not merely that they are unthinkable in accordance with the reigning cosmology, but that the cosmology is woefully, on its fundamental premises, incapable of yielding any truly adequate knowledge about the eventfulness of humankind, about how the societies in which we actual live are as they are. Which is to say they are a far ways away from giving a full picture of how humanity lives life in our world.

    To see how this problem of incomprehensibleness and contradiction relates to the Tunisian Revolution, and so underscore this point about the resemblance between the events begun at Bois Caïman on August 28 1791 and those that began at Sidi Bou Zid on December 17 2010, we need merely recall Alain Badiou, just 5 days after the fall of Ben Ali, designating the events in Tunisia as Les émeutes en Tunisie. What struck Badiou about the events in Tunisia was they contradicted the «fin de l’histoire» thesis of globalization that postulates “the end of eventful history (le fin de l’événementialité historique), the end of a moment where the organization of power could be overthrown in favor of, as Trotsky said, ‘the masses entering on the stage of history.’” So that precisely such events as Tunisia were supposedly no longer possible. For the past thirty years, neoliberal globalization has been, as Badiou says, “The only tenable norm of general subjectivity (la seule norme tenable de la subjectivité générale).”

    Once again, we are held captive by a powerful idealist concept of things—and especially so when it is touted as a reductive behavioralism or functionalism—that interferes with our capacity to see what is unfolding before us. Certainly, this subjectivité, this person, becoming the global norm has been the meaning of globalization until now. It has been a globalization from above that we have called, in the French mode, “Américanisation,” underscoring its association with imperialism, or more consistently “neoliberalism,” which is characterized by the premise that market values — the dynamics of high capitalist finance — are the absolute measure not just of human progress but existence as well. And so, the economy of consumption and desire, desire and consumption has been the sole determinate of what we are. Until now. I say until now, because what the Tunisian émeutes have unleashed is another mode of globalization, one expressly based on a set of values — dignity, liberty, and social justice — the very same ones espoused by the Haitians. And as with them, these values were not espoused by the intellectuals of the elite classes (whether bourgeoisie of petty bourgeoisie) functioning as the avant-garde to the masses, but by the masses on their own. “What is fascinating above all else in the Tunisian events,” according to Badiou, “is their historicity, they demonstrate that the capacity to create new forms of collective organization is intact (la mise en évidence d’une capacité intacte de création de nouvelles formes d’organisation collective).”

    I draw your attention here to this distinction between the processes of market- driven subjectivity and the capacity of the Tunisian revolution to create new forms of collectivity. In both instances, we are talking about some process of individuation that has the practical and very material function of socialization, of creating a certain type of individual suitable for a certain type of sociality. The individuation process of the capitalist market — and I mean throughout its history from the early commodity markets of tenth-century Europe to the current neoliberal market of global finance — may indeed have engendered the normative subjectivity of the market through its endless refashioning and management of desire and imagination, but it also engendered something else, as is evidenced by the Tunisian Revolution. This something else is what Zygmnt Bauman termed an aesthetic sociality, the spontaneity of subjective feeling into volatile and unpredictable occasions of consensus. As he says: “The instantaneous sociality of the crowd is a counter-structure to socialization’s structures.” We can understand by this that the cumulative institutionalized practices of disciplining normality, the genealogies of which Foucault elaborated under the lose rubric of biopolitics—to which Bauman includes the legislative rationality of cognitive space thereby referencing the methodological practices of the human sciences in the university—are interrupted by the faceless agency of the crowd.

    On this point, I emphasize the importance of the Tunisian Revolution’s displaying the very real capacity of the spontaneous intelligence of the people to create, to generate new forms of sociality independent of the market-based processes of socialization. These “émeutes” hold the promise of what Frantz Fanon referred to 43 years ago in his hopeful analysis of the potential of the Algerian Revolution as “doing something new,” and which gets paraphrased as neo-humanism, but I prefer to call radical humanism. And what I mean by this is a humanism predicated on something else than the processes of bourgeoisie or even proletariat individuation; that is to say, its values are not reducible to matters of exchange or even the practical, in the Kantian or even pragmatic sense of the term, matters related to exchange-value. I am brought, thus, to the second point of analogy I wish to mark today. Both the Tunisian and Haitian revolutions give manifest expression to a type of human intelligence articulating a self-consciousness that is not identical to the transcendental self behind subjective, as well as objective idealism. It is, in distinction from that subjectivity, an articulation of being among things in the world. It is a figure for a distinctly different epistemology than that of the bourgeoisie, even in the latter’s revolutionary articulations.

    Such was highlighted early on in the revolution by Mongi Rahoui who, just one month after Ben Ali’s flight from power, during a symposium convened at the Temimi Foundation, proclaimed: “I personally do not belong to any party or any association; I have my personal affiliation—I belong to myself . . . I want to be a member of ‘a stone in a larger dam,’ paying the revolution forward together and giving attention to its accomplishments, saying it is from beginning to end a revolution of freedom and dignity.” With this blunt assertion for the self, and his identifying this self-awareness as the fundamental revolutionary project to actualize a free society, Rahoui raises to prominence the question of ethical relations: How am I engaged in ethical relation with others? He has publicly insisted on the centrality of this question in the political process of the revolution in his role as the representative of Jendouba in the National Constituent Assembly, which was charged with drafting the new constitution. Just this January, when the constitution was being finalized, Rahoui became embroiled in a pivotal debate with Habib Ellouz, a founding member of the Nahda over the relationship between the language of Article 1 of the newly drafted constitution, resulting in the language in Article 6 expressly prohibiting charges of apostasy (تكفﯿﻴر /takfir) and incitement to hatred and violence—a clear indication as any that this revolution, whatever it is, is not theocratic. It is not inconsequential that Rahoui’s debate with Ellouz garnered considerable attention in Tunisia and the Arab World precisely because it is a heuristic of the struggle between the native secularism expressed in the spontaneous prolonged insurrection of the streets that began in December 2010 and continued well through to September 2013, and the Islamist agenda to impose what the Tunisian activist and philosopher Muhsin al-Khouni, calls a utopian fiction of the Islamic heritage: their conception of sharī‘a. Nor is it inconsequential in that regard that Rahoui is now the sole member of the leftist Mouvement des Patriotes Démocrates, (Democratic Patriots’ Movement, or MOUPAD) to hold a seat in the National Constituent Assembly. Ideologically Marxist and ardently secular and anti-Islamist, MOUPAD was part of the Popular Front that was formed in October 2012, bringing together various leftist and progressive parties into effective political block. It was the assassination of MOUPAD’s Secretary-General, Chokri Beliäd, by Salafist in February 2013 that precipitated the national crisis in which the coordinated efforts of the Popular Front, the UGTT, and street demonstrations eventually led to the Nahda government’s collapse this January.

    Rahoui’s persistence in emphasizing the Tunisian Revolution’s fundamental insistence on individual responsibility for life in association with others in the political reformation of Tunisia gives a certain actuality to what was initially signaled by the multitude in the streets with the slogan كراﺍمة اﺍلإنسانﻥ (kāramat-ul-insān/ “human dignity”) during the initial insurrection, and was fiercely defended by the syndicalists during the Nahda government. It is a manifestation of what the late Chokri Beliäd spoke about as the “Tunisian intelligence” (al-dhikā al-tunisī/ لذكاء اﺍلتونسياﺍ) by which he meant a critical mass of educated subjects, including the labor movement and the various institutions of civil society, formed through a specific educational system and a confluence of historical and geographic factors, unique to the country. That intelligence, he argued, is both what would save the nation, having sparked the revolution, and what the emerging constitutional order should invest in and strive to preserve.

    Beliäd’s designation and description of Tunisian intelligence, Rahoui’s activism, and especially the vernacular invocation of human dignity are all indicative of a particular process of individuation that was not so much inaugurated by the postcolonial Bourguiba government’s enactment of the Education reform law number 58-118 of November 1958, but rather traceable back to the older Tanzimat- style reforms implemented by Khaïreddine al-Tunsi in the nineteenth-century at al- Zaytouna University and Collège Sadiki, which the 1958 reform gave a more popular institutionalization and instrumentality. This process of individuation can be regarded as resonant with Gramsci’s fundamental focus on the relationship between material conditions of life- practices and the institutions of human intelligence, so that the popular Tunisian intelligence Beliäd described is an emergent formation—it is a moment of subalternity, the precise moment when a set of life- practices give expression to a set of intellectual practices of reflection and organization that articulate a narrative of historical constitution and change. Mahmud al-Mas‘adi, who undertook institutional execution of the 1958 reform as Secretary of State for Education, Youth and Sports, designated this condition “restlessness” ( َعلى قَلَ ٍق /‘ala qalqin), describing a mode of sociality in which each individual accepts the responsibility, as well as risk of living life in relation and common with others. In effect, the Bourguiba/Mas‘adi reform engendered a population that is قلوقﻕ / qalūq (restless), capable of an ongoing open-ended practice of discovery, which is precisely what Fanon was describing with the term individuation. The Tunisians’ identification of this restlessness with كراﺍمة اﺍلإنسانﻥ (kāramat-ul-insān) is akin to what Tony Bogues has recently designated as “common association” in his attempt to think the centrality of artistic and poetic expression in the Haitian peoples effort to actualize a free revolutionary subject in the immediate aftermath of the 1804 revolution. Indeed, The 1958 Bourguiba law was as extensive in scope as were the education law promulgated by Henri Christophe when he became King of Haiti in 1811 after the dissolution of Dessalines’ imperium with his death in 1806, precipitating the division of the country into warring northern and southern realms; and then again in 1816 by Pétion who, after Christophe’s assassination, restored the united republic and established an extensive system of education, including a national school of secondary education for girls in Jacmel, as well as the Pensionnat National des Demoiselles in Port-au-Prince declaring: “Education should be the fundamental basis of any program in a true democracy, because education raises man to the dignity of his being.” In that vein, the human condition both the Haitian and Tunisian revolutions describe as well as enact is perennially transitional, or to use an older language, metabolic. This, I think, is currently at stake in Tunisia right now, expressed in the eloquent local metaphor شرﯾﻳعة اﺍلثورﺭةﺓ. /shari‘at-u-thawra. I translate this in deliberate deviation as “the ethics of the revolution,” rather than the more conventional “law [as in Sharia] of the revolution,” to remain in solidarity with the Youth of the Revolution in their ambition to sustain an open-ended possibility for a myriad of ways of taking care of the self, an unending restlessness.

    Arguably, the spontaneity with which the people of Kasserine established structures of order in all the chaos during those dark days of early January 2011 is illustrative of such restlessness as a societal force. And when those events are considered in light of Mohamed-Salah Omri’s claim that a constancy of Tunisian social life is the culture of dialogue and what may be called institutionalism, we must seriously ponder the hard question of whether the Tunisian events of this moment, like the Haitian events of the long nineteenth century, do not so much announce a new paradigm of revolutionary transformation, as they manifest a history of individuation in modernity that escapes comprehension from a certain perspective. This is a matter of the seer and the seen. And, in that regard, the assessment of the Tunisian revolutionary unionist and theorist, Mouldi Guessoumi, is extremely pertinent: “This is a revolution that has not affected Tunisia’s mode of production, or the overall structure of its society, or even the political consciousness and reasoning. Rather, it has been a surgical intervention undertaken by the citizenry in the daily life practices of society.” Perhaps the clearest, although not simplest, illustration of this is the insistence of the people in Sidi Bou Zid that they be able to eat bread without having to beg. Calling this كراﺍمة اﺍلإنسانﻥ (kāramat-ul-insān), human dignity, they aim at achieving a society in which one’s desire is not the instrument of one’s exploitation.

    notes:
    1. Make note that when President Boyer secured France’s recognition of the republic in 1825 at a devastating cost, he effectively ended the revolution’s political expression.
    Back to the essay

    2. He made this assertion in Haiti: State against Nation. The Origins and Legacy of Duvalierism, reiterating what he had already set out in his landmark 1977 work, Ti difé boulé sou Istoua Ayiti, which was the first book-length monograph in Haitian Creole on the origins of the Haitian Revolution.
    Back to the essay

  • Towards Alternative Archives

    Towards Alternative Archives

    Between 2010-2012, Anthony Bogues and Geri Augusto convened a critical global humanities summer institute at Brown University. As part of that program Bogues was invited to Addis Abbba, Ethiopia to continue these conversations. This is a short documentary on these conversations held in Addis Abba. Here Ethiopian scholars discuss their own practical and theoretical approaches to humanistic work, which draws on African thought and experience.

    Video by the Watson Institute for International Studies.