Category: b2o: an online journal

b2o: an online journal is an online-only, peer-reviewed journal published by the boundary 2 editorial collective, with a standalone Editorial Board.

  • Arne De Boever — Realist Horror — Review of “Dead Pledges: Debt, Crisis, and Twenty-First-Century Culture”

    Arne De Boever — Realist Horror — Review of “Dead Pledges: Debt, Crisis, and Twenty-First-Century Culture”

    by Arne De Boever

    Review of Annie McClanahan, Dead Pledges: Debt, Crisis, and Twenty-First-Century Culture (Stanford: Stanford University Press, 2017)

    This essay has been peer-reviewed by the boundary 2 editorial collective.

    The Financial Turn

    The financial crisis of 2007-8 has led to a veritable boom of finance novels, that subgenre of the novel that deals with “the economy”.[i] I am thinking of novels such as Jess Walter’s The Financial Lives of the Poets (2009), Jonathan Dee’s The Privileges (2010), Adam Haslett’s Union Atlantic (2010), Teddy Wayne’s Kapitoil (2010), Cristina Alger’s The Darlings (2012), John Lanchester’s Capital (2012), David Foster Wallace’s The Pale King (2012),[ii] Mohsin Hamid’s How To Get Filthy Rich in Rising Asia (2013), Nathaniel Rich’s Odds Against Tomorrow (2013), Meg Wolitzer’s The Interestings (2013)—and those are only a few.

    Literary criticism has followed suit. Annie McClanahan’s Dead Pledges: Debt, Crisis, and Twenty-First Century Culture (published in the post-45 series edited by Kate Marshall and Loren Glass) studies some of those novels. It follows on the heels of Leigh Claire La Berge’s Scandals and Abstraction: Financial Fiction of the Long 1980s (2015) and Anna Kornbluh’s Realizing Capital: Financial and Psychic Economies in Victorian Form (2014), both of which deal with earlier instances of financial fiction. By 2014, McClanahan had already edited (with Hamilton Carroll) a “Fictions of Speculation” special issue of the Journal of American Studies. At the time of my writing, Alison Shonkwiler’s The Financial Imaginary: Economic Mystification and the Limits of Realist Fiction has just appeared, and no doubt, many more will follow. In the Coda to her book, La Berge mentions that scholars are beginning to talk about the “critical studies of finance” to bring together these developments into a thriving field.

    Importantly, Dead Pledges looks not only at novels but also at poetry, conceptual art, photography, and film. Indeed, the “financial turn” involves more than fiction: J.C. Chandor’s Margin Call (2011), Costa-Gavras’ Capital (2012), Martin Scorcese’s The Wolf of Wall Street (2013), and Adam McKay’s The Big Short (2015) were all released in the aftermath of the 2007-8 crisis. American Psycho, the musical, premiered in London in 2013 and moved on to New York in 2016.

    All of this contemporary work builds on and explicitly references earlier instances of thinking and writing about the economy, so it is not as if this interest in the economy is anything new. However, given the finance novel’s particular name one could argue that while the genre of the finance novel—understood more broadly as any novel about the economy–precedes the present, it is only during the financial era, which began in the early 1970s, and especially since the financial crisis of 2007-8 that it has truly come into its own. For the specific challenge that is now set before the finance novel is precisely to render the historic formation of “finance” into fiction. Critics have noted that such a rendering cannot be taken for granted. While capitalism has traditionally been associated with the realist novel (as La Berge and Shonkwiler at the outset of their edited collection Reading Capitalist Realism point out[iii]), literary scholars consider that capitalism’s intensification into financial or finance capitalism or finance tout court also intensifies the challenge to realism that some had already associated with global capitalism.[iv] Abstract and complex, finance exceeds what Julia Breitbach has observed to be some of the key characteristics of realism: “narration”, associated with “readable plots and recognizable characters”; “communication”, allowing “the reader to create meaning and closure”; “reference”, or “language that can refer to external realities, that is, to ‘the world out there’”; and “ethics”, “a return to commitment and empathy”.[v]

    In the late 1980s, and just before the October 19th, 1987 “Black Monday” stock market crash, Tom Wolfe may still have thought that to represent finance, one merely had to flex one’s epistemological muscle: all novelists had to do, Wolfe wrote, is report—to bring “the billion-footed beast of reality” to terms.[vi] However, by the time Bret Easton Ellis’s American Psycho comes around, that novel presents itself as an explicit response to Wolfe,[vii] proposing a financial surrealism or what could perhaps be called a “psychotic realism” (Antonio Scurati) to capture the lives that finance produces. If (as per a famous analysis) late capitalism’s aesthetic was not so much realist but postmodernist, late late capitalism or just-in-time capitalism has only intensified those developments, leading some to propose post-postmodernism as the next phase in this contemporary history.[viii]

    At the same time, realism seems to have largely survived the postmodernist and post-postmodernist onslaughts: in fact, it too has been experiencing a revival,[ix] and one that is visible in, and in some cases dramatized in, the contemporary finance novel (which thereby exceeds the kind of financial realism that Wolfe proposes). Indeed, one reason for this revival could be that in the aftermath of the financial crisis, novelists have precisely sought to render abstract and complex finance legible, and comprehensible, through literature—to bring a realism to the abstract and complex world of finance.

    Given realism’s close association with capitalism, and its post- and post-postmodern crisis under late capitalism and finance, none of this should come as a surprise. Rather, it means that critics can consider the finance novel in its various historical articulations as a privileged site to test realism’s limits and limitations.

    Finance, Credit, Mortgage

    If Karl Marx’s celebrated formula of capital—M-C-M’, with money leading to money that is worth more via the intermediary of the commodity—is quasi-identified with the realist novel, the formula’s shortened, financial variation—M-M’, money leading to money that is worth more without the intermediary of the commodity[x]—has come to mark its challenges. Perhaps in part reflecting this narrative (though this is not explicitly stated in the book), Dead Pledges’ study of the cultural representations of finance starts with a discussion of the realist novel but quickly moves away from it in order to look elsewhere in search of representations of finance.

    McClanahan’s case-studies concern the early twenty-first century, specifically the aftermath of the 2007-8 crisis. However, the historical-theoretical framework of Dead Pledges focuses on credit and debt. It extends some 40 years before that, to the early 1970s and the transformations of the economy that were set in motion then. Dead Pledges thus takes up the history of financialization, which is usually dated back to that time. Neoliberalism, which is sometimes confused with finance and shares some of its history, comes up a few times in the book’s pages but is not a key term in the analysis.

    One could bring in various reasons for the periodization that McClanahan adopts, including—though with some important caveats—the Nixon administration’s unilateral decision in 1971 to abolish the gold standard, thus ultimately ending the Bretton Woods international exchange agreements that had been in place since World War Two and propelling the international markets into the so-called “Nixon shock.” However, in his key text “Culture and Finance Capital” Fredric Jameson already warned against the false suggestion of solidity and tangibility that such a reference to the gold standard (which was really “an artificial and contradictory system in its own right”, as Jameson points out[xi]) might bring. Certainly for McClanahan, who focuses on credit and debt and is not that interested in money, it would make sense to abandon so-called commodity theories of money and fiat theories of money—which have proposed that the origins of money lie in the exchange of goods or a sovereign fiat—for the credit or debt theory of money which, as per the revisionist analyses of for example David Graeber and Felix Martin,[xii] have exposed those other theories’ limitations. Indeed, McClanahan’s book explicitly mentions Graeber and other contemporary theorists of credit and debt (Richard Dienst, Maurizio Lazzarato, Angela Mitropoulos, Fred Moten and Stefano Harvey, Miranda Joseph, Andrew Ross) as companion thinkers, even if none of those writers is engaged in any detail in the book.

    Since the 1970s, consumer debt has exploded in the United States and Dead Pledges ultimately zooms in on a particular form of credit and debt, namely the home mortgage. McClanahan inherits this focus from the collapse of the home mortgage market, which triggered the 2007-8 crisis. McClanahan rehearses the history, and the complicated technical history, of this collapse at various moments throughout the book. Although this history is likely more or less familiar to readers, the repetition of its technical detail (from various angles, depending on the focus of each of McClanahan’s chapters) is welcome. As McClanahan points out, home mortgages used to be “deposit-financed” (6). While there was always a certain amount of what Marx in Capital: Vol. 3 called “fictitious capital”[xiii] (“fiktives Kapital”) in play—banks can write out more mortgages than they actually have money for based on their credit-worthy borrower’s promise to repay (with interest)—the amount of fictitious capital has increased exponentially since the 1970s. More and more frequently mortgages are being funded not through deposits but “through the sale of speculative financial instruments” (6)—basically, through the sale of a borrower’s promise to repay. This development is enabled by the practice of securitization: many mortgages are bundled together into what is called a tranche, which is then sold as a financial instrument—a mortgage backed security (MBS) or collateralized debt obligation (CDO). These kinds of instruments, so-called derivatives, are the hallmark of what in Giovanni Arrighi’s terms we can understand as the phase of capitalism’s financial expansion (see 14). This refers to an economic cycle during which value is produced not so much through the making of commodities but through value’s “extraction” (as Saskia Sassen puts it[xiv]) beyond what can be realized in the commodity—in this particular case, through the creation and especially the circulation of bundles of mortgages.

    As McClanahan explains, securitization is about “creating a secondary market” (6) for the sale of debt. The value of those kinds of debt-backed “commodities” (if we can still call them that) does not so much come from what they are worth as products—indeed, their value is dubious since for example the already mentioned tranches will include both triple A rated mortgages (mortgages with a low risk of default) and subprime mortgages (like the infamous NINJA mortgages that were granted to people with No Income, No Jobs, No Assets). Nevertheless, those MBSs or CDOs often still received a high rating, based on the flawed idea that the risk of value-loss was lessened by mixing the low risk mortgages with the high risk mortgages. What seemed to have mattered most was not so much the value of an MBS or CDO as product but their circulation, which is the mode of value-generation that Jasper Bernes among others has deemed to be central to the financial era. Ultimately, and while they brought the global financial system to the edge of collapse, they also generated extreme value for those who shorted those financial products. And shorted them big, as Adam McKay’s The Big Short would have it (Paramount, 2015; based on Michael Lewis’ 2010 book by the same title). By betting against them, the protagonists of Lewis’ and McKay’s story made an immense profit while everyone else suffered catastrophic losses.

    “Dematerialization” alone and cognate understandings of finance as “performative” and “linguistic”[xv]—in other words, this story as it could be told using the abolition of the gold standard as the central point of reference—cannot tell the whole truth here, especially not since credit and debt can actually be found at the origin of money. However, through those historico-economic developments of credit and debt there emerges a transformed role of credit and debt in our societies, from a “form of exchange that reinforces social cohesion” (185) to “a regime of securitization and exploitable risk, of expropriation and eviction” (182). Dematerialization—or perhaps better, various rematerializations: for example from gold or real estate to securitized mortgage debt—is important but without the material specifics of the history that McClanahan recounts, it does not tell us all that much.

    Echoing David Harvey’s description of the need for “new markets for [capital’s] goods and less expensive labor to produce them” as a “spatial fix” (Harvey qtd. 12), McClanahan reads the history summarized above as a “temporal fix” because “it allows capital to treat an anticipated realization of value as if it had already happened” (13). In 2007-8, of course, that fix turned out to be an epic fuck-up. McClanahan recalls Arrighi’s periodization (after Fernand Braudel) of capitalism as alternating “between epochs of material expansion (investment in production and manufacturing) and phases of financial expansion (investment in stock and capital markets)” (14) and notes that the 2007-8 crisis seems to have marked the end of the phase of financial expansion.

    In Arrighi’s view, that would mean the time has come for the emergence of a new superpower, one that will step in for the U.S. as the global hegemon. A return of American (U.S.) greatness through a return to an era of material expansion (as the current U.S. President Donald J. Trump is proposing) appears unlikely within this framework: at best, it will have some short-lived, anachronistic success before the new hegemon arrives. However, will that new hegemon arrive? According to some, and McClanahan appears to align herself with those, the current crisis of the system “will not lead to the emergence of a new regime of capitalist accumulation under a different imperial superpower” (15). “Instead, it heralds something akin to a ‘terminal crisis’ in which no renewal of capital profitability is possible” (15). Does this then lead to an eternal winter, as Joshua Clover already asked?[xvi] Alternatively, are we finally done with those phases, and ready for something new?

    The Novel: Scale and Character

    If all of this has been theoretical so far, Dead Pledges’ four chapters stand out first as nuanced readings of works of contemporary culture. As McClanahan sees it, culture is the best site to understand debt as a “ubiquitous yet elusive social form” (2). By that, she does not mean we should forget about economic textbooks; but to understand debt as a “social form”, culture is the go-to place. McClanahan’s inquiry starts out traditionally, with a chapter about the contemporary realist novel. In it, she takes on behavioral economics, a subfield of microeconomics. Unlike macroeconomics, microeconomics focuses on individual human decisions. Whereas microeconomic models tend to assume rational agents, behavioralism does not: non-rational human decisions might cause or result from a market crisis.

    What caused the 2007-8 crisis? There are multiple answers, and McClanahan shows that they are in tension with one another. One answer—the macroeconomic one–is that the crisis was the result of an abstract and complex financial system that caved in on itself. Such an explanation tends to avoid individual responsibility. On the other hand, microeconomics, and behavioralism in particular, blames the crisis on the bad decisions of a few individuals, which exculpates institutions. This seemed to be the dominant mode of explanation. In this explanation too, however, the buck seemed to stop nowhere: how many bankers went to jail for the catastrophic market losses they caused? This leads to a larger question: how should one negotiate, in economics, between the macro and the micro, between the individual and the system—how should one assign blame, enforce accountability? How should one regulate? How should one even think, and represent, the connections between systems and individuals?

    One cultural form that has been particularly good at this negotiation is the novel, which tends to tell a macro-story through its representation of the micro, and so seeks “to capture the reality of a structural, even impersonal, economic and social whole” (24) while also considering “individual investors’ ‘personal impulses’” (31). This is what McClanahan finds in Martha McPhee’s Dear Money (2010), Adam Haslitt’s Union Atlantic, and Jonathan Dee’s The Privileges. These novels marry the macro- and the micro-economical; they accomplish what McClanahan presents as a scalar negotiation. However, one should note that in doing so, they keep the behavioralist model intact—for they suggest that individual bad decisions lie at the origin of macroeconomic events. McClanahan shows, however, that as novels Dear Money, Union Atlantic, and The Privileges take on that behavioralist remainder, in other words: the novel’s characteristic “focus on subjective experience and the meaningfulness of being a subject” (33), through their awareness of their place in the genealogy of the novel. McClanahan’s readings ultimately reveal that the novels she looks at cannot save the individual from what she terms “a kind of ontological attenuation or even annulment” (33) that comes with their account of the 2007-8 crisis. Out go the full characters of the realist novel. The crisis demands it.

    What is left? The chapter culminates in a reading of Dee’s novel in which McClanahan cleverly suggests that the novel explores “the formal limits of sympathetic identification” and tells “money’s” story rather than the story of Adam and Cynthia “Morey” (51), who are the novel’s main characters. Thus, the novel is not so much about behavioralist psychology but about money itself. Capital is remade in the novel, McClanahan argues, “in the image of the human” (52), creating the uncanny effect of human beings who are merely stand-ins for money. Adam Morey/Money “has no agency, and he is all automaton, no autonomy. He has no interiority” (53). McClanahan does not note that this description places Adam in line with American Psycho’s “automated teller”[xvii] Patrick Bateman, who in a famous passage observes that while

    there is an idea of a Patrick Bateman, some kind of abstraction, … there is no real me, only an entity, something illusory, and though I can hide my cold gaze and you can shake my hand and feel flesh gripping yours and maybe you can even sense our lifestyles are probably comparable: I simply am not there”.[xviii]

    Like Bateman’s narrative voice, which echoes the abstraction of finance, The Privileges’ voice is that of “investment itself” (52), which swallows human beings up whole.

    If the neoliberal novel, as per Walter Benn Michaels’ analysis (from which McClanahan quotes; 53) reduces society to individuals (and possibly their families, following Maggie Thatcher’s claim), The Privileges as a finance novel goes beyond that and “liquidat[es]” (53) individuals themselves. We are encountering here the terminal crisis of character that writes, in the guise of the realist novel, our financial present. Rich characterization is out. The poor character is the mark of financial fiction.

    Yet, such depersonalization does not capture the full dynamic of financialization either. In Chapter 2, McClanahan draws this out through a discussion of the credit score and its relation to contemporary literature. Although one’s credit score is supposed to be objective, the fact that one can receive different credit scores from different agencies demonstrates that an instability haunts it—and resubjectifies, if not repersonalizes, it. McClanahan starts out with a reading of an ad campaign for a website selling credit reports that quite literally personalizes the scores one can receive. It probably comes as no surprise that one’s ideal score is personalized as a white, tall, and fit young man; the bad score is represented by a short balding guy with a paunch. He also wears a threatening hockey mask.

    McClanahan suggests that what structures the difference here between the objective and the subjective, the impersonal and the personalized, is the difference between neutral credit and morally shameful debt. The former is objective and impersonal; the latter is subjective and personalized. The problem with this distinction, however, is not only that the supposedly objective credit easily lets the subjective slip back in (as is evident from the ad campaign McClanahan discusses); discussions of subjective debt also often lack quantitative and material evidence (when they ignore, for example, “the return in debt collection to material coercion rather than moral persuasion”; 57). Rather than showing how the personal can become “a corrective for credit’s impersonality” and how “objectivity [can become] a solution to the problem of debt’s personalization” (57)—debt always operates on the side of both–McClanahan considers how contemporary literature and conceptual art have turned those issues into “a compelling set of questions to be pursued” (57).

    If in the finance novel, rich characterization is out, a question arises: what alternatives emerge for characterization at the crossroads of “credit, debt, and personhood” (57)? As McClanahan points out, there is a history to this question in the fact that “the practice of credit evaluation borrowed the realist novel’s ways of describing fictional persons as well as the formal habits of reading and interpretation the novel demanded” (59). The relation went both ways: “the realist novel drew on the credit economy’s models of typification … to produce socially legible characters” (59). Because “quantitative or systematized instruments for evaluating the fiscal soundness” of borrowers were absent, creditors used to rely “on subjective evaluations of personal character” to assess “a borrower’s economic riskiness” (59). Such evaluations used to take a narrative form; in other words, the credit report used to be a story. It provided a detail of characterization that readers of literature would know how to interpret. The novel—the information it provided, the interpretation it required—was the model for this, for the credit report.

    Enter the quantitative revolution: in the early 1970s the credit report becomes a credit score, the result of “an empirical technique that uses statistical methodology to predict the probability of repayment by credit applicants” (63). Narrative and character go out the window; the algorithmically generated score is all that counts. It is the end of the person in credit. As McClanahan is quick to point out, however, the credit score nevertheless cannot quite leave the person behind, as the “creditworthiness” that the credit score sums up ultimately “remains a quality of individuals rather than of data” (65). Therefore, the person inevitably slips back in, leading for example to the behavioralist models that McClanahan discusses in Chapter 1. Persons become numbers, but only to inevitably return as persons. McClanahan’s reading of the credit score negotiates this interchange.

    One can find some of this in Gary Shteyngart’s Super Sad True Love Story (2010). If critics have faulted the novel for its caricatures and stereotypes, which “[decline] the conventions of characterization associated with the realist novel” (68), McClanahan argues that Shteyngart’s characters are in fact “emblematic of the contemporary regime of credit scoring” (68). Shteyngart’s use of caricature “captures the creation of excessively particular data-persons”; his “use of stereotype registers the paradox by which a contemporary credit economy also reifies generalized social categories” (71). While the credit score supposedly does not “discriminate by race, gender, age, or class” (71), in fact it does. McClanahan relies in part on Frank Pasquale’s important work in The Black Box Society to note credit scoring systematizes bias “in hidden ways” (Pasquale qtd. 72)—hidden because black boxed. This leads McClanahan back to the ad campaign with which she opened her chapter, now noting “its racialization” (72). The chapter closes with a discussion of how conceptual art and conceptual writing about credit and debt have negotiated the issue of personalization (and impersonalization). If “the personal” in Chapter 1 was associated first and foremost with microeconomics and behavioralism (which McClanahan criticizes), McClanahan shows that it can also do “radical work” (77) in response to credit’s impersonalization as “a simultaneously expanded and denaturalized category … representing social relations and collective subjects as if they were speaking persons and thus setting into motion a complex dialectic between the personal and the impersonal” (77). She does this through a discussion of the work of conceptual artist Cassie Thornton and the photographs of the “We are the 99%” tumblr. Mathew Timmons’ work of conceptual writing CREDIT, on the other hand, plays with the impersonal to “provide an account of what constitutes the personal in the contemporary credit economy” (89).

    Although McClanahan does not explicitly state this, I read the arch of her Chapters 1 and 2 as a recuperation of the personal from its negative role in behavioralism (as well as its naturalized, racist role in the credit scoring that is discussed in Chapter 2), and more broadly from microeconomics. Following Thornton in particular (whose art also features on the cover of Dead Pledges), McClanahan opens up the personal onto the macro of the social and the collective. In Dead Pledges, the novel and especially the realist novel turn out to be productive sites to pursue such a project due to the scalar negotiation and rich characterization that are typical of the genre—and in the credit-crisis novel both of those are under pressure. If the novel gradually disappears from Dead Pledges to give way to photography and film in Chapters 3 and 4, the concern with realism remains. Indeed, McClanahan’s book ultimately seems to want to tease out a realism of the credit-crisis era, and it is that project to which I now turn.

    Foreclosure Photography and Horror Films

    In Chapters 3 and 4, once the novel is out of the way, McClanahan’s brilliance as a cultural studies scholar finally shines. Dead Pledges’ third chapter looks at post-crisis photography and “foreclosure photography” in particular. The term refers to photography of foreclosed homes but evokes the very practice of photography itself, which depends on a shutter mechanism that closes—or rather opens very quickly–in order to capture a reality. This signals a complicity between foreclosure and photography that McClanahan’s chapter explores, for example in a discussion of photographs of forced eviction by John Moore and Anthony Suau, which allow McClanahan to draw out the complicities between photography and the police—but not just the police. She notes, for example, that “[t]he photographer’s presence on the scene is underwritten by the capacity of both the state and the bank to violate individual privacy” (114). Dead Pledges ties that violation of individual privacy to a broader cultural development towards what McClanahan provocatively calls “unhousing” (115), evident for example in how various TV shows allow the camera to enter into the private sanctuary of the home to show how people live. Here, “the sanctity of domestic space [is defended] precisely by violating it” (115). In parallel, “sacred” real estate, the financial security of the domestic property has become transformed—violated—by the camera seeking to record foreclosure. The home now represents precarity. This development happened due to the creation of mortgage backed securities, which turned real estate into liquidity and the home into an uncanny abode.

    The chapter begins with a comparative discussion of photographs in which the home is “rendered ‘feral’—overrun by nature” (103). McClanahan considers the narratives that such photography evokes: one is that of the disintegration of civilization into a lawless zone of barbarism—the story of the home gone wild. Looking at the mobilization of this narrative in representations of Detroit, she discusses its biopolitical, racial dimensions. Often the economic hardship that the photographs document is presented as something that happens to other people. But the being of debt today is such that it is everywhere—in other words the “othering” of the harm it produces (its location “elsewhere”) has become impossible. So even though the photographs McClanahan discusses “represent the feral houses of the crisis as the signs of racial or economic Otherness, these photographs ultimately reveal that indebtedness is a condition more widely shared than ever before, a condition that can no longer be banished to the margins of either national space or of collective consciousness” (113). It is us—all of us.

    The last two sections of the chapter deal with the uncanny aspects of foreclosure photography—with the foreclosed home as the haunted home and the uncanny architectural landscape as the flipside of the financial phase that was supposed to “surmount” (135) the crisis of industrial production but actually merely provided a temporal fix for it. Ghost cities in China—cities without subjects, cities whose assets have never been realized, marking the failed anticipation of credit itself–are the terminal crisis of capital. The uncanny, in fact, becomes a key theoretical focus of this chapter and sets up the discussion of horror films in the next: real estate (in other words, the familiar and secure), becomes the site where the foreign and unstable emerges, and as such the uncanny becomes a perfect concept for McClanahan to discuss the home mortgage crisis.

    Far from being real estate, the house, and in particular the mortgaged home, is haunted by debt; so-called “homeowners” are haunted by the fact that the bank actually “owns” their home. Property is thus rendered unstable and becomes perceived as a real from which we have become alienated. In McClanahan’s vision, it even becomes a hostile entity (see 127). At stake here is ultimately not just the notion of property, but a criticism of property and “the inhospitable forms of domestic life produced by it” (105), an undermining of property—and with it a certain kind of “family”–as the cornerstone of liberalism. If McClanahan is critical of our era’s sanctification of the private through a culture of unhousing, her response is not to make the case for housing but rather to use unhousing to expose the fundamental uncanniness of property. With that comes the profanation (as opposed to the sanctification) of the private (as a criticism of inhospitable forms of domestic life). The domestic is not sacred. Property is not secure. Time to get out of the fortress of the house and the violence it produces. If the housing crisis has produced the precarization of the house, let us use it to reinvent domestic life.

    Given the horror film’s long-standing relationship with real estate—think of the haunted house–it was only a matter of time before the 2007-8 crisis appeared in contemporary horror films. And indeed, in the films that McClanahan looks at, it does appear—as “explicit content” (151). One has to appreciate here McClanahan’s “vulgar” approach: she is interested in the ways in which the horror films she studies “speak explicitly to the relationship between speculation, gentrification, and the ‘opportunities’ presented to investors by foreclosure” (151). Unlike for example American Psycho, which borrows a thing or two from the horror aesthetic, McClanahan’s horror flicks do not shy away from the nuts and bolts of finance; instead, they “almost [obsessively include] figures and terminology of the speculative economy in real estate” (151). This leads McClanahan to suggest that as horror films, they have “all the power of reportage”: they offer “a systematic account rendered with all the explicit mimetic detail one would expect of a realist novel” (151). At the same time, they do not do the kind of reporting Tom Wolfe was advocating back when: indeed, “they draw on the particular, uncanny capacity of the horror genre to defamiliarize, to turn ideological comfort into embodied fear” (151). McClanahan emphasizes, with a nod to Jameson (and his appropriation of Lévi-Strauss’ account of myth[xix]), that this is not just a performance of the “social contradictions” that always haunt narrative’s “imaginary solutions” (151). Instead, the films “oscillate between the imagined and the real or between ‘true stories’ and ‘crazy’ nightmares” (151). There are contradictions here both at the level of form and of content—both representational and material, McClanahan writes—and they remain without resolution. The credit-crisis era requires this sort of realism.

    Darren Lyn Bouseman’s Mother’s Day (Anchor Bay, 2010), for example, a remake of Charles Kaufman’s 1980 film, oscillates between competing understandings of property: “as labor and sentimental attachment”; “as nontransferable value and the site of hospitality”; “as temporal and personal”; “as primarily a matter of contingent need” (157). If those all contradict each other, McClanahan points out that what they have in common is that “they are all incompatible with the contemporary treatment of the house as fungible property and liquid investment” (157). Upkeep, sentimental investment, and use all become meaningless when a hedge fund buys up large quantities of foreclosed homes to make profit in renting. Such a development marks the end of “ownership society ideology in the wake of the crisis” (158). Like Crawlspace (Karz/Vuguru, 2013), another film McClanahan discusses, Mother’s Day reveals a strong interest in the home as fixed asset, and the changes that his asset has undergone due to securitization. Indeed, the two other films that McClanahan looks at, Drag Me to Hell (Universal, 2009) and Dream Home (Edko, 2010), are “more specifically interested in real estate as a speculative asset and in the transformation of uncertainty into risk” (161-2).

    By the time Dream Home ends, with an explicit reference—from its Hong Kong setting–to “America’s subprime mortgage crisis” (170), it is hard not to be entirely convinced that with the horror film, McClanahan has uncovered the perfect genre and medium for the study of the representation of the home mortgage crisis. It is here that realism undergoes its most effective transformation into a kind of horrific realism or what I propose to call realist horror, an aesthetic that, like so much else when it comes to finance, cannot be easily located but instead oscillates between different realms. Indeed, if Dream Home provides key insights into the home mortgage crisis in the U.S., it is worth noting that it does so from its Chinese setting, which McClanahan takes to indicate that many of the changes that happened as part of financialization from the 1970s to the present in the U.S. in fact “occurred first in Asia” (174). This opens up the American (U.S.) focus of McClanahan’s book onto the rest of the world, raising some questions about the scope of the situation that Dead Pledges analyzes: how global is the gloomy, even horrific picture that McClanahan’s book paints? This seems particularly important when it comes to imagining, as McClanahan does in the final part of her book, political responses to debt.

    Debt and the Revolution

    While the home mortgage is McClanahan’s central concern, Dead Pledges closes with a political Coda about student debt. If McClanahan returns here to student loans (a topic that she had already addressed in Chapter 2), it is because they are perhaps the representative example of the securitized debt markets that she has discussed. Given the staggering amount of student debt, the low-regulation environment of student loans, and the default rate on student loans, it is likely that the next major market crash will result from the collapse of the securitized student debt market. It is worth noting, indeed, that some are already shorting this market in the hopes of making a profit from its collapse a few years down the line (The Bigger Short, anyone?). In this situation, McClanahan proposes “sabotage”: like several others, most prominently the Strike Debt movement, she is calling on students to stop paying their debts. As the Strike Debt movement puts it: “If you owe the bank $10,000, you’re at the mercy of the bank. If you owe the bank $10 million, the bank is at your mercy”.[xx] Today, banks are at the mercy of students through the massive amounts of student credit that have been extended.

    McClanahan arrives at this politics of sabotage through her discussion of the collapse of the home mortgage market, and specifically of foreclosure. In the first part of her Coda, she discusses how people have responded to their homes being foreclosed by “acts of vandalism”, like “punch[ing] holes in the walls”, leaving “dead fish to rot in basements”, or breaking “pipes to flood their houses with water or sewage”, which she singles out as a “clever” way of “turning the problem of their home’s asset illiquidity on its head” (186). If these are acts of sabotage, it is because they “[remove] commodities from circulation or [block] the paths by which they (and money) circulate” (186). McClanahan embraces this tactic. From this vantage point, one can understand why, as someone reported to me recently after a visit to Greece, the banks there are holding off on foreclosing on those who have defaulted on their mortgages: by keeping the owners in their homes, the banks are trying to guarantee the protection of their assets—this is clearly the better option especially in view of the absence of renter or buyer demand for the apartments or homes whose owners have defaulted. For the moment, the banks in Greece are paying their borrowers for the maintenance of the bank’s assets.

    A couple of things are worth noting: first, “vandalism” or the destruction of an object does not necessarily coincide with the destruction of that object as a commodity. Indeed, if to destroy the object as commodity is to take it out of circulation—as McClanahan, following Bernes (following Marx), argues (186)—then the question is first and foremost how to block that circulation—and that might involve acts of vandalism, or not. In fact, one might imagine the situation in Greece, which involves labor being invested in the upkeep of a home, ultimately leading to a property claim—to taking the home out of the circulation that makes the bank its money. McClanahan considers such an understanding of property in her reading of Mother’s Day in Chapter 4. However, McClanahan is taking aim at the root of property (as becomes clear in both Chapters 3 and 4), and so the latter might not be a satisfactory solution since it keeps the notion of property intact. In addition, one might want to ask whether the home is the appropriate target for the vandalism? Why not sabotage the bank’s plumbing instead? Leave some fish to rot in the bank’s basement?

    Secondly, in the case of student loans, what is the asset to vandalize? The asset that students acquire through loans is “education.” It is an asset that the bank cannot reclaim although of course the diploma that formalizes it can be and has been taken away. But it is not inconceivable that, if the home mortgage crisis is the model here, the institutions and people providing an education will be vandalized: universities, professors, administrators—rather than the banks. And some (Trump University comes to mind) would certainly deserve it. At my own (private arts) institution, where tuition is currently set at a whopping $46,830, I have seen posters in which students bitterly thank the university president for their student debt or claim that the only thing that unites them as students is their debt. If the students look at the institute’s budget more closely, they are able to see that it is tuition-driven: specifically, the pie-chart clearly shows that (debt-based) tuition pays the faculty’s salaries. This pitches the students not only against the university president or other administrators (whose salaries, needless to say, far exceed those of the faculty) but ultimately against the faculty. McClanahan also notes that faculty retirement may also be involved in this: Student Loan Asset Backed Securities (or, in finance’s inexhaustible generation of acronyms, SLABS) are “tranched and sold to institutional investors, including many pension funds” and so “it’s possible for a professor at a university to be indirectly invested in the debt of her own students” (189). Not just in the present, through their salary, but also in the future, for their retirement.

    It is important to argue about student debt, and some faculty—like McClanahan–are bringing that argument into their classrooms. But it will be interesting to see how that develops once the student debt market collapses and faculty salaries and retirement implode. Kant’s answer to the question “What is Enlightenment?” was famously split between “Argue all you please” and “Obey.” What happens if, in this particular case, the students stop obeying? Unless they identify the agent of their subjection correctly—faculty? administrator? university president? university? bank? government? President?–, it might ring the death knell of the U.S. university system. Of course, that may have been the project all along–now with the students as its driving force.

    It is the political dimension of McClanahan’s book, which is somewhat disavowed in the introduction–McClanahan notes early on that “Dead Pledges is not a work of political economy” (15)–but then becomes prominent in the Coda, that may leave some readers frustrated. This is, on the one hand, because the Coda makes a comparative move from home mortgages to student loans that does not come with the nuanced discussion of economics that McClanahan develops elsewhere in the book (there is no consideration, for example, of how CDOs and SLABS are different: does it make sense to short SLABS? Why? Why not?). However, the economic specifics may be important when trying to decide on the most effective political response. The specific response that Dead Pledges offers—sabotage—may also leave some readers frustrated. While sabotage can be effective as a politics that would break financialization’s extraction of value through circulation, it remains, ultimately, a negative intervention that temporarily interrupts or destroys (perhaps in some cases definitively) its targets. But it seems obvious that as far as politics goes, that response can hardly be sufficient; some kind of positive engagement would be required to imagine the world that will come after it. It seems one would need to ask about the “affirmative”[xxi] dimension of the sabotage that is proposed here.

    In a review[xxii] of Wendy Brown’s book Undoing the Demos: Neoliberalism’s Stealth Revolution,[xxiii] McClanahan has criticized Brown on a number of counts, first of all because of her largely negative description of the collective as something that neoliberalism destroys; and second, because through that description, Brown uncritically projects a pre-neoliberal collective that was somehow unaffected by economic pressures. Sarah Brouillette, with whom McClanahan recently teamed up for her response to the Yale hunger strike,[xxiv] has made a similar point.[xxv] As far as positive descriptions of collectivity go, however, McClanahan’s sabotage may also leave one dissatisfied. Furthermore, one may wonder whether the turn to sabotage as a politics is not partly a consequence of Dead Pledges’ focus on the United States. When considering political responses to the debt crisis, it might be the limits and limitations of that focus—a version of the “there is no alternative” that is often associated with neoliberalism–that prevents for example any consideration of, say, the state’s potentially positive roles in processes of financial regulation or even wealth redistribution. Is sabotage the only politics that the left has left in the U.S.? Might not other parts of the world—for example, certain countries in Europe, certain countries in Latin America—offer alternatives from which the left in the U.S. could learn? I am not being naïve here about what I am proposing: it would require fundamental political changes in the U.S. for this to come about. But again, are those changes entirely beyond the American (U.S.) left—so much so that the political imaginary stops at sabotage? Who was it again that rhymed “sabotage” with “mirage”? Sabotage should target the mirage, to be sure; but it raises the question: does their rhyme also evoke sabotage’s complicity with the mirage? Has leftist politics really come down to leaving dead fish to rot in the basements of what used to be our homes? Of course, it may be unfair to expect that those who are defaulting on their mortgages become the agents of the leftist revolution. But what about the students who emerge as the political subjects of our time at the end of McClanahan’s book? Let us focus, post-sabotage, on what other universities they might imagine–what other states.

    I am thinking of what another revolutionary says during that famous rooftop conversation in Gillo Pontecorvo’s The Battle of Algiers (Rizzolo, Rialti Pictures, 1966):

    It’s hard to start a revolution. Even harder to continue it. And hardest of all to win it. But, it’s only afterwards, when we have won, that the true difficulties begin.

    Work in critical finance studies often recalls how it has become easier for us to imagine the end of the world than the end of capitalism.[xxvi] Point taken. But Pontecorvo’s film can help one adjust this position: yes, it is hard to imagine the end of capitalism; but it is even harder to imagine the world that will come after it.

    There is probably no point in worrying, as I will admit I do, about that world and the “true difficulties” that it will bring. Such worrying may prevent one from starting a revolution in the first place. Best to focus on the battle at hand. Certainly, McClanahan’s Dead Pledges provides the perfect impetus.

    I would like to thank Paul Bové and Sarah Brouillette for their generous editing of this review. 

    Notes

    [i] In an article titled “The Plutocratic Imagination”, Jeffrey J. Williams notes for example that “[s]ince the mid-2000s there has also been a spate of novels foregrounding finance” (Williams, “The Plutocratic Imagination.” Dissent 60:1 (2013): 96.

    [ii] David Foster Wallace may appear to be the odd one out in this list but Jeffrey Severs’ recent David Foster Wallace’s Balancing Books: Fictions of Value (New York, Columbia UP, 2017) justifies his inclusion.

    [iii] Berge, Leigh Claire La and Alison Shonkwiler, eds. Reading Capitalist Realism. Iowa City: U of Iowa P, 2014. 1.

    [iv] One can think here of Alberto Toscano and Jeff Kinkle’s book Cartographies of the Absolute (Winchester, Zero Books: 2015), which took its inspiration from Fredric Jameson’s work on these issues.

    [v] Breitbach, Julia. Analog Fictions for the Digital Age: Literary Realism and Photographic Discourses in Novels after 2000. Rochester: Camden House, 2012. 8.

    [vi] Wolfe, Tom. “Stalking the Billion-Footed Beast: A Literary Manifesto for the New Social Novel.” Harper’s Magazine Nov. 1989, 45-56. Here 52. Using a nickname that was used on the Salomon Brothers trading floor to refer to those who had made a monster bonds trade, Michelle Chihara aptly termed this kind of realism “big swinging dick realism” in a review of La Berge and Kornbluh’s books about financial fiction. See: Chihara, Michelle. “What We Talk About When We Talk About Finance.” Los Angeles Review of Books, 09/18/2015, accessible: https://lareviewofbooks.org/article/what-we-talk-about-when-we-talk-about-finance/.

    [vii] See, for example: Berge, Leigh Claire La. “The Men Who Make the Killings: American Psycho and the Genre of the Financial Autobiography”. In: Berge, Scandals and Abstraction: Financial Fiction of the Long 1980s. Oxford: Oxford UP, 2015. 113-147. Here in particular 139.

    [viii] Nealon, Jeffrey T. Post-Postmodernism: Or, The Cultural Logic of Just-In-Time Capitalism. Stanford: Stanford UP, 2012. The famous analysis evoked in the previous part of the sentence is of course Fredric Jameson’s.

    [ix] Kornbluh’s book, among others, testifies to this: Kornbluh, Anna. Realizing Capital: Financial and Psychic Economies in Victorian Form. New York: Fordham UP, 2014.

    [x] Note that Marx already singled out this shorter version as the formula for “interest-bearing capital”, a situation in which money begets more money without the intermediary of the commodity: Marx, Karl. Capital: Vol. 1. Trans. Ben Fowkes. New York: Penguin, 1990. 257. A discussion of M-M’ as the financial variation of the general formula of capital can be found for example in: Marazzi, Christian. Capital and Language: From the New Economy to the War Economy. Trans. Gregory Conti. Los Angeles: Semiotext (e), 2008.

    [xi] Jameson, Fredric. “Culture and Finance Capital.” In: The Cultural Turn: Selected Writings on the Postmodern 1983-1998. New York: Verso, 2009. 154.

    [xii] Graeber, David. Debt: The First 5,000 Years. New York: Melville House, 2011; Martin, Felix. Money: The Unauthorized Biography—From Coinage to Cryptocurrencies. New York: Vintage, 2015.

    [xiii] Marx, Karl. Capital: Vol. 3. Trans. David Fernbach. New York: Penguin, 1991. 596.

    [xiv] Sassen, Saskia. Expulsions: Brutality and Complexity in the Global Economy. Cambridge (MA): Harvard UP, 2014.

    [xv] See, for example, the already mentioned book by Marazzi or also: Berardi, Franco “Bifo.” Precarious Rhapsody: Semiocapitalism and the Pathologies of the Post-Alpha Generation. London: Minor Compositions, 2009; Berardi, The Uprising: Poetry and Finance. Los Angeles: Semiotext (e), 2012.

    [xvi] Clover, Joshua. “Autumn of the System: Poetry and Financial Capital.” JNT: Journal of Narrative Theory 41:1 (2011): 34-52.

    [xvii] This is how La Berge has perceptively analyzed American Psycho’s mode of narration: Berge, Scandals, 136.

    [xviii] Ellis, Bret Eason. American Psycho. New York: Vintage, 1991. 376-377.

    [xix] See: Jameson, Fredric. The Political Unconscious: Narrative as a Socially Symbolic Act. Ithaca: Cornell UP, 1981.

    [xx] McKee, Yates. “DEBT: Occupy, Postcontemporary Art, and the Aesthetics of Debt Resistance.” South Atlantic Quarterly 112:4 (2013): 784-803. Here 788.

    [xxi] I borrow the notion of “affirmative sabotage” from Gayatri Chakravorty Spivak. See, for example: Evans, Brad (interview with Gayatri Spivak), “When Law is Not Justice”, 07/13/2016, accessible: https://www.nytimes.com/2016/07/13/opinion/when-law-is-not-justice.html?_r.

    [xxii] McClanahan, Annie. “On Becoming Non-Economic: Human Capital Theory and Wendy Brown’s Undoing the Demos.” Theory & Event, forthcoming.

    [xxiii] Brown, Wendy. Undoing the Demos: Neoliberalism’s Stealth Revolution. New York: Zone Books, 2016.

    [xxiv] Brouillette, Sarah, Annie McClanahan, and Snehal Shingavi. “Risk and Reason/The Wrong Side of History: On the Yale University Unionization Efforts”, 05/16/2017, accessible: http://blog.lareviewofbooks.org/essays/risk-reasonthe-wrong-side-history-yale-university-unionization-efforts/.

    [xxv] Brouillette, Sarah. “Neoliberalism and the Demise of the Literary.” In: Huehls, Mitchum and Rachel Greenwald-Smith, eds. Neoliberalism and Contemporary Literature. Baltimore: Johns Hopkins UP, forthcoming. The uncorrected page proofs with which I am working are numbered 277-290.

    [xxvi] The statement is usually attributed to Fredric Jameson.

    Arne De Boever teaches American Studies in the School of Critical Studies at the California Institute of the Arts, where he also directs the MA Aesthetics and Politics program. He is the author of States of Exception in the Contemporary Novel (2012) Narrative Care (2013), and Plastic Sovereignties (2016), and a co-editor of Gilbert Simondon (2012) and The Psychopathologies of Cognitive Capitalism (2013). He edits Parrhesia and the Critical Theory/Philosophy section of the Los Angeles Review of Books and is a member of the boundary 2 collective. His new book, Finance Fictions, is forthcoming with Fordham University Press.

  • Racheal Fest — What Will Modernism Be?

    Racheal Fest — What Will Modernism Be?

    by Racheal Fest

    This essay has been peer-reviewed by the boundary 2 editorial collective. 

    The absence of imagination had itself to be imagined.

    — Wallace Stevens, “The Plain Sense of Things”

    US academics have expanded “modernism.” In a founding PMLA article, Douglas Mao and Rebecca Walkowitz (2008) gather under the rubric “The New Modernist Studies” (NMS) a range of contemporary scholarly activities they argue expand both modernism’s canon and the methods scholars employ when they examine it. More recently, Sean Latham and Gayle Rogers (2015) consolidate these practices and give them a history in Modernism: The Evolution of an Idea. As a look at these documents of self-presentation reveals, scholars loosely affiliated with NMS often imagine their academic work opposes from the left contemporary forces that produce inequality in the US and beyond. This essay reviews these documents and asks whether or not the expansionist methods the New Modernist Studies endorses can fulfill the political desires its practitioners share. It takes up a version of the self-reflexive project Raymond Williams urged upon a previous generation of oppositional academics. “[C]ultural theory,” Williams wrote in 1986, “which takes all other cultural production as its appropriate material, cannot exempt itself from the most rigorous examination of its own social and historical situations and formations, or from a connected analysis of its assumptions, propositions, methods, and effects” (Williams [1986] 1989, 163). Williams encouraged critics, scholars, and historians of culture who believed they carried out radical work to train their field’s critical resources upon their own activities.

    The New Modernist Studies deserves attention of this kind not only because its practitioners claim they have transformed the study of early-twentieth-century literary and cultural texts. NMS also typifies some of the guiding methods, values, and goals that animate contemporary literary studies across subfields. Because the study of literature in US universities emerged at once alongside and by way of the poems and novels we associate with modernism, NMS’s practitioners perform again for the present what has become a familiar scholarly gesture. To reflect upon the nature and value of modernism, critical histories of the term indicate, has been to reflect upon—and to make a case for one view of—the nature and value of academic and critical literary activity itself.[1] Although critics and scholars devoted primarily to this period no longer lead the profession, modernists share with others across subfields (and perhaps, disciplines) the hope that US academic activity might have broader social and political effects. Many also share the sense that a primary way to produce desired effects is to expand canons and revise conservative methods previous generations of literary critics established. If these common assumptions sometimes serve, rather than counter, the state and market interests that perpetuate contemporary inequality across economic and identity categories, as I suggest in what follows they may, the field might embrace alternative approaches across areas of specialization.[2] A troubling gulf separates the progress narratives left academics proliferate for a privileged audience of peers and students inside the US university from the narratives of increasing inequality that today pervade other domains of life in the US and beyond.[3] Recognizing this gulf might encourage oppositional critics to think beyond the self-regulating and self-justifying habits of professional life.

    The New Modernist Studies

    The New Modernist Studies, according to Mao and Walkowitz (2008), describes as “modernist” an increasingly broad set of materials. Over “the past decade or two,” they explain, “all period-centered areas of literary scholarship have broadened in scope,” and so “modernist literary scholarship” has likewise expanded in “temporal, spatial, and vertical directions” (737). Along a “temporal” axis, such scholars as Susan Stanford Friedman extend modernism’s reach beyond the late nineteenth and early twentieth centuries.[4] Jahan Ramazani and others associated with a “transnational turn” (744) attempt to “make modernism less Eurocentric by including or focusing on literary production outside Western Europe” (739). Still others—those who expand scholarship along society and culture’s “vertical” axes—no longer understand modernism as “a movement by and for a certain kind of high (cultured mandarins) as against a certain kind of low” (738). These scholars examine “reportage,” “propaganda,” and “news” alongside artworks and objects of mass culture (746).[5]

    Mao and Walkowitz suggest these diverse practices together constitute a common oppositional project. The New Modernist Studies aims to “disrupt” and alter the conservative methods for organizing and evaluating literary texts that dominated US literary studies in the past (738). When Mao and Walkowitz celebrate monographs that emphasize “modernism’s entanglement . . . with . . . feminism, socialism, nationalism, and other programs of social change” (737) or colleagues who “encounter[r] with fresh eyes and ears” artworks “by members of marginalized social groups” (738), they indicate powerful desires for social, political, and economic equality, at home and abroad, drive the disciplinary transformations NMS sanctions.

    As some of the major studies Mao and Walkowitz cite make clear, many NMS scholars hope their expansive activities will serve broader left agendas of this kind not only within the discipline of literary studies, but also, outside of the university. A moment in Jahan Ramazani’s acclaimed study, A Transnational Poetics (2009), exemplifies this desire. Ramazani gives new expression to the anti-nationalist and anti-colonial dreams such modernist writers as Claude McKay, Aimé Césaire, and Frantz Fanon first voiced when he describes what motivates his book:

    I write from within the early twenty-first-century US academy, when the most consequential nationalism in the world is American, when assumptions about civilizational differences sometimes underwrite political discourse and even projections of US military forces abroad. Under these circumstances . . . the usefulness of . . . pluralizing and creolizing our models of culture and citizenship, should not be underestimated. . . . A nuanced picture of cross-national and cross-civilizational fusion and friction is badly needed today, and denationalized disciplines in the humanities may help provide it, however limited their extra-institutional reach. (48–49)

    Ramazani hopes his scholarship contributes to vital efforts contemporary state violence requires of those who would combat it. He wants to counter imperial logics that devalue difference across the globe and in so doing license the US state to ruthlessly pursue its own interests. Literary scholars, he argues, might serve this project for equality by expanding, diversifying, and “denationalizing” their own disciplines inside US universities. This moving call for political change represents NMS’s determination to produce from inside of literary studies the new ways of thinking and being contemporary conditions demand.

    At the same time, however, Ramazani registers an anxiety that today pervades both the New Modernist Studies and literary studies in general. Ramazani is confident increasingly plural “models of culture and citizenship,” such as those he finds in the poems of the past and present, can counter the ways of thinking he believes perpetuate global inequities. And yet, he wonders whether or not he and other academics can finally contribute to this “extra-institutional” project when they revise disciplinary practices. When Ramazani emphasizes his position “within the early-twenty-first-century US academy”—he works inside a department (University of Virginia’s Department of English) and within one or more subfields (“modernist” and “postcolonial” poetry) of an already specialized area of study (literary studies)—he does so in order to at once identify his sphere of influence and to express doubts about the final significance of the activities he carries out within it. He speaks passionately for a disciplinary change his political commitments inspire, but he also worries about the restricted reach of the change he proposes.

    If we take seriously this consummate anxiety—and the urgency of the social and economic inequalities critics want to redress demands we do—we might pick up where Ramazani leaves off and investigate its sources more fully. To do so, I turn now to Modernism: The Evolution of an Idea, NMS’s longest and most ambitious document of self-presentation. Latham and Rogers’s book at once introduces the series, “New Modernisms,” which the authors edit for Bloomsbury’s academic imprint, and tells a story about professional progress that culminates in the New Modernist Studies. It develops an extended version of the narrative of expansion Mao and Walkowitz first sketched and fills in the academic history necessary to understand it. I believe a critical reading of this history, which tracks alongside NMS’s celebrated expansion a tandem movement of contraction, helps explain literary studies’ broader disquiet.

    What “Modernism” Was and Is

    Modernism: The Evolution of an Idea is the most recent contribution to the special genre of articles and monographs academics have dedicated to defining modernism.[6] It also gives an overview of this genre. The book describes and organizes the twentieth century’s many accounts of modernism before endorsing in conclusion the New Modernist Studies’ expanded vision of it. The writers display deep and wide expertise as they move nimbly over more than a century’s worth of fraught material. They offer students and colleagues a thorough overview of the debates that have constituted the field they call “modernist studies.”

    A new version of the genre’s definitional question—first posed by Harry Levin (1960) in the essay “What Was Modernism?”—guides the book. In their introduction and conclusion, Latham and Rogers ask: “What is modernism?” (1). Posing the question this way prepares them to develop a response importantly different from those previous critics generated. “Modernism” is no longer a proper noun, as it was for Levin’s generation, so readers know right away the authors will not try to describe a period’s dominant style and make big claims about Western life based upon it. By asking what modernism is, Latham and Rogers remind readers the term shares with all such constructions its perpetually unfinished character, and critics will always have to define it anew to serve present interests. They thus break with an earlier generation of critics Maurice Beebe (1974) typifies when he extends to readers mourning “the passing of the greatest literary age since the Renaissance” this small comfort: “we can now define Modernism with confidence that we shall not have to keep adjusting our definition in order to accommodate new visions and values” (1076).

    In order to answer their question anew for twenty-first century readers—as Latham and Rogers do in their fourth and final chapter on the New Modernist Studies—the authors tell us first what modernism used to be. They begin with the term’s emergence in the late nineteenth and early twentieth centuries. At first, they explain, “modernism” circulated widely, freely, and polemically among “writers, artists, and thinkers around the world,” all of whom “believed that something was happening, that the established conventions of realism, representation, and poetic form seemed to be failing in the face of new experiences, new audiences, and new things” (8). Usual suspects T. S. Eliot, Ezra Pound, James Joyce, and others argued in this period over modern art’s nature and value, in part, Latham and Rogers emphasize, as a way to secure a legacy for their own experimental works.

    Latham and Rogers next describe what we might understand to be the original contraction upon which their narrative of expansion depends. In chapter two, “Consolidation,” academics step in to settle artists’ charged, vital, and international quarrels. By the mid-twentieth century, the authors explain, the so-called New Critics moved modernism’s artworks out of the “bohemian garrets and ateliers” from which they had emerged and installed them in “college classrooms and student anthologies” (19). Borrowing a figure from Joyce, Latham and Rogers say this generation of critics understood modernism to be “a ‘strandentwining cable’ that weaves together a distinct group of writers and artists around shared aesthetic practices” (7). The New Critics and their kin, in other words, revered an exclusionary canon of difficult, formally sophisticated, and willfully apolitical literary works (mostly) white European and American men composed. In so doing, they “silenc[ed] the voices of artists marginalized by gender, race, sexuality, and geography” (207).

    “Iron Filings,” Latham and Rogers’s third chapter (named for a figure they take from Pound), maps the slow demise of this conservative vision. The authors explain how critics writing in the 1970s and 1980s—Edward W. Said, Raymond Williams, Fredric Jameson, and others populate their account—first challenged from the left the modernist canon and its attendant sense of art’s autonomy. The chapter glosses work by “feminists,” “Marxists,” “black modernists,” and “postmodernists” (103–49). These groups, Latham and Rogers argue, began “to move modernism away from the relative autonomy of aesthetic difficulty and toward a broader engagement with political and social issues that inhere within an increasingly global modernity” (14). Scholars and critics came to examine diverse texts and develop worldly and historical views of art. Latham and Rogers laud these efforts and find in them the origins for the work the New Modernist Studies advances.[7] These earlier oppositional efforts do not satisfy them, however. This generation, they argue, still focused too often upon the virtues of difficult, formally experimental texts elites composed, failed to privilege works for “identitarian” reasons (8), or promulgated esoteric theories of language with dubious claims to legitimacy (14).

    Enter the New Modernist Studies. This loosely affiliated movement, Latham and Rogers explain in their final chapter, emerged in the 1990s to overcome these failures and complete the oppositional project. NMS of course does so by expanding modernism’s materials along the spatial, temporal, and vertical axes Mao and Walkowitz name. Contemporary scholars let speak, on syllabi and in academic journals, those diverse voices literary studies once silenced. They devote increased attention to “women’s experiences of modernity” (161), promote “new awareness of the multiple ways in which homosexuality and queerness defined and constituted many of the works we now call ‘modernist’” (163), and treat race as a vital “part of a larger network of forces, practices, and identities” (168). As part of the same effort to displace elite texts, NMS makes new archives available to period specialists. It “attempt[s] to synthesize rather than to bracket or isolate forms of cultural expression across multiple media and throughout the world” (149–50). Examining a range of media forms, NMS scholars believe, unseats literature as an exclusive activity and affirms that other texts deserve critical attention. NMS scholars also continue to explore art’s many entanglements with history’s forces.

    When Latham and Rogers ask themselves one last time the book’s guiding question—“what is modernism?”—they answer it in a way they believe does justice to the radical openness these expanded practices affirm. They leave readers with this “desultory, if nevertheless provocative answer: ‘We don’t know’” (206). The New Modernist Studies, they say, accepts that “there is, finally, no right way to define modernism, just as there is finally no right way to carve up the rich multiplicity of human expression” (207–8). Because the New Modernist Studies is neither a movement nor a method, but rather “the collective work of thousands of scholars,” it generates conclusions that have been and are likely to be in the future “ultimately incommensurable” (149). The book’s final Whitmanian gesture accepts these contradictions in order to applaud expansion itself as a final good. NMS dispenses with the canon, the period container, and the category of the literary as identifiable features of the object it investigates. In so doing, contemporary scholars believe they fulfill a narrative of advancement earlier critics set in motion, but could not complete. According to the New Modernist Studies, fundamental indeterminacy itself constitutes a decisive victory for the left.

    This is a happy story. US academics have today completed a project decades in the making, and the left has at last triumphed inside humanities departments. And yet, as canons, periods, and materials have expanded inside US literary studies, the same narrative of inclusion and progress has not unfolded outside the university, as the 2016 US election made clear. If NMS’s practitioners hope the transformations for which they work within their field can contribute to broader political, social, and economic projects for equality, the radical divergence of these two chronologies might provoke oppositional scholars to examine anew the conviction that indeterminacy is itself a self-evident and absolute good. (This is not to suggest literary studies produced, or alone might have prevented, current emergencies. The profession’s progressive victory narrative simply sounds an eccentric note against the right’s rise.)

    A figure Latham and Rogers select to represent the New Modernist Studies helps us identify one possible source for this distressing incongruity and thus points the way to alternative projects. In their final chapter, the authors describe the new core exhibition Catherine Grenier curated for Paris’s Centre Pompidou in 2013. Under Grenier’s direction, they write, the Pompidou has traded its “canonical and almost exclusively Eurocentric understanding of modernism”—and the modes of display conventional to it—for a new logic of exhibition:

    Crucially, the museum abandons a narrative of development and opts instead simply to display as diverse an array of materials on the walls as it can. Picassos rub shoulders with architectural models from Brazil, Japanese prints, and paintings by the Moroccan artist Farid Belkahia—all placed against wallpaper made from hundreds of little magazines. (150–51)

    This exhibition style, Latham and Rogers believe, represents something essential about the current state of their field. It signals “we are in an ‘interrogatory’ moment that invites us to ask anew about the range, constitution, and value of modernism” (151). A viewer standing before this display, in other words, stands in the figural space the contemporary modernist scholar occupies.

    This figure should be familiar to expert readers of modern discourses. A genealogy of artists, critics, and philosophers proliferated versions of it in the late nineteenth and early twentieth centuries. (Marx and Nietzsche give two of the most famous accounts[8]). When Latham and Rogers invoke it here, they remind readers contemporary US academics confront under their own peculiar circumstances the prototypical dilemma “modern” minds face. The scholar stands before his materials as Walter Benjamin’s ([1940] 1968) “angel” stands before history’s ruins or as Wallace Stevens’s ([1942] 1997b) “Man on the Dump” straddles culture’s dross. To be “modern,” figures of this kind suggest, is to be aware one is a historical being that creates a future out of a past by evaluating materials in the present. It is also to face perpetually the crushing problems proper to this condition, among them, the knowledge that whatever sense one makes of the past will itself one day end up on history’s junk heap.

    The figure Benjamin invents to exemplify this dilemma, the “angel of history,” differs from the cheerful twenty-first-century modernist Latham and Rogers find in the museum, gazing raptly at the walls. The contrast is instructive. In “Theses on the Philosophy of History,” Benjamin’s angel, a trope for the radical or oppositional historian of culture, experiences the modern subject’s constituting crisis. He looks back upon a past that fills his entire perceptual field, a past he perceives as a “pile of debris” that “grows skyward” (257). As he gazes upon history’s ruins, he experiences a deep and awful longing. He wants nothing more than to “stay, awaken the dead, and make whole what has been smashed” (257–58). Confronted with a disorganized mass of cultural relics that overwhelms him, the radical historian must decide what to do with these materials in order to serve present needs. He wants desperately to make sense of, and in so doing redeem for the present, the violent and destructive chaos of human activity we call history.

    Tragically, though, a twofold danger frustrates the oppositional historian’s efforts. He knows, first, that the objects of the past that appear before him, many of which other historians regard as evidence of progress, do not enter his field of attention untouched by powerful interests. On the contrary, the same conditions of brutal inequality he hopes to oppose produce and pass along the “cultural treasures” others believe signal advancement (256). The historian therefore regards with suspicion both privileged works and the means by which they are “transmitted from one owner to another” (256). He believes that “even the dead will not be safe” from ruling interests, so he tries to wrest from them both revered and disdained objects (255).

    At the same time as the radical historian struggles to protect the dead, he also struggles to protect himself. While the angel attempts to recover out of the past resources for the present, a “storm irresistibly propels” him “into the future to which his back is turned” (258). The angel cannot easily reinterpret or redeem the ruins because, catastrophically, he is enmeshed himself within the very history he wants to grasp and transform. Just as cruel interests produce, organize, and preserve history’s materials for their own purposes, so too do present conflicts and conditions always over-determine the radical historian’s work. As Benjamin puts it, the “same threat hangs over both [the content of the tradition and its receivers]: that of becoming a tool of the ruling classes” (255). (This is in part why Benjamin imagines only a messianic figure, who stops time, can complete the revolutionary historian’s effort.[9])

    As Latham and Rogers’s version of this figure indicates, the contemporary scholar of modernism faces with satisfaction the conditions the angel meets with horror and yearning. This is in part because, by its own account, the New Modernist Studies believes it has fulfilled Benjamin’s charge to protect the dead from ruling interests. While Latham and Rogers are critical of Theodor Adorno and those friends of Benjamin’s who “effectively helped build the modernist canon and affirmed its terms,” they are grateful to Benjamin because he “offered a set of tools and perspectives for undoing that work” (106–7). The New Modernist Studies believes it has secured, in the figurative space of the institution the museum signifies, what Benjamin’s angel desperately wanted—time and venue to stay and awaken the dead, to recover and let sound out of the past’s ruinous violence excluded songs.

    If the New Modernist Studies protects the dead from ruling interests, however, it does not protect itself. NMS does not recognize, as Benjamin insists oppositional critics and scholars must, that it faces the same danger as do its objects. As the New Modernist Studies fulfills Latham and Rogers’s disciplinary progress narrative of expansion, it also completes the book’s corresponding narrative of contraction. The story about modernism’s enlargement, Latham and Rogers explicitly say, is also a story about its total “institutionalization and professionalization” (134). While some artists and critics in the early twentieth century “conceptualized [modernism] as a site of resistance to modernity’s regulatory and routinizing practices,” Latham and Rogers write, modernism has by 2015 “become part of an institutional system” (15). Today, modernism is, among other things, “an institutionalized profession, self-regulating and fitted somewhat uncomfortably between the nineteenth century and the always-moving present” (207). NMS finds “its strongest support and articulations in the institutions of academia: conferences, journals, scholarly organizations, and course catalogs” (156). The profession, in other words, with its self-directed procedures for formal training, publication, and credentialing, furnishes the domain within which NMS’s progress narrative can register as meaningful.

    Attention to the contraction upon which expansion depends reveals a profound contradiction legitimates the New Modernist Studies. As scholars have worked to extend modernism’s materials and to abandon dated claims about art’s independence from political and economic forces, they have at the same time embraced the apparent autonomy the profession seems to tender those (increasingly few) humanities academics universities employ. (Latham and Rogers [2015] note the number of tenure-track positions for specialists in modernism US universities advertise has declined in recent years [157]). The profession creates a seemingly sovereign space in which a fortunate few can freely play over an extended set of materials.

    Inside this apparently secure and exclusive domain, the fundamental indeterminacy Latham and Rogers hail as itself an achievement for the left performs another function entirely. Undirected expansion turns out to be a condition for the possibility of professional activity in the present. “Modernist studies,” Latham and Rogers explain, “has been strengthened by the lack of resolution over what exactly modernism is. A perpetual ‘definitional crisis’ has been a boon, in other words, to the wide-ranging debates about the field’s nature, boundaries, and contents” (151). This permanent emergency enables academics to produce scholarship an audience of like-minded period specialists will value. The authors celebrate the remarkable volume of discourse academics continue to publish out of the field’s authorizing crisis: “Even in the troubled world of academic publishing, studies of modernism, anthologies of modernist texts, introductions to the movement, essay collections on modernism and its formation, and other such texts have flourished since the mid-1990s, far outpacing the analogous publications in the 1960s and 1970s that helped entrench the field in universities” (156). The New Modernist Studies finally presents itself as an interminable (and profitable) set of classificatory squabbles elites with common aims perpetuate, but need not resolve, inside protected institutions.

    This insular vision of US intellectual activity is not exactly new, and its consequences are not newly dangerous. In the well-known essay “Reflections on American ‘Left’ Literary Criticism,” Edward W. Said (1983) warned literary critics that the so-called “culture wars” of the 1970s and 1980s might not produce the outcomes across culture and society rival factions on the right and left desired.[10] Drawing upon Antonio Gramsci’s prison writings, Said (1983) argued universities, as institutions located within “civil society,” cannot furnish a protected vantage point from which critics on the left might attack state and market interests (175). The concept of “culture,” as Raymond Williams (1983) has demonstrated, emerged in tandem with and as an instrument of the nation-state. Therefore, Said argues, a critic “acting entirely” within the traditionally restricted humanist “domain” of the “literary specialist” does not destroy, but rather “confirms the culture and the society enforcing those restrictions” (175). This “confirmation,” Said writes, “acts to strengthen the civil and political societies whose fabric is culture itself” (175). When academics conceive of literary criticism as an adversarial activity one can pursue within an autonomous professional space, then, culture’s indissoluble relationship to power ensures that activity paradoxically reinforces “the whole enterprise of the State” (175). The autonomous view of literary studies NMS propagates is an updated version of the one Said challenges.[11]

    Benjamin’s fable suggests oppositional scholars and critics who want to promote contemporary change should not be satisfied with this limited view of intellectual activity. To renew a vision of modernism responsive to contemporary inequality, scholars would have to expand more than their visions of the past. They would also have to expand their views of the present.[12] An expansion of this kind would multiply modernist studies’ materials along two new horizons. In addition to past artworks, modernist studies would explicitly consider, first, how ruling interests produce inequality in the present, and second, how its own relationship to those interests influences its activity. A disciplinary program such as NMS would have to begin with and attend to the logics, structures, and institutions that contribute to ongoing inequality, violence, and injustice, not only inside the discipline, but also more broadly, and then ask how its specialized activities might best transform these. Because liberating voices inside elite spaces has not countered inequalities the consequences of which those excluded from those spaces feel most acutely, literary studies might now begin with its expanded materials and ask anew what, more specifically, scholars might do with them.

    If these expanded practices guided the field’s historical work, modernist studies might be better positioned to pose and respond to its constituting question—what is modernism? Right now, the field’s leading experts do not believe they need to resolve among themselves answers to it. Perhaps this is because the question is not today an urgent one for radical or progressive movements, or worse, perhaps ruling interests have already seized the question in its moment of danger. We might ask then, not what modernism is or was, but instead what modernism would have to be for it to matter again what it is. How might we look anew at modernism in a way that will best serve our oppositional desires in the present so that we might shape the more equitable futures we want? What vision of modernism can help us best respond to our world? What will modernism be?

    What “Modernism” Might Be

    Because Benjamin encourages us to take a more expansive view of the conditions that produce contemporary inequality both within and outside of the university when we pose enormous questions of this kind, I want to develop one tentative response by adopting the approach he recommends. After the financial collapse of 2008, many critics of arts and culture writing from the left have come to use the word “neoliberalism” to describe the forces that produce inequality today in the US and beyond.[13] The term has its strengths and limitations. It is simultaneously capacious and specific, so it can name both contemporary economic and political conditions and the popular ways of thinking that fabricate them. At the same time, it often circulates too capaciously. Philosopher of economics Philip Mirowski (2013) reproaches left intellectuals, for instance, who “bandy about attributions of ‘neoliberalism’ as a portmanteau term of abuse when discussing grand phenomena often lumped together under the terminology of ‘globalization’ and ‘financialization’ and ‘governmentality’” (29). In an attempt to avoid this practice, I want to define this abstraction more precisely before I consider how it might help us reevaluate NMS’s progress narrative and develop a revised sense of what modernism might need to be. To do so, I rely upon the more particular sense of the word Mirowski (2013) offers in his recent account of the financial crisis and its aftermath, Never Let a Serious Crisis Go to Waste. Because Mirowski places at the center of his definition a view of epistemology he argues helps produce contemporary inequality, his account is of special interest to those who hope academic activities might counter ruling forces.

    For Mirowski (2013), neoliberalism is both a “program” right-wing intellectuals and elites operating across a network of public and private institutions developed over the course of the twentieth century (29) and a “worldview [that] has sunk its roots deep into everyday life, almost to the point of passing as the ‘ideology of no ideology’” (28). His bracingly critical and deeply historical book-length account of this program and worldview includes the familiar tenets we most often associate with the term. Neoliberalism, Mirowski explains, insists “market society must be treated as a ‘natural’ and inexorable state of mankind” (55); it “redefine[s] the shape and functions of the state” to better serve market interests (56); it regards “inequality of economic resources and political rights not as an unfortunate by-product of capitalism, but as a necessary functional characteristic of [an] ideal market system” (63); it maintains “corporations can do no wrong” (64); and so on. This program produces inequalities that cut across economic and identity categories. It sanctions the strong domestic police state activists hold responsible for the mass incarceration and frequent extra-judicial killings of African-American men, for instance (Mirowski 65–66).[14]

    Mirowski argues the specific “epistemological commitments” that ensured this program’s ascendency continue to guarantee its future, even in the wake of the devastating global crisis that should have delegitimized it (333). In service of the view that markets best organize human life, Mirowski argues, elites “deploy ignorance as a political tool” (12). He offers this interpretation of the role ignorance plays in economist and neoliberal pioneer Friedrich Hayek’s worldview:

    For Hayek, the conscious attempt to conceive of the nature of public interest is the ultimate hubris, and to concoct strategies to achieve it is to fall into Original Sin. True organic solidarity can obtain only when everyone believes (correctly or not) they are just following their own selfish idiosyncratic ends, or perhaps don’t have any clear idea whatsoever of what they are doing, when in fact they are busily (re)producing beneficent evolutionary regularities beyond their ken and imagination. Thus, ignorance promotes social order, or as he said, “knowledge and ignorance are relative concepts.” (81) 

    Because Hayek and those who share his views believe markets establish a transcendentally sanctioned order human reason, imagination, and will can only complicate and destroy, Mirowski argues they “strive to preserve and promote doubt and ignorance,” as many economists unwittingly did after 2008 (81). Motivated by this view of knowledge’s nature and value, recent policies have started to eliminate or weaken such knowledge producing institutions as the university by “put[ting] them on commercial footing” (82).[15] Doing so undercuts the critical, theoretical, and imaginative activities in which the humanities (and, just as vitally, the sciences[16]) conventionally offer training. These activities now seem, from this popular perspective, deleterious to omnipotent economic systems, and therefore, to human life. Policies of this kind deny “that it is even possible to speak truth to power, or that one can rationally plan social goals and their attainment” (Mirowski 82).

    At the same time, and paradoxically, Mirowski argues elites themselves have relied over the course of the twentieth century upon precisely those modes of knowledge production, theoretical planning, critical rigor, and imagination they denounce in order to construct market-friendly policies and to build cultures of consent around their notions of freedom, human life, and education. Friedman, Hayek, George Stigler, and others associated with the influential and international Mont Pèlerin Society cultivated robust, diffuse, and persistent networks for pursuing creative and epistemological activity inside think tanks, universities, corporations, and state institutions (37-38). As a tactic for consolidating power, neoliberal policies strategically deny opponents access to those resources they utilize to gain and safeguard influence (83).

    This epistemological paradigm, experts in modernism will recognize, imperils Benjamin’s figure. The very historical self-awareness writers and artists working in the late-nineteenth and early-twentieth centuries associated with being “modern,” in other words, today threatens to disappear. The idea that we might understand and evaluate present political and economic conditions and invent together ways to transform them is under pressure. While US and European elites continue to deploy such terms as “modernization” and “modernity” in what Fredric Jameson (2002) calls “a fundamental political discursive struggle” to guarantee free-market capitalism seems reality’s natural telos (9), they also tactically foreclose certain so-called “modern” ways of thinking others might use to resist current realities. As Mirowski argues, contemporary discourses in part shore up power by denying above all that human activity—be it political, imaginative, or intelligent—can help shape better futures.

    A range of practices inside knowledge-producing institutions such as the university contribute to this popular view. US economics, for instance, leaves historical circumstances out of its models, as Thomas Piketty (2014) argues (573–74), or psychology joins with evolutionary biology to prove timeless drives motivate men to purchase luxury vehicles (Sundie et al. 2010).[17] Scholars of culture might counter these tactics from within literary studies if we imagine we are in conversation, not only with our colleagues and our field’s bygone giants, but also with other producers of knowledge across epistemological institutions.[18] Work of this kind would complement interdisciplinary research contemporary scholars already pursue—Latham and Rogers emphasize an “interdisciplinary foundation” grounds the New Modernist Studies (168)—but it would also differ importantly from it. In addition to adopting approaches other fields generate, as many interdisciplinary projects now do, literary studies might challenge the epistemological assumptions that license inequality and violence across fields and identify instead the alternate views of those creative, imaginative, and intelligent human activities neoliberalism attempts to monopolize and conceal that humanities traditions hold out to us.

    Some such views, of course, contribute to transcendental worldviews new versions of which continue to foster inequality. Scholars therefore would not be able to return to the romantic or classically humanist ways of thinking about art theorists of the posthuman warn us are dangerously outmoded.[19] Rather, critics might recover and defend, before they disappear, literary visions of the tandem powers and limits of human activity historically conceived, in Benjamin’s sense. Professional readers of modernist texts are uniquely suited to contribute to projects of this kind because late-nineteenth- and early-twentieth-century writers and artists conceived in increasingly secular, material, and historical ways precisely those creative and imaginative capabilities popular discourses currently deny.[20]

    I turn in conclusion to one such conception. Over the course of a long career, Wallace Stevens developed in verse and prose a potent vision of the capabilities and limitations of human imagination. I want to conclude with Stevens because the demise of the canon NMS achieves—a necessary and vital destruction—enables us to look anew not only at previously excluded materials. It also invites us to see in new ways those now liberated from their advantaged places within a hierarchy Raymond Williams (1987) worried had captured imagination’s radical potential. Because many of us share the sense that lesser known works recently recovered (or, as in the case of the heretofore unknown Claude McKay novel a graduate student at Columbia University found in the archives, discovered[21]) deserve more robust attention, I want to demonstrate how the alternative mode of expansion I am proposing can also help us see previously favored figures in newly apposite ways. (As a tradition of African American writing that moves from Frederick Douglass to Toni Morrison and Claudia Rankine emphasizes, violent hierarchies also disfigure, though differently, those who claim a place at the top.[22]) The field’s pervasive view of Stevens has long been over-determined by such popular misreadings of his poems and essays as those Harold Bloom published in the 1970s.[23] Bloom misrepresents Stevens by insisting he adheres to the willfully ahistorical, autonomous, and unworldly understanding of art Bloom is one of the last US critics to prefer.

    Stevens offers one version of his vision of imagination in a poem he composed on the eve of the Second World War, “Extracts from Addresses to the Academy of Fine Ideas.” The poem gives an early sense of what later works—most famously, Notes Toward a Supreme Fiction and “The Noble Rider and the Sound of Words”—elaborate more fully, but it has the virtue, for my purposes, of lovingly antagonizing the same institutional audience I have just suggested literary studies might imagine for itself. “Extracts” assembles scraps taken from lectures and notes a speaker addresses to an audience with a stake in epistemological questions.

    The poem’s wry title, as usual, opens onto a subject that turns out to be deadly serious. In the first section, Stevens’s speaker establishes before an academic audience of bearded “Messieurs” a dichotomy readers of Stevens will recognize is fundamental to his project. The speaker contrasts a “wrinkled ros[e]” made of “paper” (227) with “the blood-rose living in its smell” (228). He entreats his audience to consider the relationship between the two categories of being for which these flowers stand, categories which go elsewhere in Stevens’s oeuvre by the familiar names “imagination” and “reality.” At first, the speaker seems melancholy as he remarks the differences between the blooms. The paper rose is “false” and it is “dust,” even if it makes for us “brilliant” sounds (228). The blood-rose might be “silent,” but it is vibrant, pungent, and alive in the “sun and rain” (228).

    Immediately, though, we realize Stevens does not establish this difference in order to privilege plant over paper, or reality over imagination, and his elegy gives way to affirmation. Ours, he tells the academy, “is an artificial world,” and the “rose of / Paper is of the nature of its world” (228). What we might call reality—the “sea,” the “mountains,” and the “sky”—is “so many written words” (228). We cannot, then, experience a world of necessity unmediated by or independent of the language we use to describe and know it, because this language shapes our perceptions of what we encounter. We must therefore accept that “the false and the true are one” (228).

    For Stevens, who here differs from such contemporaries as Eliot (a villain in the poem), understanding the interdependence of these two categories need not engender melancholy. The very notion that we can know the blood-rose, or the real, without exercising our human faculties seems to Stevens a dangerous fantasy, one he sees emerge out of transcendental traditions. (This essay’s epigraph formulates most simply this insight.) “The rainy rose belongs / To naked men, to women naked as rain,” and we have never truly been these men and women (228). “Where,” the speaker asks, “is that summer warm enough to walk / . . . Beyond the knowledge of nakedness, as part / Of reality, beyond the knowledge of what / Is real, part of a land beyond the mind?” (228). This rhetorical question suggests humans never could access the paradise of ignorance Christian traditions project into the species’ distant past. This is not because we sinners once traded for knowledge’s paltry spoils the immortality ignorance guaranteed. It is rather because the difficult environments we inhabit on earth—cold, poisonous, dirty—require finite, self-aware beings to know them, and change them, and change ourselves to suit them. In order to do so, the speaker makes clear, we have relied upon what the paper rose represents: intelligence and imagination.

    Stevens’s speaker thus asks the academy to renounce any fiction that requires its acolytes cleave epistemological and creative human activity from “reality” and its imagined fulfillments. He entreats his interlocutors to repudiate promises that ignorance can produce a paradise of the real. As the sections that follow demonstrate, Stevens has in mind Plato’s idealism, monarchy’s divine right, and the old world’s monotheisms, systems that make the same seductive promises contemporary “free-market fundamentalism” does (Krugman 2010). Stevens at once challenges these monumental metaphysical systems and suggests we attempt to better understand the character and purpose of the human faculties by which we invented them, faculties without which we can neither know, nor make, reality.

    The final section of “Extracts” models such an attempt. Here is the speaker’s closing plea to the institution of fine ideas:

    If earth dissolves

                Its evil after death, it dissolves it while

                We live. Thence come the final chants, the chants

                Of the brooder seeking the acutest end

                Of speech: to pierce the heart’s residuum

                And there to find music for a single line,

                Equal to memory, one line in which

                The vital music formulates the words.

     

    Behold the men in helmets borne on steel,

    Discolored, how they are going to defeat. (233-34)

    Stevens concludes the poem with a careful vision of the tandem possibilities and limitations of human creative power. Earth, here a figure for the conditions of necessity the constraints of time and space produce, “dissolves evil” when death erases, and does not oblige an everlasting soul to harbor forever, life’s accumulated injuries. If we accept our own finitude in this way—“Be tranquil in your wounds,” Stevens (after Whitman) bids us (229)—we can turn our attention to the earthly powers we do possess, powers that help us “dissolve evil . . . while we live.” These are our “final chants,” the songs, stories, and ideas we make out of the conditions of mortality we cannot transcend. We compose and perfect these chants, not only because we are intelligent, brooding over what words will satisfy the mind, but also because we are sensuous. Sounds please us. When we hear “the vital music,” we know we have found the material for beliefs that “pierce,” and thereby shape, us (234).

    Yet, even as the poem rises in the end to this fever pitch of human celebration, its final chant leaves us with the brutal image of soldiers “going to defeat.” This concluding volta serves a composite function. It warns us, as Stevens (1997) will again in the coda to Notes Toward a Supreme Fiction, that the songs of belief and knowledge we invent can stir us to violence. (“How gladly with proper words the soldier dies, / If he must, or lives on the bread of faithful speech,” the Notes concludes [352]). In so doing, Stevens undercuts the good/evil dichotomy he has developed and emphasizes we can use our saving faculties to produce the same pain they can alleviate. The final couplet also leaves us with an image of precisely that from which the poem suggests we cannot turn away. Our chants comfort us while we live, but we must keep before us our own mortality in order to truly understand what we are and can do. This tempered conclusion at once affirms human creative power and admits, with humility, our profound and irreducible limitations. Stevens neither elevates to divine status intelligence and imagination, as some romantics did, nor denies these faculties influence our lives on earth, as do some contemporary discourses.

    This vision cautions us to remain wary of explanations that promise an unknowable set of forces that operate beyond our control can best organize our lives and insists instead that humans are historical beings. Within limits, in other words, we shape out of the past, by way of our creative and critical activities, both the selves we are and the worlds we know. By affirming this vision (which Stevens is only one among many modernists, canonical and marginal, to leave us), and by sharpening it against those views that oppose it, we can seize at the moment it threatens to disappear a historical sense of ourselves. When we privilege this historical view of the human, we need not nostalgically return to and affirm the destructive and arrogant humanism that long licensed the West’s colonial violence and initiated environmental devastation. Rather, views such as Stevens’s can help us pursue in revitalized ways the increasingly material and historical search for self-understanding modernist genealogies value. Because a posthuman view of the species would still have to be able to explain the species’ historical activities, writers who describe these seem as important as ever.

    As Benjamin’s vision of the angel warns, oppositional criticism cannot be programmatic, so reading Stevens this way offers no final, reproducible answer to this essay’s title question. It is merely one attempt to mobilize in the face of the conditions that produce inequality today the resources of the past. Because the New Modernist Studies is satisfied simply to expand its store of past materials, it does not encourage scholars to open out of modernism’s discourses specific and identifiable ways of thinking the left might rely upon when it tries to oppose from within the university the forces that produce social and economic disparity. Indeterminacy ensures NMS can continue as an influential, autonomous, and relatively lucrative institutional force, in part because it does not encourage critics to oppose power. Its foundational indeterminacy (“we don’t know”) seems to complement and mirror, rather than to contest, the broader attitude toward epistemological and creative human activity upon which ruling interests strategically insist. When elite discourses attempt to control and conceal the critical and creative practices humanities disciplines previously cultivated, academic trends that do not value these practices can come to suit elite interests.

    To ameliorate these shortcomings, contemporary scholars need not necessarily flee the university or contritely devote themselves to public outreach projects. All institutional work is not identical. Mirowski’s epistemological reading of contemporary inequality suggests one of the most oppositional acts a scholar or critic can today perform is to insist—from inside and across the creative, critical, and knowledge-producing fields currently under attack—that historical activity is ongoing and vital.

     

    References

    Altieri, Charles. 2012. “How the ‘New Modernist Studies’ Fails the Old Modernism.” Textual Practice 26, no. 4: 763–82.

    Anderson, Chris. 2008 “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.” Wired. June 23. https://www.wired.com/2008/06/pb-theory/.

    Beebe, Maurice. 1974. “What Modernism Was.” Journal of Modern Literature 3, no. 5: 1065–84.

    Benjamin, Walter. (1940) 1968. “Theses on the Philosophy of History.” In Illuminations, edited by Hannah Arendt, translated by Harry Zohn, 253–64. New York: Schocken Books.

    Bloom, Harold. 1976. Poetry and Repression: Revisionism from Blake to Stevens. New Haven, CT: Yale University Press.

    Bové, Paul A. 2010. “Misprisions of Utopia: Messianism, Apocalypse, and Allegory.” Field Day Review 6: 71–93.

    Brzezinski, Max. 2011. “The New Modernist Studies: What’s Left of Political Formalism?” Minnesota Review 76: 109–25.

    Capehart, Jonathan. 2015. “From Trayvon Martin to ‘Black Lives Matter.’” Washington Post. February 27. www.washingtonpost.com/blogs/post-partisan/wp/2015/02/27/from-trayvon-martin-to-black-lives-matter/.

    Churchill, Suzanne W. 2006. The Little Magazine Othersand the Renovation of Modern American Poetry. Burlington, VT: Ashgate.

    Coleridge, Samuel Taylor. (1817) 1985. Biographia Literaria. Vol. 7 of The Collected Works of Samuel Taylor Coleridge. Edited by James Engell and W. Jackson Bate. Princeton: Princeton University Press.

    Friedman, Susan Stanford. 2001. “Definitional Excursions: The Meanings of Modern/Modernity/Modernism.” Modernism/modernity 8, no. 3: 493–513.

    ———. 2015. Planetary Modernisms: Provocations on Modernity Across Time. New York: Columbia University Press.

    Harvey, David. 2005. A Brief History of Neoliberalism. New York: Oxford University Press.

    Howarth, Peter. 2012. “Autonomous and Heteronomous in Modernist Form: From Romantic Image to the New Modernist Studies.” Critical Quarterly 54, no. 1: 71– 80.

    Jameson, Fredric. 2002. A Singular Modernity: Essays on the Ontology of the Present. New York: Verso.

    Jauss, Hans Robert. (1970) 2005. “Modernity and Literary Tradition.” Translated by Christian Thorne. Critical Inquiry 31, no. 2: 329–64.

    Josipovici, Gabriel. 2010. Whatever Happened to Modernism? New Haven, CT: Yale University Press.

    Kermode, Frank. 1986. “Modernisms.” London Review of Books 8, no. 9: 3–6.

    Krugman, Paul. 2010. “When Zombies Win.” “The Opinion Pages.” The New York Times. December 19. www.nytimes.com/2010/12/20/opinion/20krugman.html.

    Latham, Sean and Gayle Rogers. 2015. Modernism: The Evolution of an Idea. New York: Bloomsbury.

    Lee, Felicia R. 2012. “New Novel of Harlem Renaissance is Found.” New York Times. September 14. www.nytimes.com/2012/09/15/books/harlem-renaissance-novel-by-claude-mckay-is-discovered.html.

    Levin, Harry. 1960. “What Was Modernism?” The Massachusetts Review 1, no. 4: 609–30.

    Lichtblau, Eric. (2016). “US Hate Crimes Surge 6%, Fueled by Attacks on Muslims.” New York Times. November 14. http://www.nytimes.com/2016/11/15/us/ politics/fbi-hate-crimes-muslims.html.

    Mao, Douglas and Rebecca Walkowitz, eds. 2006. Bad Modernisms. Durham: Duke University Press.

    ———. 2008. “The New Modernist Studies,” PMLA 123, no. 3: 737–48.

    Marx, Karl. (1852) 1963. The Eighteenth Brumaire of Louis Bonaparte. New York: International Publishers.

    Mirowski, Philip. 2013. Never Let a Serious Crisis Go to Waste: How Neoliberalism  Survived the Financial Meltdown. New York: Verso.

    Morrison, Toni. 1987. Beloved. New York: Vintage.

    Nietzsche, Friedrich. (1876) 1997. “On the Uses and Disadvantages of History for Life.” In Untimely Meditations, translated by R. J. Hollingdale, 57–124. New York: Cambridge University Press.

    Piketty, Thomas. 2014. Capital in the Twenty-First Century. Translated by Arthur Goldhammer. Cambridge, MA: Belknap Press of Harvard University Press.

    Ramazani, Jahan. 2009. A Transnational Poetics. Chicago: University of Chicago Press.

    Readings, Bill. 1997. The University in Ruins. Cambridge, MA: Harvard University Press.

    Robbins, Bruce. 1985. “Modernism and Professionalism: The Case of William Carlos Williams.” In On Poetry and Poetics, edited by Richard Waswo, 191–205. Tubingen: Gunter Narr Verlag.

    Said, Edward W. 1983. “Reflections on American ‘Left’ Literary Criticism.” In The World, the Text, and the Critic, 158–177. Cambridge, MA: Harvard University Press.

    ———. 1993. Culture and Imperialism. New York: Vintage.

    ———. 2000. “Presidential Address 1999: Humanism and Heroism.” PMLA 115, no. 3: 285–91.

    ———. 2004. Humanism and Democratic Criticism. New York: Columbia University Press.

    Stevens, Wallace. (1942) 1997a. “Extracts from Addresses to the Academy of Fine Ideas.” Parts of a World, in Collected Poetry and Prose, 227–234.

    ———. (1942) 1997b. “The Man on the Dump.” Parts of a World, in Collected Poetry and Prose, 184–85.

    ———. (1951) 1997. “The Noble Rider and the Sound of Words.” The Necessary Angel, in Collected Poetry and Prose, 643–65.

    ———. (1955) 1997. “The Plain Sense of Things.” The Rock, in Collected Poetry and Prose, 428.

    ———. 1997. Collected Poetry and Prose. Edited by Frank Kermode and Joan Richardson. New York: Library of America.

    Sundie, Jill. M., Douglas T. Kenrick, Vladas Griskevicius, Joshua M. Tybur, Kathleen D. Vohs, and Daniel J. Beal. (2010). “Peacocks, Porsches, and Thorstein Veblen: Conspicuous Consumption as a Sexual Signaling System.” Journal of Personality and Social Psychology. November 1.

    Toomer, Jean. (1923) 2011. Cane. Edited by Rudolph P. Byrd and Henry Louis Gates, Jr. New York: W. W. Norton.

    United States Department of Labor. 2016. “Equal Pay.” December 12. https://www.dol.gov/featured/equalpay.

    V21 Collective. 2016. “Manifesto of the V21 Collective: Ten Theses.” December 12. http://v21collective.org/manifesto-of-the-v21-collective-ten-theses/.

    Wellek, René. 1985. “Literary Modern?” Review of Genealogy of Modernism, by Michael Levenson. The New Criterion 3, no. 9: 76.

    Wicke, Jennifer. 2001. “Appreciation, Depreciation: Modernism’s Speculative Bubble.” Modernism/modernity 8, no. 3: 389–403.

    Williams, Raymond. 1983. Culture and Society: 1780-1950. New York: Columbia University Press.

    ———. (1986) 1989. “The Uses of Cultural Theory.” The Politics of Modernism: Against the New Conformists, 163–176.

    ———. (1987) 1989. “When Was Modernism?” in Politics of Modernism: Against the New Conformists, 31–35.

    ———. 1989. The Politics of Modernism: Against the New Conformists. New York:Verso.

    Whitman, Walt. (1855) 1996. Leaves of Grass. In Whitman: Poetry and Prose, edited by Justin Kaplan, 1–146. New York: Library of America.

    Williams, William Carlos. (1923) 1970. Spring and All. Imaginations. New York: New Directions. 85–151.

    Epigraph taken from Wallace Stevens ([1955] 1997).

    Notes 

    [1] For a range of representative instances, see Robbins (1985), “Modernism and Professionalism: the Case of William Carlos Williams”; Williams ([1987] 1989), “When Was Modernism?”; Stanford Friedman (2001), “Definitional Excursions: The Meanings of Modern/Modernity/Modernism”; Jameson (2002), A Singular Modernity: Essays on the Ontology of the Present; and Josipovici (2010), Whatever Happened to Modernism?

    [2] A number of critics have challenged the New Modernist Studies and its assumptions from various perspectives. See Wicke (2001), “Appreciation, Depreciation: Modernism’s Speculative Bubble”; Jameson (2002); Brzezinski (2011), “The New Modernist Studies: What’s Left of Political Formalism?”; Altieri (2012), “How the ‘New Modernist Studies’ Fails the Old Modernism”; Howarth (2012), “Autonomous and Heteronomous in Modernist Form: From Romantic Image to the New Modernist Studies.”

    [3] In the US, for instance, inequality is today pervasive across categories of class, race, gender, and sexuality. Piketty (2014) compares rates of income disparity in the US in the early 2010s to those “in France and Britain during the Ancien Regime” (263). A 2010–11 survey indicates “the top decile own 72 percent of America’s wealth” (257). Capehart (2015) tracks recent instances of race violence in the US and the emergence of activist counter-movements. The United States Bureau of Labor (2016) reports US “women working full time only make about 79% of what men earn,” indicating one ongoing gender disparity liberal feminist movements often target.

    [4] Friedman’s (2015) book, Planetary Modernisms: Provocations on Modernity Across Time, pursues this “expansive” tendency to the limits of its logic. Friedman argues “modernism” might describe all “aesthetic movements or specific instances that innovatively engage with the specific modernities of their space/time/culture, particularly . . . those whose forms as well as content push against or reinvent inherited conventions” (190). She suggests critics might consider modernist such figures as the sixth-century Chinese poet Du Fu, whose formal innovations responded to changing political and economic conditions under the Tang Dynasty.

    [5] See Churchill (2006) and Mao and Walkowitz, eds. (2006).

    [6] For key works in this definitional genre, see Levin (1960), “What Was Modernism?”; Maurice Beebe (1974), “What Modernism Was”; Williams ([1987] 1989), “When Was Modernism?”; Friedman (2001), “Definitional Excursions: The Meanings of Modern/Modernity/Modernism”; Josipovici (2010), What Ever Happened to Modernism?

    [7] When Latham and Rogers (2015) rely upon a language of “networks” as a way to explain art’s place in the “world,” for instance, they indicate Edward W. Said is one important influence for NMS (149). Said (1983) encouraged critics with radical ambitions to scrutinize any “art-for-art’s-sake theory” that insists “the world of culture and aesthetic production subsists on its own, away from the encroachments of the State and authority” and to study instead the “network” of “affiliation” that “enables a text to maintain itself as a text” (169, 174).

    [8] Marx ([1852] 2004) describes historical consciousness and its challenges this way: “Men make their own history, but they do not make it just as they please; they do not make it under circumstances chosen by themselves, but under circumstances directly encountered, given and transmitted from the past. The tradition of all the dead generations weighs like a nightmare on the brain of the living” (15). A few years later, Nietzsche ([1876] 1997) writes: A “human being … cannot learn to forget but clings relentlessly to the past: however far and fast he may run, this chain runs with him” (61). See Jauss ([1970] 2005) for a critical etymology of the term “modern.” Jauss traces the different modes of historical consciousness it has named over the course of Western history.

    [9] See Paul A. Bové (2010), “Misprisions of Utopia: Messianism, Apocalypse, and Allegory,” for a challenge to the utopian, messianic element fundamental to Benjamin’s vision of history.

    [10] Said (1983) characterizes his moment—acerbically—this way: “Indeed, what distinguishes the present situation is, on the one hand, a greater isolation than ever before in recent American cultural history of the literary critics from the major intellectual, political, moral, and ethical issues of the day and, on the other hand, a rhetoric, a pose, a posture (let us at last be candid) claiming not so much to represent as to be the afflictions entailed by true adversarial politics. A visitor from another world would surely be perplexed were he to overhear a so-called old critic calling the new critics dangerous. What, this visitor would ask, are they dangers to? The state? The mind? Authority?” (160).

    [11] Said’s later work responds explicitly to these transformations. See, for instance, Said (1993; 2000; 2004).

    [12] US academics specializing in Victorian literature and culture have recently called for “presentist” approaches. See V21 Collective (2016).

    [13] Critics regularly rely upon the vision of neoliberalism anthropologist David Harvey (2005) develops in his rigorous and accessible A Brief History of Neoliberalism. The term has a long history, as Harvey demonstrates, but its popularity as an explanatory cipher for current political and economic conditions among intellectuals and activists who are not specialists in economics increased after 2008.

    [14] For a timeline of recent events, see Capehart (2015).

    [15] For an early account of the transformations corporate interests have inaugurated within the university, see Readings (1997).

    [16] The same ways of thinking are transforming disciplinary paradigms in the social and natural sciences. See Anderson (2008).

    [17] Sundie et al. (2010) claim to prove “conspicuous consumption is driven by men who are following a lower investment (vs. higher investment) mating strategy and is triggered specifically by short-term (vs. long-term) mating motives” (1).

    [18] During the “culture wars,” conservative humanists opposed critics on the left who wanted to expand the canon and privilege politics. Although this conservative position has virtually disappeared within humanities departments, contemporary scholars continue to claim as their primary antagonists the New Critics and the deconstructionists, figures from literary studies’ past. It remains vital to reflect upon professional practices so that our methods serve the projects we value—and again, historical self-consciousness teaches us this labor will be perpetual—but literary critics might better accomplish this if we cultivate simultaneously a more critical view of our discipline within a system of other disciplines, many of which endorse and promulgate views of the human and of history radically different from those many experts in culture often sanction.

    [19] A number of complementary and overlapping discourses put pressure on the category of the “human” as a means of pursuing a radical or progressive politics for democracy, liberty, and equality. These include the “posthumanist” projects we associate with Michel Foucault, Gilles Deleuze, and their inheritors, which attempt to destroy transcendental and ontotheological humanisms, and “posthuman” projects we associate with critics such as Donna Harraway, N. Katharine Hayles, and Ursula K. Heise, which assume humans have entered a new stage of being defined by technological innovation, biological change, and environmental catastrophe. These very different discursive formations both attempt to conceive the human anew in increasingly material terms and to trade anthropocentric models of the universe for more complex ones.

    [20] Samuel Taylor Coleridge’s ([1817] 1985) Biographia Literaria is an originary text for an Anglophone genealogy of poetry and poetics preoccupied with the nature and function of human imagination and intelligence. For a few key texts that pursue these questions in the US, see Walt Whitman’s ([1855] 1996) Leaves of Grass; Jean Toomer’s ([1923] 2011) Cane; and William Carlos Williams’s ([1923] 1970) Spring and All.

    [21] See Lee (2012).

    [22] Toni Morrison (1987) renders this violence in the novel, Beloved (234). Stevens also uses racist language in some of his letters and poems. See Hayes (2014) for a nuanced engagement with Stevens’s failures.

    [23] Bloom (1976) presents Stevens as an American transcendentalist in Poetry and Repression: Revisionism from Blake to Stevens.

  • Olivier Roy — French elections: Catholics vote Catholic, Muslims vote secular

    Olivier Roy — French elections: Catholics vote Catholic, Muslims vote secular

    by Olivier Roy

    Two days before the first round of France’s presidential elections, a terrorist attack on the Champs-Elysées, claimed by the Islamic State, sent a shock wave through the media: such an attack would surely play into the hands of the “anti-Islam” candidates—namely, the conservative François Fillon and the populist Marine Le Pen. In fact, nothing of the sort happened. Instead, the victor was centrist Emmanuel Macron, who said that France should learn to live with terrorism. The fear of Islam did not work. But religion did play a role, though not in the way that many would have predicted.

    Since the recognition of France’s secular Republic by the Catholic Church in 1890 (Cardinal Lavigerie, on behalf of Pope Leo XIII, made a toast “A la République Française!” after an official banquet in Algiers),therehas never been an avowedly Catholic political party in France. The Church rejected the idea, instead opting to promote its values by “secularizing” them and disseminating them through non-religious political actors. For instance, to same-sex marriage was couched in the 2012 by Cardinal Barbarin (bishop of Lyon) as a refusal to change the “anthropological paradigm” on which society is based; he referred to the natural law and not to the will of God.

    But the effort to reach out to secular circles and even other religious groups, including Jews, Protestants, and Muslims, failed in this case. Even the moderate right wound up endorsing same sex-marriage. As a consequence, militant Catholics took to the streets under their own flag (and cross). The movement, called la Manif pour tous (“the Demo for all”), which took shape in 2013became autonomous from the clerical hierarchy, by entering politics. By 2016, it had developed into its own political branch, called Sens Commun (common sense), which brought together some militants of Les Républicains, the “Gaullist” center-right party, of Chirac and Sarkozy, in order to push the agenda of the Manif pour Tous inside the party. It achieved a big victory with Fillon’s primary victory over Alain Juppé, the favorite. Although Fillon did not explicitly promise to rescind the law on same-sex marriage, he pledged to rewrite it and prevent full adoption by gay couples. Fillon was the only credible candidate for the presidency since the 1958 constitution to present himself as a practicing Catholic, eager to promote Christian identity and values (conversely: De Gaulle, also a devout Catholic, was a strong defender of the separation of Church and State).

    This sudden breakthrough of militant Catholicism took place at a time when the traditional right, in France and throughout Western Europe, had more or less finally but reluctantly endorsed liberal values like feminism, sexual freedom, abortion, gay’s rights, even animal rights. Moreover, even the populist extreme right has also endorsed liberal values where family and sexuality are concerned. Neither the Netherlands’s Geert Wilders, Marine Le Pen, or the Austrian Hans Christian Strache are known for attending church, or advocating Christian sexual and family norms, or Christian teachings on love and hospitality. Their definition of Christian identity is purely ethnic and folkloric, not rooted in the teachings of the Church.

    French society is strongly secular—a fact that Le Pen wove into the identity of her National Front (FN) party some time ago. Although the FN is steeped in its anti-immigrant, anti-Muslim fundamentals, from the start of the campaign she has endorsed laïcité—“political secularism,” the official term for the separation of church and state—over Christianity, as the template for French identity. Of course, her version of laïcité is directed against Islam, including banning the veil and halal food from the public space. Le Pen has also extended her particular version of laïcité to exemplifiers of all other religions in the public space, including yarmulke and kosher food.

     Nevertheless, this approach helped Le Pen finish second. But to defeat the centrist Macron in the run off, she will have to attract the Catholic constituency of Fillon and the anti-globalization, anti-capitalist, secularist electorate of Jean-Luc Mélenchon, a neo-communist and a “third-worldist,” who has supported Hugo Chavez, Fidel Castro, and the Palestinian people; like Le Pen, he has also been accused of anti-semitism. The former might be attracted by her stance against Islam, and the latter by her anti-European, anti-establishment position.

    Mélenchon, a staunch opponent of religious signs in the public sphere, offered perhaps the first round’s biggest surprise: he was the most-popular candidate among Muslim voters, of which there are between 2 and 4 millions, depending if we refer to believers or people from Muslim origin. Some attribute this to his support for the Palestinians and his open, controversial backing of Syrian leader Bashar al-Assad and Russia’s Vladimir Putin. But Palestine did not come up during the campaign. In addition, Mélenchon backs Assad because of his war against Salafist rebels; it’s difficult to see how this would appeal to pro-Salafist French Muslims living on the margins of French society—youth of destitute neighborhoods, the born-again of all kind, and converts. Traditionally, Salafists avoid political participation. In fact Mélenchon never addressed the concerns of faithful Muslims.

    The problem in understanding Muslim support for Mélenchon is that most people tend to think that Muslims vote as a single, undifferentiated faith community. For years, the debate over Islam in France has been oversimplified, reduced to an idea known commonly as communautarisation:by returning to a conservative and normative practice of Islam, the Muslim community is enforcing its own forms of social control in “the lost territories of the republic”—namely, the destitute neighborhoods. That move would lead to some sort of separation from mainstream society. But whether this has actually occurred is far from clear.

    Muslim support of Mélenchon likely had far more to do with class and social exclusion.

    There are, of course, both well-off and less-well-off French Muslims—those stuck in low-wage jobs in the destitute neighborhoods their contract-labor forefathers settled in in the 1960s and 70s, and those who have managed to move into the middle-class. France does not collect voting data by ethnic or religious group, so we cannot say for certain how these people voted; many of these middle-class Muslims likely voted for Macron or the socialist Benoit Hamon in the first round, and are likely to vote for Macron in the second. That’s because they represent middle-class aspirations.

    We know the voting patterns of less-well-off Muslims, by contrast, because they are concentrated in certain electoral precincts. Mélenchon came first in the department of Seine Saint Denis, which has the highest-percentage migrant population in France, with 37 percent; in Dreux, another city with a high percentage of migrants, he also captured 37 percent, and a peak of 57 percent in the electoral precinct with the highest percentage of Muslims. This general pattern was confirmed by an IFOP poll after the second round, which indicated that 37 percent of the French Muslims voted for Mélenchon, far exceeding the other candidates.

    The first round of the presidential elections showed no political expression or symptoms of such a religious separatism—they voted for Mélenchon, a neo-Marxist. On the contrary, despite the ban on voting declared by many Salafists, and despite a traditional disaffection of the youth towards elections, there has been an increase in participation versus the last elections. Mélenchon, then, likely won the Muslim vote on social issues: exclusion, joblessness, and precariousness attributed to capitalism, the free market, globalization and Europe. Muslims—poorer ones, at least—voted because of their social situation, not their religious convictions, choosing a candidate that based his campaign on social issues, while supporting laïcité and opposing the veil.

     Ahead of the second round, it’s interesting that while the Catholic hardliners made a more or less explicit call to vote for the FN, Le Pen is openly trying to court Mélenchon’s electorate without making any reference to the important proportion of Muslims in his electorate. While Mélenchon made it clear that he will vote for Macron, he refused to join the “Republican Front” against extreme right and “fascism” ; and let his supporters decide. Will some poor Muslims vote for Le Pen because they support the FN’s populist agenda? A bit difficult because the FN is still racist. Will they vote for Macron to fight racism? Not necessarily because Macron embodies, according to both Melenchon and Marine Le Pen, the global world of finance. The most probable option is that they will abstain, as many of them told me in Dreux.

    Olivier Roy is a political scientist, professor at the European University Institute in Florence, Italy. His most recent book is Holy Ignorance: When Religion and Culture Part Ways (Columbia University Press, 2010).

     

     

  • Alexander R. Galloway — An Interview with McKenzie Wark

    Alexander R. Galloway — An Interview with McKenzie Wark

    by Alexander R. Galloway

    This interview has been peer-reviewed by the boundary 2 online editorial collective. 

    Alexander R. Galloway: Critical theory tends to subdue biography, but I’d like you to reflect on your own trajectory as a thinker. Your last few books all fit together. How do you conceive of the project that began with The Beach Beneath the Street (2011), and continues through The Spectacle of Disintegration (2013), up to Molecular Red (2015)? It’s a story about the Situationist International, to be sure, but your story is both broader and longer than the specific locus of the S.I. Did you set out to rewrite the history of radical modernity? What stories do you want to tell next?

    McKenzie Wark: I would include A Hacker Manifesto (2004) and Gamer Theory (2007) in that trajectory. Those books are already about the mode of production after capitalism that runs on information. The former was a more optimistic book about the new kinds of class conflict that could shape it; the latter a more pessimistic one about its new modes of incorporation and control. But I felt that nobody was quite getting the alternate path through the archive those books implied. So I decided to write some more pedagogic books that laid out the resources one could use to “leave the twenty-first century.”

    That led to the three books you mentioned plus another to come that are indeed a cycle about rewriting radical modernity. Not that this is the only alternate path through the archive, but it’s an attempt to suggest a different relation to the archive in general, to see it as a labyrinth rather than an apostolic succession; a kind of “no-dads” theory, but full of queer uncles and batty aunts.

    Molecular Red has a bit about the moment of the October Revolution, rethought through Bogdanov and Platonov. Then, second in the sequence, would come the one I haven’t finished, about the British scientific left, the original accelerationists and cyberfeminists. That covers the 1930s – ’50s. Then The Beach Beneath the Street, which reads the situationists as radical theory, not art, and expands the story beyond Guy Debord. The Spectacle of Disintegration continues that dérive through the archive by way of the post-’68 moment. What to do when the revolution fails? As a book-end, there is the last part of Molecular Red about Donna Haraway, but read as a marxist as well as feminist thinker, a reading I then take through a cluster of people with Haraway-affinities.

    My job at The New School is really not ideal as far as doing research is concerned. So these are more writerly than scholarly books. They are meant to legitimate spaces in which others might do more thorough work. I want to leave nice, big attractive spaces for grad students or artists or activists to go set up camp. And people do, which makes me happy. In my small way I think I enabled some of the new work on Bogdanov, the Situationists, Haraway in relation to Marx, and so on.

    I find it enervating when people simply try to squeeze the present into the old patterns set by Walter Benjamin or whomever, and add just a tiny bit of novelty to how we read such a canonic figure. Why not read other people, or read the present more in its own terms? Ironically, to best honor Marx or Benjamin one should not simply be their exegetes. So my job is to corrupt other people’s grad students. To be the odd uncle (or auntie) who whispers that one can dissent from the great academic patriarchy (and even its subsidiary matriarchy) where one only succeeds through obedience to the elders and the reproduction of their thought.

    AG: Can you also reflect on your move to the United States, where you’ve lived now for over fifteen years? I know you’ve commented on how disconnected American academia is from other parts of the world, particularly Australia–an observation that could be spun negatively or positively. (American schools are tuition-driven and hyper capitalist, yet ironically still largely free of neoliberal bureaucratization along the lines of Britain’s onerous Research Excellence Framework.) And you’ve also mentioned in the past how you received a rather unique political education in Australia. Can you say more about your life during the Twentieth Century?

    MW: I had a great education of the provincial petit-bourgeois kind. I learned at the feet of labor movement militants and later from various self-invented avant-gardes and proto-queer bohemias. Things were already going badly in Australia in the nineties so I wrote what I think of as my “popular front” books. The Virtual Republic (1997) was about the culture wars, in the spirit of Lyotard’s differend. I wrote another one about two versions of the popular: social democratic and hyperreal. But then I fell in love with a New Yorker, so I gave up tenure, moved to New York, and started over. Probably a lucky escape, as so far the diversity of economic models has kept American universities in better shape than in state systems such as the UK or Australia.

    In Australia I was part of what Mark Gibson called the “republican school” of cultural studies. Republican in the sense of the res publica, the public thing, or more figuratively of cutting to the heart of a problem and exposing it. We weren’t interested in cultural policy or simply doing critique from the sidelines, but of trying to effect the national-popular space of cultural conflict itself. But I was already a bit critical of the superstructural turn cultural studies represented, its bracketing off of questions of media form, and with them of the mode of production and historical stage. In Virtual Geography (1994) I had already wanted more a theory of the media vector as shaping a certain kind of space of action.

    It just seemed untenable to do anything like the same sort of work in America, where I had no access to the public sphere. I was nobody. And I already wanted to move away from the post-Marxism of say Hall and Laclau and Mouffe, that turn to either the cultural or political as autonomous or even ontological. That did not make a lot of sense if you looked around at the big world a bit. So I went back to my earlier formation in classical and western Marxism as well as in the avant-gardes, and wrote A Hacker Manifesto. That was my first “American” book, even though it came out of participating in the transnational digital avant-gardes of the nineties, something of interest to me alongside the “popular front” work I was doing at that time.

    AG: Is it fair to say that you have a reticence toward high theory and big thinkers, figures like Alain Badiou with his intricate if not onerous systems? You are not a system builder, if I may speak plainly. Instead you are proud to pursue a kind of “low theory.” Provisionality, tactical intervention, tinkering and recombination, intellectual creativity, but also impurity–although I can never tell if you are a pragmatist or an idealist! Can you comment on the fascinating mixture that constitutes low theory?

    MW: I was formed by the labor movement, and I remain in solidarity with it even though it is in a sense a god that died. So how does one keep living and working after defeat? There’s something to be said for knowing one is of a defeated people. One is free from the silly chatter of optimism. And one knows who one’s real comrades are. They are the ones you still have after the defeat. I still retain that side, which for me is a kind of decision that can’t be revoked, a picket line never to cross.

    On the other hand, my other commitment is not to the community of labor but the community of non-labor, or bohemia. Its expression is not the organized labor movement but the disorganized avant-garde. It’s not uncommon to combine these things, of course. But most often they are combined in the form of (sometimes rather dreary) Marxist theories about the avant-garde, which nevertheless remain very conventional in form. It seemed to me self-evident that one should also reverse the procedure, and apply avant-garde techniques to the writing of theory itself. Hence A Hacker Manifesto uses Situationist détournement and Gamer Theory uses Oulipo-style constraints.

    Low theory refers to the organic conceptual apparatus a milieu composes for itself, at least partly outside of formal academic situations. Both the labor movement and the avant-garde did that. I think it is useful to have that base, even if it is an attenuated and defeated one. It’s useful to have some perspective outside of the criteria of success of academia itself. After all, many of the “greats” of low theory–Spinoza, Marx, Darwin, Freud–they were not philosophers.

    AG: So low theory means anti-philosophy? I’ve noticed that some commentators prefer to define anti-philosophy as a kind of anti-rationalism (that being Badiou’s gripe) or even some type of a mystical romanticism. But these definitions of anti-philosophy never made sense to me.

    MW: Badiou thinks philosophy has a monopoly on a certain kind of reason, but more out of institutionalized habit than anything else. You could think of low theory as what organic intellectuals do. It’s defined by who does it and why, rather than by any particular cognitive style. I’m interested in how, after the organic intellectuals of labor, there are organic intellectuals of social movements, everyday life, the experience of women or the colonized, and of new kinds of activity that are not traditional labor in fields like media and computation. Concepts get formed differently and are meant to do different things when you are trying to think through your own action in the world rather that when you are a scholar of action in the world.

    AG: I’m also intrigued by what you say about form, since this always struck me as the central question for Marxism, if not for all attempts to think and act politically. There’s the critique of the commodity of course, where form takes a beating. But at the same time form–particularly as idea or concept–seems absolutely crucial to me, not as the thing to be avoided, but as a scaffolding to propel people forward. Do you think idea, concept, or form has a place in Marxism?

    MW: As extracts or abstracts from practice, concepts attempt to grasp a range of practical particulars within a conceptual form. The concept is only going to be slightly true, but about a lot of situations. As opposed to a fact, which is mostly true, but about a particular. Concepts are handles for grabbing a lot of facts. The thing to avoid however is the temptation to think the concept is more true than the practice. As if it were some underlying essence or ontology. I’d call that the “philosophical temptation.”

    I think one has to wear one’s concepts lightly, and I think Marx did that, if not consistently. His concepts modify over time as he gets further along in thinking practice, and of course as the experience of living within capitalism changes. Capitalism isn’t an eternal essence with changing appearances. This is of course a mere thumbnail sketch of an epistemology, but then it ought not to be too big a distraction. Knowledge practices are experiments. There’s no royal road to science.

    Form is however a rather larger question, particularly as forms, unlike concepts, are embodied and implanted in social life itself. The commodity form, for example. But they are still not essences. The commodity form mutates, and not least in contact with other forms: the property form; technical forms. I’m particularly interested in how the information form (a redundant phrase, I know!) and the commodity form mutually transform each other. It is not that information was subsumed within the commodity form, which remains the same essence. Rather each changes the other. Which is maybe why this is not our grandparents’ capitalism, if it is still capitalism at all. It may be a worse ensemble of forms, including what Randy Martin calls the derivative form.

    AG: You always pull me back from the precipice of the concept! I value that about your work. Although I can’t help but question the notion of “no royal road,” and am reminded of larger discussions about the critique of method, or the notion that method can’t or shouldn’t exist. Wouldn’t you agree that the rejection of method is an ideology in itself, an ideology that, in fact, can be isolated very precisely around a certain Anglo-American configuration of empiricism, pragmatism, and realism?

    MW: The shadow of not one, but two historical exclusions hangs over our received ideas about all this. Certainly there is a Cold War in western knowledge that has to do with suppressing anything that is not empiricism, pragmatism and realism. Recall how the CIA funded Michael Polanyi’s efforts to construct a philosophy of science that saw science as functioning best as a “free market of ideas”–and at the very time it was becoming the exact opposite, entangled as it was in the military industrial complex. This calls for some detours into the archive to see what was excluded.

    The new left rescued various philosophical alternatives, most notably what came to be constructed as “western Marxism.” But it neglected certain other suppressed traditions, the scientific socialism of Waddington, Bernal, Needham and Haldane being just one of them. So I think there’s still a project there to reclaim some other missing resources.

    But there was another exclusion, which happened earlier, and within Marxism itself. That was the suppression of the “Machists.” Both the Russian and German strands of Machism, despite a lot of political and theoretical differences, had one argument in common: that a merely philosophical materialism is no materialism at all. A merely philosophical materialism will simply reify and take as first principles some metaphors drawn from the science and industry of its time. Rather, materialism ought to open philosophy to the world, to other practices of knowledge and action, including those that generate low theory. Philosophy can’t be sovereign. It has to accept comradely relations with other practices, not one of command.

    The decisions for or against a given configuration of knowledge tend to be infused with the politics of their time. And sometimes one has to go back and revise those decisions. One has to reverse the decision in the early part of the twentieth century by the Leninists in favor of a dogmatic (and supposedly “dialectical”) materialism. But one also has to reverse or at least qualify the decision of the new left in favor of philosophy as a sovereign discourse. Neither has the resources needed for the times. Neither is adequate to understanding what the forces of production are about today, as expressed in earth science, biology and information science.

    AG: Can you also elaborate information form and its relation to commodity form? I recall how the shape of information played an important role in your book A Hacker Manifesto.

    MW: Information, as a sort of real abstraction at work in the world, is one of the key phenomena of our times. Obviously it is partly ideological, but then all forms are. They are never pure. That there could be a method of purifying concepts out of social-historical forms was the great fantasy of western Marxism.

    Information emerges historically. The key moment is the war and wartime logistics. World War II demanded unprecedented scales of production, and information emerged as a means of control for that production. At the time it was understood to be a complex mode of production that included state politics, military command, and vast business enterprises. After the war it continued in much the same manner. The great postwar boom known as Fordism is in part a state socialist achievement. Only later in the history of the development of the forces of production does information become a means of radically transforming the commodity form itself, and enabling new relations of production and reproduction.

    Rather than think of the commodity form philosophically, as a kind of eternal essence of capital, I think it is more interesting to think about how the information form comes into contact with the commodity form and forces it to mutate. What emerges is a commodity form far more abstract than anything hitherto, a derivative form, one that does not need any particular material being at all, even though it is in no sense immaterial. Rather, the fact that information can have an arbitrary relation to materiality infects the commodity form itself. Property is no longer a thing. Whole new relations of production have to be concocted to canalize information as a force of production into some new exploitative economy, one now based in the first instance on asymmetries of information. The “business model” of any contemporary corporation is to extract surplus information from both labor and non-labor.

    So it might be timely to think about what information actually is. How it came to be. How it is ideological, and yet like all ideologies, actual as well. A formal force in the world. Marx got as far as thinking through the implications of thermodynamics for a low theory from the labor point of view. But information did not even exist in his time in the sense we mean the word now, and in the way it works now. So we have to reopen theory’s dialog with other ways of knowing and acting in the world, in order to understand information.

    AG: Or as you sometimes ask: what if we’re no longer living in capitalism, but instead living in something much worse? I’m thinking of how you gave a name to the “Carbon Liberation Front.” Is capitalism more avant-garde than the avant-garde?

    MW: Yes, one might argue that this is a new mode of production: not capitalism but worse. “Not capitalism but better” is a quadrant of ideological space already covered by “the post-industrial” and other cold war intellectual products. But I thought “not capitalism but worse” was worth exploring. People who think this is capitalism have very impoverished resources for thinking historically. Either it is transformed into communism–and good luck with that–or capitalism just goes on eternally. Capitalism stays the same in essence, but its appearances change. Modifiers are thus attached: cognitive capitalism, semio-capitalism, platform capitalism, postfordist capitalism, neoliberal capitalism; but these are non-concepts. The thing itself is not really thought through. It is like adding epicycles to an Earth-centric view of the universe.

    Still, I’m reluctant to concede that whatever this mode of production is would then supplant the avant-garde, even if it has now fully ingested the historical avant-gardes. Social formations change through conflict. Struggles over information shape the new mode of production, not the “genius” of the ruling class or some intrinsic elan vitale of capital. I associate that with Nick Land’s position. And for Land, a certain kind of Marxism only has itself to blame. Such Marxists treated capital as an unfolding essence, and forgot all about labor’s struggle in and against a nature it only perceives retroactively, through the inhuman prosthesis of technology. They forgot all about the specifics of how the forces of production develop. And while I think we can have concepts about science and technology rather than just empirical descriptions–our shared premise in Excommunication (2013)–I don’t think they are concepts of philosophy.

    Commodification always comes late to the game, wrapping its form around labors of one kind or another. Commodification turns qualitative practices into exchange value. It is pushed and mutated by social forces external to it. One was labor; another was, in fact, the avant-gardes, including the one you and I once belonged to, which tried to do a punk-rock refunctioning of the digital to make a new commons. Well, we lost, like all avant-gardes. But we gave it a try. Like Dada or the Situationists, we were not only absorbed into the commodity form, it had to adapt and mutate to swallow us. History advances bad side first, as Marx said.

    AG: “Fear of handling shit is a luxury a sewerman cannot necessarily afford.” That old line from Hans Magnus Enzensberger often comes to mind when reading your work. Political thinkers, Marxists among them, have long struggled with questions of perfection and purity if not cleanliness. How to form a more perfect union? How to envision utopia? Fossil fuel pollution has brought on global catastrophe. At the same time one might wish to shun “pristine environments,” as Heather Davis calls them, clean environments like those Roundup Ready fields, which of course are also dirty in a different sense. The clean and the dirty, how do you determine which is which?

    MW: It is a misunderstanding of the utopian strain to think it was always interested in perfection and purity. Maybe Plato’s Republic is like that. Morris’ News From Nowhere isn’t. Parliament is used for storing horse manure, if I remember rightly. And from Wells onwards, including Bogdanov, utopians had to deal with evolutionary time, in which there can be no final and perfect form. JBS Haldane was probably the first to think this on a very, very long time frame, where the human evolves and devolves and some other sentient species evolves in our place.

    So I don’t think utopia is about perfection. And in Fourier, it’s specifically about shit. Compare to the emerging bourgeois novel of his time, Fourier was a realist. He wanted to know who dealt with the poo. Shit and dirt and waste were real problems for him. In short, I don’t see the utopian as “cognitive estrangement” that posits realistic-detailed but ideal worlds. I see the utopian as deeply pragmatic and realistic, particularly about entropy, waste, impurity and so on. And of course the utopias all came true and are all more or less functional. Not true as representations. The details look different. But true as diagrams. We live in them as we propose new ones.

    AG: Thinking more about utopia, I wonder if you have thoughts on Fredric Jameson’s recent piece on the “universal army”? He’s also someone who refuses to build grand systems; yet today he offers a modest proposal for how to build communism in America. Indeed from a certain perverse point of view the U.S. is already an advanced socialist economy, given the size of the military and its socialist or quasi-socialist organization (single-payer health care, job security, pension, subsidized education, etc.). Is the army the jobs program we always needed?

    MW: I’m fond of counter-intuitive ways of categorizing or narrating things. I think it is worth arguing that the post-war American economy was successful because of socialism. Not “socialism” in some ideal or perfect dream-form, but socialism as a practical, existing set of social organizations. Certainly, the great technical advances mostly came from the socialized science and engineering of the war effort, for example. The capitalist part of the economy built Fordism because there was a great reservoir of socialization to back it up, from education to highway-construction. Capitalism is one of the affordances of socialism, not vice-versa.

    The kind of crash-course socialization of science and labor that made D-Day possible might also be the only way we’re going to do anything about climate disruption. It is an astonishing story, how the allies built artificial harbors to make possible the greatest seaborne invasion of all time. At the moment I’m quite interested in the communist, socialist and left-liberal scientists and intellectuals involved in that effort, the ones who came away from D-Day with a strong sense of what socialized labor, science and tech could achieve, because they were the ones doing it. The very people Hayek targeted his theories against had actually achieved what his theory said couldn’t be done.

    AG: Who are you thinking about?

    MW: Hayek’s The Road to Serfdom is an armchair polemic. One of the people he rants against is Conrad Waddington, a significant figure in biology. He coined the concept of epigenetics. He was also involved in wartime operations research and was a figure on the scientific left. Waddington published a wartime book called The Scientific Attitude. It was published by Penguin, who were instrumental at the time in publicizing a progressive scientific politics in connection with the war effort. Waddington’s book is not the best expression of the “attitude” of the times, and so Hayek was picking low hanging fruit. But the fact is, people like Waddington were involved in an immense effort to deploy a partly socialized economy, which brought together the forces of science and labor, to defeat fascism.

    AG: As you mentioned earlier, Donna Haraway figures prominently in Molecular Red. Her mantra “stay with the trouble” might be your mantra as well. What interests you most about “Haraway’s California,” as you call it?

    MW: It started as making good on a missing footnote to A Hacker Manifesto. A decade or so after that book, I was ready to assess more seriously my relation to Haraway, starting with her “A Manifesto for Cyborgs.” Looking back, that text was already a strong and rhetorically keen refutation of what Richard Barbrook called the “California ideology,” that synthesis of Ayn Rand, hippy effluvia and computation, pumped up on military-industrial-complex money. There’s a tiresome line peddled now that anyone who ever took an interest in technology must be a dupe of Silicon Valley and its “techno-utopianism.” Haraway was so far ahead of that game.

    And there’s a Marxist strand to her work. It’s less visible over time, but it’s there. Partly it stems from Marcuse and the reception of western Marxism. But she is also a reader of Joseph Needham’s synthesis of Darwin, Marx and Whitehead. Needham gets a whole chapter in her first book. That aspect of Haraway is about keeping open the question of how nature and culture are related, how to be careful about importing undetected metaphors from one to the other, and how other, more enabling metaphors are smuggled in. I saw that as a sort of reinvention, out of materials at hand, of Bogdanov’s project.

    Engels had realized that the fortunes of capitalism rested in part on the development of the forces of production. That in turn depended on the sciences. So one needs to know something about the sciences. He tried too hard to fit them into a schematic version of dialectical materialism. But the basic strategy was sound. Science is part of the labor of knowing and producing the world. That became a somewhat neglected tradition in some quarters. One thinks of Lukács’ absurd claim, based on nothing but philosophical arrogance, that science is “reification” and nothing more. The connection between Marxist thought and the sciences was repressed in the West by the Cold War. One of the more interesting exceptions to that lacuna is Haraway, who then so usefully connects it to feminist questions about science, particularly the life sciences.

    AG: Joseph Needham, yes, I’m thinking of another reference too, Norman Brown and Love’s Body, which helps Haraway return to eros and intimacy as a necessary precondition for subjectivity. Haraway is a child of the sixties, to be sure, but while reading Haraway’s defense of canine discipline–that it’s unethical not to discipline your dog–I was reminded how Haraway was never a peacenik in the feel-good hippy sense. She was never interested in conceiving the world without power, like some new-age Pollyanna. Still, it’s somewhat disorienting to read a feminist advocating dominion, if not domination, over other creatures, even if such dominion is guided by health and a sense of “flourishing,” a word that appears a few times in Haraway’s “Companion Species Manifesto.” The left used to write about structure and hierarchy. Although today it is more common to write about ethics and care. Does hierarchy still matter? Or is it more important to address the ethical than structure per se?

    MW: This connects back to something we touched on earlier: my instinctive distrust of Badiou. If one reads some Darwin, one really has to give up on the belief in formal or absolute equality as the meaning of communism. That really starts to look like nothing more than a theological residue. (Here I am an acommunist just as one might be an atheist). Indeed, one of Haldane’s books is called The Inequality of Man. If one is a Marxist after Darwin, and Haldane is one of the great figures in that dispersal–a term I prefer to camp or lineage–then one has to confront inequality. And not just as an ethics. What’s a politics, or a political economy even, of non-equality, and not just of “man” but of multi-species being? Particularly if one has thus abandoned the apartheid that separates the human from the non-human, considered as another kind of theological residue. How can we all flourish in our differences?

    Haraway is useful here, as one of the few inheritors of the Marx after Darwin dispersal. Although one might want to connect her also to John Bellamy Foster, not to mention Stephen Jay Gould and others who survived the Cold War by treading very gently where overt political and philosophical affiliations were concerned. Multi-species being can’t really be conceived via formal, abstract, or absolute equality. Particularly if one accepts that domesticated animals have to be thought as part of our multi species-being and not as either part of a pure nature nor simply as individual animals. So you end up having to think a political economy, or a nature-culture as Haraway says, of many species together.

    I once took the kids to a zoo that had a collection of domesticated animals that had become endangered: chickens and sheep and so forth. Which made me think, provocatively perhaps, that veganism can’t be ethical, because if one made it a categorical imperative, then all these and many other domesticated species are condemned to extinction. Of course the majority of species may be condemned to extinction at the moment, so this may be the least of our worries, but surely this is the great challenge the Anthropocene throws at theory. Theory’s dominant traditions, which treat some version of the human or the social or the historical as giving rise to concepts that can have an autonomous existence apart from what the earth science and natural sciences describe–all of that is just obsolete. I think we have to start over from elsewhere in the archive, as existing critical theory owes too much to an a priori separation of culture from nature.

    Latour is unfortunately right about that, to the extent that one considers our impoverished, Cold-War deformed inheritance from the archive as in any way representative of what Marxism and critical theory really have to offer. But Latour would steer us back to theology by another path, a post-Catholic one, a sophisticated one in which “all things bright and beautiful” are equally divine. And Haraway participates a bit in that too, even as she resists the somewhat providential celebration of Gaia in Latour or Stengers. Her world is more tentacular. For tactical reasons I have offered something of a détournement of Haraway, pushing her off that path and back to Marx, as it were.

    AG: Catholic indeed. And Haraway herself doesn’t hide her own Catholic formation. Another way to stay with the trouble? I take it you are fairly skeptical of the whole Christian turn in recent theory, Badiou’s Saint Paul, Zizek’s Book of Job, Laruelle’s Christ, Agamben’s theodicy, etc?

    MW: I take the theological turn to be a covert admission of exhaustion. A certain kind of philosophy can no longer stand on its own. But rather than go backwards to theology, I wanted to go forward. What if some of our received ideas about the sciences are simply out of date? How does climate science work as a simulation science? That way one gets away from the transcendent God lurking in the theological turn. But then various flavors of an immanent God re-emerge, whether it be in the so-called new materialism, in speculative realism, or in actor-network theory. There again Haraway is useful, because she consistently takes a hard line against revivals of vitalism, for which Deleuze should cop some of the blame.

    But then as Bogdanov might point out, one just generalizes from the metaphors one inherits, the metaphors that give shape to one’s labors, inflating them into a worldview. Bogdanov’s observation is as true of me as of anyone else. In my case, it’s third generation protestant atheism, with an understanding of labor that comes from experiencing the transition from analog to digital, and with an education marked by immersion in the tail end of the old labor movement, the new social movements, and so on. The key is not to take one’s particular worldview generated from one’s particular experience as a universal valid for everyone, while still maintaining its universality for one’s particular experience. It may have component parts that work for others; others may have parts that work for me. So, fine, others will find theo-critical theory explains their world. It can be locally useful. The bigger problem is an organization of labor that can share and mix and coexist using all such worldviews as can be considered functional for life.

    AG: Haraway is a Westerner as well, a Colorado kid who moved to California. That hadn’t registered for me in the past, but it clicked after reading some of her recent interviews. She has a bit of country outlaw in her. At the same time she’s quick to acknowledge the bloody history of manifest destiny and settler colonialism; the real world is ideologically messy and that’s not a bad thing, as she might say. I wonder if there is an American regionalism at play here? For example, this city where we’ve both migrated to, New York, might be the center of the art world, and perhaps the center of finance capital, but it’s never had a monopoly on intellectual production, far from it. Do you still believe in regional knowledges? Or has globalization and the Internet done away with all of that?

     MW: One has to look at two things there. As far as history goes: how do the trans-regional relations of war, trade and migration retroactively produce regionalism? If one tracks not just the settling of people and their moving but also the movement of commodities and information, one ends up with a much less contained sense of place. But then that history rests on top of a geography, even a geology. To really understand place means to abandon romantic notions of a people and their place. Place is a non-human thing, made on very large scales and times.

    Of course one’s answer to the question of the regional and the global depends not just on which region but which “global” one is from. I was very influenced by the Australian Marxist art historian Bernard Smith’s work, particularly European Vision and the South Pacific. Smith argues that James Cook’s voyages in the Pacific yielded information that exceeded the categories in which English scientists and intellectuals expected to put it. The Great Chain of Being fell apart, and in its place went a more flexible relationship between category and content, a relationship that holds for geology, flora, fauna and “native” peoples. That book is a neglected masterpiece, revealing the significance of the 18th Century naval vector.

    I was also influenced by Eric Michaels, Stephen Muecke and others who were breaking with anthropological studies of Aboriginality. They were interested in particular Aboriginal practices of communication and philosophy, respectively. It is interesting how certain Aboriginal peoples came to treat information as value to be shared in very selective gift practices. And how those practices could have a kind of error correction procedure that seems like it has worked pretty well for some thousands of years. Then there was Vivien Johnson’s work on secret, sacred Aboriginal information that was being used as “designs” on tea-towels and the like, because there was no “copyright” on it. That really broke open for me all the assumptions of the postmodern era, of appropriation and unoriginality. The postmodern worldview was completely incompatible with this other, indigenous one. I became less interested in differences against the totality and more interested in totalities against each other.

    AG: A revealing comment, particularly since I so closely associate you with appropriation and unoriginality–not that your work is unoriginal! I’m thinking of détournement, and your affection for Situationist tactics of all kinds. “Plagiarism is necessary. Progress demands it.” Debord said it, but so have you. Or am I wrong? Have you soured on appropriation?

    MW: The western desert Aboriginal world Michaels studies was as modern as any other, but it was based on oral transmission. His whole project was to introduce video within the existing cultural forms, to strengthen rather than obliterate them. It was a great lesson in the possibility that, even with standard media tech, maybe someone could build really different kinds of relations. Questions of copy, original, ownership, asymmetry and so forth could play out very differently. Which was also one of the lessons of Situationist theory and practice: that the ownership of information was a late and only partial accretion on top of quite other practices–of which détournement was only one. Détournement did, however, target what Marx took to be crucial, the property question.

    Détournement was the dialectical complement to spectacle in Debord. It was the means to abolish private property at least in the sphere of information. I developed that into a class analysis in A Hacker Manifesto. What intellectual property obscures is the difference between being the class that makes information and the class that owns it.

    But at the time it was not entirely clear how détournement was to be recuperated. There was indeed a social movement in all but name that freed information from property, but the leading edge of the vectoralist class worked out how to adapt. The vectoralist class built vectors for precisely that free information, while retaining the keys for themselves. They said, in essence: You can have the data, but not the meta-data. You can have the information of your most personal desires, but in exchange we will retain the totality of those desires. So one must shift from being data punks to meta-data punks in order to continue the struggle in and against a mode of production based not in the first instance on surplus value, but on asymmetries of information.

    AG: Yes of course I agree–but all data is meta-data! We know this from examining the nested structure of the protocols. It’s meta-data all the way down (and up too). That’s something that I never understood about the Snowden revelations: skim people’s data, no one cares; but call it a theft of “meta” data, and people start to balk. The meta seems scarier, or somehow more real; it’s a very modern problem. Or am I being overly pedantic? Maybe these kinds of technical analyses of data infrastructure are disconnected from everyday politics?

    MW: Well, this might be what the slogan “meta-data punk” is about. Or in old fashioned post-structuralist terms, you could think of it as reversing the relation between data and meta-data, and making meta-data primary and data derivative. But in any case I think understanding how data infrastructure actually works would be an excellent project, to which your own Protocol was a signal contribution. Data infrastructure is now a key component of the forces of production, which have already pushed the mode of production into some weird new shape. So rather than do the “quantitiative” digital humanities we might do the “qualitative” digital humanities, which is about understanding phenomena at the level of form rather than content. (And here our old friend “form” returns again…)

    I’m surprised anyone was surprised by Snowden. You may remember in the ’90s there was a story going around nettime.org and rhizome.org about Project Echelon, an inter-agency project to scrape, archive and search everybody’s emails, news of which allegedly leaked in New Zealand. I have no idea if that story was true, and it doesn’t matter anyway. All that matters is that it was technically feasible at the time. With the rapid drop in cost of digital storage, one could expect that eventually all signals of all kinds would be collected, archived and searched. If a technology is technically feasible, one should assume the security state has the technology at their disposal.

    One should assume the ruling class has it as well, although people seem less concerned about that. It makes sense to assume that all major corporations are now in the “meta-data business.” On the other hand, we’re no longer simply individual subjects to be disciplined until we internalize the law. We’re not even split subjects caught between drive and desire. We are, as Hiroki Azuma says, “data-base animals.” Power is now about seeking advantage from asymmetries of information in a volatile and noisy world, in which the human is just another random bag of attributes resonant in disparate fields of information.

    AG: Also, any indictment of the NSA entails an indictment of Web 2.0 and social media companies. Google and the NSA perform the same basic function: they both build secondary graphs from primary ones (ours). And they both do it under dubious conditions of “permission,” even if Google still has the public’s trust if not always its confidence. It’s a PR game; NSA is bad at it, but Google is better, at least so far. One of the key reasons why it has been so hard to critique much less curtail the NSA–hard psychically I mean–is that people implicitly understand the hypocrisy in slamming the NSA while loving Twitter. Result is, both organizations get a pass. The theme is similar to my previous question: what happens when an argument bumps into a desire? We used to solve that problem via critique. But today critique is passé.

    I love your point about asymmetries of information. One of the great myths of distributed networks is that they are “smooth” or “flat” or otherwise equitable. In reality, they are nothing but an accumulation of asymmetries, of difference itself congealed into infrastructure. Is this what you meant, in A Hacker Manifesto, by vectors and the vectoral class?

    MW: One of the reasons to spend so much time writing about Bogdanov’s Proletkult, the Situationists, and Haraway and her kith is that I think these were examples of how to be critical and inventive at the same time. Bogdanov thought that ideologies–or what he preferred to call worldviews–were an inevitable substitution outwards from our forms of organization to assumptions regarding the workings of the world. But he also thought worldviews were what motivated people emotionally to work together. (He was already doing a bit of affect theory!) So it is a matter of inventing the worldview best suited to our organizational practices while at the same time maintaining a critique of those that don’t grow organically out of our labors.

    So what’s the worldview of people who don’t do labor in the strict sense? They don’t work against the clock, filling a form with content. Their job is to design the form. There may still be deadlines, but there isn’t an assembly line. What they produce isn’t actually a product. It is a “unique” arrangement of information–unique enough to be considered a distinct piece of property under intellectual property law. If what they came up with is very valuable, they probably won’t get most of the value out of it, even if they retain ownership, as they own just the intellectual property, not the means of production. What class is this? I called them the hacker class, but it involves anyone whose efforts produce intellectual property.

    In retrospect, A Hacker Manifesto leaned more on an understanding of law, something superstructural, than on understanding what had happened to the forces of production. I’m a law school drop-out, but I read my Evegy Pashukanis and critical legal theory. I sensed that the rapid evolution of intellectual property law in the late Twentieth Century probably corresponded to significant changes in the mode of production. It relied more and more on a new kind of effort that wasn’t quite labor, that of the hacker class. It gave rise to a new class of owners of the means of production, what I called the vectoralist class.

    “Vector” I got from Paul Virilio. It is a shorthand way of describing technical relations that have specific affordances. In geometry, a vector is a line of fixed length but of no fixed position. So it is a kind of “technological constructionism,” in that a given techne does indeed have a determining form, but also some openness as well. Critical media theory is about understanding both at the same time. A Hacker Manifesto rested on this very general theory of the technical relation. And regarding the openness of a given vector, one can ask: what shuts down any particular affordances that may exist? The information vector, product of a particular historical moment in the development of the forces of production, reveals an ontological property of information: that it can exist without scarcity.

    The hacker class is producing something that for the first time can really be common, while the vectoralist class has to stuff it back into the property form to survive, by means of legal and technical coercion. Or, it can concede the battle, and let a portion of information flow freely, but win the war through control of the infrastructure in which it is shared. That’s about where we are now: the commodification of the information produced by non-labor as free shared activity. Just as capitalism is an affordance of socialism; vectoral commodified information is an affordance of the abstracted gift practices of the information commons.

    AG: I remember reading versions of A Hacker Manifesto that you would post to the Nettime email list, and getting very jealous that I hadn’t written it! There’s a lot more I want to ask you about, but let’s skip to, why not, the chapter on “Revolt.” There you contrast a “representational” politics with an “expressive” politics, the latter being a stateless politics or an escape from politics as such. What does that mean exactly, and have your thoughts on stateless politics evolved at all in the intervening years?

     MW: It turns out something similar to what I called an expressive post-politics was being thought as exodus or self-valorization by the Italians. I never liked their somewhat idealist take on “general intellect” and “immaterial labor,” but it was interesting to see these ideas of forms of organization outside politics taking off there. In General Intellects (forthcoming) I look at both theorists of exodus and hegemony (or “representation”). The shorthand would be that both are going on simultaneously, but perhaps the belief in the political is evaporating. Another stage in the endless rediscovery of the fact that god is dead. It is no accident that attempts to revive political theory overlap with the theological turn in critical theory. Both illustrate a longing for what’s passing.

    Starting in Virtual Geography I was interested in the vector as something that distributes information, globally but not equally, and which gives rise to turbulence and noise. One of my case studies was Tiananmen square in 1989, a sometimes overlooked precursor to the “movement of the squares.” Another was the Black Monday stock market crash, again a precursor. I think I was already sensing in a partial way the rise of a new vectoral infrastructure that bypasses the old envelopes of the state form. The new infrastructure both erodes the old state form, and also paradoxically allows it to return in a hard and reactive way.

    In Gamer Theory I was thinking of this as a movement from topography to topology, where geo-strategic and geo-commodified space can no longer be mapped on a plane, but rather, as in topology, they appear more like vectors that can bend space and connect points, points which on the surface of a planar Earth appear far apart. (This idea has also been picked up by Benjamin Bratton.) I think we’re a long way from being able to think topological space, where points on the surface of the Earth can be connected and disconnected. It is quite different to any kind of political conception of power. It is what I call vectoral power. We still have simulations of politics, or for that matter culture, but perhaps they are things of the past now. But this is of course not to be optimistic about technology. All that what replaced them is probably worse.

    AG: We’ve been having this dialog over email for a few days now, but today is November 9, 2016 and Trump is president-elect. As a final question, what are your thoughts on American fascism? It’s an old theme, in fact…

    MW: It’s curious that the political categories of liberal, conservative and so forth are treated as trans-historical, but you are not supposed to use the category of fascism outside of a specific historical context. There are self-described neoconservatives, and even supposed Marxists have taken the neoliberals at their word and used their choice of name without much reflection, calling this “neoliberal capitalism.” But somehow there’s resistance to talking about fascism outside of its historical context. I have often been waved off as hysterical for wanting to talk about it as a living, present term.

    Even if it is admitted to the contemporary lexicon, it is treated as something exceptional. Maybe we should treat it not as the exception but the norm. What needs explaining is not fascism but its absence. What kinds of popular front movements can restrain it, and for how long? Or, we could see it as a “first world” variant of the normal colonial state, and even of many variants of what Achille Mbembe calls the “postcolony.”

    Further along those lines: maybe fascism is what happens when the ruling class really wins. When it no longer faces an opponent in whose struggle against it the ruling class can at least recognize itself. And when it no longer knows itself, it can only discover itself again through excess, opulence, vanity, self-regard. Our ruling class of today is like that. They not only want us to recognize their business acumen, but also that they are thought leaders and taste makers and moral exemplars. They want to occupy the whole field of mythic-avatars. But our recognition doesn’t quite do the trick because we’re just nobodies. So they heap more glory on themselves and more violence on someone else.

    Maybe any regime of power is necessarily one of misrecognition. All it can perceive is shaped by its own struggles. But the fascist regime, the default setting of modernity and its successors, is doubly so. It can recognize neither its real enemies or itself. There is some small irony in an election being won because Florida voted Republican, when the Republican plan to accelerate the shit out of climate disruption may start putting Florida under water in our life time. I’m reminded of a line from Cool Hand Luke: “What we have here is a failure to communicate.” Fascism keeps punching away at the other but never finds even its own interests in the process. Hence its obsession with poll numbers and data surveillance. The ruling class keep heaping up data about us, but because it has expunged our negativity from its perceptual field, it cannot find itself mediated by any resistance.

     

  • Robert T. Tally Jr. — The Southern Phoenix Triumphant: Richard Weaver, or, the Origins of Contemporary U.S. Conservatism

    Robert T. Tally Jr. — The Southern Phoenix Triumphant: Richard Weaver, or, the Origins of Contemporary U.S. Conservatism

    by Robert T. Tally Jr. 

    This essay has been peer-reviewed by the boundary 2 editorial board. 

    I

    The 1950 U.S. Senate race in North Carolina was fiercely contested, featuring what even then was understood by many to be the opposed ideological trajectories of Southern politics: that of a seemingly progressive, “New South,” characterized by its support for modernization, industry, and above all civil rights (or, at least, improvements to a system of racial inequality) on the on hand, and that of a profoundly conservative tradition resistant to such change, particularly with respect to civil rights, on the other. The unelected incumbent, appointed by the governor after the death of Senator J. Melville Broughton a year earlier, Frank Porter Graham was notoriously progressive, the former president of the University of North Carolina and a proponent of desegregation. The challenger was Willis Smith, mentor to later longtime conservative senator Jesse Helms, who was himself an active campaigner for Smith in this race. At the time, this election was viewed as a turning point in North Carolinian, and perhaps even Southern, politics, so starkly was the ideological division drawn. The primary election—this being 1950, the Democratic primary was, in effect, the election, since no Republican nominee could possibly offer meaningful competition in November—was remarkably vitriolic, as Smith supporters played on the fears of bigots at every turn. (For example, one widely disseminated pro-Smith flyer announced “Frank Graham Favors Mingling of the Races.”) On May 26, a Graham supporter, the idealistic young major of Fayetteville took to the airwaves to castigate the Smith campaign for its repulsive rhetoric and divisive tactics:

    Where the campaign should have been based on principles, they have attempted to assault personalities. Where the people needed light, they have brought a great darkness. Where they should have debated, they have debased. … Where reason was needed, they have goaded emotion. Where they should have invoked inspiration, they have whistled for the hounds of hate.

    Decades before “dog-whistle politics” become a de facto political strategy throughout the South (and elsewhere, of course), J. O. Tally Jr. lamented the motives, and no doubt the effectiveness, of such an approach, which had made this the “most bitter, most unethical in North Carolina’s modern history.”[1]

    That was my grandfather, then an ambitious, 29-year-old lawyer and politician, who must have seen himself as fairly representative of a New South intellectual and statesman. A graduate of Duke University with a law degree from Harvard, Joe Tally had returned from distinguished overseas service in the navy during World War II to teach law at Wake Forest University and to practice at the family firm before running for office in his hometown. His own career in electoral politics ended with a failed 1952 run for Congress, during which his moderate views on segregation likely amounted to an unpardonable sin for many voters in southeastern North Carolina, and he settled for alternative forms of civic and professional service, such as the Kiwanis Club, of which he later became international president. Others of Tally’s political circle had better fortunes with the voters. Terry Sanford, for example, went on to become the governor of North Carolina, then long-time president of Duke University, before returning the U.S. Senate in 1987 as perhaps the most liberal of the Southern senators. (Ah, to recall the time when an Al Gore was considered quite conservative!) Tally’s ex-wife, my grandmother Lura S. Tally, went on to serve five terms in the N.C. House and six in the Senate from 1973 to 1994, where she represented that liberal wing of the old Democratic Party, promoting legislation especially in support of elementary education, the environment, and the state’s Museum of Natural History. However, during the same period, former Smith acolyte Jesse Helms carried that banner into the U.S Senate in 1973, immediately becoming one of the most conservative members of Congress, hawkish in foreign policy, parsimonious in his domestic policy, and ever ready to protect the public from unsavory art in his attacks on the National Endowment for the Arts. Perhaps it is part of the legacy of the 1950 Senate campaign, but North Carolina had always seemed rather bi-polar in its politics, often maintaining a far-right-wing and a relatively liberal contingent in the U.S. Congress. That is, until recently. In the past decade, North Carolina, like all of the South and much of the country, has lurched ever rightward in politics and policies. Today, the spirit of the old conservatives of Willis Smith’s era reigns triumphant.

    The same year that the Smith campaign allegedly “whistled for the hounds of hate” in order to secure an election over a liberal vanguard dead set on undermining traditional Southern values, another native North Carolinian lamented that those espousing belief in the such values had been forced out of the South. Speaking of the paradoxical fact that so many Southern Agrarians (including himself) had fled from their ancestral homeland to the urban North, there colonizing institutions like the University of Chicago, Richard M. Weaver proclaimed them “Agrarians in exile,” who had been rendered “homeless,” for “[t]he South no longer had a place for them, and flight to the North but completed an alienation long in progress.” Weaver explained that “the South has not shown much real capacity to fight modernism,” and added that “a large part of it is eager to succumb.”[2] For Weaver, the great Agrarians of the I’ll Take My Stand generation had been compelled to retreat in the face of those, like my grandparents, who in their “disloyalty” to “their section” of the United States exhibited “the disintegrative effects of modern liberalism.”[3] Contrary to appearances, Weaver found that the Southern values which undergirded his preferred form of cultural and political conservatism were under assault, and perhaps even waning, in the South. The baleful liberalism he saw as all but indomitable in the industrial North and Midwest was, in Weaver’s view, ineluctably encroaching on the sacred soil of the former Confederacy.

    It is strange to look upon this scene from the vantage of the present. With the defeat of Senator Mary Landrieu in Louisiana’s December 6, 2014, run-off, there were no longer any Democrats from the Deep South in the U.S. Senate. And, as the 114th Congress convened in 2015, the U.S. House of Representatives contained no white Democrats from the Deep South, this for the first time in American history. Of course, the once “solid South” has been steadily trending ever more toward the Republicans since Brown vs. Board of Education, Governor Wallace’s “segregation forever,” and Richard Nixon’s notorious Southern strategy of the late 1960s. Native conservatism, gerrymandering, demographics, racial attitudes, and other factors have come into play, and the shift is therefore not wholly surprising, but the domination of the states of the former Confederacy by the Republican Party represents a sea-change in U.S. electoral politics. Furthermore, the hegemony of a certain Southern-styled conservatism within the Republican Party and, increasingly, within social, political, and cultural conservatism more generally marks a decisive movement away from not only the mid-century liberalism against which many Agrarians like Weaver railed, but also against the worldly neoconservatives like the elder President Bush whose  embrace of a “new world order” elicited such fear and loathing from members of his own party in the early 1990s. The dominant strain of twenty-first-century political discourse in the United States is thus a variation on a sort of neo-Confederate, anti-modernist theme of the Agrarians,[4] or, rather, of Weaver, perhaps their greatest philosophical champion.

    In this essay, I want to revisit the ideas of this mid-twentieth-century conservative theorist in an attempt to shed light on the origins of this distinctively American brand of conservatism in the twenty-first century. Weaver’s agrarian conservatism today seems both quaint or old-fashioned and yet disturbingly timely, as the rhetorical and intellectual force of his ideas seems all-too-real in the present social and political situation in the United States. Weaver’s mythic vision of the South, ironically, has come to symbolize the nation as a whole, at least from the perspective of many of the most influential conservative politicians and policy-makers today. As a result of what might be called the australization of American politics in recent years—that is, a political worldview increasingly coded according to identifiably “Southern” themes and icons, not to mention the growing influence of Southern and Southwestern politicians at the level of national government—we can see more clearly now the degree to which Weaver’s seemingly eccentric, often fantastic views have become not only mainstream, but perhaps even taken for granted, in 2015.[5] The “Southern Phoenix,” celebrated by Weaver for its ability to survive its own immolation and re-emerge from the ashes, now appears triumphant to a degree that the original Fugitives and Vanderbilt Agrarians could not have dreamed possible. And, as is so often the case when fantasies come to life, the result may be more frightening than even their worst nightmares forebode.

    II

    Outside of certain tightly circumscribed spaces of formally conservative thought such as that of the Liberty Fund, Richard M. Weaver may no longer be a household name. However, his writings and his legacy have been profoundly influential on conservative thinking, and he has been viewed as a sort of founding father or patron saint of the movement. The Heritage Foundation, for example, adopted the title of his totemic, 1948 critique of modern industrial society, Ideas Have Consequences, as its official motto when founded in 1973. A devoted student, literally and metaphorically, of the Southern Agrarians of the I’ll Take My Stand generation, Weaver embraced a certain “lost cause” view of the old Confederacy that informed his wide-ranging criticism of twentieth-century American and Western civilization. He viewed the antebellum South as the final flourishing of an idealized feudalism, doomed to fail as the forces of industry, science, and technology, together with ideological liberalism, secularism, and “equalitarianism,” undermined and ultimately destroyed its foundations. Weaver’s critique of modernity, like J. R. R. Tolkien’s, thus took the form of an almost fairy-story approach to history, in which a mythic past functioned as an exemplary model and as a foil to the lurid spectacle of the present cultural configuration, a balefully “modern” society characterized especially by its secularism, its embrace of scientific rationality, and its ineluctable process of industrialization. Weaver’s jeremiad is thus both dated, redolent of a certain pervasive interwar and postwar malaise, and enduring, as his rhetoric remains audible in social and political discourse today, particularly in all those election-year panegyrics to a “simpler” America, a paradisiacal place just over the temporal horizon, now most known to us by its mourned absence.

    Weaver was born in Asheville, North Carolina, in 1910, but he moved to Lexington, Kentucky, as a small child, where he grew up “in the fine ‘bluegrass’ country,” as Donald Davidson noted,[6] and later received his bachelor’s degree from the University of Kentucky. In his autobiographical essay, pointedly titled “Up from Liberalism,” Weaver described the faculty there as “mostly earnest souls from the Middle Western universities, and many of them […] were, with or without knowing it, social democrats.”[7] This information is apparently supplied in order to explain Weaver’s own brief flirtation with the American Socialist Party upon graduation in 1932. Weaver then enrolled in graduate school at Vanderbilt, birthplace of I’ll Take My Stand in 1930 and thus ground zero of the literary or cultural movement by then known simply as “the Agrarians.” At Vanderbilt, Weaver studied directly under John Crowe Ransom, to whom The Southern Tradition at Bay was later dedicated, and he wrote a master’s thesis (“The Revolt Against Humanism: A Study of the New Critical Temper”), which criticized the “new” humanism of Irving Babbitt and Paul Elmer More, among others.[8] After receiving his M.A. degree, Weaver briefly taught at Texas A&M, but was repelled by its “rampant philistinism, abetted by technology, large-scale organization, and a complacent acceptance of success as the goal of life.”[9] Weaver entered graduate school at Louisiana State University, where his teachers included two other giants of the Agrarian and American literary traditions, Robert Penn Warren and Cleanth Brooks. The latter served as director for Weaver’s dissertation, a lengthy investigation and celebration of post-Civil War Southern literature and culture, evocatively (and provocatively) titled “The Confederate South, 1865–1910: A Study in the Survival of a Mind and Culture.” This book was released posthumously in 1968 as The Southern Tradition at Bay: A History of Postbellum Thought, and it may well be considered Weaver’s magnum opus, as I will discuss further below. After receiving his Ph.D., Weaver taught briefly at N.C. State University, before embracing “exile” at the University of Chicago, where he spent the remainder of his professional life, not counting the summers during which he returned to western North Carolina, apparently to replenish his reserves of authentic agrarian experience and to recapture the “lost capacity for wonder and enchantment.”[10] As it happens, Weaver’s celebratory vision of the Southern culture is comports all-too-well with that of a fantasy world.

    Legend has it that the virulent anti-modernist eschewed such new-fangled technology as the tractor, yet he seemed to have little compunction about enjoying the convenience of the railroad and other amenities made possible by modern industrial societies. “Every spring, as soon as the last term paper was graded, he traveled by train to Weaverville [North Carolina, just north of Asheville], where he spent summers writing essays and books and plowing his patch of land with only the help of a mule-driven harness. Tractors, airplanes, automobiles, radios (and certainly television)—none of these gadgets of modern life were for Richard Weaver,” writes Joseph Scotchie, admiringly.[11] Yet Weaver also speaks about drinking coffee with pleasure, knowing well that Appalachia is not known for its cultivation of this crop. As with so much of the fantastic critique of modernity by reactionaries, there is an unexamined (perhaps even unseen) principle of selection that allows one to choose which parts of the modern world to tacitly accept, and which to ostentatiously jettison.

    III

    Weaver’s most significant and influential work published during his lifetime is undoubtedly Ideas Have Consequences, a title given by his editor at the University of Chicago Press but which Weaver had intended to call The Fearful Descent.[12] It is actually one of only three books published by Weaver during his own life; the others are The Ethics of Rhetoric (1953) and a textbook titled simply Composition: A Course in Writing and Rhetoric (1957). Weaver recalled that Ideas Have Consequences originated in his own rather despondent musings about the state of Western Civilization in the waning months of World War II, as he experienced “progressive disillusionment” over the way the war had been conducted, and he began to wonder “whether it would not be possible to deduce, from fundamental causes, the fallacies of modern life and thinking that had produced this holocaust and would insure others.”[13] Weaver’s bold, perhaps bizarre, premise was that the civilizational crisis in the twentieth century could be traced to a much earlier philosophical turning point in the trajectory of Western thought, namely the proto-scientific nominalism of William of Occam. Weaver draws a direct line from Occam’s Razor to the most deleterious effects (in his view) of modern empiricism, materialism, and egalitarianism.

    For Weaver, humanity took a wrong turn in the fourteenth century when it allegedly embraced Occam’s Razor as the guiding principle of all logical inquiry, thus condemning mankind to a sort of secular, narrow, bean-counting approach to both the natural and social worlds. Referring obliquely to Macbeth’s encounter with the weird sisters in Shakespeare’s tragedy, Weaver asserts that

    Western man made an evil decision, which has become the efficient and final cause of other evil decisions. Have we forgotten our encounter with the witches on the heath? It occurred in the late fourteenth century, and what the witches said to the protagonist of this drama was that man could realize himself more fully if he would only abandon his belief in the existence of transcendentals. The powers of darkness were working subtly, as always, and they couched this proposition in the seemingly innocent form of an attack upon universals. The defeat of logical realism in the great medieval debate was the crucial event in the history of Western culture; from this flowed those acts which issue now in modern decadence.[14]

    What follows from this is a lengthy, somewhat disjointed analysis of “the dissolution of the West,” which will include not only the critique of philosophical tendencies or declining moral codes, but also attacks on egotism in art, jazz music, and other forms of popular entertainments. It is almost a right-wing version of the near-contemporaneous Dialectic of Enlightenment, except that Weaver would not have imagined “Enlightenment” to have suggested anything other than “disaster triumphant” to begin with, and Horkheimer and Adorno was all too wary of the latent and manifest significance of the jargon of authenticity as enunciated by writers like Weaver.[15]

    Although Ideas Have Consequences is not overtly “Southern” in any way, Weaver’s medievalism, which was developed not according to any deeply philological study of premodern texts (à la Tolkien) but rather from his own sense of that late flowering of chivalry in the antebellum South, indicates the degree to which his discussion of the West’s decline is actually tied to his view of the lost cause of the Confederacy. The first six chapters of Ideas Have Consequences constitute a fairly scattershot series of observations on “the various stages of modern man’s descent into chaos,” which began with his having yielded to materialism in the fourteenth century, and which in turn paved the way for the “egotism and social anarchy of the present world.”[16] The final three chapters, by contrast, are intended as restorative. That is, in them Weaver attempts to delineate the ways that modern man might resist these tendencies, reversing the movement of history, and reaping the rewards of a legacy that would presumably have flourished had only the pre-Occam metaphysical tendency ultimately prevailed. In a 1957 essay in the National Review, Weaver claimed that, contrary to the assertions of liberals, the conservatives were not so much in favor of “turning the clocks back” as “setting the clocks right.”[17] Not surprisingly, Weaver’s three prescriptions in Ideas Have Consequences would neatly align with the fantastic, medieval, or feudal system he had imagined as the dominant form of social organization in the antebellum South, although he does not highlight his regional allegiance in this, a work purportedly devoted to the study of (Western) civilization as a whole.

    The first is the principle of private property, which Weaver takes to be “the last metaphysical right” available to modern man. That is, while “the ordinances of religion, the prerogatives of sex and of vocation” were “swept away by materialism” (specifically, the Reformation, changing social values, and so on), “the relationship of a man to his own has until the present largely escaped attack.”[18] Weaver calls the right to private property a “metaphysical right” because “it does not depend on social usefulness. […] It is a self-justifying right, which until lately was not called upon to show in the forum how its ‘services’ warranted its continuance in a state dedicated to collective well-being.”[19] Private property, which Weaver likens to “the philosophical concept of substance,” is depicted as providing a foundation for the renewed sense of self and being in the world. The second principle is “the power of the word”: “After securing a place in the world from which to fight, we should turn our attention to the matter of language.”[20] Weaver offers a critique of semantics as itself simply a form of nominalism, while arguing for an education in poetics and rhetoric as necessary to reclaim one’s connection to the absolute, while also remaining critical of the abuses of language in modern culture. Finally, Weaver concludes with a chapter on “piety and justice,” in which he argues that the piety, “a discipline of the will through respect,” makes justice possible by allowing man to transcend egotism with respect to three things: nature, other people, and the past.[21] Fundamentally, for Weaver, this piety issues from a chivalric tradition that he imagines as the only real hope for a reformation of the twentieth-century blasted by war, spiritually desolate, and (he does not shrink from using the term) “evil.” What is needed, Weaver concludes in the book’s final line, is “a passionate reaction, like that which flowered in the chivalry and spirituality of the Middle Ages.”[22]

    As it happened, there was a place in the United States which had previously held, and in 1948 perhaps still maintained, this medieval worldview. Weaver’s beloved South, even though it was under siege from without by the forces of modernity and in peril from within by a generation of would-be modernizers, retained the virtues of an evanescing feudal tradition, which might somehow be recovered and brought into the service of civilization itself. Indeed, Weaver’s first book-length work, which only appeared in print after his death, was an elaborate examination and strident defense of this chivalric culture that once flourished beneath the Mason and Dixon line. If only its message could be distilled and disseminated, this Southern tradition might redeem the entirety of the West.

    IV

    The Southern Tradition at Bay occupies a unique and important place in Weaver’s corpus. Based on his doctoral thesis but published five years after his death, the book can be read as being representative of his “early” thinking on the subject and as a sort of summa of his entire literary and philosophical program at the same time. Many of the ideas that Weaver here identifies as Southern are clearly connected to those he celebrates in Ideas Have Consequences. For example, Weaver’s elaboration of the “mind” of the Old South focused on four distinctive but interrelated characteristics: the feudal system, the code of chivalry, the education of the gentleman, and the older religiousness, by which Weaver meant a non-creedal religiosity. Combined, these four factors distinguished the unique culture of the “section,” clearly differentiating its heritage from that of other parts of the United States.[23]

    Weaver’s medievalism, as I mentioned before, is not rooted in the formal study of the history, philology, or philosophy of the European Middle Ages, although he draws upon certain imagery from its time and place. One might argue that Weaver’s project is literally quixotic, inasmuch as he figuratively dons the rusty armor of a bygone age to tilt at windmills which he imagines to be giants, but in an effort “in this iron age of ours to revive the age of gold or, as it is generally called, the golden age.”[24] Weaver’s tone is simultaneously elegiac and recalcitrant, mourning the lost cause or the waning of a glorious past and ardently defending its values in the present, fallen state of the world. Methodologically, Weaver’s approach is to gather selectively then-contemporary accounts, including public proclamations and individual diaries—or, often, a combination of the two, in the form of published memoirs—as well as more recent historical studies, then add his own assessments of their currency (i.e., in 1943) as evidence of an enduring, twentieth century “Mind of the South.”[25] Weaver somewhat disingenuously cautions that,

    In presenting evidence that this is the traditional mind of the South, I am letting the contemporaries speak. They will seldom offer whole philosophies, and sometimes the trend of thought is clear only in the light of context; yet together they express the mind of a religious agrarian order in struggle against the forces of modernism.[26]

    Needless to say, perhaps, but such a collective “mind” is likely not to be discovered if the historian were to cast the nets of his research more widely.[27] By identifying only those “true” Southerners whose opinions can thereafter be identified as authentic, Weaver anticipates all of our current politicians and pundits who seem to be forever deferring to these mythical “real Americans” whose viewpoints are curiously at odds with the actual history of the present. After laying out the feudal heritage which characterizes the mind and culture of the South in the opening chapter, Weaver by turns examines the antebellum and postbellum defense of the Southern way of life, the perspectives of Confederate soldiers and the reminiscences of others during the Civil War (or “the second American Revolution”), the work of selected Southern fiction writers, and then the reformers or internal critics who, in Weaver’s view, effectively managed to take the fight out of the “fighting South.”[28]

    Weaver concedes by the end that “the Old South may indeed be a hall hung with splendid tapestries in which no one would care to live; but from them we can learn something of how to live.”[29] It is a disturbing and prophetic line, suggestive of how much the Southern heritage might be abstracted, idealized, and then transferred to distant places and times. Comparing his own situation to that of a Henry Adams, who, “wearied with the plausibilites of his day, looked for some higher reality in the thirteenth-century synthesis of art and faith,” Weaver imagines that the old Confederacy, with its feudal hierarchies and chivalric cultural values, may yet become a model for the social formations to come. Calling the Old South “the last non-materialist civilization in the Western world,” Weaver concludes:

    It is this refuge of sentiments and values, of spiritual congeniality, of belief in the word, of reverence for symbolism, whose existence haunts the nation. It is damned for its virtues and praised for its faults, and there are those who wish its annihilation. But most revealing of all is the fear that it gestates the revolutionary impulse of our future.[30]

    Behind this elevated rhetoric lies the hoary old dream, indistinct threat, and rebel yell: the South will rise again!

    The title of The Southern Tradition at Bay is provocatively descriptive. Since its purview is the period of American history between 1865 and 1910, following the crushing defeat of the former Confederacy and the disastrous period of Reconstruction—not to mention advances in science, the rise of a more industrial mode of production, and the emergence of modernism in the arts and culture—the study’s elaboration of a cognizable “Southern Tradition” rooted in unreconstructed agrarianism and adherence to the ideals of the old Confederacy is intended to establish it as a preferred counter-tradition to that of the victorious North and to the united States in general. Moreover, the phrase “at bay” is suggestive not of defeat or conquest, but of temporary inconvenience; it refers especially to being momentarily held up, kept at a distance, but by no means out of the game. Such an accomplished rhetor as Weaver would no doubt be aware that the phrase derives from the French abayer, “to bark,” and that it probably referred to dogs that were prevented from approaching further to attack and that were thus relegated to merely barking at their prey. (The image of a group of Southerners barking at an uncomprehending North may be all too appropriate when revisiting I’ll Take My Stand, come to think of it.) In other words, The Southern Tradition at Bay’s title nicely encapsulates two powerful aspects of its argument: that the Southern Tradition exists, present tense, long after its ancien régime was disrupted by war and by modernization; and that it was not ever defeated, much less destroyed, but merely kept in abeyance from the then dominant, though less creditable national culture. Weaver’s vision of the South does not imagine a residual or emergent social formation, to mention Raymond Williams’s well-known formulation,[31] but rather another dominant, yet somehow suppressed or isolated, form which remained in constant tension with the only apparently victorious North. Weaver’s mood is sometimes melancholy, befitting his sense of the “lost cause,” but his conviction that the South ought to rise again, whether he believed it was practically feasible or not, is clear throughout.

    Thus, the idea of a distinctively Southern tradition being temporarily held “at bay” suits Weaver’s argument well. However, this was not the original title of the study. When he presented it as his doctoral dissertation at Louisiana State University, where his thesis advisor was Cleanth Brooks,[32] Weaver gave it a much more provocative and politically charged title: “The Confederate South, 1865–1910: A Study in the Survival of a Mind and Culture.” The difference is not particularly subtle. Here it is asserted that the “Confederate South,” not just a tradition, itself exists outside of the more limited lifespan of the C.S.A., and that its mind and culture—not merely those of a South, a recognizable section of the United States, but those of the Confederacy—survived the aftermath of the Civil War, a conflict which Weaver dutifully names the “second American Revolution.”[33] Weaver submitted the manuscript to the University of North Carolina Press in 1943, but it was summarily rejected. I have found no evidence one way or another, but I like to think that the publishing arm of the university presided over by Frank Porter Graham declined to publish the execrable apologia of the Confederacy’s “survival,” with its idyllic portrait of human bondage and of racial bigotry, on not only academic but also political grounds. The story is probably less interesting than that, for although the book makes a passionate case for a certain worldview, the dissertation’s extremely selective portrayal of the postbellum culture of the south almost certainly rendered its conclusions dubious from the perspective of academic historians and philosophers. Most likely, Weaver’s omissions, as well as his renunciation of any sense of objectivity or nonpartisanship, led to the study’s remaining unpublished during his lifetime. In any case, its eventual publication in 1968, a transformative moment in U.S. politics and society, makes for a rather intriguing, if unhappy, coincidence. The “Southern strategy,” conceived by Harry Dent and launched by the Nixon campaign that very year, had in The Southern Tradition at Bay its historico-philosophical touchstone.

    V

    It is all too noteworthy that the “mind and culture” that Weaver identifies as surviving in the aftermath of the Civil War is, at once, generalized so as to extend to the entirety of the American South and limited to a fairly tiny slice of that section’s actual population. Weaver makes no bones about the fact the he wanted to consider only the elite members of that society as representative of this tradition. Asserting that “it is a demonstrable fact that the group in power speaks for the country,” Weaver unapologetically writes that, “[i]n assaying the Southern tradition, therefore, I have taken the spirit which dominated,” thus ignoring Southern abolition societies, for example.[34] He also ignores the majority of the people. In order to make his case, Weaver pays little attention to white people who are not aristocratic lords of their own fiefdoms or soldiers who fought in the Civil War, which is to say, Weaver largely overlooks the poor multitudes who vastly outnumbered the wealthy planters, military leaders, and governors. Also, though not unexpectedly, the black population, a not inconsiderable percentage of the populace in these states, is treated far worse, in this account; black Southerners are not ignored, but rather are called out for special treatment in assessing their significant role in making possible the this culture and its tradition.[35]

    Indeed, Weaver refers to blacks in the South as “the alien race,” as if he cannot understand that persons of African descent are no more or less alien to the lands of the Americas than are those of European descent. “Alien” cannot here mean “foreign,” since Weaver highlights the Southerner’s kinship to the Europeans, whether genealogically or with respect to social values. Weaver almost blames the black servants for being “inferior,” the mere fact of which itself could lead to abuse and therefore can reflect badly on the moral constitution of the white superiors. For example, after praising the idyllic state of paternalism in which “[t]he master expected of his servants loyalty; the servants of the master interest and protection,” and going so far to note that even at present, “so many years after emancipation,” the Southern plantation owner will routinely “defray the medical expenses of his Negroes” and “get them out of jail when they have been committed for minor offenses,” Weaver concedes that

    This is the spirit of feudalism in its optative aspect; some abuses were inevitable, and in the South lordship over an alien and primitive race had less favorable effects upon the character of the slaveowners. It made them arrogant and impatient, and it filled them with boundless self-assurance. Even the children, noting the deference paid to their elders by the servants, began at an early age to take on airs of command. […] These traits [i.e., irritability, impatience, vengefulness], which were almost invariably noted by Northerners and by visiting Englishmen, gave Southerners a reputation away from home which they thought baseless and inspired by malice.[36]

    Weaver never doubts  whether the feudal paternalism of the plantation owner, pre- or post-emancipation, to “his Negroes” would have appeared quite so optative in its aspect to the servants themselves. Informed readers, regardless of their own political views, cannot help but question this formulation.

    In Weaver’s view, all servants—almost exclusively understood to be members of an “alien race” as well as being a subaltern class—on a Southern plantation are either happy and loyal or hopelessly deluded. During the Civil War, for example, “the alien race, which numbered about four millions in the South, kept its accustomed place, excepting those who through contact with the Federal armies were won away from adherence to ‘massa’ and ‘ol’ mistis’.”[37] This appears in a section called “The Negroes in Transition,” within a long chapter titled “Diaries and Reminiscences of the Second American Revolution,” and Weaver’s unmistakable conclusion is that the black population of the South was almost entirely better off under the system of slavery. Indeed, from his blinkered perspective, the African Americans under consideration would be better off as slaves precisely because they are more naturally suited to that condition. This position constitutes not merely an apologia of human bondage but also a casual acceptance of the most foul racial bigotry. Weaver cannot seem to imagine a reasonable reader who would question white supremacy, which he and the authorities he approvingly cites take to be a matter of fact. “The Northern conception that the Negro was merely a sunburned white man, ‘whose only crime was the color of his skin,’ found no converts at all among the people who had lived and worked with him.”[38] Weaver thus intimates that those, such as the Northerners, who believed otherwise were merely ignorant of the facts familiar to any and all with the least bit of experiential knowledge. Similarly, when Weaver writes that “[m]ore than one writer took the view that it was impossible for the two races to dwell together unless the blacks remained in a condition approximating slavery,” he offers not a word to gainsay the view, and he tacitly endorses it throughout the book.[39]

    Weaver’s somewhat disingenuous assertion that he is “letting the contemporaries speak” for themselves is hardly an excuse for this profoundly racist account. Even if he relied only on direct quotations, which he certainly does not, Weaver had already conceded that he was rather selective in how he would approach his project. Needless to say, perhaps, but “The Negroes in Transition” section makes no reference whatsoever to any black authorities; in fact, Weaver here seems to rely entirely on the remembrances of Southern belles, as the footnotes in this section refer exclusively to autobiographies or memoirs written by white women, including one titled A Belle of the Fifties.[40] (The suggestion that free blacks represented a threat to white women is not so subtly hinted at in the pages.) Weaver quotes liberally from the women’s writings, but he frequently editorializes and supplements their mostly first-person perceptions with an almost scientific assessment, expounding on the laws governing society and nature.[41] For example, having just mentioned both slavery and race, and therefore leaving no doubt in the mind of the readers as to the racial criteria by which a social hierarchy of the type he is endorsing would be established, Weaver asserts: “[o]ut of the natural reverence for intellect and virtue there arises an impulse to segregation, which broadly results in coarser natures, that is, those of duller mental and moral sensibility, being lodged at the bottom and those of more refined at the top.”[42] Indeed, Weaver goes so far as to credit the endemic racism of the Southerner with a kind of moral superiority over those who lack this good sense. He argues that, in the Southerner’s “endeavor to grade men by their moral and intellectual worth,” his defense of slavery and racial hierarchy “indicates an ethical awareness” missing from many Northerners’ perspectives.[43]

    That politics in the United States has, since 1968, become increasingly characterized by racial division is both controversial and indubitable. The “post-racial” America presided over by Barack Obama has witnessed some of the most acrimonious, racially-inflected public discourse and debate in years. Yet open appeals to racial justice or to discriminatory practices are considered gauche. As I mentioned at the beginning of this essay, a form of “dog-whistle politics” has infiltrated nearly all political rhetoric in recent decades. Perhaps the most infamous example of this “dog-whistle” political strategy can be found in Lee Atwater’s remarkably candid revelation in a 1981 interview. The former Strom Thurmond acolyte, who later served in the Reagan White House, then as George H. W. Bush’s 1988 campaign manager, and who later became chairman of the Republican National Committee, Atwater is acknowledged as one of the most astute political strategists of his generation. In speaking (anonymously, at the time) of the Reagan campaign’s far more elegant and effective version of the Southern strategy, Atwater explained:

    You start out in 1954 by saying, “Nigger, nigger, nigger.” By 1968, you can’t say “nigger”—that hurts you. Backfires. So you say stuff like forced busing, states’ rights and all that stuff. You’re getting so abstract now [that] you’re talking about cutting taxes, and all these things you’re talking about are totally economic things and a byproduct of them is [that] blacks get hurt worse than whites. And subconsciously maybe that is part of it. I’m not saying that. But I’m saying that if it is getting that abstract, and that coded, that we are doing away with the racial problem one way or the other. You follow me—because obviously sitting around saying, “We want to cut this,” is much more abstract than even the busing thing, and a hell of a lot more abstract than Nigger, nigger.”[44]

    The fact that abstract economic issues, which presumably would affect both whites and blacks in the relatively poor Southern states in more-or-less equal measure, are so effective as code words for traditional, race-baiting tactics of a previous generation—the era of Willis Smith, in fact—demonstrates the degree to which Weaver’s feudal hierarchies maintain themselves, now in an utterly fantastic way as a vague threat, well into the late twentieth century or early twenty-first. As Atwater suggested, Southern white voters are willing to endorse policies that actually harm them, so long as a byproduct of those policies is that “blacks get hurt worse than whites.” This too, it seems, has much to do with the survival of a mind and culture in the aftermath of slavery and war, and so it is not altogether surprising that Weaver’s examination of the Southern tradition “at bay” focuses so intently on demonstrating why the black population of the South ought to remain subjugated to the white population as the era of civil rights, desegregation, and modernization dawns on the region.[45]

    VI

    In his appreciative remembrance of I’ll Take My Stand, written on the occasion of the thirtieth anniversary of its publication, Weaver invoked the image of the “Southern Phoenix,” a mythic reference to a being that had regenerated itself from the ashes following its own fiery destruction. Weaver uses this figure not so much to recall how the Agrarians whose work constituted that epochal text had themselves gone on to greatness, even if the volume had been ridiculed and dismissed by Northern critics in the 1930s. Weaver is also thinking of the tenets and values of the Old South, those that the Vanderbilt Fugitives and Agrarians embraced and promoted, which must have seemed retrograde, even malignant to so many in 1930, but which had reemerged and flourished amid an ascendant conservatism just beginning to take shape nationally in 1960. Yet, for all its usefulness as a metaphor, the phoenix is probably also an apt figure for Weaver’s own conservative vision, since—like an imaginary creature taken from the provinces of mythology—Weaver’s image of the Southern tradition, whether at bay or on the offensive, is profoundly fantastic. This imaginary tradition is rooted in a world that almost certainly never existed, not on a wide scale at any rate, and the polemical forces of Weaver’s argument are directed at a foe that has been conceived as an immense Leviathan, but which we today know to have been largely chimerical.

    At times, this argument becomes almost comical. In explaining the importance of “the last metaphysical right,” private property, for example, Weaver cites the example of Thoreau,[46] although the latter’s notorious experiment in living deliberately required him to purchase, not build, a prefabricated hut, then to place it and himself on property owned by another (Emerson, in fact), but which he was permitted to dwell upon rent-free. Far from demonstrating the self-sufficiency and resolve of the individual, Thoreau’s experiment might be taken as exemplary of a kind of localized welfare system; one need not punch the clock at the local factory if one lives off the generosity and largess of family and friends. However, as we have seen increasingly in the United States in recent years, the receipt of corporate and other forms of welfare in no way prevents the recipients from bashing the government for offering support to others. The Republican Party’s adoption of the “We Built It” slogan in 2012 offers a tellingly Thoreauvian fantasy, one where it is possible to accept the public’s funding while insisting upon absolute independence from the commonweal.

    Given the importance of a sense of place and of community to Weaver’s fantastic vision of a medieval heritage, such rampant individualism—an ideology subtending the basic neoliberal projection of free markets and autonomous economic actors—seems quite foreign. Indeed, it is odd to talk about Weaver as a forebear to contemporary conservatism. Certainly the economic neoliberalism which celebrates unfettered free markets and the geopolitical neoconservatism which glories in globalization and preemptive military engagements are a far cry from Weaver’s fanciful nostalgia for an idealistic feudalism founded upon rigid social hierarchies, chivalric codes of ethics, and a powerful, culture-shaping religion or religiosity.[47] In his own writings, we can see Weaver’s strong aversion to the emergent globalization and even nationalization, which he views as corrupting the properly regionalist values he favored. Weaver’s worldview would not have allowed him to embrace the preemptive war strategies championed by Dick Cheney, Donald Rumsfeld, and Paul Wolfowitz during the various military conflicts of the past 30 years. Moreover, Weaver’s ardent defense of the humanities—recall his loathing for the educational and cultural aura of Texas A&M, now home to the George [H. W.] Bush Presidential Library—is entirely at odds with the views on higher education, the arts, philosophy, and “high” culture held by the most prominent and visible members of the G.O.P. today.  Yet, the sectarianism of Weaver’s view paved the way for contemporary neoconservative politics and policies. Weaver’s well-nigh Schmittian, Us-versus-Them antagonism, requires us to envision not merely a Western civilization opposed to its non-western rivals but a truer, more valuable “Southern” civilization against the putatively uncivilized rest of the United States. The loathsome, omnipresent discourse about “real” Americans and what constitutes them is a legacy of the Southern Agrarian traditions apotheosized by Weaver’s philosophy.

    Indeed, the particular labels—conservative, neoconservative, neoliberal, and so forth—are not necessarily helpful in understanding the dominant political and cultural discourses in the United States in the twenty-first century. As Paul A. Bové has observed, “[m]any critics of the Far Right movement conservatism mischaracterize it. It is not an epiphenomenon of neoliberalism. In fact, the popular elements of this movement, of its electoral coalition, resent the economic and cultural consequences of neoliberalism and globalization in politics and culture.”[48] To many of the policies and even most of the ideas of the neoconservatives like Wolfowitz, Cheney, and both Presidents Bush, Weaver and his beloved Agrarians would almost certainly object. However, the cultural and intellectual foundations of the neoconservatives’ positions, not to mention the fact of their being elected or appointed to offices of great power in the first place, owes much to an ideological transformation of U.S. intellectual culture whose fons et origo may be found in the fantastic vision of a distinctively Southern exceptionalism.

    One might well name this the australization of American politics, as the Southern section’s purportedly unique culture has tended, since the 1960s, to be more and more representative of a national conservative movement. This movement, which has become perhaps the most influential force within the Republican Party at a moment when the conservative politics has itself become more prominent in the United States, thus tends to be the dominant force in national, and increasingly international, politics as well. It should not be forgotten that the rightward shift even in the Democratic Party can itself be linked to this increasingly australized politics, as both Georgia’s Jimmy Carter and Arkansas’s Bill Clinton emerged nationally as the preferable, because more conservative, candidates who would stand up to the old-fashioned liberals in their own party (inevitably symbolized by Ted Kennedy, Mario Cuomo, or Jesse Jackson).[49] In their commitment to economic growth, particularly that made possible by increasingly corporate or industrial development, these conservative Southern Democrats would have earned the agrarian-minded Weaver’s contempt, but their rhetorical and ideological commitments align far better with the agrarian discourse than did the expansive liberalism of the New Deal or the Great Society. Weaver would undoubtedly decry the rapid growth of the South’s population in recent decades, since that growth has been generated in large part by ever more industrial or urban development, but he would probably delight in seeing the rust of the Rust Belt as unionization, heavy industry, and traditional urbanism has declined in the North and Northeast. The shifting numbers of electoral votes in favor of Southern states is also a real consideration for any political or cultural program interested in preserving or expanding Southern “values” in the United States. The fall of the hated North, in this view, is almost as sweet as the South’s rising again.

    The costs of this australization of American politics are incalculable, as may be inferred from the increasingly vicious public discourse with respect to all manner of things, including welfare and taxation, education, science, the environment, individual rights, foreign adventures, war, domestic surveillance (a form of paternalism), and so forth. As far back as 1941, W. J. Cash had concluded his study of The Mind of the South by noting the “characteristic vices” of that culture:

    Violence, intolerance, aversion and suspicion toward new ideas, an incapacity for analysis, an inclination to act from feeling rather than from thought, an exaggerated individualism and too narrow concept of social responsibility, attachment to fictions and false values, above all too great attachment to racial values and a tendency to justify cruelty and injustice in the name of those values, sentimentality and a lack of realism—these have been its characteristic vices in the past. And, despite changes for the better, they remain its characteristic vices today.[50]

    Taken out of their original context, these words seem all too timely in the twenty-first century, with the events of Ferguson, Missouri, in 2014 or Baltimore, Maryland, in 2015, among many other less spectacular and more pervasive examples, resounding throughout the body politic. In Cash’s final lines, he abjured any temptation to play the role of prophet, declaring that it would be “a brave man” who would venture definite prophecies, and it would be “a madman who would venture them in the face of the forces sweeping over the world in 1940.”[51] Bravery or madness notwithstanding, Cash likely could not have imagined the degree to which the characteristic vices of the South in his time could become so widespread to have become the characteristics of a national American “mind” tout court in the next century.

    Moreover, as should be obvious, the australization of American politics is not simply a matter of political leaders or voters residing in the southern parts of the United States. The pervasiveness of a certain identifiably Southern cultural signifiers within mainstream political discourse, particularly in the more conservative members of the Republican Party but also throughout the public policy and electioneering rhetoric of both major parties, signals a victory for that fantastic or idealistic “mind and culture” so celebrated by Weaver and his Agrarian forebears. It is a terrifying prospect for many, but the vision of the intransigent Southern traditionalist now operating from a position of broad-based cultural and political power on a national, indeed an international, stage might be the apotheosis of Weaver’s grand historical investigation into the region’s purportedly distinctive past. As Weaver put it in a 1957 essay,

    It may be that after a long period of trouble and hardship, brought on in my opinion by being more sinned against than sinning, this unyielding Southerner will emerge as a providential instrument for saving this nation. […] If that time should come, the nation as a whole would understand the spirit that marched with Lee and Jackson and charged with Pickett.[52]

    For most people residing in the United States, including many of us in the South (like me, some of whose ancestors did march with these men in the early 1860s), the prospect of a neo-Confederate savior of the nation or world is horrifying, like a mythological monster assuming worldly power. Sifting through the ashes of the triumphant Southern Phoenix, we are likely to find much of value has been destroyed.

    Notes

    [1]  Quoted in Julian M. Pleasants and Augustus M. Burns III, Frank Porter Graham and the 1950 Senate Race in North Carolina (Chapel Hill: University of North Carolina Press), 183. On the term “dog-whistle politics,” see Ian Haney López, Dog Whistle Politics: How Coded Racial Appeals Have Reinvented Racism and Wrecked the Middle Class (Oxford: Oxford University Press, 2014).

    [2] Richard M. Weaver, “Agrarianism in Exile,” in The Southern Essays of Richard M. Weaver, ed. George M. Curtis III and James J. Thompson Jr. (Indianapolis: Liberty Press, 1987), 40, 44.

    [3] Weaver, “The Southern Phoenix,” in The Southern Essays of Richard M. Weaver, 17.

    [4]  See Paul A. Bové, “Agriculture and Academe: America’s Southern Question,” in Mastering Discourse: The Politics of Intellectual Culture (Durham, NC: Duke University Press, 1991), 113–142.

    [5]  Recent events concerning the removal of the “Confederate Flag,” the notorious symbol of racism wielded by the KKK and others, from state capitols and other official sites in the South appears to be a surprising turn of events, although cynics could argue that, in turning attention away for gun violence and particularly violence against black citizens and other minorities, the flag issue has provided a convenient cover, allowing the media to ignore more urgent social problems in the wake of the Charleston massacre. Still, symbols are powerful, and the removal of this symbol is itself a hopeful sign as even conservative politicians and pundit have realized, all too late, what the embrace of the lost Confederacy has cost them on a moral level. See, e.g., Russ Douthat, “For the South, Against the Confederacy,” New York Times blog (June 24, 2015): http://douthat.blogs.nytimes.com/2015/06/24/for-the-south-against-the-confederacy/?_r=0.

    [6]  Donald Davidson, “The Vision of Richard Weaver: A Foreword,” in Richard M. Weaver, The Southern Tradition at Bay: A History of Postbellum Thought, eds. George Core and M. E. Bradford (New Rochelle, NY: Arlington House, 1968), 17.

    [7] Weaver, “Up from Liberalism” [1958–59], in The Vision of Richard Weaver, ed. Joseph Scotchie (New Brunswick, NJ: Transaction Publishers, 1995), 20.

    [8]  See Fred Douglas Young, Richard M. Weaver, 1910–1963: A Life of the Mind (Columbia, Missouri: University of Missouri Press, 1995), 56 –58.

    [9] Weaver, “Up from Liberalism,” 23.

    [10]  Ibid., 28.

    [11]  Joseph Scotchie, “Introduction: From Weaverville to Posterity,” in The Vision of Richard Weaver, 9–10.

    [12]  Ibid., 9.

    [13]  Weaver, “Up from Liberalism,” 31. Notwithstanding the use of the word “holocaust,” Weaver makes no mention of the Nazis or the concentration camps in this essay; rather, his example is “the abandonment of Finland by Britain and the United States” (31).

    [14]  Weaver, Ideas Have Consequences (Chicago: University of Chicago Press, 1948), 2–3.

    [15]  See Max Horkheimer and Theodor W. Adorno, Dialectic of Enlightenment, trans. John Cumming (New York: Continuum, 1987), 3. See also Adorno, The Jargon of Authenticity, trans. Knut Tarnowski and Frederic Will (Evanston, IL: Northwestern University Press, 1973).

    [16]  Weaver, Ideas Have Consequences, 129.

    [17]  Weaver, “On Setting the Clock Right,” In Defense of Tradition: Collected Shorter Writings of Richard M. Weaver, ed. Ted J. Smith III (Indianapolis: Liberty Fund, 2000), 559–566.

    [18]  Weaver, Ideas Have Consequences, 131.

    [19]  Ibid., 132.

    [20]  Ibid., 148.

    [21]  Ibid., 172.

    [22]  Ibid., 187.

    [23]  In a later essay, Weaver compares the difference between the American North and the South to that between the United States and England, France, or China. In the same essay, Weaver adds that “The South […] still looks among a man’s credentials for where he’s from, and not all places, even in the South, are equal. Before a Virginian, a North Carolinian is supposed to stand cap in hand. And faced with the hauteur of an old family from Charleston, South Carolina, even a Virginian may shuffle his feet and look uneasy.” See “The Southern Tradition,” in The Southern Essays of Richard M. Weaver,  210, 225.

    [24]  Cervantes, Don Quixote, trans. J. M. Cohen (New York: Penguin, 1950), 149. Apparently, many conservatives would not object to such a comparison. For example, in his history on the right-wing Intercollegiate Studies Institute, Lee Edwards approvingly begins by saying of its founder, “Frank Chodorov had been tilting against windmills all his life.” See Edwards, Educating for Liberty: The First Half-Century of the Intercollegiate Studies Institute (Washington, DC: Regnery Publishing, 2003), 1.

    [25]  At no point does Weaver cite Cash’s The Mind of the South (originally published in 1941), which in this context must be seen as a sort of “absent presence” for Weaver and others who carried the torch for the Agrarians in the 1940s and beyond. The Mind of the South appeared while Weaver was working on his dissertation, and Weaver’s own study might even be seen as a tactical critique of, or at least alterative to, Cash’s celebrated work. See W. J. Cash, The Mind of the South (New York: Vintage, 1991). Although these two native North Carolinian authors identify some of the same characteristics and even arrive at similar conclusions about the “mind of the South,” they also maintain rather different social and political positions. For one thing, Cash does not see a feudal or aristocratic Southern character as praiseworthy, whereas Weaver’s entire defense of the Southern tradition rests on his admiration for and allegiance toward the aristocratic virtues of the archetypal Southerner.

    [26]  Weaver, The Southern Tradition at Bay, 44.

    [27]  One legitimate critique of Cash’s The Mind of the South was that it focused primarily on the attitudes and customary habits associated with Cash’s own Piedmont region of North Carolina (which happens to be my native region as well), thus underestimating the divergences to be found in the Tidewater zones to the east or the “Deep South” below and to the west. Weaver’s Southern Tradition at Bay does not limit its approach by regions, giving more or less equal space to views from all parts of the South, but it does severely restrict itself to materials best suited to make its argument with respect to a feudal system. Hence, Weaver tends to ignore the experiences of those who did not live on large estates or plantations, which is to say, Weaver omits the experiences of the vast majority of Southerners. If Cash’s study could be faulted for its Mencken-esque journalistic techniques—Cash’s original article, “The Mind of the South,” did appear in H. L. Mencken’s American Mercury, after all—and its lack of intellectual rigor, Weaver’s more academic study (it was a PhD dissertation, of course), in its questionable method and especially in its selectivity, also raises doubts about the “mind” it purports to lay bare.

    [28]  Weaver, The Southern Tradition at Bay, 387.

    [29]  Ibid., 396.

    [30]  Ibid., 391.

    [31]  See Raymond Williams, Marxism and Literature (Oxford: Oxford University Press, 1977), 121–127.

    [32]  Weaver’s advisor had been the cultural historian, literary critic, and biographer Arlin Turner, but Brooks stepped in only near the end to serve as the head of Weaver’s thesis committee. In his biography, Young reports that “Weaver was in the final stages of writing his dissertation when Turner left LSU to take a position at Duke University; Cleanth Brooks became his advisor at that point and oversaw the work to its conclusion” (78). However, as far as I can tell, Turner did not arrive at Duke until 1953, ten years after Weaver received his Ph.D. degree. The more likely reason for the change in advisor, as Fred Douglas Young writes, was that Turner was “called up for service in the U.S. Navy,” which is why Weaver asked Brooks to serve as dissertation director at the last minute (see Young, Richard M. Weaver, 67). I am not prepared to speculate on the relationship between teacher and student, but I might note that Turner, a native Texan who wrote a well-regarded biography of Nathaniel Hawthorne and later became the editor of American Literature, likely did not share his former student’s strictly sectarian views with respect to the opposed and irreconcilable cultures of the North and the South.

    [33]  See Weaver, The Southern Tradition at Bay, 41, 231–275.

    [34]  Ibid., 30.

    [35]  Although it lies well outside the scope of the present essay, it would be interesting to consider the other side of Weaver’s celebratory medievalism by looking a Eugene D. Genovese’s Roll Jordan, Roll: The World the Slaves Made (New York: Random House, 1974). Genovese also identifies a patriarchal, paternalistic society in which religion or religiosity played a crucial role, but he focuses attention on the essential contributions of the slaves in forming this distinctively Southern culture. Genovese, then a Marxist historian influenced by Gramsci, among others, later became a notoriously conservative thinker in his own right, a shift that coincided—perhaps not coincidentally?—with his growing interest in the Agrarians of the I’ll Take My Stand era, which culminated in a book whose title could have come directly from Weaver’s own pen: see Genovese, The Southern Tradition: The Achievement and Limitations of an American Conservatism (Cambridge, MA: Harvard University Press, 1994).

    [36]  Ibid., 55–57.

    [37]  Ibid., 259. Being “won away” is, for Weaver, a sign of the servant’s delusion. Indeed, this line follows directly from a section which concluded that “the blacks suffered as much maltreatment as the whites, the [Union] soldiery being as ready to snatch the silver watch of the slave as the gold one of his master” (258).

    [38]  Ibid., 261.

    [39]  Ibid., 173. Weaver lists a number of postbellum incidents, including “disturbing reports of Negro voodooism,” as evidence that Southern blacks, now lacking the beneficial effects of a civilizing servitude, would “soon relapse into savagery” (261–262).

    [40]  Incidentally, Weaver’s overall assessment of women’s rights is not much more salutary than his position on civil rights for persons of color, at least with respect to the decline of the West. In Ideas Have Consequences, Weaver laments that, although “[w]omen would seem to be the natural ally in any campaign to reverse” the anti-chivalric modern trends that have rendered Western civilization so spiritually vacant, in fact, they have not. “After the gentlemen went, the lady had to go too. No longer protected, the woman now has her career, in which she makes a drab pilgrimage from two-room apartment to job to divorce court” (180). Without chivalry, Weaver concludes, there can be no ladies.

    [41]  See The Southern Tradition at Bay, 268.

    [42]  Ibid., 36–37.

    [43]  Ibid., 35.

    [44]  Quoted in Alexander P. Lamis, “The Two-Party South: From the 1960s to the 1990s,” in Southern Politics in the 1990s, ed. Alexander P. Lamis (Baton Rouge: Louisiana State University Press, 1999), 8.

    [45]  Contrast this view with the lament by which Albert D. Kirwan chooses to conclude his near-contemporaneous, 1951 study of postbellum Mississippi politics: “As for the Negro, whose presence in such large numbers in Mississippi has given such a distinctive influence to its politics, his lot did not change throughout this period. No one thought of him save to hold him down. No one sought to improve him. […] He was and is the neglected man in Mississippi, though not the forgotten man.” See Kirwan, The Revolt of the Rednecks: Mississippi Politics, 1875–1925 (Gloucester, MA: Peter Smith, 1964), 314.

    [46]  Weaver, Ideas Have Consequences, 132. Thoreau seems to be the one Yankee whom Weaver is willing to consider a non-barbarian. See also The Southern Tradition at Bay, 41: “Southerners apply the term ‘Yankee’ as the Greeks did ‘barbarian.’ The kinship of ideas cannot be overlooked.”

    [47]  Space does not permit a full consideration of the matter, but Weaver’s embrace of a certain Southern “non-creedal religiosity” would not necessarily seem to fit easily with the rise of the religious right in the 1980s and beyond, particularly when considering the prominence of certain denominations and organization, like the Southern Baptist Convention, in political and cultural debates of recent decades. However, one might also recognize the apparently Southern accent with which must of the new political religiosity has been voiced on a national level, which suggests another aspect of the australization of American politics.

    [48]  Paul A. Bové, A More Conservative Place: Intellectual Culture in the Bush Era (Hanover, NH: Dartmouth College Press, 2013), 10.

    [49]  Bill Clinton, then Governor of Arkansas, made his name nationally as the Chairman of the Democratic Leadership Council, an organization founded in the aftermath of the 1984 Reagan re-election landslide. The D.L.C. was established the express aim of promoting more conservative policies within the Party and nationally, and its leadership largely consisted of Southerners, not coincidentally.

    [50]  Cash, The Mind of the South, 428–429.

    [51]  Ibid., 429.

    [52]  Weaver, “The South and the American Union,” in The Southern Essays of Richard M. Weaver, 256.

  • David Thomas — The End of History, In Memoriam

    David Thomas — The End of History, In Memoriam

    by David Thomas

    The West welcomed East Germany to the end-of-history by flying in David Hasselhoff for Berlin’s New Year celebrations. From the top of the fallen wall, clad in a pulsing light-spangled jacket, Hasselhoff regaled half a million people with the period’s unofficial anthem, “Looking for Freedom.” It is still, to date, one of Germany’s bestselling songs.

    Two years earlier, as Margaret Thatcher closed in on her second re-election, Hot Chocolate’s former front man, Errol Brown, had also lent his weight to the liberal cause. During the 1987 Conservative Party Conference, the disco hitmaker stepped up to the podium and led the entire caucus in a rousing rendition of John Lennon’s pop-socialist anthem “Imagine” (Wilson 2013: 41):

       Imagine no possessions

    I wonder if you can

    No need for greed or hunger

    A brotherhood of man

    Imagine all the people

    Sharing all the world

    You, you may say I’m a dreamer,

    But I’m not the only one

    I hope some day you’ll join us

    And the world will live as one           

    Little wonder that irony was the order of the day. Confident in their unassailable position, and wryly indulgent of the counterculture’s crabby idealism, the architects of globalization rested content on their laurels. There was much to celebrate. Liberal democracy had vaulted over the last great hurdle on its pathway to perpetual peace. Francis Fukuyama captured the mood as he sketched out the lineaments of his wildly popular end-of-history thesis:

    What we may be witnessing in not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government. (Fukuyama 1989: 4)

    The course of world events allowed Fukuyama to stand by this claim for almost thirty years. Writing in 2007, on the very cusp of the 2008 financial crisis, he offered a confident and largely unqualified reassertion of his fundamental argument:

    I believe that the European Union more accurately reflects what the world will look like at the end of history than the contemporary United States. The EU’s attempt to transcend sovereignty and traditional power politics by establishing a transnational rule of law is much more in line with a ‘post-historical’ world than the Americans’ continuing belief in God, national sovereignty, and their military. (Fukuyama 2007)

    Yet ten years on – as Fukuyama himself has conceded – the same claim has begun to leave behind a bad taste in the mouth. Walls are in vogue again. And as a fragile European Union teeters on the brink of fragmentation, the new political trajectory of the United States seems the more reliable harbinger of the political futures that await us, futures overshadowed by a resurgence of rightwing authoritarianism and a reactive cascade of unpredictable international statecraft. Indeed, as Trump has ridden a wave of anti-establishment discontent into the White House, “Americans’ continuing belief in God, national sovereignty, and their military” now looms large over the old effort to “transcend sovereignty and traditional power politics by establishing a transnational rule of law.”

    Even prior to Brexit and Trump’s election, the new tenor of the times was starting to emerge. As armed guards rolled out the checkpoints and the razor wire, as miles of ad hoc barrier unwound across Europe, it became apparent that the open borders of globalization belonged to another era. The Schengen Agreement was a thing of the past, a bitterly regretted utopian folly. And as the specter of Trump’s “great, great wall” hovered at the outer edge of possibility, it was becoming clearer that other dreams were dying too: You could keep your hungry, your poor, your tired, there was no room for them here.

    The truth was, however, these high-walled fever dreams had been under construction for some time. The work had never really stopped. Long before the “great wall” was rumored, and even as the old Iron Curtain came crashing down, the world’s wealthy had peered out from behind their battlements in the sky. This is Mike Davis writing in the mid-1980s:

    According to its advance publicity, Trump Castle will be a medievalized Bonaventure, with six coned and crenellated cylinders, plated in gold leaf, and surrounded by a real moat with drawbridges. These current designs for fortified skyscrapers indicate a vogue for battlements not seen since the great armoury boom that followed the Labour Rebellion of 1877. In so doing, they also signal the coercive intent of postmodernist architecture in its ambition, not to hegemonize the city in the fashion of the great modernist buildings, but rather to polarize it into radically antagonistic spaces. (Davis 1985: 112-3)

    When Davis wrote, he would probably have been surprised to learn that the proposed owner-occupant of this “medievalised Bonaventure” would one day descend in his golden elevator to lead a populist insurgency against “out-of-touch elites.” He would probably have been less surprised to hear that when he came, he came promising the Rust Belt’s dispossessed fortifications and battlements of their own.

    To understand the success of the strategy, and to understand the strange mismatch of class interests that defines the Trump mandate, one has to dig beneath the concrete partitions of the “postmodern” city and search within the rusting husk of the American factory system to grasp the hidden economic imperatives and political contingencies that produced deindustrialization and financialization as coeval phenomenon. Here – as many of us are belatedly beginning to realize – Robert Brenner’s history of postwar economic development proves an indispensable resource.

    Much of Brenner’s later work has focused on the turbulent transition from the global economy’s belle époque – the period of unprecedented economic dynamism the lasted from the close of the war to the early 1970s – to the “long downturn” that has followed in its wake. His work has drawn attention to a progressive reallocation of capital investments away from industrial manufacturing and into the so-called FIRE (Finance, Insurance and Real Estate) sector. In Brenner’s account of the history, as emerging industrial economies began contesting American manufacturing market dominance struggle for market share reached such a pitch of intensity that companies became accustomed to operating with radically reduced profit margins.

    In response to a corresponding decline in rates of return, investors became shyer of the manufacturing sector and sought to diversify their portfolios. And as fallow capital sought new routes to profit the structural import of the FIRE sector intensified. From the late 1970s through to the first decade of the new millennium the finance industry’s signature methods and investment strategies became more and more computationally sophisticated and systemically pervasive. Among the symptoms of the FIRE sector’s new significance was the rapid reconstitution of urban space that Davis describes. And it was, of course, in the context of the FIRE sector’s expansion that Trump emerged as the popular face of the resurgent real estate industry, whose towering skylines – so adored by Hollywood – became totemic of American greatness in its autumnal phase.

    For the better part of three decades, this increasingly baroque financial system succeeded in restoring dynamism to the world economy, propelling the US beyond a sputtering Soviet regime into the position of uncontested global hegemon. And as the oil and stagflation crises retreated from view, it is perhaps not that surprising that faith in the FIRE sector’s new methods became so strong that creditors began to think themselves so cybernetically insured against loss that they lent in increasingly blithe and unstinting fashion.

    It was in this climate of technocratic hubris that Fukuyama sketched out his thesis, one that expressed the dominant structure of feeling then prevailing in elite circles: The common sense of the period all but dictated that a universal “evolutionary pattern” was just then culminating in the globalization of liberal democracy, as “technologically driven capitalism … free[ed itself] of internal contradictions” (Fukuyama 1993: 91; xi).

    These dreams seem all the more ironic now that one of the FIRE sector’s most notorious figures has begun to wreak havoc with the signal institutions of liberal democracy, questioning the accuracy of the ballot, wielding executive orders in madcap and draconian fashion, intimidating the judiciary, and attempting to bludgeon the free press into abject submission. The ironies deepen when we recall that it was the increasingly risky expansion of the FIRE sector that triggered the 2008 financial crisis, as the securitization of subprime mortgages opened the door to that last round of dispossession that underlies so much of today’s anti-establishment discontent.

    The ramifying consequences of the 2008 financial crisis were evident in this electoral cycle as we saw the two candidates periodically breaking away from the familiar battery of appeals to the middle-class homeowner, to take time to address a dispossessed and precariously employed “working class” who had swelled to such an extent that their political significance could no longer be ignored.

    Yet in its resurrection the figure of the worker seemed to have undergone a subtle transformation. In its return to the mainstage of electoral politics, talk of the worker functioned less as a metonym for the workers movement, and more as a shorthand for the plight of the downwardly mobile “worker-citizen,” one who could no longer count on social state protections, whose stake in the real estate market was gone or imperiled, but who was still the bearer of the full legal rights and privileges of the citizen.

    To understand the increasingly reactionary disposition of this citizen-worker we have to grasp the long downturn’s ongoing effects on the technical and demographic composition of the real economy. This effort again finds us tacking back to the 1970s, to the last great crisis in the capitalist world system, when attempts to restore dynamism to the global economy saw industrial capital fighting to break free of the constraints that social democracy had placed on its agency, unleashing a two-pronged assault on labor. Capital flight saw manufacturing plants flee the blue-collar heartlands, as industry reconstituted its industrial base in the emerging economies where it could exploit a labor force that enjoyed far fewer legal protections. And at one and the same time as the Thatcher and Reagan administrations drove through the legislation that facilitated and policed this new round of capital flight – a series of legislative actions that also undergirded the FIRE sector’s emergence as the new motor of economic dynamism – manufacturers also fended off the labor movement from within, introducing a new round of automation that saw labor’s relative share in the productive process decline, further securing factories against sabotage and slowdown.

    Research collective Endnotes sums up the prevailing political fallout of this double-fronted assault:

    Industrial output continues to swell, but is no longer associated with rapid increases in industrial employment … In this context, masses of proletarians, particularly in countries with young workforces, are not finding steady work; many of them have been shunted from the labour market, surviving only by means of informal economic activity (Endnotes 2015)

    The use of the term “shunted” here evokes the language of Stuart Hall’s Policing the Crisis. And in tracking the early effects of this “shunting from the labour market” Hall identified that the policing of deindustrialization’s dispossessed broke differentially along racial lines. As the economy contracted, white Britons closed ranks, consigning immigrant communities to a greater share of the joblessness that capital flight was leaving behind it. Writing that race was “one of the main mechanisms, by which, inside and outside the work-place itself, th[e] reproduction of an internally divided labour force [was] accomplished” (Hall et al. 1982 [1978]: 346), Hall detailed the advantages that the dominant classes gleaned from these divisions:

    The ‘benefits’ … must therefore be reckoned to include not only the direct and indirect exploitation of the colonial economies overseas, and the vital supplement which this colonial work-force made to the indigenous labour force in the period of economic expansion, but also the internal divisions and conflicts which have kept that labour force segregated along racial lines in a period of economic recession and decline – at a time when the unity of the class as a whole, alone, could have pushed the country into an economic ‘solution’ other than that of unemployment, short-time, cuts in the wage packet and the social wage. (Hall et al. 1982: 346)

    Having identified these developing tendencies – and their role in keeping the “unity of the class as a whole” at bay – Hall went on to explore how black Britons had begun to adapt to their entrapment in the grey and black economies. Explaining the concept of hustling for the benefit of a predominantly white readership, he wrote:

    The hustle is as common, necessary and familiar a survival strategy for ‘colony’ dweller’s as it is alien and strange to those who know nothing of it … Hustlers live by their wits. So they are obliged to move around from one terrain to another, to desert old hustles and set up new ones in order to stay in the game. From time to time, ‘the game’ may involve rackets, pimping, or petty theft. But hustlers are also the people who sustain the connections and keep the infrastructure of ‘colony’ life intact. They are people who always know somebody, who can get things done, have access to scarce goods, who can ‘deal’ and service the less-respectable ‘needs’ of the respectable end of ‘colony’ society. They hang out around the clubs, organise the blues parties, set the domino game up, know what day the illegal white rum distilleries produce. They work the system; they also make it work … When the going is good, hustlers are men about the street with style, visibly displaying their temporary good fortune: ‘cool cats.’ (Hall et al. 1982: 351-2)

    Of course in the days since he offered this account, hip hop’s rise to pop cultural dominance has made the swaggering resourcefulness of the hustler part of the cultural fabric of millennial experience. Few under the age of forty are not intimately familiar with hip hop’s virtuosic chronicling of Black America’s experience of the racialized policing of the long downturn. Part of the enduring value of Hall’s work lies in its ability to tie hip hop’s signature tropes and stances to the determinations against which they emerged, as the policing of capital’s real movement subjected the black proletariat to the worst effects of this new round of capital flight and automation.

    Indeed, in retrospect, it seems that NWA, rather than Hasselhoff, would have been a fitter avatar of the inequities that globalization scattered in its wake as it ground the labor movement beneath its heel. For while Hasselhoff’s words at the Berlin Wall implied that freedom had descended on the former Soviet bloc in a moment of decisive apotheosis, in NWA’s language freedom was difficult to attain; indeed, it was wrestled from the system in the context of an unremitting struggle with occupying powers determined to maintain existing inequities:

    Fucking with me cause I’m a teenager

    With a little bit of gold and a pager

    Searching my car, looking for the product

    Thinking every nigga is selling narcotics

    You’d rather see me in the pen

    Than me and Lorenzo rolling in a Benz-o

    In the crosshairs of the carceral state it was more than evident that history and the struggle for emancipation was far from over.

    Still, when Hall sketched out his typology of the hustler he, like Davis, would probably have been surprised by the uncanniness of subsequent developments. For at one and the same time as policing practices trended along the lines he identified – with a massively disproportionate number of black Americans subject to incarceration and unemployment – we also saw hip hop’s hustler swagger its way deep into the heart of the American culture industry. So deep was the penetration that one of hip hop’s most celebrated dons would one day take to the stage of Carnegie Hall, backed by a 36-piece orchestra, in a full tuxedo and tie, to make a boast of a rags-to-riches tale that dwarfed anything that Dickens ever conceived:

    Momma ain’t raised no fool

    Put me anywhere on God’s green earth,

    I’ll triple my worth

    Motherfucker, I, will, not, lose

    I sell ice in the winter, I sell fire in hell

    I am a hustler baby, I’ll sell water to a well

    I was born to get cake, move on and switch states

    Cop the coupe with the roof, gone and switch plates

    The epic scale of Jay Z’s biography – from resourceful street kid slinging rocks on the corner, to owner of music streaming service valued at $600 million – charts one self-defined hustler’s traversal of the vast wealth disparities that have characterized the global economy in the wake of the belle époque.

    We might pause for a moment here to consider the sociological significance of hip hop’s massive contemporary appeal. Indeed, it might not be too much of a stretch of the imagination to suggest that part of what has driven these accounts of life in the game to the top of the Hot 100 is precisely the more widespread generalization of the conditions of precarity and disenfranchisement that this genre has spent the bulk of its existence recounting and resisting. As the state continues to scale back on the welfarist commitments of the postwar order, and as yet another wave of automation sees the world system further unable to absorb labour into the productive process, a life in the black or grey economy is a very real prospect for increasing numbers of the world’s people. Endnotes write at another juncture:

    The social links that hold people together in the modern world, even if in positions of subjugation, are fraying, and in some places, have broken entirely. All of this is taking place on a planet that is heating up, with concentrations of greenhouse gases rising rapidly since 1950. The connection between global warming and swelling industrial output is clear. The factory system is not the kernel of a future society, but a machine producing no-future. (Endnotes 2015)

    A poignant statement in light of the last electoral cycle, as the Trump campaign implicitly configured the 1950s “factory system” as the locus of America’s lost greatness, promising to “return” the US to a weird Disneyland recapitulation of its Fordist heyday. And as Trump’s executive actions against environmental and energy agencies have demonstrated in the weeks since his inauguration, this back-to-the-future ride will not tolerate any slowdown or inhibition of its propulsive thrust toward “no-future.”

    Endnotes write that today’s left is prone to approach the worker’s movement with the “latecomers’ melancholy reverence” (Endnotes 2015) – a striking phrase that eerily anticipates Trump’s appeal to America’s erstwhile greatness. And these affinities seem capable of identifying a key problematic that has handed the Rust Belt over to Trump, and the British postindustrial zones to the Leave Campaign. For while the architects of globalization succeeded in decimating the worker’s movement, they were markedly less successful in their efforts to subordinate the sovereignty of the nation state to the rigors of transnational law. And thus as citizen-workers look for protection against immiseration, many seem increasingly willing to approve statist measures to both expel noncitizens who “unfairly compete” for scarce jobs, and introduce protectionist regimes designed to shield the nativist worker from the threat of international competition. Unable to organize themselves in a united internationalist front against exploitation, it is hardly surprising that the downwardly mobile worker-citizens of today are instead willing to fall back on the state’s promises to negotiate favorable “deals” on their behalf.

    The background to these tendencies seems to be the declining viability of the global development narrative that has attended postwar international policymaking since the Bretton Woods Agreement. In the context of secular stagnation and economic contraction, advanced economies have been forced to rely on our era’s signature admixture of debt and austerity, scaling back on welfarist provisions even as the nation state continues to function as a macroeconomic stimulator and a guarantor of private property. And increasingly, as development of the world system’s “peripheral” regions also stalls (Barone 2015), the core economies appear to be bracing themselves to resist a rising tide of economic and climate migration. It seems that population growth, economic growth, and industrial productivity have fallen out of sync to such a profound extent that we are increasingly “experiencing modernization of industry without modernity’s attendant social forms: without, that is, the institutional, social, cultural features associated with development, such as universal public education, democratic state institutions” and the humanitarian defense of human rights (Brouillette and Thomas 2016: 511).

    Yet rather than identify the newness of this geopolitical situation, political discourse on these matters more often coalesces around an introverted and melancholic nationalism that understands the immiseration of the worker-citizen in relation to vague but impassioned narratives of national decline. It is telling that Trump’s notorious baseball cap evokes the popular affluence of the belle époque through the allusive figure of “greatness,” a sleight of hand that evades the tricky question of how exactly one goes about turning back the clock on the technological development of the forces of production. For even if Trump’s protectionist policies do manage to lure back some manufacturing plants, the fixed to variable capital ratio will not be as favorable as it was back in the days when America was “great,” which is to say, when it was Fordist.

    Appeals to the figure of the precarious citizen-worker – more often figured in the guise of Thatcher’s “individual, and his family” – thus became a common feature of both campaigns. Yet in Clinton’s case we witnessed the strange spectacle of an establishment standard-bearer attempting to patch together, ad hoc, a fuzzily defined platform that alternately gestured toward the maintenance of the status quo, and toward the construction of a newly “social democratic” pluralism. Trump, meanwhile, staked out a clearly defined appeal to white nativist protectionism, one that was capable of uniting a large cross section of white America around the prospect of a Fordist “restoration,” one that sought to assert the rights and security of the white worker-citizen in the face of intensifying global economic malaise.

    In so doing, the Trump campaign amplified a strategy that has been a mainstay of advanced economies in times of crisis throughout the postwar period. Hall describes this strategy in relation to the structural function that migrant workers performed for British industry from the early-1950s to the mid-1970s:

    In the early 1950s, when British industry was expanding and undermanned, labour was sucked in from the surplus labour of the Caribbean and Asian subcontinent. The correlation in this period between numbers of immigrant workers and employment vacancies is uncannily close. In periods of recession, and especially in the present phase, the numbers of immigrants have fallen; fewer are coming in, and a higher proportion of those already here are shunted into unemployment. In short, the ‘supply’ of black labour in employment has risen and fallen in direct relation to the needs of British capital. (Hall et al. 1982: 343)

    On the one hand, Trump’s wall, Brexit, and the broader European resistance to the Schengen Agreement, faithfully reproduce the pattern that Hall identified, as global conditions of economic contraction have triggered a rising tide of anti-immigration policies.

    Yet what separates the dynamics that Hall describes from those that are unfolding around us now, is the extent to which Trump’s anti-immigration policies are also a feature of a larger enthnonationalist projectionist program, one that signals a full-blooded return to the so-called “beggar-thy-neighbor” economic strategies that last openly prevailed prior to the advent of the Bretton Woods Agreement.

    Writing in 1937, British economist Joan Robinson argued that “in times of worldwide unemployment, it is indeed possible for one country to increase its employment and total output by increasing its trade balance at the expense of other countries. She coined the phrase ‘beggar-thy-neighbor’ to describe such policies” (Pasinetti 2008 [1987]). Robinson itemized four beggar-thy-neighbor economic weapons: wage reductions, officially induced exchange depreciation, export subsidies, and import restrictions. In the last month the US government has publicly evoked most, if not all, of these weapons, either accusing other governments of using them against the US, or signaling its intention to use them itself.[i] It is worth stressing that this is more of an escalation of already-existing dynamics than a complete bolt out of the blue – i.e. beggar-thy-neighbor strategies have been quietly on the rise for a decade or more (Barone 2015) – but the additional level of unvarnished aggression that Trump has introduced into the picture cannot but result in further escalations. The underlying dynamic is one in which – under the prevailing conditions of secular stagnation – economic growth risks becoming a zero-sum game, such that the growth of one nation is always talking place at the expense of another.

    Here Robinson’s explanation of the likely geopolitical fallout of beggar-thy-neighbor economics is worth remembering. Robinson wrote that “as soon as one country succeeds in increasing its trade balance at the expense of the rest, others retaliate” and among the economic effects of this cycle of retaliation is a reduced volume of international trade (Robinson 1947, 156).  There are affective consequences to these cycles of retaliation and these increasingly isolationist tendencies. Indeed, Robinson cautions that such policies can “add fuel to the fire” of economic nationalism, as trade wars push nations to the brink of open hostilities (Robinson 1947: 157).

    It should be noted that the intellectual circles that fostered this new ethnonationalism are not averse to an escalation of international armed conflict. Indeed, Steve Bannon is on record as thinking a war with China “inevitable” within the next ten years (Hass 2017). Much like Russia’s Alexander Dugin, Bannon subscribes to a world-historical vision that anticipates the onset of another great war, one that will serve as the crucible from which a revived Judeo-Christian culture will emerge victorious:

    But I strongly believe that whatever the causes of the current drive to the caliphate was — and we can debate them, and people can try to deconstruct them — we have to face a very unpleasant fact. And that unpleasant fact is that there is a major war brewing, a war that’s already global. It’s going global in scale, and today’s technology, today’s media, today’s access to weapons of mass destruction, it’s going to lead to a global conflict that I believe has to be confronted today. Every day that we refuse to look at this as what it is, and the scale of it, and really the viciousness of it, will be a day where you will rue that we didn’t act (qtd. Feder 2016)

    We are, at this juncture, a long way from the condition of universal post-historical secularism that Fukuyama anticipated. Indeed, the surprisingly pervasive appeal of Dugin and Bannon’s millennial creeds seems to have done much to consolidate Trump’s white American base, where a paranoiac strain of conservative religiosity has gained a powerful foothold. In a widely circulated address to a Vatican conference, Bannon appealed to the old counter-reformation concept of the “Church Militant” in an effort to recruit foot soldiers for an apocalyptic culture war, one that was to unfold, simultaneously, on domestic and geopolitical fronts:

    And we’re at the very beginning stages of a very brutal and bloody conflict, of which if the people in this room, the people in the church, do not bind together and really form what I feel is an aspect of the church militant, to really be able to not just stand with our beliefs, but to fight for our beliefs against this new barbarity that’s starting, that will completely eradicate everything that we’ve been bequeathed over the last 2,000, 2,500 years. (qtd. Feder 2016)

    On the international stage, the prime bête noire was the “new barbarity” of “jihadist Islamic fascism” (qtd. Feder 2016). Yet in the same speech in which he made use of this profoundly ironic terminology, Bannon also trained his ire on the architects of globalization, producing a strange taxonomy of “capitalisms” that distinguished between the “enlightened capitalism” of the “Judeo-Christian West,” and the new “crony capitalism” of Davos, one that a “younger generation” had “gravitate[d] to under this kind of rubric of personal freedom” (qtd. Feder 2016).

    As Bannon’s remarks make plain, conservative America’s Christianity and its post-Fordist nostalgia are now all tangled up in each other in ways that speak both to the impact that automation and deindustrialization have had on traditional gender norms, and to the much more widely pervasive and ambient sense of melancholy that results from living under the setting sun of a declining hegemon. Indeed, there is a case to be made that the extinction of the blue-collar “oedipal wage,” and the corresponding structural obsolescence of the “traditional” nuclear family, have been key catalysts of the US’s culture wars. For while conservatives have targeted changing gender norms as “dangerous” symptom of liberalism’s lapsarian hubris, what is actually taking place is arguably much better understood as another case of all that was solid melting into air. Sarah Brouillette puts the matter this way:

    In our current situation of economic turmoil and stagnation, the reproduction of productive labor in couple-based households is no longer a necessity everywhere –indeed, in some countries, the difficulty of keeping people working and keeping the unemployed engaged in work-like activities worries governments greatly, hence conversations about the possibility of a Universal Basic Income. At the same time, an expanding service sector handles some of what used to keep people too busy to develop multiple relationships: housekeeping, childrearing, and elder care, for example. Under these circumstances, it is more possible than ever for ‘alternative’ ways of being to come to the fore, with some even achieving mainstream respectability: think gay marriage, affective disinvestment in parenting, non-couple coparenting, moving back in with your parents, and ‘conscious uncoupling.’ Flip the coin, and the dwindling of the blue-collar industrial workforce, the expansion of domestic and affective caring work in the service sector, and the creeping obsolescence of the traditional nuclear family, have been crucial drivers of the hyper-conservative Men’s Rights Activist or ‘alt-right’ masculinist backlash against changing norms. (Brouillette 2017)

    This is not, however, how this would-be Church Militant understands the situation. And under the leadership of figures like Bannon it has taken up the project of trying to discipline gender norms back into alignment with its particular hierarchy of values and prohibitions, a project that reveals a latently theocratic dimension to the American and Russian branches of ethnonationalism.

    Indeed, insofar as the Trump administration continues to signal its commitment to a new counter-reformation – via the metonyms of its opposition to abortion, trans rights, and gay marriage, and its virulent hostility to Islam – it can count on a large base of support among white American Christians, many of whom seem willing to overlook the refugee camps, the prisons, and the ecocide, just so long as gender norms and national holidays are better aligned with the niceties of canon law. The Trump era thus seems set to catalyze a struggle for the soul of Christianity, as clericalist traditionalists – ever vulnerable to allure of state power – do battle with a charismatic and decidedly anti-clericalist, anti-capitalist Pope:

    In one of the cardinal’s antechambers, amid religious statues and book-lined walls, Cardinal Burke and Mr. Bannon – who is now President Trump’s anti-establishment eminence – bonded over their shared worldview. They saw Islam as threatening to overrun a prostrate West weakened by the erosion of traditional Christian values, and viewed themselves as unjustly ostracized by out-of-touch political elites. ‘When you recognize someone who has sacrificed in order to remain true to his principles and who is fighting the same kind of battles in the cultural arena, in a different section of the battlefield, I’m not surprised there is a meeting of hearts,’ said Benjamin Harnwell, a confidant of Cardinal Burke who arranged the 2014 meeting. (Pierce 2017)

    And thus as the Trump admiration attempts to consolidates its base, it seems set to lean hard on the far right flank of conservative Christianity, as it positions its reactionary, xenophobic, and ecocidal mandate as a noble crusade to save “the West” from external and internal enemies. Early signs suggest that attacks on the press and the liberal academy will intensify in the coming days, as Bannon and company turn their attention to the “enemies within.” The structural position of the press and the academy thus seems set to undergo a seismic shift. Accustomed to offering ambivalent but compliant critiques of neoliberal globalism, the principle organs of the bourgeois sociolect now find themselves thrust into a battle that they had thought consigned to the annals of history, undertaking harried resistance on terrain that is not of their own choosing. The radicalization of the press is already underway as we pass through the looking glass into a world where Teen Vogue and Cosmopolitan join The Guardian in drumming up support for a general strike. And as the Trump administration’s cuts to the NEA and NEH budget take hold, it appears that the liberal humanist academy will have little choice but to join the fray, drawing its post-critical turn to a panicked conclusion.

    It seems that for those who have been shielded from the harsher effects of globalization, it is hard not to feel nostalgic for the days when it was possible to indulge – however ambivalently – in Fukuyama’s dream. Where we go from here is extremely unclear, and one obviously feels dwarfed by the massive scale of these developments. But I suspect we should not spend too long grieving the loss of Fukuyama’s political horizon, for it helped to bring us to the place where we now stand, occluding the intensifying inequities that have resulted from the long downturn, developments that have opened the door to this new round of ethnonationalist insurgency. The Trump administration’s erratic brand of nascent fascism is a very real and present danger, one that we would be naive not to resist by whatever means we are able, but we also have to keep firmly in mind that this administration is not the sole source of our problems. We are in the grip of another of the violent and eruptive crises that capitalism has, throughout its long history, repeatedly thrust upon us.

    And if we find ourselves still burning a candle for some vestige of that dream of unity that Lennon cribbed from the chauvinistic language of the postwar reconstruction, then that may entail putting ourselves on the line in ways that we have been little accustomed to do in recent years. The fight is coming to us whether we like it or not. And in fighting back we will need to rediscover political horizons that extend far beyond the concerns of the individual and his family.

    The value of an individual life a credo they taught us

    to instill fear, and inaction, ‘you only live once’

    a fog in our eyes, we are

    endless as the sea, not separate, we die

    a million times a day, we are born

    a million times, each breath life and death:

    get up, put on your shoes, get

    started, someone will finish     (di Prima 2007 [1971]: 8)

    David Thomas is a Joseph-Armand Bombardier Canada Graduate Scholar in the Department of English at Carleton University. His thesis explores narrative culture in post-workerist Britain, and unfolds around the twin foci of class and climate change.

    References

    Barone, Barbara. “Protectionism in the G20.” Policy Department, Directorate-General for External Policies. European Union, Belgium: 2015 https://www.academia.edu/12266435/ Protectionism_in_G20_2015_

    Brenner, Robert. Economics of Global Turbulence. London: Verso, 2006.

    Brouillette, Sarah. “A feminist communist killjoy reads Future Sex,” Public Books. Forthcoming 6 March 2017.

    Brouillette, Sarah, and David Thomas. “Forum: Combined and Uneven Development,” Comparative Literature Studies, No 53, 3, 2016.

    Carter, Shawn. You Don’t Know. New York: Roc-A-Fella Records, 2001.

    Carter, Shawn. “You Don’t Know Lyrics,” Genius, 24 Feb 2017: https://genius.com/Jay-z-u-  dont-know-lyrics

    Davis, Mike. “Urban Renaissance and the Spirit of Postmodernism,” New Left Review, No. 151, May – June 1985: https://newleftreview.org/I/151/mike-davis-urban-renaissance-and-the-spirit-of-postmodernism

    Di Prima, Diane. Revolutionary Letters. San Francisco: Last Gasp of San Francisco, 2007.

    Endnotes. “A History of Separation,” Endnotes 4, October 2015: https://endnotes.org.uk/issues/4/en/endnotes-preface

    Feder, J. Lester. “This Is How Steve Bannon Sees The Entire World,” Buzzfeed, 16 Nov 2016: https://www.buzzfeed.com/lesterfeder/this-is-how-steve-bannon-sees-the-entire-world?utm_term=.yxR8RK2gV#.sh1Q6P02E

    Fukuyama, Francis. “The End of History?” The National Interest, No. 16, 1989: 3-18.

    Fukuyama, Francis. The End of History and The Last Man. New York: Harper Perennial, 1993.

    Fukuyama, Francis. “The History at the End of History,” Guardian, 3 April 2007: https://www.theguardian.com/commentisfree/2007/apr/03/thehistoryattheendofhist

    Haas, Benjamin. “Steve Bannon: We’re going to war in the South China Sea … no doubt,” Guardian, 02 Feb 2017: https://www.theguardian.com/us-news/2017/feb/02/steve-bannon-donald-trump-war-south-china-sea-no-doubt

    Hall, Stuart, Chas Critcher, Tony Jefferson, John Clarke, and Brian Roberts. Policing the Crisis: Mugging, the State, and Law and Order. Hong Kong: MacMillan Press, 1982.

    Jackson, O’Shea, Andre Romell Young, Lorenzo Jerald Patterson, Harry Lamar III Whitaker.

    Fuck Tha Police. Los Angeles: Priority Records, 1988.

    Jackson, O’Shea, Andre Romell Young, Lorenzo Jerald Patterson, Harry Lamar III Whitaker.

    “Fuck Tha Police Lyrics,” Genius, 24 Feb 2017:https://genius.com/Nwa-fuck-tha-police-lyrics

    Lennon, John, and Barrie Carson Turner. Imagine. EMI, 1971.

    Pasinetti, Luigi L. “Robinson, Joan Violet (1903–1983).” The New Palgrave Dictionary of Economics. Second Edition. Eds. Steven N. Durlauf and Lawrence E. Blume. Palgrave Macmillan, 2008. The New Palgrave Dictionary of Economics Online. Palgrave Macmillan. 14 February 2017:  http://www.dictionaryofeconomics.com/article?id= pde2008_R000166> doi:10.1057/9780230226203.1450

    Robinson, Joan. Beggar-My-Neighbor Remedies for Unemployment,” Essays in the Theory of Employment. Oxford: Basil Blackwell & Mott, 1947: 156-172.

    Pierce, Charles C. “For His Next Trick, Steve Bannon Will Undermine the Pope.” Esquire. 07 Feb 2017: http://www.esquire.com/news-politics/politics/news/a52901/bannon-pope-francis/

    Wilson, Scot. “Violence and Love (in Which Yoko Ono Encourages Slavoj Zizek to give Peace a Chance).” Violence and the Limits of Representation. Matthews, Graham, and Sam Goodman, eds. New York: Palgrave Macmillan, 2013: 28-48.

    Notes

    [i] For examples of the Trump administration’s remarks on currency devaluation see:

    http://asia.nikkei.com/Markets/Currencies/Trump-singles-out-Japan-China-Germany-for-currency-attack; and for their current approach to export subsidies, and import restrictions consult: http://www.economist.com/blogs/economist-explains/2017/02/economist-explains-9 and http://www.bbc.co.uk/news/world-us-canada-38764079

     

  • Christian Thorne — A South Wind Blowing from the East

    Christian Thorne — A South Wind Blowing from the East

    by Christian Thorne

    This paper was written for “The South and the South,” a conference hosted by Vanderbilt University in May 2016. Its opening questions were generated by Hortense Spillers and other members of the boundary 2 collective.

    This essay was peer-reviewed by the editorial board of b2o: an online journal

    What comes to mind when a writer says that he means to comment upon “the South”? Anyone sitting in North America is likely to hear that term, if not further specified, as referring to the southern United States, what we might for now call “Alabama etcetera,” though this is hardly the phrase’s only possible designatum. The other region now routinely denominated “the South”—the other region, I mean, that routinely earns that otherwise ungrammatical capital S—isn’t actually a region at all, but a name for what used to be called “the Third World” or “the developing countries” or “the colonies”: the Global South. The American South, the Global South—as soon as one sees those two terms in the same paragraph, questions start humming. Why does the former Third World bear the same name as Georgia and the Carolinas? Do these have anything to do with one another, conceptually or concretely? Do our perceptions of one bleed into our perceptions of the other? In what sense are they all southern? What are we attributing to a region when we call it southern? Is there such a thing as southness?

    With these questions in front of us, I’d like to state a few propositions forthrightly—propositions, in the first instance, about the US South, which might or might not open up to include the global South, too. There are two propositions that I suspect I can get a person to agree with directly, without coaxing, and then a third that will in all likelihood require further elaboration and reflection. I’m going to share a few observations about “the South,” but with the proviso that I mean the phrase and not the place. What I’m wondering is what it means to call some expanse of territory “the South.”

    What I need us to see first is that the word “South” is, in the US context and probably most others besides, entirely optional. You might imagine yourself reading these words in Tennessee somewhere, west of the Appalachians. We often refer to that patch of the planet as “the South,” but we could and do call it other things. A person might for instance, feel a certain attachment to the region marked out in burnt orange here:

    —or to the dusty pink region here:

    Such a person would have a few different choices about what to call either of those contiguities. One of them he could call the slave states; the other he could call the Confederacy or the former Confederacy; or Dixie or Dixieland or maybe the Southeast.

    So again, the word South is optional. Second example, in one sentence:

    Have you ever noticed that Biafra is only ever referred to as eastern Nigeria?

    Third example. Anyone interested in the American South would still do well to read that group of writers variously referred to as the Vanderbilt Agrarians and the Nashville Agrarians and the Tennessee Agrarians and the Southern Agrarians.[1] A small admission: Only Edmund Wilson seems to have called them the Tennessee Agrarians, though that is what he called them.[2] Vanderbilt, Nashville, Tennessee, Southern. The nested redundancy of that list makes especially plain, I think, the elective quality of the word “southern.” Other designations are always available. Nor are these words plain synonyms—not at all. The word “Southern” is doing something slightly different in this formulation than would the words “Vanderbilt” or “Nashville”—it has different effects—and it falls to us to say what these are.

    The point that jumps out, of course, is that the South stands here at the end of a spectrum of widening scope, from the pinpoint of Vanderbilt to the eleven-state sectional vastness of the South. But let’s whittle the list back to Nashville vs. Southern, since these are the two main rivals when it comes to placing the Agrarians. We’ll want to register the localizing specificity of the one vs. the relative under-determination of the other. It becomes important here that we not overlook an important point, which is that option #4 does not merely name the widest among the four spaces—from campus to city to state to region. It’s that the word “Southern” doesn’t actually name a place in the same way that these others do.

    All I mean is that the South is a non-specific term—the only word on that list that is not a proper name. I would readily grant that even proper names aren’t as specifying as we usually take them to be, but the word “Southern” doesn’t even make a pretense of particularity, or whatever pretense it makes is easily dissipated. The word can’t help but refer beyond itself. A group of scholars, gathered at Vanderbilt in the spring of 2016 to talk about the American South and the global South, called their meeting “The South and the South.” Not all terms repeat so consequentially. There is a Nashville in Michigan, population 1600, but I doubt that a dozen professors have ever gathered to talk about “Nashville and Nashville.” What would it get you? The South and the South, however, gets you Alabama and Ghana. Or Ghana and Alabama. Of course, there’s no way to confirm the sequence. Which, after all, is the base South and which the South prime? The US South and the global South—when we hear the phrase “the South and the South,” do you and I mentally put them in the same order? And couldn’t there be other Souths beyond two? Southern Italy used to be organized into large estates worked by subjugated labor and regarded by Northern reformers as both immoral and an obstacle to national unity. So why not organize a conference on “The South and the South and the South?” Couldn’t il Mezzogiorno be our South double prime? But then where would that leave Okinawa or le Midi or the Saharan Maghreb?

    So I’m wondering what happens when we conceptualize a region as “the South.” What happens for that matter when we conceptualize it as “a region”? What I think I can show you is that these terms are both commonplace and problem-laden.

    The core claim in all Southern regionalist thinking is that the US South is a distinctive place. This sometimes goes hand in hand with the idea that the North has become a non-place, featureless, generic, that the North is ambient and all-purpose America. And if you believe this last, then the opposition North-versus-South is not exactly a battle between two places, because the South has a special claim on place-ness as such.

    What I want to show you is that this last claim is completely untenable, and I want to demonstrate this on something other than empirical grounds, though I’m sure there are ways of making the argument empirically, as well. But my claim is not a factual one—to the effect that Southern distinctiveness has eroded before mass media and the Internet and the strip-mall monoculture. That’s probably true, but it’s a truth you don’t need, because even before you get to the statistics and the comparative sociology, a case is there to be made on conceptual and discursive grounds.

    Here, then, is my non-empirical point: When Southern regionalists rise to talk about the South, they inevitably find themselves talking about something else, somewhere else, other places. The South won’t stay fixed. I’m not just saying that it’s hard to tell which states are and aren’t the South, though I’m sure that’s true. Nor am I saying that the South is internally differentiated, such that there are many Souths: the Piedmont, the Black Belt, the Appalachians, the Chesapeake. I’m saying that the South won’t stay still, that the South has a way of leaving the South or of sliding all over the map.

    When you go to talk about a region, anything other than a proper name will introduce into your discussion a degree of abstraction, inserting the region into a set, an overlay of universalizing claims, and these will erode the very particularism that you have undertaken to defend. And the South, we know, is not a proper name. Let’s just take a sentence like this one: “The Southern Agrarians were regionalists.” It’s an unexceptional, declarative sort of thing—very much the kind of sentence you could find in an undergraduate essay or in Paul Conkin’s book on the Vanderbilt gang or in Dorman’s Revolt of the Provinces.[3] Again: “The Southern Agrarians were regionalists.” Now that sentence is both descriptively true and a total mess. It contains three semiotically robust terms—regionalist, agrarian, and southern—and if you fix your gaze successively upon each one, it will crumble before you.

    The trouble with the word “agrarian” is easily explained. It is both a universal and in its own way a qualifier, referring indiscriminately to any society in which peasants or small farmers predominate, including the great many such societies that have existed outside of Virginia and Louisiana. Just as important, within the immensity of the US South, it refers only to such formations. It doesn’t matter what any given writer thinks he is doing, the drift of the word “agrarian” has always been to link certain subregions of the South to a great many regions outside the South. To any attempt to promote the South qua South, it thus introduces non-identity on two fronts. The word will always point beyond the South, putting pressure on the Vanderbilt poet to prefer Wisconsin (with its 76,000 farms) over North Carolina (with its 48,000). Already springloaded into the word “agrarian,” then, is Donald Davidson’s shock upon realizing that he preferred Vermont to a great many places in the old Confederacy. It harbors the joy of countrysides not your own.

    The word “regionalist,” meanwhile, introduces non-identity in even more arid a form. There are two different routes by which the concept of regionalism can betray itself. If I take as my project the defense of some region—the South—then I am declaring myself indifferent to the vast and open-ended list of other localities that could hypothetically command my regard: the Southwest, the Mountain West, the Plains. Each region will negate all other regions, striving to constitute itself as a set of one, hence to cast off the concept that is its double and its defeat. The South exists as a distinct “region” only until you name it as such, at which point it gets crammed into the lumpy sack of places radically unlike itself. Any one region comes at the expense of countless other provinces and districts, and so, too, does the word “region.” Alternately, then, I can vow to defend regionalism as such, any regionalism, just by virtue of its being regional, in which case, of course, I have ensured that I will never, in fact, defend any particular region at all—that’s the second betrayal: regionalism with all its options open. The Southern partisan who calls himself a regionalist has resolved in advance to make common cause with Massachusetts.

    Nor does the word “Southern” fare any better. Here are three arguments made on behalf of Southern distinctiveness, trying above all to account for the divergence between North and South.

    1) One hears that the South is—or that it was at one point—the only North American instance of a traditional society, the continent’s only guardian of the non-bourgeois virtues. We can sharpen this: Below the Potomac, American society was once dominated by non-commercial and semi-commercial agriculture, and because of this, it incubated a set of priorities that by the standards of Boston or New York seemed downright anti-bourgeois: a society that preferred leisure to work; that preferred aesthetics to efficiency; a society united by myth or religion or shared belief, and not by the contract; a society premised on care and mutuality and not on competition; a society that promoted humility rather than striving and conversation rather than consumption.[4]

    2) That’s one argument. Here’s a second. One also hears that the South diverged from the North early on because most of its white immigrants came not from England, but from England’s Celtic periphery. The core of white Southern culture, to which other groups have largely assimilated, has always been Irish and Scottish and Scots-Irish. Even English migrants to the South were more likely to come from the English borderlands and hill country—they were honorary Celts, easily absorbed by a Scottish majority. The task, then, is to give up on our sense of the white South as WASP—to learn to tell the difference between white people, and indeed, the difference between Anglos, whereupon we will be in a position to see that a great many of these latter weren’t actually Anglos at all.[5]

    I want to say a few words about why this argument matters, if right. People often wonder why the American myth of national origins gets routed exclusively through New England and the Pilgrims, when the Jamestown settlers beat the godly Northerners to the East Coast by a full decade. And the answer to that question has been easy to find—which is that early Virginia was a kakocracy of all-male, corporate-military misrule, wholly devoid of any principle or mission that didn’t include subordinating the Indians and hoping for gold.[6] If you want the US to have a purpose or program, you have to go through Plymouth Plantation. The usual line is that the first Southerners were just the rapacious side of the English establishment.

    But the Celtic thesis modifies this view in a big way. If this line were ever to gain general acceptance, which seems unlikely, then there is much in our histories of colonial North America that would have to change. The claim here is that Southern society hosted an alternate separatism—the separatism of the Celtic diaspora, in a period when the English were stepping up the colonization of their own periphery. It’s not just about Ulster, which is the place around which this argument sometimes gets truncated. The claim, rather, is that pre-enclosure Scotland and un-improved Ireland enjoyed an afterlife in the Southern backcountry—that the older Celtic formations survived longer in the US than they did in Britain or much of Ireland. That idea will then turn some of the major regional conflicts in early US history into a battle of competing British minorities, with neither section forced to play the role of the Anglo-mainstream.

    There’s also a whisper of political economy to the Celtic thesis, which begins not by emphasizing culture or folkways, but by emphasizing the unusual features of the backcountry’s pastoralist economy. Settlers in the upland South farmed like the Scots or the Irish or the northwest English, which is to say that they hardly farmed at all, keeping herds on open ranges, resigned in advance to some going missing and others dying, not bothering with barns or winter fodder or market crops or even gardens of any ambition. One could find that claim deeply challenging, since it blows apart generations of talk about Jeffersonian yeomen and the commitments of American republicanism. If this account accurately describes the small farmers of the Southern hill country, then the crackers weren’t yeomen. They were barely even farmers.

    3) A third argument holds that the South is different from the North because for much of its history the latter has been an economic dependency of the former.  The South was an internal colony, its fortunes largely dictated by Northern capital, hence underdeveloped in the familiar colonial manner: handed over to extractive industries and monocrop exports, its economy staffed by hyper-exploited labor, its railroads and forests and mines largely owned by outsiders who systematically diverted profits out of the region.[7]

    Those, then, are three common accounts of what has made the American South distinctive: The South is or was a traditional society, the South is or was Celtic, the South is or was an internal colony.

    Let’s go back to the first argument and ask: What happens when you claim that the South was different because it housed a “traditional society”? What does that claim do to the South as a concept? We can answer that question by looking at the footnote that Allen Tate inserted at the very beginning of his contribution to I’ll Take My Stand:

    The writer is constrained to point out (with the permission of the other contributors) that in his opinion the general title of this book is not quite true to its aims. It emphasizes the fact of exclusiveness rather than its benefits; it points to a particular home of a spirit that may also have lived elsewhere and that this mansion, in short, was incidentally made with hands.[8]

    These two sentences deserve to be restated. Tate is determined to explain right up front that he doesn’t think that he and his fellows should really be focusing on the South. That approach, he says, is too “exclusive,” too tied to a “particular home”; it won’t recruit a general readership. The whole point is that what Tate has consented to misdescribe as the Southern way “may also have lived elsewhere.”

    Tate’s words allow us to say something important. Even in the core texts of Southern regionalist thinking, the South is shadowed by an elsewhere, and it is this elsewhere that grows up in the gap between the concept of “the South,” understood as the old Confederacy, and the concept of “traditional society,” which isn’t bound to any region. So this argument, too, hands the Southern intellectual over to a non-regional regionalism. Even in the founding documents of Southern regionalism, its partisans were not, in fact, defending the South. But we can get more specific about this. For intellectuals like Tate, the “elsewhere” has a name.

    Tate: “We must be the last Europeans—there being no Europeans in Europe at present.”

    Ransom: “The South is unique on this continent for having founded and defended a culture which was according to the European principles of culture.”

    Davidson: “The cause of the South was and is the cause of Western civilization itself.”[9]

    The pattern is hard to miss: the last Europeans, European principles, Western civilization. Europe, Europe, the West. Richard Weaver was arguing in the 1950s that the Nashville Agrarians were in large part the product of the Rhodes Scholarship. In England, he said, a “suspicion began to dawn that the society they had grown up with in the South was in the main tradition of Western European civilization.”[10] Agrarianism, in other words, was interested in Virginia and Kentucky, true enough, but it was interested in those places as mediated through England. The Vanderbilt Twelve weren’t just Southern, they were Oxbridge Southern.

    Let’s not worry too much about the group biography of the Agrarians, though, because conceptually the point is even more compelling, since what we see here is that the South has begun sliding around the compass; it is refusing to stay anchored at 6:00. This entire discourse is what we can call the South as West, and it yields what I would like us to consider a general point: That Southern regionalism typically works by way of geographical conflations of this kind. The non-identity of the South is plainest when it borrows the names of other headings or when one cardinal point reinvents itself as a second. In the Dixie Theogony, the south wind can blow from any direction.

    The reader might need more convincing on that last point, so I’ll bring forward another instance. What is the geographical unconscious of the Celtic thesis? It will be enough to scan a passage from the standard citation on that topic:

    To understand regional differences in the US, one has to grasp that…

     …by virtue of historical accident, the American colonies south and west of Pennsylvania were peopled during the seventeenth and eighteenth centuries mainly by immigrants from the ‘Celtic fringe’ of the British archipelago—the western and northern uplands of England, Wales, the Scottish Highlands and Borders, the Hebrides, and Ireland—and that the culture these people brought with them and to a large extent retained in the New World accounts in considerable measure for the differences between them and the Yankees of New England, most of whom originated in the lowland southeastern half of the island of Britain.[11]

    You can see the transposition there. From north to south and south to north. The south was peopled by immigrants from the northern uplands. The Yankees originated in the southern half. What the Celtic thesis celebrates in the South is its Northernness, and this, of course, yields the most complete of the regionalist flippings—a proper dialectical Umschlag, in fact: The South as North. From a certain neo-Confederate perspective, the problem with New England is that it is insufficiently northern. One defends the South in order to safeguard the old boreal ways.

    The South as North—of course, that idea is available in a second version, as well. Before the Civil War, the South sent out freelance imperialists in all directions. The planter leadership was not just promoting westward expansion along the lines set down by the Missouri compromise. Many in the South embraced a more general program of Southern, slave-power expansion. Southerners operating independently of the US government tried repeatedly to invade Cuba. A native of Nashville conquered Nicaragua and appointed himself president of that country. Militant planter-politicians had thoughts of uniting with Brazil or perhaps of seizing large sections of the Amazon. A native of Tennessee led the US occupation of the Dominican Republic in 1916. The Secretary of the Navy who presided over the invasion of Haiti in 1915 was from North Carolina. All I mean to say is that the South is a relational term, and that once you pass Key West, the South transposes in one to the north, an imperial north in which there are only Yankees. Any Southerners reading these sentences will have to face up to this. Proud not to be Yankees, they are Yanquis all the same.

    So what, then, of the claim that the South has been colonized, that it is the proximate victim of New England imperialism? First, we’ll want to take stock of a sentence that John Crowe Ransom wrote when he was in his 40s: “By poets, religionists, Orientals, and sensitive people, nature is feared and loved.”[12] That sentence appeared in 1929; I’ll Take My Stand was published the following year, and read side by side, it is easy to see that the achievement of that second volume was to add Southerners to the list of nature’s adorers—or indeed, to let the white Southerner absorb those other four positions into himself: the poetic Southerner, the sensitive Southerner, the religious Southerner, the Oriental Southerner. I’ll Take My Stand gives us the Virginian as Chinese aesthete and anchorite or the Tarheel as mandala-painting monk. Just as curious, this appetite for self-Orientalizing is what the new Southern studies most inherits from the Agrarians, precisely when it takes itself to be critical.

    There is more than a passing resemblance between some writers of the Southern Renaissance and the characters of novelist Chinua Achebe, who are caught up in the cultural conflict triggered by the growing Westernization of African life.[13]

    We’ll want to note: That last sentence was written by James Cobb, a historian at the University of Georgia and one-time president of the Southern Historical Association. Cobb uses the word “Westernization,” from which we need merely work backwards. Faulkner is the writer of the Westernization of the South, which makes of the South a colonized East or generic Orient. Or there’s this, by the Duke historian John Cell:

    By the mid-1870s, however, it was clear that the strategy of direct rule [by the North in the defeated South] had failed. But as other imperialists [other than the Yankees]—notably the British in the Indian princely states, Malaya, Nigeria, and elsewhere—have also discovered, the goal of economic hegemony could be achieved as well, and with much less trouble and expense, through subtler forms of indirect political domination.[14]

    Or there’s this, by a senior historian at Clemson:

    In the following passages by Said, I have substituted the words northern or Yankee for Said’s European, West, and the like, and the words South or southern for Orient, Arab, (Mid)East, and so on.[15]

    This last is especially telling, since it entirely candid about the search-and-replace quality of the entire endeavor. Regionalism as a master discourse will attach its claims indiscriminately to any compass point. The arguments remain the same; you just swap in fresh coordinates.

    The South as West, the South as North, the South as East. Traveling the rim of the compass, we end up back at South as South, but now this last formulation appears us in changed form, internally riven, no longer simply itself. To wit:

    Woodward: “Like republics below the Rio Grande the South was limited largely to the role of a producer of raw materials, a tributary of industrial powers, an economy dominated by absentee owners.”

    McWhiney: “John Morgan Dederer claims that ‘the tribal Celtic-Southerner’s culture and folk traits were so compatible with those of the Africans that it took little adaptation for slaves to fit Celtic characteristics around their African practices.’ Many of the Indians of the Old South practiced lifestyles quite similar to those of their Celtic neighbors.”[16]

    This is, of course, a second version of the South as colony, but now without the language of the East or the Orient. In this variant, the South gets to keep its name, but is nonetheless doubled by other Souths. In a fit of tribalism and primitivism, the old Confederacy enlists in the ranks of the darker nations. This last should help us see what is most distinctive about Southernism as a discourse, that it produces a contradiction or particular intertwining, enrolling itself simultaneously in the West and the non-West, naming itself the South while also claiming to run true north. The South positions itself as the repository of all Western values, as civilization’s last American garrison, while also embracing an unlikely anti-colonialism. It is in the discourse of the twofold South that the colonizers learn to wrap themselves in the pathos of the colonized. We can spot this most efficiently in the words of Thomas Fleming, a classicist trained at UNC:

     The original Klan was a national liberation army made up of [those] who refused to accept their status as a subjugated people.[17]

    That is the idiom of white-supremacist Third Worldism, which is the position made possible by the indistinct and never-regional word “South.” The South and the South: Or, the masters enroll themselves in the coalition.

    Notes

    [1] See I’ll Take My Stand: The South and the Agrarian Tradition, New York: Harper & Brothers, 1930. The title page attributes the book to “Twelve Southerners.” For background, see Paul Conkin’s Southern Agrarians (Knoxville: University of Tennessee Press, 1988).

    [2] Edmund Wilson, “Tennessee Agrarians.” New Republic. 29 July, 1931.

    [3] In addition to Conkin, see Dorman’s Revolt of the Provinces: The Regionalist Movement in America, 1920-1945 (Chapel Hill: University of North Carolina Press, 1993).

    [4] See, for instance, Allen Tate’s “Remarks on the Southern Religion” in I’ll Take My Stand, pp. 155 – 175.

    [5] Grady McWhiney’s Cracker Culture: Celtic Ways in the Old South (Tuscaloosa: University of Alabama Press, 1988).

    [6] See Edmund Morgan’s American Slavery, American Freedom: The Ordeal of Colonial Virginia (New York: W. W. Norton, 1975).

    [7] See, among many others, Natalie Ring’s The Problem South: Region, Empire, and the New Liberal State, 1880 – 1930 (Athens: University of Georgia Press, 2012).

    [8] Tate, p. 155.

    [9] Tate’s letter to Donald Davidson is quoted in Paul Murphy’s Rebuke of History: Southern Agrarians and American Conservative Thought (Chapel Hill: University of North Carolina Press, 2001), p. 66; Ransom in I’ll Take My Stand, p. 3; Davidson, writing in Southern Writers in the Modern World, qtd in Murphy, p. 114.

    [10] Richard Weaver, “Agrarianism in Exile” in The Sewanee Review, 58.5 (1950), pp. 586 – 606, quotation p. 588.

    [11] McWhiney, p. xxi.

    [12] Ransom, God Without Thunder (New York: Harcourt, Brace, 1929), p. 31.

    [13] James C. Cobb, Away Down South: A History of Southern Identity (Oxford: Oxford University Press, 2005), p. 139.

    [14] John Cell, The Highest Stage of White Supremacy (1981) (Cambridge: Cambridge University Press, 1982), p. 146.

    [15] Orville Burton, “The South as ‘Other,’ the Southerner as Stranger,” Journal of Southern History, 79 (Feb. 2013), pp. 7–50, quotation p. 12.

    [16] C. Vann Woodward, Origins of the New South, 1877 – 1913, (No city given: Louisiana State University Press, 1951), p. 311; McWhiney, p. 21.

    [17] Fleming qtd in Murphy, p. 241.

     

  • Video Essay: All That Is Solid Melts Into Data

    Video Essay: All That Is Solid Melts Into Data

    dir. Ryan S. Jeffery and Boaz Levin

    This film is posted in anticipation of boundary 2‘s upcoming special issue –– Bernard Stiegler: Amateur Philosophy (January 2017).

    Equal parts building and machine, a library and a public utility, data centers are the unwitting monuments of knowledge production to the digital turn. This film traces the historical evolution of these structures that make-up “the cloud”, the physical repositories for the exponentially growing amount of human activity and communication taking form as digital data. While our “smart tools” and devices for communication become increasingly smaller, thinner, and sleeker, the digital sphere they require grows larger demanding an ever-growing physical infrastructure, effecting and shaping our physical landscape. This film looks to the often-overlooked materiality of networked technologies in order to elucidate their social, environmental, and economic impact, and calls into question the structures of power that have developed out of the technologies of global computation.

  • Bruce Robbins: On the Non-representation of Atrocity

    Bruce Robbins: On the Non-representation of Atrocity

    by Bruce Robbins

    The closing day featured a formal keynote address by Bruce Robbins, followed by responses.  While the keynote practices a rousing, engaged, presentist, theoretical Victorian studies, the responses by Zach Samalin and Molly Clark Hillard, and the heated discussions at the symposium, point to other futures.  Elaine Hadley integrated a number of the arcs of discussion while also highlighting what remains to be argued.  We are grateful to b2o for providing this catalyst for yet more.    

    This essay was peer-reviewed by the editorial board of b2o: an online journal.

    Toward the end of Michael Ondaatje’s novel The English Patient (1992), the young Canadian ex-nurse Hana writes in a letter home to her stepmother: “From now on, I believe the personal will forever be at war with the public” (Ondaatje 1992, 292).

    Hana has just heard about the bombing of Hiroshima and Nagasaki, news that has shocked her Sikh lover Kip into leaving both her and the anti-Nazi war effort.  The unending war between the public and the personal that Hana dates “from now on” is the result of what we have come to call an atrocity: an act of extreme cruelty that is collective, unnecessary, and indiscriminate, the latter two adjectives judged to apply because (here I quote Jacques Sémelin’s definition of “massacre” in his book Purify and Destroy) it is “aimed at destroying non-combatants” (Sémelin 2007: 4). I will withhold comment for now on whether the war between the public and the personal (which echoes a vocabulary put in play a few years earlier by Fredric Jameson) is as new as Hana thinks; it sounds pretty Victorian to me.  But the atom bomb was definitely new.  And as a concept, the atrocity is also pretty new.  The idea of the “non-combatant” dates only from the Napoleonic Wars.  Both “non-combatant” and “atrocity” would seem to require the modern weakening of membership–the still recent assumption that individuals should not be held responsible for actions taken by the families or nations to which they belong.  “Cruel” and “fierce,” the meanings of “atrox,” the Latin source word for “atrocity,” did not begin their lives as pejoratives, but picked up pejorative meanings only as physical violence came to seem a less dependable aspect of ordinary lives, something that generally could and should be avoided.  The re-classification of violence as out of the ordinary is again associated, perhaps only wishfully, with modernity.

    But you only feel how very modern Ondaatje’s naming of the atom bomb as atrocity is when you add one more element.  Kip and Hana are recoiling from an action performed by their own side.  This is a moment of civilizational self-accusation.  It belongs to the very special subset of atrocity-response in which “we” accuse ourselves of doing something outrageously cruel, collective, and indiscriminate to “others.”

    Yes, Ondaatje is a Canadian and a Sri Lankan; Kurt Vonnegut’s Slaughterhouse Five might have been a better as well as an earlier example.  And yes, to play up the look-at-us-admitting-the terrible-things-we-did-to-others criterion, as I’m preparing to do, could be seen as a celebratory re-write of Enlightenment self-scrutiny, in other words as a way of once again giving credit to the modern West for a virtue on which it has often prided itself, perhaps excessively.  Undeterred by these objections, I am going to forge ahead, assigning atrocity as self-accusation an important part in the long-term moral history of humankind and indicating a desire, at least, to place the novel within that larger history.  This of course assumes there exists such a thing as the long-term moral history of humankind.  It assumes that history need not be understood as a finer and finer discrimination of differences (a habit that I think the V21 group has very usefully expressed its impatience with) but can also be thought of as a series of experiments in the synthesis of differences—bold generalities, even “grand narratives.”

    It’s from the perspective of the long-term moral history of humankind that the question of atrocity is most interesting, and most humbling, for specialists in nineteenth-century British literature.  In the late 1970s, the editors of the journal New Left Review conducted a book-length series of interviews with Raymond Williams.  The interview that hit me hardest at the time dealt with Williams’ admiration for the novels of the 1840s, about which I had just heard him lecture.[i]  “In that decade,” the interviewers say,

    there occurred a cataclysmic event, far more dramatic than anything that happened in England, a very short geographical distance away, whose consequences were directly governed by the established order of the English state.  That was of course the famine in Ireland—a disaster without comparison in Europe.  Yet if we consult the two maps of either the official ideology of the period or the recorded subjective experience of its novels, neither of them extended to include this catastrophe right on their doorstep, causally connected to socio-political processes in England. (Williams 1981, 170)

    If this is true for catastrophic events in Europe, how much more true is it, the interviewers ask, for more distant colonies like India, where events were again directly affected by the imperial system?

    The NLR interviewers are asking us to imagine that even the English literature of the 1840s we most admire today was unable to represent disasters or cataclysmic events for which England was itself responsible, directly or indirectly.  It does not seem implausible that atrocity-representation in the narrow, self-accusatory sense might simply be missing from the history of the 19th century novel.  If you think of its greatest works, direct representations of any atrocity are certainly not the first things that come to mind.  We know our authors could express horror at the 1857 Mutiny in India or the Bulgarian Atrocities (committed by the Ottomans) or King Leopold’s mischief in the Congo or the occasional scene of mob violence.  But perhaps they simply could not summon up any English equivalent to Vonnegut’s horror at the Allied bombing of Dresden.  Perhaps the English could not imagine accusing themselves, at least not from the viewpoint of the non-English, at least not when the accusation would have been damning.  Were we to accept this hypothesis, which I offer up here as nothing more than a hypothesis, it seems clear that some of the going rationales for nineteenth-century studies, and maybe even for literary criticism in general, would be in jeopardy.

    In self-defense, we could of course argue that the criterion of self-accusation is unacceptably presentist. How could one expect the great epoch of European realism to “do” atrocity in the particular, self-accusing sense? Arguably such representations only became possible after European civilization has been shocked out of its pre-Copernican complacency by, for example, the Holocaust and the rise of anti-colonial movements. In the nineteenth century, those shocks were still to come. It would therefore be anachronistic to expect European literature to have re-set its default settings, which were presumably nationalist or at least national, and to have experimented even intermittently with cosmopolitan self-consciousness. Another field-defensive move would be to focus on the canon’s experimental outliers. As some of you probably know, there exists a body of scholarship qualifying the claim that outside Ireland the Irish Famine did indeed go unrepresented. Much of that scholarship deals with minor works by Trollope. To me, those works seem both aesthetically and politically uninspiring. But perhaps one can do better. More inspiring, among the potential counter-examples, would be Multatuli’s 1860 novel Max Havelaar: Or the Coffee Auctions of the Dutch Trading Company, which has been credited with starting the anti-colonial movement in Indonesia.  Or Tolstoy’s final work of fiction, Hadji Murat.

    Hadji Murat is set during the mid-19th century Russian conquests of the East that Tolstoy himself participated in as a young man and that so neatly mirror the genocide of the Native Americans that the US was carrying out in the same years in the American West.  At one point it describes the destruction of an indigenous village in the Caucasus in what would now be called Chechnya.  Tolstoy shows us the army’s burning of the Chechen village through the eyes of a Russian soldier.  The Russian’s mind is elsewhere, preoccupied with a theme that could not be more conventional for people like him: money he has lost at cards.  For him it is an unremarkable day, so the reader sees nothing remarkable: “War presented itself to him only as a matter of subjecting himself to danger, to the possibility of death, and thereby earning awards, and the respect of his comrades here and of his friends in Russia. . . . The mountaineers [he does not call them Chechens] presented themselves to him only as dzhigit horsemen from whom one had to defend oneself” (Tolstoy 2009: 78). Given this failure of imagination on the Russian side, the narrator must step in and, somewhat intrusively, make a connection on the next page that no one within the novel’s world is there to make:

    The aoul devastated by the raid was the one in which Hadji Murat had spent the night before his coming over to the Russians. . . . When he came back to his aoul, [Sado, at whose house we have seen Hadji Murat greeted hospitably in the novel’s first scene despite the extreme danger the host is in] found his saklya destroyed: the roof had fallen in, the door and posts of the little gallery were burned down, and the inside was befouled. His son, the handsome boy with shining eyes who had looked rapturously at Hadji Murat, was brought dead to the mosque on a horse covered by a burka. He had been stabbed in the back with a bayonet. (Tolstoy 2009, 79).

    The sentence about the child bayoneted in the back does not end the paragraph.  There is no pause for the drawing of conclusions, moral or otherwise.  It’s as if, from a Russian point of view, a Chechen child who has been bayonetted in the back is not ethically or emotionally forceful enough to interrupt the flow of narration, not enough to justify even the briefest of hesitations.  It’s not surprising that Tolstoy could not get that book published in full in his country in his lifetime.  It’s surprising that he left this record at all.

    Something could no doubt be said about the depiction in nineteenth-century literature of the poor and the homeless as internal aliens, hence sufficiently “other” to count as victims of atrocity in my limited sense.  I’m thinking of, say, Victor Hugo (the army firing on the barricades in Les Misérables) or Bleak House’s description of the death of Jo: “And dying thus around us every day” (Dickens 1998, 677).   One could also go back to the criterion that the NLR interviewers apply to Raymond Williams (and that Williams himself does not dispute): the premise that criticism should aim to reconstruct, through literature, “the total historical process at the time” (Williams 1981, 170).  Who says the novelists of the 1840s were obliged to talk about the Irish Famine, a (to them) invisible part of the (to us) larger causal system?[i] Perhaps this is asking for something the novel simply could not and even cannot deliver. Perhaps we should content ourselves with what it can deliver, even if that seems a humbler thing.  This line of thinking may have encouraged some critics to urge a dialing back of the political and ethical claims we make.  A modest anti-presentism of this sort would certainly make it easier for those 19th century specialists who are professionally uncomfortable with atrocity to return to what they were already doing, undisturbed by any nagging sense of responsibility to imperatives they see as coming from outside the field.

    My own impulse is not to back down from “the total historical process” criterion.  Which means I’m stuck with atrocity, however presentist the topic may seem.  What I’d like to try out therefore is a different negotiation between present imperatives and period loyalties, between history as the proliferation of differences (differences that may turn out to be trivial) and history as synthesis (synthesis that avoids triviality but could seem to lack rigor as the field defines it).

    The concept of atrocity may be new, but the thing of course is not. It seems admirable to me that much new scholarship is willing to hold off on the familiar nominalist-historicist move (there is no true history but the recent history of the name, the concept) and instead to take on the deeper history of as yet unnamed things, a trans-historical history much of which (like the atrocity) is inescapably pre-modern.  I’m thinking for example of the thunderous “no!” to periodization itself that is proclaimed in Susan Stanford Friedman’s Planetary Modernisms and the challenge to “periodizing divisions between premodern and modern” in the introduction that Saher Amer and Laura Doyle wrote to ”Reframing Postcolonial and Global Studies in the Longer Durée,” a special section of the latest PMLA (Friedman 2015, 331). Both texts accuse conventional periodization of sustaining Eurocentrism.  It seems to me that both share important concerns with the V21 manifesto and its impatience with period-centered thinking.

    I hope you agree that the V21 project belongs in the context of a broader acknowledgment that learning to work in an enlarged, trans-period time scale is no longer optional.  The reasons behind this new temporal common sense are not unfamiliar, but it may be helpful to gather a few of them together. Among the best known is the emergence of the term “anthropocene” to mark the salience of an ecological perspective at the level of the planet.  Among the least known is the emergence of an international movement of indigenous peoples, one premise of which is that colonialism is not something done solely by European settlers or done solely after 1492.  Joining the two are books like Pekka Hämälainen’s The Comanche Empire, which gives the Comanches credit, if that’s the right word, for themselves practicing colonialism, and justifies their conduct (again, if justifying remains a pertinent concept) in terms of their superior ecological adaptation.  Logically enough, the new sub-disciplines of “world” history and “big” history are notable for an impulse, sometimes conscious and sometimes not, to do without moral judgment entirely. Some declare that to arrange history around the values of “democracy,” for example, would be inexcusably teleological and provincial. The same vector appears in another important zone of temporal stretching: the postcolonial critique of Eurocentrism. Here of course it seems even more paradoxical, dependent as postcolonial studies has been on a politicized model of European core, non-European periphery. But as Alexander Beecroft has argued, this model, useful enough for the recent past, simply doesn’t apply for most of the world’s cultures during most of the world’s history. China and India two or three or four thousand years ago were in no sense peripheries to Europe’s core.  It would be temporally parochial, therefore, to take the particular inequalities and injustices of the recent past as a guide to the interpretation of Indian or Chinese culture. Thus the cosmopolitanism with which we are most familiar, call it cosmopolitanism in space, brings with it a corresponding cosmopolitanism in time, and this temporal cosmopolitanism ends up undermining habits of ethico-political judgment based on an outmoded core-periphery geography. Here I am re-describing the emergence of a somewhat depoliticized “world literature” out of a very political “postcolonial studies.” For better or worse, re-describing it in this way makes it harder to complain about.

    Expanding our time-frame seems inevitable. As does some evening out of the blame for imperialism, which can no longer seem the moral burden of Europe alone. The long-term question for V21, it seems to me, is how to manage this expansion beyond the period while sustaining the moral and political commitments that make the critical enterprise worth doing at all.  The immediate question is where in this revisionist scale and sense of history I can find a home for my interest in atrocity, an interest that takes for granted the centrality of critique.

    From this perspective, the first thing I notice about interesting new work on an expanded time-scale is that atrocity tends to get left out. For Amer and Doyle, the familiar European version of imperialism was only one in a long series of imperialisms before 1500, many of them non-European. Rather than insisting that the presence or absence of capitalism made all the difference, they suggest, we need to find a way of talking about European and non-European imperialisms in the same breath. That seems right. But what this can mean in practice is that imperialism’s violence is omitted, perhaps because it is assumed that moral critique of imperialism would be anachronistic and/or Eurocentric or because blaming has come to seem pointless and irrelevant.  Hence there is no vocabulary for atrocities. Historically speaking, Amer and Doyle are gradualists. The premodern for them was already modern; the difference is merely a matter of detail and degree. From their moderate anti-periodization position, anything that looks like violent rupture, such as modernity, is actually always the result of small, slow accretions.  It’s as if their distaste for violent rupture at the level of periodization is duplicated in a distaste for violence as social content. Violence exists for them, of course, but not as a conundrum; it’s not interesting enough to demand interpretation. What’s interesting about the world’s interconnectedness is commercial contact and cultural exchange. There are empires, but when it’s pre-moderns or (especially) non-Europeans who are doing the slaughtering and conquering, what suddenly kicks in is a great deal of respect for the empire-builders and for the cultural consequences of their empire- building.  Coercion is not absolutely forgotten, but it’s rarely stage center. This is arguably just as presentist as the older focus on domination and atrocity, but it’s presentist in a different way: a projection onto the past of globalization’s smug, all-cultures-are-equal case, a case which does not harp on inequalities of economic and political power.

    The closest Susan Stanford Friedman comes to a statement on imperial coercion is as follows: “empires typically intensify the rate of rupture and accelerate change in ways that are both dystopic and utopic” (Friedman 2015, 337). What she calls “brutalities” can of course be recognized, but only as a general phenomenon that 1) is balanced in advance by the “utopic” aspects of empire, and in part for that reason, 2) is in no way interesting or worthy of being investigated (Friedman 2015, 337). The problem here is not the reluctance to innovate of a sluggish, fuddy-duddy field.  The problem is the innovation, an anti-rupture position that makes things like atrocity harder to see, or to teach.  Sometimes that seems to be the whole point of innovating. I think for example of Rita Felski’s mobilizing of Actor Network Theory against “the rhetoric of negativity that has dominated literary studies in recent years: a heavy reliance on critique and the casting of aesthetic value in terms of negation, subversion, and rupture.”

    Neither history’s narrative form nor its social content can be all rupture all the time.  But unless it has rupture in it, it’s not history at all.  And even those of us who are most impatient with the restrictiveness of existing periodization should not want, finally, to give up on history as such.  Laura Doyle notes that there were slave revolts in the Abbasid Empire of the 9th century just as there were “anticolonial movements” in the twentieth century (Doyle 2015, 345). This observation only becomes genuinely historical if one goes on to ask whether the slave revolts of the 9th century might have been different in kind–more precisely, whether they were in fact anti-colonial or anti-imperialist.  They may have been, and they may not have been. These may have been slaves who not unreasonably preferred to have slaves rather than to be slaves.  The difference is important.  In order to know, you would have to be interested not just in the history of imperialism, but in the history of anti-imperialism.  You would have to decide that anti-imperialism has a history.  It’s the difference between asking when people were merely complaining that we suffer under imperial rule (probably as long as there have been conquests) and when they began saying that others may have suffered under our rule–a universalizing moment that is probably more recent and more rare. This would bring us back to the representation of atrocity as self-accusation.

    If there was a moment when the feeling “I am angry at your country for conquering mine and ruling it by a harsher standard than you apply to your own” metamorphosed into something like “it is wrong for any country, including my own, to conquer any other,” wouldn’t we want to know something about it?  It might turn out that this only occurs with or after that violent rupture we call modernity.  As a historical fact, wholesale raping, pillaging, plundering, and slaughtering are of course characteristic of many if not most pre-modern societies.  I think for example of the ethnic cleansing of the Midianites in the Old Testament, which raises a red flag for Moses only because his troops left the very old and the very young Midianites alive, alongside the nubile maidens, and therefore had to be told to go back and finish the job.  The chapter on the ancient Near East and classical Greece in David Johnston’s magisterial history of justice concludes that “commitments to freedom and equality” are “nowhere to be seen” in the domestic laws of ancient world, but it doesn’t even bother to ask about foreign policy–about the possible existence of scruples as to, say, violence against members of other groups, tribes, nations (Johnston 2011, 15).   For “our” treatment of “them,” there were no rules.  As Michael Freeman says in the entry for “Genocide” in the Dictionary of Ethics, Theology and Society: “Genocide was not a moral problem for the ancient world.  It is for the modern world because moral and political values have changed” (Clarke and Linzey 1996, 403). As everyone knows, the Greek word from which we get apology, apologia, “does not involve an acknowledgement of transgression and, thus, needs no request for pardon or forgiveness” (Lazare 2004, 24). Atrocity is everywhere in ancient times, but not (to my knowledge) as representation.  In the West, at any rate—I can’t speak for other cultures, and I have some trouble pretending to speak for the West—it is only when “the moral and political values have changed” that one can expect to see representations of atrocity.  If we say that the atrocity is a construct, one thing we would mean is that in order for it to be discussed, a moral norm that it violates first had to emerge or be invented.  It’s in this sense that, even if representations of atrocity are indeed missing from the great literature of the 19th century, the atrocity is also a nineteenth century topic.

    I am not talking here about Steven Pinker’s highly questionable argument that modernity is in some fundamental way opposed to violence.  (This from someone whose book has no entry in its index for “colonialism”!)  I am talking only about the emergence of moral norms, whether or not those norms were violated in practice.  This story is untellable without the nineteenth century.  You know the moments of emergence I have in mind: the transfer of Jacobin ideals to the Haitian Revolution, Burke on Warren Hastings, Marx on the British in India, Henri Dunant deciding at Solferino that warfare had to be regulated, Tolstoy deciding that the Chechens should be permitted to survive as Chechens, and so on. I think it’s also a story that we could find, if we chose to look, entangled in the forms of the 19th century canon.

    What would it say about us if, for fear of falling into Whiggish triumphalism, we turned out to be incapable of acknowledging even that moral history, partial and incomplete and unsatisfying as it is?  One thing it would say is that we prefer to leave atrocity without a history.  I hope we don’t.  There is of course a deep, largely unacknowledged tension between the working assumptions of the humanities and the idea of progress—progress even as a possibility.  Any admission of possible progress threatens the value of canonical texts. That’s arguably why we have been so eager to prostrate ourselves before Walter Benjamin’s Angel of History rather than asking, in a secular and open-minded way, whether what we see before us is really nothing but an ever-increasing accumulation of ruins.

    According to Helen Small’s definition in The Value of the Humanities, the humanities “respect the products of past human endeavors in culture, even when superseded” (Small 2013, 57).  “Even when superseded” is a phrase you don’t hear much in literature departments.  To admit that cultural products and endeavors might ever be “superseded” is to call in question our presumptive respect or rather reverence for them, which Small is trying here to affirm, and that is a prospect that critics less courageous than she is would prefer not to recognize.  And yet there are moments when, like Helen Small, we are all brave enough to admit to some some progressive thinking.  About our assumptions on race, class, gender, and sexuality, which we assume (correctly) to have improved.  Or about “our own work.”

    In her book The Deaths of the Author, Jane Gallop notices that when Gayatri Spivak talks about her work as the author of A Critique of Postcolonial Reason, she uses the word “progress,” as in the sentence, “My book charts a practitioner’s progress” (Gallop 2011, 130). “‘Progress,’ Gallop goes on, “does not seem like a word one would expect Spivak to use.  The word ‘progress’ generally denotes the most triumphant relation to temporality.  ‘Progress’ here represents the least troubled or troubling, the most positive version of a writer’s change over time” (Gallop 2011, 130).  In fact, she concludes, this somewhat conventional phrasing is “quite atypical of the book” (Gallop 2011, 131).[iii]

    A similar inconsistency pops up in Max Weber’s famous lecture Wissenschaft als Beruf (Scholarship as a Vocation).  It is the strong argument of that lecture that we have fallen into what Weber calls polytheism, a somewhat melancholic condition in which progress is impossible because each collectivity follows its own gods and there is no commonly shared membership, no overarching religious or political principle that would adjudicate among them or mark out any course of action as an advance over any other.  And yet Weber also says that scholars-to-be must resign themselves to seeing their work rendered obsolescent by those researchers who come afterwards.  Unlike art, where “there is no progress,” Weber says, scholarship or Wissenschaft (the translation calls it “science”) “is chained to the course of progress” (Weber 1946, 137).  “In science,” as a result, “each of us knows that what he has accomplished will be antiquated in ten, twenty, fifty years.  That is the fate to which science is subjected; it is the very meaning of scientific work … Every scientific ‘fulfilment’ raises new ‘questions’; it asks to be ‘surpassed’ and outdated.  Whoever wishes to serve science has to resign himself to this fact” (Weber 1946, 138). If our work will be surpassed and outdated, that is not just something to which we have to resign ourselves; it’s not just a grim fate to which we are “chained.” It’s also a fact that ought to give us a certain satisfaction. It means we belong to a collectivity which recognizes the value of our work, takes advantage of it, and builds on it. The suggestion here is that you would need to feel you belong to a relatively tight collectivity in order to be able to experience progress. So there is such a thing as progress after all— progress at the level of research, progress within the community of scholars, provided that the community of scholars really is in a strong sense a community.

    I have made a little collection of instances like these in which a scholar will deny progress in general but affirm it within the domain of scholarship.  The point is not to poke fun.  This apparent contradiction can be explained, I think, without any indignity to the scholars concerned.  The reason we can acknowledge progress within scholarship is that as scholars we feel ourselves to belong to a collectivity. As citizens, on the other hand, collectivity of this sort is not something we tend to experience on a regular basis or indeed to seek out. At a recent conference on Stuart Hall, I found myself saying that if Hall defended the now old-fashioned-sounding idea of “theoretical gains,” it was because he thought of himself first and foremost not as a writer and scholar but as a member of a movement. If you are a member of a movement, you have a rough measure by which progress can be calculated. Progress is no longer unthinkable or embarrassing.  Hall’s example is worth contemplating, and not just so as to achieve consistency. I don’t see why those of us who think of ourselves as progressives–and there are a lot of us– are so reluctant to seek real-world equivalents for the scholarly experience of collectivity, thereby permitting us to recognize in the world we write about more of the progress we sometimes recognize in our own writing.

    I’m not trying to encourage Whiggish or Eurocentric complacency.  At present, all I really have is questions and areas for further research. I for one would like to know how it was possible for Ishikawa Tatsuzo’s 1938 novel Soldiers Alive to document atrocities committed by his fellow Japanese against Chinese civilians within months of the 1937 Rape of Nanjing.[iv] Were there precedents in the Japanese literature of the 19th century that prepared for this extraordinary feat?  Or perhaps earlier?  I’m sure there is more than one path leading to national self-accusation, both on the global scale and within the various European traditions.  At whatever risk to the hypotheses advanced thus far, I would like to know more about Grimmelshausen’s Simplicissimus, with its extraordinary accounts of the atrocities committed during the Thirty Years’ war, or before that Bartolomé de las Casas, with his extraordinary accounts of atrocities committed during the Spanish conquest of the Americas, or before that Euripedes’s Trojan Women.  It seems odd to me that no one considered it essential to my education–that I was not taught, and still don’t know when North Americans became conscious that there might be an ethical problem with the genocide of the Native Americans. I’m convinced that with a little work, we could come up with trans-periodic constellations of both research and pedagogy that would link earlier and later texts, and would do so in a way that is concretely rather than abstractly respectful of the past—that is, would take the past as something more than an empty figure of resistance to a present about which all we need to know is that we are against it.

    The 19th century’s failure to produce representations of atrocity as self-accusation, if that is indeed the case, can be explained by the non-existence in the 19th century of a “public” on an international scale, a public capable of demanding or enforcing scrutiny of ourselves from outside.  Incomplete as it may be, it seems to me there is a story here about the emergence of such a public.  Publics get constructed. The process of construction takes time: alien voices must be gathered and listened to.  It also takes an attitude toward time.  We cannot imagine ourselves as engaged in the process of constructing anything if we see every “chain of events” as (you will recognize the quotation) “one single catastrophe, which keeps piling wreckage on wreckage” (Benjamin 1969, 257).  What we ask our fellow specialists to join is a story with a future.

    References

    Benjamin, Walter. 1969. Illuminations. Edited by Hannah Arendt. Translated by Harry Zohn. New York: Schocken Books.

    Clarke, Paul A. B., and Andrew Linzey. 1996. Dictionary of Ethics, Theology and Society. London: Routledge.

    Dickens, Charles. 1998. Bleak House. Edited by Stephen Gill. Oxford: Oxford University Press.

    Doyle, Laura. 2015.  “Inter-Imperiality and Literary Studies in the Longer Durée,” PMLA 130:2 March 2015, 336-347.

    Felski, Rita. no date. “Comparison, Translation, and Actor-Network Theory,” manuscript available from the author.

    Friedman, Susan Stanford. 2015. Planetary Modernisms: Provocations on Modernity across Time. New York: Columbia University Press.

    Gallop, Jane. 2011. The Deaths of the Author Reading and Writing in Time. Durham, NC: Duke University Press.

    Johnston, David. 2011. A Brief History of Justice. Chichester, West Sussex: Wiley-Blackwell.

    Lazare, Aaron. 2004. On Apology. New York: Oxford University Press.

    Ondaatje, Michael. 1992. The English Patient. Vintage.

    Sémelin, Jacques. 2007. Purify and Destroy: The Political Uses of Massacre and Genocide. London: Hurst & Company.

    Small, Helen. 2013. The Value of the Humanities. Oxford: Oxford University Press.

    Tolstoy, Leo. 2009. Hadji Murat. Translated by Richard Pevear and Larissa Volokhonsky. New York: Vintage.

    Weber, Max. 1946. From Max Weber: Essays in Sociology. Edited by Hans Heinrich Gerth and C. Wright Mills. New York: Oxford University Press.

    Williams, Raymond. 1981. Politics and Letters: Interviews with New Left Review. London: Verso.

    Notes

    [i] I realized how hard the Williams/NLR interview hit me only after noticing, while preparing this essay, that I had already used it to begin one of my own early publications, an essay on Bleak House written in the 1980s and published in Homi Bhabha’s collection Nation and Narration.

    [ii] Perhaps this is not the proper or precise sense in which novels belong to history, and history belongs in novels.

    [iii] This and the following paragraph appear in my article “Hope,” Political Concepts: A Critical Lexicon, posted November 2015, www.politicalconcepts.org/hope-bruce-robbins.

    [iv] Ishikawa was arrested by the Japanese authorities and convicted, but then released and allowed to return to China on condition that he never write anything like that again.  He didn’t.  Despite my complete ignorance, I have the fantasy of trying to create a global counter-history of such moments of national self-critique.

     

  • Zachary Samalin: Genealogies of Self-Accusation

    Zachary Samalin: Genealogies of Self-Accusation

    by Zachary Samalin

    Response to Bruce Robbins: On the Non-representation of Atrocity

    This essay was peer-reviewed by the editorial board of b2o: an online journal.

    In his V21 symposium keynote lecture, “Atrocity in the Novel, Atrocity in History,” Bruce Robbins asks whether it is reasonable or instead “unacceptably presentist” to “expect the great epoch of European realism to ‘do’ atrocity in the particular, self-accusing sense” he is interested in examining, in which “‘we’ accuse ourselves of doing something outrageously cruel, collective, and indiscriminate to ‘others.’” “Arguably,” Robbins continues, “such representations only became possible after European civilization had been shocked out of its pre-Copernican complacency by the Holocaust and the rise of anti-colonial movements. In the nineteenth century, those shocks were still to come” (Robbins 2016: 4-5). Perhaps not surprisingly in a room full of Victorian literature specialists, the response to Robbins’ lecture during the question and answer session produced a long list of 19th century works that audience members thought would complicate, enrich, trouble or outright repudiate Robbins’ hypothesis that the literature of the 19th century had yet to achieve a certain form of critical self-consciousness, and so was incapable of indicting political brutality and violence. To the contrary, this audience response seemed to suggest, the archive of 19th century literature is rife with examples of just what Robbins is looking for.

    In the following response to Robbins’ lecture, I want to theorize more specifically the tension between these two seemingly irreconcilable positions, by examining one of Robbins’ central theses about the entwinement of politics and aesthetics—namely, that literature can and perhaps ought to lay claim to a privileged role in the articulation of “civilizational self-accusation,” especially in the context of the atrocities of modern imperialism. The notion that the literary has the capacity to register unwanted self-implication in destructive sociopolitical processes is extremely compelling; but, unlike Robbins, it is also an aesthetic innovation that I have come to associate with various currents in 19th century literature. And yet, as half a century of postcolonial literature and theory has helped us to see, this sophisticated innovation, which allowed for the registration, in narrative form, of undesired conditions of immanence, did little to turn the critical gaze of the 19th century novel outwards, that is, towards the ongoing atrocity of the British empire. When we read the literature of the mid- to late-19th century—Little Dorrit (1857), Notes from Underground (1864), The Belly of Paris (1873)—we don’t find a journalistic subjectivity reporting on the turbulent decades of perpetual war in Algeria, Persia, the Crimea, India, Burma, Vietnam, and China; but we do encounter a complex structure of feeling, beginning to emerge as something articulable, that conceived of modernity as a process of regressive self-destruction and of civilization as something unwanted that would soon sour itself from the inside out. In this respect, the question that Robbins’ lecture raises is to my mind not whether it is too ‘presentist’ to expect Flaubert or Dickens to have offered a critique of atrocity, but rather the enduring, perhaps more disturbing question of what specific forms of ideological blindness kept the novel form from extending the implications of its own socially critical and ethico-political insights to the imperial context?

    The first point to make is that, when it came to its atrocities, 19th century Britain left behind an indisputably immense non-literary paper trail. Certain brutal events in the maintenance of the empire—such as the violent responses to the Morant Bay rebellion (1865) and the Indian revolt (1857-8)—were not only voluminously documented, but debated publicly and at length, and did much to bring to the fore the question of what it means to participate in a putatively modern and morally enlightened national culture. More often than not, as has been well established, such debates served to mask the violence intrinsic to imperialism and capitalism, focusing instead on the extent to which particular episodes of brutality and exploitation represented local failures and setbacks in the ongoing civilizing project of the British Empire. Thus while Governor Eyre came under fire in the aftermath of Morant Bay, the terms of public debate set by the Jamaica Committee did little to overturn the entrenched patterns of racist thought and economic opportunism which helped to prop up the central premises of imperial exploitation (see Holt 1992: 278-312). Like a good deal of the public and official reaction to the documentation of torture at Abu Ghraib prison in our own day, Morant Bay provided a space for a limited articulation of civilizational self-accusation in British public discourse—‘we don’t do that’—but only within a larger self-serving framework of disidentification, disavowal and civilizational (which is to say racial and cultural) arrogance that helped keep the inherent injustice of imperial occupation from taking center stage. Indeed, one limitation of framing critique in reference to specific atrocities made apparent through these examples is that the focus on the event of cruelty and violence runs the risk of obscuring patterns of ongoing or systemic exploitation.

    Yet in their most trenchant form, 19th century critiques of imperialist violence did approach the form of self-critique that Robbins holds up as a more modern ideal. Marx’s criticism of the 1855 Report of the Commissioners for the Investigation of Alleged Cases of Torture in the Madras Presidency is exemplary in this respect (see Rao 2001). The report sought to establish the prevalence of physical torture and brutality as a systemic means of extracting tax revenue within British India for the profit of the East India Company, only to disavow responsibility for that violence and to condemn it, with characteristic outrage and condescension, in the racialized language of barbarism. “Our aim,” the report concludes, “is to guard the Natives against themselves” (Report 1855: 70). As Marx summarized the report, “The universal existence of torture as a financial institution of British India is thus officially admitted, but the admission is made in such a manner as to shield the British Government itself” (Marx [1857]1975: 66). Yet as Marx goes on to observe, “a few extracts from the evidence on which the Madras Report professes to be founded, will suffice to refute its assertion that ‘no blame is due to Englishmen,’” and to document instead the systematically exploitative nature of capitalist imperialism. Far from evidencing the need for colonial paternalism, Marx thought the report ought to raise for the “dispassionate and thoughtful men” of Europe the more self-implicating question of “whether a people are not justified in attempting to expel the foreign conquerors who have so abused their subjects” (Marx 1975: 69). Marx’s indictment of the Madras Report may not be precisely what Robbins has in mind when he argues for the cosmopolitan modernity of civilizational self-accusation as a “very special subset of atrocity-response in which ‘we’ accuse ourselves of doing something outrageously cruel, collective, and indiscriminate to ‘others’” (Robbins 2016: 2)—but if not, it is certainly a close relative.

    While Marx’s writings on India often lapse into a more rigidly developmental-teleological mode, according to which capitalism represents the first step necessary for Asian civilizations to catch up with world history, his observations about the Madras Report do more to highlight the complex ways that the question of identification came in this period to animate the representational dynamic of critique. The difference between the critical language of civilizational self-accusation, as Robbins formulates it, and the exculpatory language of civilizational disavowal, as exemplified by the Madras Report, hinges precisely on such vectors of identification—that is, on a speaker’s imagined participation in a particular ideological community. In this respect, while Robbins observes that “the modern weakening of membership” is a prerequisite for the distance needed to understand atrocity as such, I would argue that the unwanted (but inescapable) identification with destructive processes is in fact the crucial psychosocial component he ought to pursue, rather than the fraying of communal bonds more customarily associated with the onset of modernity (Robbins 2016: 1). Due in large part to a post-Enlightenment legacy that idealizes disinterestedness and objective distance, we have yet to provide even the basic outline of a history for this capacity for unwanted identification.

    Understanding how these two opposite movements—towards a desirable disinterest and an undesired involvement—were fused to one another throughout the 19th century is a significant and unfinished task for scholars of the period, in the first place because their fusion accounts for the antithetical attachments to the impulse to document violence and atrocity that I have been describing. The imperialist impulse to represent violence in order to disavow it as something always perpetrated by an other, or to frame it as an exceptionality that justifies rule, cannot be fully distinguished from the self-implicating impulse to expose that violence as immanent to modernity. This is in part because they share the same language, as reflected by Marx’s insistence that blue books are the only evidence of systemic violence one needs. Though we often think of Marxist thought as working to fill in the gaps in the official discourse, I am suggesting instead that we attend to what Marx presupposes is the radical transparency of the language of domination—the presupposition that violence and exploitation had become self-evident, and were written brazenly on the surface of things in the language of the perpetrators. We might therefore take Robbins’ call to place the writing of atrocity within a longue durée of moral development as an invitation to theorize this intersection of the genealogy of self-accusation and unwanted identification with the historical transformations which allowed atrocity to be written legibly and out in the open, rather than hidden or buried in secret.

    At the same time that we see extensive evidence of such a complex public discourse for engaging atrocity in 19th century Britain, we also know that in different national and cultural contexts, literary and artistic production began to develop a wide array of aesthetic strategies for representing atrocity throughout the 19th century while simultaneously problematizing the presumed security of the disinterested observer. Goya’s Disasters of War come to mind, as does the archive of 19th century photographs that Nathan Hensley and Zahid Chaudhary have recently written about; indeed Hensley has helped us to see precisely how these hermeneutic questions about the representation of violence and its implied spectators remain unanswered in the aftermath of empire (see Chaudhury 2012; Hensley 2013). Similarly, slave narrative and abolitionist literature in the United States—which of course tended not to focus only on specific atrocities but on the systemic and juridical nature of slavery under capitalism—bear directly on Robbins’ claims about the 19th century’s representational capacity for moral indictment. However, I present these not so much as counter-examples, but rather as indices of the more particular absence that Robbins has helped us to identify. We know that British imperial atrocities were voluminously documented and often publicly debated as potentially undermining the civilizational project; and we know that the 19th century saw the development of a more radical social scientific and socially critical discourse of self-accusation, that sprouted up out of an official discourse of disavowal; and, finally, we know as well that other aesthetic traditions in other cultural contexts have done a better job than the British novel at representing atrocities through some form of self-accusation or communal indictment.

    So then one question: What to call this kind of ideological absence or moral-aesthetic caesura? How does it work, and how can we grasp its psychosocial dynamics? I put the question this way, since we have previously relied on the vocabulary of symptom and repression to elaborate precisely these absences. And yet it seems clear, today, as it has for some time, that the tools afforded by the vocabulary of cultural neurosis don’t quite satisfy here, given that we are not dealing with an occluded or concealed discourse of atrocity that “returns” from its repression in the interstices of the literary text, but rather with the more disjointed, more deranged fact that this proliferate and public discourse did not find its fullest expression in the exemplary aesthetic form of the period, that is, in the novel. Why not? My sense is that we still need to sharpen and refine our historical account of the ways in which representation functions vis-à-vis the intolerable, the unwanted, the atrocious, and the unrepresentable—a newly sharpened account of the writing of the disaster that takes into account the different species of blindness and specific patterns of resistance endemic to modern literary forms.

    These caesuras in the political consciousness of the Victorian novel become all the more jarring when we consider that, over the 19th century, literary texts, and perhaps the novel in particular, emerged as the cultural laboratory for testing out Enlightenment ideals and for exposing them as violent or vacuous, as cruelty in themselves—whether in the name of reactionary sentiment or liberalizing social critique or some impulses more nihilistic than either of those. I am thinking of earlier works like Juliette and Gulliver’s Travels just as much as later, increasingly socially engaged texts such as Our Mutual Friend, La Terre, Notes from Underground and Jude the Obscure. Considered from this angle, the literary domain in the 19th century was a sophisticated and complex arena for elaborating a deeply affective experience of unwanted self-implication and inevitable participation in a destructive order, founded on tenuous, inverted values.

    Even if the 19th century did not “possess a public capable of demanding or enforcing scrutiny of ourselves from outside” (Robbins 2016: 24), it is clear to my mind that later authors as diverse as Achebe, Vallejo and Sebald returned to this more nihilistic 19th century conception of literature as a privileged space for giving voice to an unwanted relation of immanence in the destructive processes of modernity. Indeed, the outraged self-accusation Robbins describes, in order to transcend mere bad faith or ressentiment, needs to involve a more disturbing set of identifications than simply seeing oneself as though from without. A literary genealogy of civilizational self-accusation, then, might follow unpredictable lines back through unexpected pages, from the mushroom clouds of the 20th century Robbins begins with to the storm-clouds of the 19th. How can we further specify and describe this negative structure of feeling in the novel, give it a longer history that doesn’t stop and start according to the arbitrary constraints of post-hoc periodization, and which attends to its ever-shifting blind spots and its insights alike?

    References

    Chaudhury, Zahid. 2012. Afterimage of Empire: Photography in Nineteenth Century India. Princeton, NJ: Princeton University Press.

    Hensley, Nathan. 2013. “Curatorial Reading and Endless War.” Victorian Studies 56, no.1: 59-83.

    Holt, Tom. 1992. The Problem of Freedom. Baltimore: Johns Hopkins University Press.

    Marx, Karl. (1857) 1975. “Investigations of Tortures in India.” Reprinted in Marx, The First Indian War of Independence, 1857-1859. Moscow: Progress Publishers.

    Rao, Anupama. 2001. “Problems of Violence, States of Terror: Torture in Colonial India.” Interventions 3, no. 2:186-205

    Report of the Commissioners for the Investigation of Alleged Cases of Torture in the Madras Presidency. 1855. Madras: Fort St. George Gazette Press.

    Robbins, Bruce. “Atrocity as Self-Accusation.” 2016.

     

    CONTRIBUTOR’S NOTE

    Zachary Samalin is Assistant Professor of English at the University of Chicago.  He is currently working on a manuscript, The Masses Are Revolting: Victorian Culture and the Aesthetics of Disgust.