David Golumbia and David Simpson begin a conversation, inviting comment below or via email to boundary 2:
What are we talking about when we talk about drones? Is it that they carry weapons (true of only a small fraction of UAVs), that they have remote, mobile surveillance capabilities (true of most UAVs, but also of many devices not currently thought of as drones), or that they have or may someday have forms of operational autonomy (a goal of many forms of robotics research)? Is it the technology itself, or the fact that it is currently being deployed largely by the world’s dominant powers, or the way it is being deployed? Is it the use of drones in specific military contexts, or the existence of those military conflicts per se (that is, if we endorsed a particular conflict, would the use of drones in that scenario be acceptable)? Is it that military use of drones leads to civilian casualties, despite the fact that other military tactics almost certainly lead to many more casualties (the total number of all persons, combatant and non-combatant, killed by drones to date by US operations worldwide is estimated at under 4000; the number of civilian casualties in the Iraq conflict alone even by conservative estimates exceeds 100,000 and may be as many as 500,000 or even more), a reduction in total casualties that forms part of the arguments used by some military and international law analysts to suggest that drone use is not merely acceptable but actually requiredunder international law, which mandates that militaries use the least amount of lethal force available to them that will effectively achieve their goals? If we object to drones based on their use in targeted killings, do we accept their use for surveillance? If we object only to their use in targeted killing, does that objection proceed from the fact that drones fly, or do we actually object to all forms of automated or partly-automated lethal force, along the lines of the Stop Killer Robots campaign, whose scope goes well beyond drones, and yet does not include non-lethal drones? How do we define drones so as to capture what is objectionable about them on humanitarian and civil society grounds, given how rapidly the technology is advancing and how difficult it already is to distinguish some drones from other forms of technology, especially for surveillance? What do we do about the proliferating “positive” use cases for drones (journalism, remote information about forest fires and other environmental problems, for example), which are clearly being developed in part so as to sell drone technology in general to the public, but at least in some cases appear to describe vital functions that other technology cannot fulfill?
What resources can we call upon, invent or reinvent in order to bring effective critical attention to the phenomenon of drone warfare? Can we revivify the functions of witness and testimony to protest or to curtail the spread of robotic lethal violence? What alliances can be pursued with the radical journalism sector (Medea Benjamin, Jeremy Scahill)? Is drone warfare inevitably implicated in a seamlessly continuous surveillance culture wherein all information is or can be weaponized? A predictable development in the command-control-communication-intelligence syndrome articulated some time ago by Donna Haraway? Can we hope to devise any enforceable boundaries between positive and destructive uses of the technology? Does it bring with it a specific aesthetics, whether for those piloting the drones or those on the receiving end? What is the profile of psychological effects (disorders?) among those observing and then killing at a distance? And what are the political obligations of a Congress and a Presidency able to turn to drone technology as arguably the most efficient form yet devised for deploying state terrorism? What are the ethical obligations of a superpower (or indeed a local power) that can now wage war with absolutely no risk to its own combatants?
a review of Christian Fuchs, Social Media: A Critical Introduction
by Zachary Loeb
~ Legion are the books and articles describing the social media that has come before. Yet the tracts focusing on Friendster, LiveJournal, or MySpace now appear as throwbacks, nostalgically immortalizing the internet that was and is now gone. On the cusp of the next great amoeba-like expansion of the internet (wearable technology and the “internet of things”) it is a challenging task to analyze social media as a concept while recognizing that the platforms being focused upon—regardless of how permanent they seem—may go the way of Friendster by the end of the month. Granted, social media (and the companies whose monikers act as convenient shorthand for it) is an important topic today. Those living in highly digitized societies can hardly avoid the tendrils of social media (even if a person does not use a particular platform it may still be tracking them), but this does not mean that any of us fully understand these platforms, let alone have a critical conception of them. It is into this confused and confusing territory that Christian Fuchs steps with his Social Media: A Critical Introduction.
It is a book ostensibly targeted at students. Though when it comes to social media—as Fuchs makes clear—everybody has quite a bit to learn.
By deploying an analysis couched in Marxist and Critical Theory, Fuchs aims not simply to describe social media as it appears today, but to consider its hidden functions and biases, and along the way to describe what social media could become. The goal of Fuchs’s book is to provide readers—the target audience is students, after all—with the critical tools and proper questions with which to approach social media. While Fuchs devotes much of the book to discussing specific platforms (Google, Facebook, Twitter, WikiLeaks, Wikipedia), these case studies are used to establish a larger theoretical framework which can be applied to social media beyond these examples. Affirming the continued usefulness of Marxist and Frankfurt School critiques, Fuchs defines the aim of his text as being “to engage with the different forms of sociality on the internet in the context of society” (6) and emphasizes that the “critical” questions to be asked are those that “are concerned with questions of power” (7).
Thus a critical analysis of social media demands a careful accounting of the power structures involved not just in specific platforms, but in the larger society as a whole. So though Fuchs regularly returns to the examples of the Arab Spring and the Occupy Movement, he emphasizes that the narratives that dub these “Twitter revolutions” often come from a rather non-critical and generally pro-capitalist perspective that fail to embed adequately uses of digital technology in their larger contexts.
Social media is portrayed as an example, like other media, of “techno-social systems” (37) wherein the online platforms may receive the most attention but where the, oft-ignored, layer of material technologies is equally important. Social media, in Fuchs’s estimation, developed and expanded with the growth of “Web 2.0” and functions as part of the rebranding effort that revitalized (made safe for investments) the internet after the initial dot.com bubble. As Fuchs puts it, “the talk about novelty was aimed at attracting novel capital investments” (33). What makes social media a topic of such interest—and invested with so much hope and dread—is the degree to which social media users are considered as active creators instead of simply consumers of this content (Fuchs follows much recent scholarship and industry marketing in using the term “prosumers” to describe this phenomenon; the term originates from the 1970s business-friendly futurology of Alvin Toffler’s The Third Wave). Social media, in Fuchs’s description, represents a shift in the way that value is generated through labor, and as a result an alteration in the way that large capitalist firms appropriate surplus value from workers. The social media user is not laboring in a factory, but with every tap of the button they are performing work from which value (and profit) is skimmed.
Without disavowing the hope that social media (and by extension the internet) has liberating potential, Fuchs emphasizes that such hopes often function as a way of hiding profit motives and capitalist ideologies. It is not that social media cannot potentially lead to “participatory democracy” but that “participatory culture” does not necessarily have much to do with democracy. Indeed, as Fuchs humorously notes: “participatory culture is a rather harmless concept mainly created by white boys with toys who love their toys” (58). This “love their toys” sentiment is part of the ideology that undergirds much of the optimism around social media—which allows for complex political occurrences (such as the Arab Spring) to be reduced to events that can be credited to software platforms.
What Fuchs demonstrates at multiple junctures is the importance of recognizing that the usage of a given communication tool by a social movement does not mean that this tool brought about the movement: intersecting social, political and economic factors are the causes of social movements. In seeking to provide a “critical introduction” to social media, Fuchs rejects arguments that he sees as not suitably critical (including those of Henry Jenkins and Manuel Castells), arguments that at best have been insufficient and at worst have been advertisements masquerading as scholarship.
Though the time people spend on social media is often portrayed as “fun” or “creative,” Fuchs recasts these tasks as work in order to demonstrate how that time is exploited by the owners of social media platforms. By clicking on links, writing comments, performing web searches, sending tweets, uploading videos, and posting on Facebook, social media users are performing unpaid labor that generates a product (in the form of information about users) that can then be sold to advertisers and data aggregators; this sale generates profits for the platform owner which do not accrue back to the original user. Though social media users are granted “free” access to a service, it is their labor on that platform that makes the platform have any value—Facebook and Twitter would not have a commodity to sell to advertisers if they did not have millions of users working for them for free. As Fuchs describes it, “the outsourcing of work to consumers is a general tendency of contemporary capitalism” (111).
While miners of raw materials and workers in assembly plants are still brutally exploited—and this unseen exploitation forms a critical part of the economic base of computer technology—the exploitation of social media users is given a gloss of “fun” and “creativity.” Fuchs does not suggest that social media use is fully akin to working in a factory, but that users carry the factory with them at all times (a smart phone, for example) and are creating surplus value as long as they are interacting with social media. Instead of being a post-work utopia, Fuchs emphasizes that “the existence of the internet in its current dominant capitalist form is based on various forms of labour” (121) and the enrichment of internet firms is reliant upon the exploitation of those various forms of labor—central amongst these being the social media user.
Fuchs considers five specific platforms in detail so as to illustrate not simply the current state of affairs but also to point towards possible alternatives. Fuchs analyzes Google, Facebook, Twitter, WikiLeaks and Wikipedia as case studies of trends to encourage and trends of which to take wary notice. In his analysis of the three corporate platforms (Google, Facebook and Twitter) Fuchs emphasizes the ways in which these social media companies (and the moguls who run them) have become wealthy and powerful by extracting value from the labor of users and by subjecting users to constant surveillance. The corporate platforms give Fuchs the opportunity to consider various social media issues in sharper relief: labor and monopolization in terms of Google, surveillance and privacy issues with Facebook, the potential for an online public sphere and Twitter. Despite his criticisms, Fuchs does not dismiss the value and utility of what these platforms offer, as is captured in his claim that “Google is at the same time the best and the worst thing that has ever happened on the internet” (147). The corporate platforms’ successes are owed at least partly to their delivering desirable functions to users. The corrective for which Fuchs argues is increased democratic control of these platforms—for the labor to be compensated and for privacy to pertain to individual humans instead of to businesses’ proprietary methods of control. Indeed, one cannot get far with a “participatory culture” unless there is a similarly robust “participatory democracy,” and part of Fuchs’s goal is to show that these are not at all the same.
WikiLeaks and Wikipedia both serve as real examples that demonstrate the potential of an “alternative” internet for Fuchs. Though these Wiki platforms are not ideal they contain within themselves the seeds for their own adaptive development (“WikiLeaks is its own alternative”—232), and serve for Fuchs as proof that the internet can move in a direction akin to a “commons.” As Fuchs puts it, “the primary political task for concerned citizens should therefore be to resist the commodification of everything and to strive for democratizing the economy and the internet” (248), a goal he sees as at least partly realized in Wikipedia.
While the outlines of the internet’s future may seem to have been written already, Fuchs’s book is an argument in favor of the view that the code can still be altered. A different future relies upon confronting the reality of the online world as it currently is and recognizing that the battles waged for control of the internet are proxy battles in the conflict between capitalism and an alternative approach. In the conclusion of the book Fuchs eloquently condenses his view and the argument that follows from it in two simple sentences: “A just society is a classless society. A just internet is a classless internet” (257). It is a sentiment likely to spark an invigorating discussion, be it in a classroom, at a kitchen table, or in a café.
* * *
While Social Media: A Critical Introduction is clearly intended as a text book (each chapter ends with a “recommended readings and exercises” section), it is written in an impassioned and engaging style that will appeal to anyone who would like to see a critical gaze turned towards social media. Fuchs structures his book so that his arguments will remain relevant even if some of the platforms about which he writes vanish. Even the chapters in which Fuchs focuses on a specific platform are filled with larger arguments that transcend that platform. Indeed one of the primary strengths of Social Media is that Fuchs skillfully uses the familiar examples of social media platforms as a way of introducing the reader to complex theories and thinkers (from Marx to Habermas).
Whereas Fuchs accuses some other scholars of subtly hiding their ideological agendas, no such argument can be made regarding Fuchs himself. Social Media is a Marxist critique of the major online platforms—not simply because Fuchs deploys Marx (and other Marxist theorists) to construct his arguments, but because of his assumption that the desirable alternative for the internet is part and parcel of a desirable alternative to capitalism. Such a sentiment can be found at several points throughout the book, but is made particularly evident by lines such as these from the book’s conclusion: “There seem to be only two options today: (a) continuance and intensification of the 200-year-old barbarity of capitalism or (b) socialism” (259)—it is a rather stark choice. It is precisely due to Fuchs’s willingness to stake out, and stick to, such political positions that this text is so effective.
And yet, it is the very allegiance to such positions that also presents something of a problem. While much has been written of late—in the popular press in addition to by scholars—regarding issues of privacy and surveillance, Fuchs’s arguments about the need to consider users as exploited workers will likely strike many readers as new, and thus worthwhile in their novelty if nothing else. Granted, to fully go along with Fuchs’s critique requires readers to already be in agreement or at least relatively sympathetic with Fuchs political and ethical positions. This is particularly true as Fuchs excels at making an argument about media and technology, but devotes significantly fewer pages to ethical argumentation.
The lines (quoted earlier) “A just society is a classless society. A just internet is a classless internet” (257) serve as much as a provocation as a conclusion. For those who ascribe to a similar notion of “a just society” Fuchs book will likely function as an important guide to thinking about the internet; however, to those whose vision of “a just society” is fundamentally different from his, Fuchs’s book may be less than convincing. Social Media does not present a complete argument about how one defines a “just society.” Indeed, the danger may be that Fuchs’s statements in praise of a “classless society” may lead to some dismissing his arguments regarding the way in which the internet has replicated a “class society.” Likewise, it is easy to imagine a retort being offered that the new platforms of “the sharing economy” represent the birth of this “classless society” (though it is easy to imagine Fuchs pointing out, as have other critics from the left, that the “sharing economy” is simply more advertising lingo being used to hide the same old capitalist relations). This represents something of a peculiar challenge when it comes to Social Media, as the political commitment of the book is simultaneously what makes it so effective and that which threatens the book’s potential political efficacy.
Thus Social Media presents something of a conundrum: how effective is a critical introduction if its conclusion offers a heads-and-tails choice between “barbarity of capitalism or…socialism”? Such a choice feels slightly as though Fuchs is begging the question. While it is curious that Fuchs does not draw upon critical theorists’ writings about the culture industry, the main issues with Social Media seem to be reflections of this black-and-white choice. Thus it is something of a missed chance that Fuchs does not draw upon some of the more serious critics of technology (such as Ellul or Mumford)—whose hard edged skepticism would nevertheless likely not accept Fuchs’s Marxist orientation. Such thinkers might provide a very different perspective on the choice between “capitalism” and “socialism”—arguing that “technique” or “the megamachine” can function quite effectively in either. Though Fuchs draws heavily upon thinkers in the Marxist tradition it may be that another set of insights and critiques might have been gained by bringing in other critics of technology (Hans Jonas, Peter Kropotkin, Albert Borgmann)—especially as some of these thinkers had warned that Marxism may overvalue the technological as much as capitalism does. This is not to argue in favor of any of these particular theorists, but to suggest that Fuchs’s claims would have been strengthened by devoting more time to considering the views of those who were critical of technology, capitalism and of Marxism. Social Media does an excellent job of confronting the ideological forces on its right flank; it could have benefited from at least acknowledging the critics to its left.
Two other areas that remain somewhat troubling are in regards to Fuchs’s treatment of Wiki platforms and of the materiality of technology. The optimism with which Fuchs approaches WikiLeaks and Wikipedia is understandable given the dourness with which he approaches the corporate platforms, and yet his hopes for them seem somewhat exaggerated. Fuchs claims “Wikipedians are prototypical contemporary communists” (243), partially to suggest that many people are already engaged in commons based online activities and yet it is an argument that he simultaneously undermines by admitting (importantly) the fact that Wikipedia’s editor base is hardly representative of all of the platform’s users (it’s back to the “white boys with toys who love their toys”), and some have alleged that putatively structureless models of organization like Wikipedia’s actually encourage oligarchical forms of order. Which is itself not to say anything about the role that editing “bots” play on the platform or the degree to which Wikipedia is reliant upon corporate platforms (like Google) for promotion. Similarly, without ignoring its value, the example of WikiLeaks seems odd at a moment when the organization seems primarily engaged in a rearguard self-defense whilst the leaks that have generated the most interest of late has been made to journalists at traditional news sources (Edward Snowden’s leaks to Glenn Greenwald, who was writing for The Guardian when the leaks began).
The further challenge—and this is one that Fuchs is not alone in contending with—is the trouble posed by the materiality of technology. An important aspect of Social Media is that Fuchs considers the often-unseen exploitation and repression upon which the internet relies: miners, laborers who build devices, those who recycle or live among toxic e-waste. Yet these workers seem to disappear from the arguments in the later part of the book, which in turn raises the following question: even if every social media platform were to be transformed into a non-profit commons-based platform that resists surveillance, manipulation, and the exploitation of its users, is such a platform genuinely just if to use it one must rely on devices whose minerals were mined in warzones, assembled in sweatshops, and which will eventually go to an early grave in a toxic dump? What good is a “classless (digital) society” without a “classless world?” Perhaps the question of a “capitalist internet” is itself a distraction from the fact that the “capitalist internet” is what one gets from capitalist technology. Granted, given Fuchs’s larger argument it may be fair to infer that he would portray “capitalist technology” as part of the problem. Yet, if the statement “a just society is a classless society” is to be genuinely meaningful than this must extend not just to those who use a social media platform but to all of those involved from the miner to the manufacturer to the programmer to the user to the recycler. To pose the matter as a question, can there be participatory (digital) democracy that relies on serious exploitation of labor and resources?
Social Media: A Critical Introduction provides exactly what its title promises—a critical introduction. Fuchs has constructed an engaging and interesting text that shows the continuing validity of older theories and skillfully demonstrates the way in which the seeming newness of the internet is itself simply a new face on an old system. While Fuchs has constructed an argument that resolutely holds its position it is from a stance that one does not encounter often enough in debates around social media and which will provide readers with a range of new questions with which to wrestle.
It remains unclear in what ways social media will develop in the future, but Christian Fuchs’s book will be an important tool for interpreting these changes—even if what is in store is more “barbarity.”
_____
Zachary Loeb is a writer, activist, librarian, and terrible accordion player. He earned his MSIS from the University of Texas at Austin, and is currently working towards an MA in the Media, Culture, and Communications department at NYU. His research areas include media refusal and resistance to technology, ethical implications of technology, alternative forms of technology, and libraries as models of resistance. Using the moniker “The Luddbrarian” Loeb writes at the blog librarianshipwreck. He previously reviewed The People’s Platform by Astra Taylor for boundary2.org. Back to the essay
This article presents three models of emergency politics—deliberative (Elaine Scarry), promiscuous (Douglas Crimp), and legalist (Louis Freeland Post)—and assesses their promise and limits for democratic theory and practice. Emergency politics names not the friend/enemy decisionism of Carl Schmitt but rather the idea that emergency may be taken to promote a focus not just on survival but also on sur-vivance—a future-oriented practice of countersovereignty. One model of this alternative form of emergency politics can be found in the Slow Food movement, which incorporates elements from all three models and embraces the paradox of politics that underwrites democratic politics in general.
~
Alexander R. Galloway’s forthcoming Laruelle: Against the Digital is a welcome and original entry in the discussion of French theorist François Laruelle’s thought. The book is at once both pedagogical and creative: it succinctly summarizes important aspects of Laruelle’s substantial oeuvre by placing his thought within the more familiar terrain of popular philosophies of difference (most notably the work of Gilles Deleuze and Alain Badiou) and creatively extends Laruelle’s work through a series of fourteen axioms.
The book is a bridge between current Anglophone scholarship on Laruelle, which largely treats Laruelle’s non-standard philosophy through an extension of problematics common to contemporary continental philosophy (Mullarkey 2006, Mullarkey and Smith 2012, Smith 2013, Gangle 2013, Kolozova 2014), and such scholarship’s maturation, which blazes new territory because it takes thought to be “an exercise in perpetual innovation” (Brassier 2003, 25). As such, Laruelle: Against the Digital stands out from other scholarship in that it is not primarily a work of exposition or application of the axioms laid out by Laruelle. This approach is apparent from the beginning, where Galloway declares that he is not a foot soldier in Laruelle’s army and he does not proceed by way of Laurelle’s “non-philosophical” method (a method so thoroughly abstract that Laruelle appears to be the inheritor of French rationalism, though in his terminology, philosophy should remain only as “raw material” to carry thinking beyond philosophy’s image of thought). The significance of Galloway’s Laruelle is that he instead produces his own axioms, which follow from non-philosophy but are of his own design, and takes aim at a different target: the digital.
The Laruellian Kernel
Are philosophers no better than creationists? Philosophers may claim to hate irrationalist leaps of faith, but Laruelle locates such leaps precisely in philosophers’ own narcissistic origin stories. This argument follows from Chapter One of Galloway’s Laruelle, which outlines how all philosophy begins with the world as ‘fact.’ For example: the atomists begin with change, Kant with empirical judgment, and Fichte with the principle of identity. And because facts do not speak for themselves, philosophy elects for itself a second task — after establishing what ‘is’ — inventing a form of thought to reflect on the world. Philosophy thus arises out of a brash entitlement: the world exists to be thought. Galloway reminds us of this through Gottfried Leibniz, who tells us that “everything in the world happens for a specific reason” (and it is the job of philosophers to identify it), and Alfred North Whitehead, who alternatively says, “no actual entity, then no reason” (so it is up to philosophers to find one).
For Laruelle, various philosophies are but variations on a single approach that first begins by positing how the world presents itself, and second determines the mode of thought that is the appropriate response. Between the two halves, Laruelle finds a grand division: appearance/presence, essence/instance, Being/beings. Laruelle’s key claim is that philosophy cannot think the division itself. The consequence is that such a division is tantamount to cheating, as it wills thought into being through an original thoughtless act. This act of thoughtlessly splitting of the world in half is what Laruelle calls “the philosophical decision.”
Philosophy need not wait for Laruelle to be demoted, as it has already done this for itself; no longer the queen of the sciences, philosophy seems superfluous to the most harrowing realities of contemporary life. The recent focus on Laruelle did indeed come from a reinvigoration of philosophy that goes under the name ‘speculative realism.’ Certainly there are affinities between Laruelle and these philosophers — the early case was built by Ray Brassier, who emphasizes that Laruelle earnestly adopts an anti-correlationalist position similar to the one suggested by Quentin Meillassoux and distances himself from postmodern constructivism as much as other realists, all by positing the One as the Real. It is on the issue of philosophy, however, that Laruelle is most at odds with the irascible thinkers of speculative realism, for non-philosophy is not a revolt against philosophy nor is it a patronizing correction of how others see reality. 1 Galloway argues that non-philosophy should be considered materialist. He attributes to Laruelle a mix of empiricism, realism, and materialism but qualifies non-philosophy’s approach to the real as not a matter of the givenness of empirical reality but of lived experience (vécu) (Galloway, Laruelle, 24-25). The point of non-philosophy is to withdraw from philosophy by short-circuiting the attempt to reflect on what supposedly exists. To be clear: such withdrawal is not an anti-philosophy. Non-philosophy suspends philosophy, but also raids it for its own rigorous pursuit: an axiomatic investigation of the generic. 2
From Decision to Digital
A sharp focus on the concept of “the digital” is Galloway’s main contribution — a concept not in the forefront of Laruelle’s work, but of great interest to all of us today. Drawing from non-philosophy’s basic insight, Galloway’s goal in Laruelle is to demonstrate the “special connection” shared by philosophy and digital (15). Galloway asks his readers to consider a withdrawal from digitality that is parallel to the non-philosophical withdrawal from philosophy.
Just as Laruelle discovered the original division to which philosophy must remain silent, Galloway finds that the digital is the “basic distinction that makes it a possible to make any distinction at all” (Laruelle, 26). Certainly the digital-analog opposition survives this reworking, but not as one might assume. Gone are the usual notions of online-offline, new-old, stepwise-continuous variation, etc. To maintain these definitions presupposes the digital, or as Galloway defines it, “the capacity to divide things and make distinctions between them” (26). Non-philosophy’s analogy for the digital thus becomes the processes of distinction and decision themselves.
The dialectic is where Galloway provocatively traces the history of digitality. This is because he argues that digitality is “not so much 0 and 1” but “1 and 2” (Galloway, Laruelle, 26). Drawing on Marxist definitions of the dialectical process, he defines the movement from one to two as analysis, while the movement from two to one is synthesis (26-27). In this way, Laruelle can say that, “Hegel is dead, but he lives on inside the electric calculator” (Introduction aux sciences génériques, 28, qtd in Galloway, Laruelle, 32). Playing Badiou and Deleuze off of each other, as he does throughout the book, Galloway subsequently outlines the political stakes between them — with Badiou establishing clear reference points through the argument that analysis is for leftists and synthesis for reactionaries, and Deleuze as a progenitor of non-philosophy still too tied to the world of difference but shrewd enough to have a Spinozist distaste for both movements of the dialectic (Laruelle, 27-30). Galloway looks to Laruelle to get beyond Badiou’s analytic leftism and Deleuze’s “Spinozist grand compromise” (30). His proposal is a withdrawal in the name of indecision that demands abstention from digitality’s attempt to “encode and simulate anything whatsoever in the universe” (31).
Insufficiency
Insufficiency is the idea into which Galloway sharpens the stakes of non-philosophy. In doing so, he does to Laruelle what Deleuze does to Spinoza. While Deleuze refashions philosophy into the pursuit of adequate knowledge, the eminently practical task of understanding the conditions of chance encounters enough to gain the capacity to influence them, Galloway makes non-philosophy into the labor of inadequacy, a mode of thought that embraces the event of creation through a withdrawal from decision. If Deleuze turns Spinoza into a pragmatist, then Galloway turns Laruelle into a nihilist.
There are echoes of Massimo Cacciari, Giorgio Agamben, and Afro-pessimism in Galloway’s Laruelle. This is because he uses nihilism’s marriage of withdrawal, opacity, and darkness as his orientation to politics, ethics, and aesthetics. From Cacciari, Galloway borrows a politics of non-compromise. But while the Italian Autonomist Marxist milieu of which Cacciari’s negative thought is characteristic emphasizes subjectivity, non-philosophy takes the subject to be one of philosophy’s dirty sins and makes no place for it. Yet Galloway is not shy about bringing up examples, such as Bartleby, Occupy, and other figures of non-action. Though as in Agamben, Galloway’s figures only gain significance in their insufficiency. “The more I am anonymous, the more I am present” Galloway repeats from Tiqqun to axiomatically argue the centrality of opacity (233-236). There is also a strange affinity between Galloway and Afro-pessimists, who both oppose the integrationist tendencies of representational systems ultimately premised on the exclusion, exploitation, and elimination of blackness. In spite of potential differences, they both define blackness as absolute foreclosure to being; from which, Galloway is determined to “channel that great saint of radical blackness, Toussaint Louveture,” in order to bring about a “cataclysm of human color” through the “blanket totality of black” that “renders color invalid” and brings about “a new uchromia, a new color utopia rooted in the generic black universe” (188-189). What remains an open question is: how does such a formulation of the generic depart from the philosophy of difference’s becoming-minor, whereby the liberation must first pass through the figures of the woman, the fugitive, and the foreigner?
Actually Existing Digitality
One could read Laruelle not as urging thought to become more practical, but to become less so. Evidence for such a claim comes in his retreat to dense abstract writing and a strong insistence against providing examples. Each is an effect of non-philosophy’s approach, which is both rigorous and generic. Although possibly justified, there are those who stylistically object to Laruelle for taking too many liberties with his prose; most considerations tend make up for such flights of fancy by putting non-philosophy in communication with more familiar philosophies of difference (Mullarkey 2006; Kolozova 2014). Yet the strangeness of the non-philosophical method is not a stylistic choice intended to encourage reflection. Non-philosophy is quite explicitly not a philosophy of difference — Laruelle’s landmark Philosophies of Difference is an indictment of Hegel, Heidegger, Nietzsche, Derrida, and Deleuze. To this end, non-philosophy does not seek to promote thought through marginality, Otherness, or any other form of alterity.
Readers who have henceforth been frustrated with non-philosophy’s impenetrability may be more attracted to the second part of Galloway’s Laruelle. In part two, Galloway addresses actually existing digitality, such as computers and capitalism. This part also includes a contribution to the ethical turn, which is premised on a geometrically neat set of axioms whereby ethics is the One and politics is the division of the One into two. He develops each chapter through numerous examples, many of them concrete, that helps fold non-philosophical terms into discussions with long-established significance. For instance, Galloway makes his way through a chapter on art and utopia with the help of James Turrell’s light art, Laruelle’s Concept of Non-Photography, and August von Briesen’s automatic drawing (194-218). The book is over three hundred-pages long, so most readers will probably appreciate the brevity of many of the chapters in part two. The chapters are short enough to be impressionistic while implying that treatments as fully rigorous as what non-philosophy often demands may be much longer.
Questions
While his diagrammatical thinking is very clear, I find it more difficult to determine during Galloway’s philosophical expositions whether he is embracing or criticizing a concept. The difficulty of such determinations is compounded by the ambivalence of the non-philosophical method, which adopts philosophy as its raw material while simultaneously declaring that philosophical concepts are insufficient. My second fear is that while Galloway is quite adept at wielding his reworked concept of ‘the digital,’ his own trademark rigor may be lost when taken up by less judicious scholars. In particular, his attack on digitality could form the footnote for a disingenuous defense of everything analog.
There is also something deeper at stake: What if we are in the age of non-representation? From the modernists to Rancière and Occupy, we have copious examples of non-representational aesthetics and politics. But perhaps all previous philosophy has only gestured at non-representational thought, and non-philosophy is the first to realize this goal. If so, then a fundamental objection could be raised about both Galloway’s Laruelle and non-philosophy in general: is non-philosophy properly non-thinking or is it just plain not thinking? Galloway’s axiomatic approach is a refreshing counterpoint to Laruelle’s routine circumlocution. Yet a number of the key concepts that non-philosophy provides are still frustratingly elusive. Unlike the targets of Laruelle’s criticism, Derrida and Deleuze, non-philosophy strives to avoid the obscuring effects of aporia and paradox — so is its own use of opacity simply playing coy, or to be understood purely as a statement that the emperor has no clothes? While I am intrigued by anexact concepts such as ‘the prevent,’ and I understand the basic critique of the standard model of philosophy, I am still not sure what non-philosophy does. Perhaps that is an unfair question given the sterility of the One. But as Hardt and Negri remind us in the epigraph to Empire, “every tool is a weapon if you hold it right.” We now know that non-philosophy cuts — what remains to be seen, is where and how deeply.
_____
Andrew Culp is a Visiting Assistant Professor of Rhetoric Studies at Whitman College. He specializes in cultural-communicative theories of power, the politics of emerging media, and gendered responses to urbanization. In his current project, Escape, he explores the apathy, distraction, and cultural exhaustion born from the 24/7 demands of an ‘always-on’ media-driven society. His work has appeared Radical Philosophy, Angelaki, Affinities, and other venues.
_____
Notes
1. There are two qualifications worth mentioning: first, Laruelle presents non-philosophy as a scientific enterprise. There is little proximity between non-philosophy’s scientific approach and other sciences, such as techno-science, big science, scientific modernity, modern rationality, or the scientific method. Perhaps it is closest to Althusser’s science, but some more detailed specification of this point would be welcome. Back to the essay
2. Galloway lays out the non-philosophy of generic immanence, The One, in Chapter Two of Laruelle. Though important, Galloway’s main contribution is not a summation of Laruelle’s version of immanence and thus not the focus of this review. Substantial summaries of this sort are already available, including Mullarkey 2006, and Smith 2013. Back to the essay
Bibliography
Brassier, Ray (2003) “Axiomatic Heresy: The Non-Philosophy of François Laruelle,” Radical Philosophy 121.
Gangle, Rocco (2013) François Laruelle’s Philosophies of Difference (Edinburgh, UK: Edinburgh University Press).
Kolozova, Katerina (2014) Cut of the Real (New York, USA: Columbia University Press).
Hardt, Michael and Antonio Negri (2000) Empire (Cambridge, MA: Harvard University Press).
Laruelle, François (2010/1986) Philosophies of Difference (London, UK and New York, USA: Continuum).
Laruelle, François (2011) Concept of Non-Photography (Falmouth, UK: Urbanomic).
Mullarkey, John (2006) Post-Continental Philosophy (London, UK: Continuum).
Mullarkey, John and Anthony Paul Smith (eds) (2012) Laruelle and Non-Philosophy (Edinburgh, UK: Edinburgh University Press).
Smith, Anthony Paul (2013) A Non-Philosophical Theory of Nature (New York, USA: Palgrave Macmillan).
‘Uighur academic Ilham Tohti sits during his trial on separatism charges in Urumqi, Xinjiang region, in this still image taken from video shot on September 17-18, 2014. REUTERS/CCTV via Reuters TV’ credit: Reuters a lecture presented at the University of Pittsburgh on September 10th, 2014
by Arif Dirlik
~ I will make a case in this discussion* for closer attention to demands on criticism thrown up by current global circumstances that are yet to be recognized in mainstream critical practice for their urgent significance. That we are living through a time of unprecedented crisis is widely acknowledged. What is less certain is whether this crisis is one of the crises endemic to the capitalist world system, an outcome of systemic transformations at work that suggest an impending hegemonic shift (with the People’s Republic of China[PRC] as the up-and-coming claimant), or a terminal crisis that signals the collapse of life as we know it as unbridled capitalist development in its various competing versions runs up against the ecological limitations of the earth.
At the same time, the social and geo-cultural issues that have dynamized criticism for the past half century seem presently to have reached a dead-end. The drift to social division, political authoritarianism and cultural fragmentation no doubt is responsible for the apparent sense of helplessness that has become the refrain of critical work, and needs to frame discussion of the crisis of criticism. But there is also an urgent need to attend to the part played in this crisis by the failure of critical practice to update its concerns in response to changing social and global circumstances. These circumstances call for reconsideration of the conceptual and political orientations that inspired criticism in its origins in the 1960s, but are most striking presently for their seeming helplessness if not irrelevance in the face of a new global situation.
Of special interest in my discussion are issues of culture and cultural difference at both national and global levels. The relationship between culture and criticism has been a staple for the last two decades both of postcolonial criticism and geopolitical thinking, provoked by questions pertaining to the past and present status of the hegemony of Euromodernity and Eurocentric ways of thinking. Ongoing reconfiguration of power relations globally, and emergent claims to alternative “centrisms”(and “alternative modernities”), suggest a need to recast the terms of this relationship: whether or not criticism, if it is to remain meaningful, needs to reconsider some of the intellectual and ideological impulses that have driven it since the upheavals of the 1960s. Any such consideration raises delicate political questions, which may be one fundamental reason for the reluctance to confront them. Criticism, if it is to be worthy of the name, needs to face up to these problems lest it in its silence over these questions it degenerates into complicity with emergent configurations of political power, social oppression, and cultural obscurantism.
Central to the question of criticism are the problematic legacies of the Enlightenment as the cultural hallmark of Euromodernity, especially the issue of universalism. The Enlightenment has been credited with the achievements of Euromodernity. It also has been condemned for the latter’s destructive consequences. Its claims to universality have drawn much criticism in recent years along with the challenges to Euromodernity. As the Enlightenment also has been endowed with seminal significance as the fountainhead of critical practice, the appearance of alternative claims on modernity throws up significant questions for criticism. I take up some of these questions below.
It is not my intention here to engage in an abstract discussion of what may constitute “criticism,” which already has been taken up by a long line of thinkers but also because too much preoccupation with abstraction often ends up in a theoretical autism that afflicts much critical writing that appears lost in the maze of its own theoretical elaborations. Suffice it to say that I understand criticism not in the routine professionalized and politically constrained sense that it appears in our educational system (as in the promise of cultivating “critical thinking”—often not very critical in what it excludes), but radical critical work that seeks to go “to the root” of things, pursues inquiry into foundations and totalities, into the very categories of analysis we deploy to grasp and explain the conditions of our existence, and throws it all back in the face of power to demand a better world. Critical work in any meaningful sense needs to be transformative in its consequences, not just in exploring more efficient functioning of the existing system but in opening its social and political assumptions to questioning and change. It seems increasingly that there is no promise on the horizon of all the things criticism seeks to achieve (including “critical thinking”), which raises painful questions about the meaning of radical criticism and what is to be expected of its further pursuit. And yet, this makes criticism not less but all the more urgent against a status quo whose promise of a bright future secured by unencumbered markets and technological innovation is not sufficient to cover over the deepening marginalization if not the threat of actual extinction of ever greater numbers of people around the world–dangers widely recognized even by those who preside over the existing system, as well as those who are responsible for its ideological sustenance.1
* * *
I would like to enter my discussion through a scandalous incident that took place at the recent 20th biennial meeting of the European Association of Chinese Studies (EACS). The meeting this year, hosted by the venerable universities of Minho and Coimbra in Portugal, was devoted to the exploration of the development of China studies, entitled, “From the origins of Sinology to current interdisciplinary research approaches: Bridging the past and future of Chinese Studies.” When they received their conference programs, the participants discovered that two pages had been torn out of the programs by the organizers, apparently at the insistence of Mme. Xu Lin, Director-General of the Hanban, the Peoples Republic of China (PRC) state organ in charge of the so-called Confucius Institutes, who in 2009 was appointed counselor to the State Council (the cabinet) with vice-ministerial rank, presumably in recognition of her contribution to the propaganda goals of the state. The pages torn out related to the Chiang Ching-kuo Foundation in Taiwan, which long had sponsored the EACS and, according to a report in a Taiwanese newspaper, donated 650, 000 Taiwanese yuan (around US$ 22,000.-) to this year’s meeting.2 EACS investigation of the incident also found that, according to Mme. Xu, some of the abstracts in the program “were contrary to Chinese regulations, and issued a mandatory request that mention of the support of the CCSP [Confucius China Studies Program] be removed from the Conference Abstracts. She was also annoyed at what she considered to be the limited extent of the Confucius Institute publicity and disliked the CCKF [Chiang Ching-kuo Foundation] self-presentation.”3
This act of academic vandalism has been met with dismay, at least among those who are still capable of being shocked at the intrusion of PRC propaganda organs into the very institutional structures of academic work. If I may share with you responses from distinguished colleagues who must remain nameless since I do not have their permission to cite them by name:
A Danish historian who long has been involved with EACS:
Indeed, what did the organizers of the conference and the EACS have in mind when accepting such a move? It is a very hot summer in Europe, but surely no excuse for not fighting Hanban considering the very long relationship between the EACS and the CCK Foundation. As far as I have understood, the CCK Foundation did not even have any representatives present at the conference! Well, it is difficult in Europe in general fighting back Hanban’s Confucius Institutes…
A distinguished historian of religion in China from the University of Paris, presently teaching in Hong Kong:
Europeans are even more gutless than Americans, and clearly no less stupid. You are right: disgusting! Every book I put out in Shanghai I have to fight to get “CCK-financed” in the English acknowledgements. Impossible to put it in the Chinese version.
A US historian of religion commenting on a news item on the conference I had posted on FB:
Moments like these when the veil drops are precious, let’s hope it exposes some truths.
A distinguished anthropologist from Beijing University:
This kind of “original rudeness” has been practiced for decades as “civility.” A disgrace, urgently needing treatment.
And after I asked him to further explain these terms:
by “civility” I usually refer to civilization; “original rudeness” is what I invent in English to describe the rough manners encouraged in Mao’s time and continued to be performed until now. In old and new Chinese movies, we often see those boys or girls who look really straightforward and “foolish” are more attractive to their opposite sex. To some extent, this kind of rudeness has been seen as what expresses honesty…but the bad performance from the official of Hanban might just be another thing. I would see it as stupid; but other Chinese may see it differently – some may be even proud of him[sic] we can see from this that cosmopolitan civility is still needed in China.
I share these messages with you to convey a sense of the deep frustration among many scholars of China with their impotence against the insinuation of PRC state and propaganda organs in educational institutions in Europe and North America.4 In the case of the colleague from Beijing University, there is also embarrassment at the delinquent behavior of a government official, combined with a different kind of frustration: that the act is unlikely to make much impression on a PRC academic and popular culture that is inured to vandalism if it does not actually condone it, beginning with the Party-state itself.
The frustration is not restricted to scholars of China. The Canadian Association of Higher Education Teachers and the American Association of University Professors have both rebuked universities in the two countries for allowing Confucius Institutes into universities and/or for their compliance with the terms set by the PRC.5 University of Chicago professors have petitioned the university administration to reconsider its agreement with the Hanban. The most thorough and eloquent criticisms of the institutes have been penned not by a China specialist but the distinguished anthropologist Marshall Sahlins.6 This broad involvement of university faculty indicates that the issues at hand go beyond Confucius Institutes or the PRC, and is revealing of accumulating frustration with significant trends that promise to end higher education as we have known it. The Institutes have been beneficiaries but also possibly the most offensive instance to date of the increasingly blatant administrative usurpation of faculty prerogatives in university governance, progressive subjection of education to business interests, and the normalization of censorship in education. At the behest of the Hanban for confidentiality, agreements over the institutes have been entered in most cases without consultation with the faculty, or at best with select faculty who, whatever the specific motivations may be in individual cases, display few qualms about complying with trends to administrative opacity or the secrecy demanded by the propaganda arm of a foreign state. The promise of the institutes to serve as bridges to business opportunities with the PRC has served as a major enticement, giving business and even local communities a stake in their acceptance and promotion, but further compromising academic autonomy. Despite all manner of self-serving protestations by those involved in the institutes, formally entered agreement to avoid issues that might conflict with so-called Chinese cultural and political norms—or whatever might “hurt the feelings of the Chinese people”—translates in practice to tacit self-censorship on questions the PRC would like to keep out of public hearing—the well-known issues of Taiwan, Tibet, June Fourth, jailed dissidents, etc., etc. It also legitimizes censorship.7
These issues concern, or should concern, everyone who has a stake in higher education. The questions facing scholars of China are narrower in focus and more specific to disciplinary concerns, but they may be even more fundamental and far-reaching in their implications than the institutional operations of the university. Beneath mundane issues of language teaching, teacher quality, academic rigorousness lie a very important question: who controls the production of knowledge about China. Like other similar organizations, including the Chiang Ching-kuo Foundation, the Hanban has already entered the business of sponsoring research and conference in research universities. But control is another matter. Interestingly, in its very vulgarity, Xu Lin’s attempt to suppress the mention of a Taiwan competitor at an academic conference brings up this question more insistently than the sugar-coated representations of Confucius Institutes as simple providers of knowledge of Chinese language and culture to school-children, or facilitators of business. The conjoining of teaching and business in Hanban activity itself should give us pause about easy acceptance of those representations. But the problem goes deeper.
It is a puzzle that a great many commentators in the US and Europe should be in self-denial about PRC aspirations to global hegemony when within the PRC it is a matter of ongoing conversation among Party leaders and influential opinion-makers, as well as the general public. To be sure, there is no end of speculation over elusive questions of whether or not and when the PRC might achieve global hegemony.8 But there is far less attention to the more immediate question of aspirations to hegemony—except among some on the right—possibly because it might fuel animosity and ill-feeling. It seems safer to go along with the more diplomatically innocuous official statements that all the PRC wants is equality and equal recognition, not world hegemony, even as it carves out spaces of “influence” around the globe.
In recent years, PRC leaders have made no secret that they wish to replace the existing world order over which the US presides. At the most modest level, President Xi Jinping’s suggestion to the US President that the Pacific was big enough for the two countries to share as part of a “new great power relationship” was remarkable for its erasure of everyone else who lives within or around the Pacific. It would take the utter blindness of servile partisanship to portray PRC activity in eastern Asia, based on spurious historical claims, as anything but moves to establish regional hegemony which, John Mearsheimer has argued, is the first step in the establishment of global hegemony—a Monroe Doctrine for Eastern Asia.9At the popular level, an obscure philosopher at the Chinese Academy of Sciences, Zhao Tingyang, has achieved fame nationally and in international power circles for his design of an alternative to the current international system based on a modernized version of the hierarchical “Under-Heaven”(Tianxia) tributary system that informed imperial China until the early twentieth century.10
Zhao’s work is interesting because it has been acclaimed as a plausible example of the call for “IR theory with Chinese characteristics” that corresponds to the PRC’s rising status—a call that eloquently brings together knowledge-production and the search for hegemony. The prevalent obsession with tagging the phrase “Chinese characteristics” onto everything from the most mundane to the most abstractly theoretical is well-known. But it seems to have acquired some urgency with the Xi Jinping leadership’s apparent desire to regulate “Western” influence on scholarship and intellectual activity in general as part of his vaunted “China Dream” that also includes the elimination of corruption along with rival centers of power, enhancing Party prestige and control over society, and the projection of PRC hard and soft power both upon the global scene.
The policy blueprint laid down in the landmark third plenary session of the 18th Central Committee stressed “the strengthening of propaganda powers and the establishment of a Chinese system of discourse (Zhongguo huayu xitong) to propel Chinese culture into the world at large (tuidong Zhonghua wenhua zouxiang shijie).”11 The discourse is to be constructed upon the three pillars of “the fine tradition of making Marxism Chinese,” or “socialism with Chinese characteristics,” the creation of a contemporary Chinese culture by melding the Chinese and the foreign, and the old and the new. The Xi leaderships stress on the “ninety-year” revolutionary tradition—perhaps the foundation of Party legitimacy—is not necessarily in conflict with the plans for greater integration with the global neoliberal economy, since in Party theorization of “Chinese Marxism” the content of “socialism with Chinese characteristics” is subject to change in response to changing circumstances—and in accordance with the policies of each new generation of leaders.12 While the “China dream” is the subject of ongoing discussion, Xi Jinping has made his own the long standing “dream” of the rejuvenation and renaissance of the Chinese nation as the marker of “socialism with Chinese characteristics” under his leadership. Lest this be taken to be a return to a parochial conservatism, it is important to note that discussions of “Chinese discourse” note his emphasis on “making our own the good things from others” as well as “making the old serve the present” as fundamental characteristic of “Chinese” cultural identity. It might be recalled that the latter slogan caused much distress among foreign observers during the Cultural Revolution amidst reports that peasants, taking the slogan at its word, had begun to dismantle the Great Wall to use its stones to build homes for themselves! Presently, according to President Xi, the rich products of this 5000 year old tradition should be taken out to the world to foster awareness of the universal value of a living Chinese culture that transcends spatial and temporal boundaries in its rich intellectual and artistic achievements. He also called upon Chinese scholars around the world to “tell China’s story” (Zhongguode gushi).
A PRC expert on foreign relations and the US active in global international relations circles has provided a convenient summary of Party leaders and intellectuals’ close attention to “discursive struggles” over the last decade, beginning with the Hu Jintao leadership, and its institutional and intellectuals issues.13 The motivation, as he puts it, was to carve out a political cultural space of its own corresponding to the PRC’s rising stature as a world power:
Although China has already joined the mainstream international community through this policy [Deng Xiaoping’s opening-up policy], one of the main findings of the paper is that China does not want to be a member of Western system. Instead, China is in the process of developing a unique type of nation-building to promote the Chinese model in the coming years.14
The formulation of a Chinese discourse was both defensive and promotional: to defend the PRC against its portrayals as a threat to world economy and politics, but at the same time to promote an image that would enhance its reputation in the world as a counterpart to a declining US hegemony caught up in constant warfare, economic problems, cultural disintegration and waning prestige.
It is interesting, however, that revamping the propaganda apparatus in public relations guise drew its inspiration mainly from the US example. The major inspiration was the idea of “soft power” formulated by the US scholar and one time government official Joseph Nye. US public relations practices and institutions are visible in everything from sending intellectuals out to the world to present a picture of PRC realities as the “Chinese people” perceive them to hosting international events, from publication activity in foreign languages to TV programming, from students sent abroad to students attracted to the PRC, and in the wholesale transformation of the very appearance and style of those who presented the PRC to the world. The idea of discourse was of Foucauldian inspiration, subject to much interpretation and misinterpretation. But its basic sense was quite clear. Participants in the discussion of discursive power and in its institutional formulations “all emphasize discourse as a kind of power structure and analyze the power of discourse through the lens of dominant characteristics such as culture, ideology and other norms. They consist of the ways we think and talk about a subject matter, influencing and reflecting the ways we act in relation to it. This is the basic premise of discourse theory.”15 And they all share a common goal. In the author’s own words, without editing,
Obviously, China chooses to join the international society led by a western value held concept from thirty years ago, but it did not plan to accept completely the named “universal value concept” of the western countries, nor wish to be a member of those countries. Instead, China wishes to start from its national identity and form a world from China’s word, and insist in the development road with Chinese characteristics, so as to realize the great revival of the Chinese nation. In order to realize this century dream, China is busy drawing on its discursive power and achieving this strategy with great efforts in public diplomacy.16
Confucius Institutes (going back to 2004) were conceived as part of this discursive struggle, with “Confucius identified as a teaching brand to promote the[sic] Chinese culture.”17 Language teaching was crucial to this task. The number of foreigners learning Chinese (“40 million” at last count) is itself a matter of pride, but the ultimate goal is the assimilation of “Chinese culture” through introduction to the language and whatever cultural resources may be available locally (from art, opera, singing and dancing to cooking and wine-tasting). It would be good to know how so-called Chinese culture is actually represented in the classroom beyond these consumer routines. To my knowledge, no one has so far been able to do a thorough ethnography of the Institutes, partly because of the opaqueness (at the “mandatory request” of Hanban) of their operations.18 One of the most interesting and probably far-reaching aspect of Hanban educational activities is to employ higher education Confucius Institutes as platforms to reach out into the community and public school classrooms. While we may only guess at the intentions behind this outreach, I think it is plausible to assume that they are not there to train future China specialists, although that, too, may happen, but to create cultural conditions where “China” ceases to be foreign, and acquires the same kind of familiarity that most people around the world have with United States cultural activity and products; at its best, to feel at home in a Chinese world. Kids in kindergarten and elementary school are more likely to be amenable to this goal than the less reliable college students!19
Lest it seem that I am reading too much into this activity, let me recall a portrayal of an imaginary (“dreamlike?”) Chinese world by Tu Wei-ming, former Harvard professor, prominent promoter of Confucianism as a global idea, and presently founding Dean of the Institute for Advanced Humanistic Studies at Beijing University—a highly respected and influential senior intellectual. In an essay published in 1991, he offered the following as a description of what he called “cultural China”:
Cultural China can be examined in terms of a continuous interaction of three symbolic universes. The first consists of mainland China, Taiwan, Hong Kong, and Singapore—that is, the societies populated predominantly by cultural and ethnic Chinese. The second consist of Chinese communities throughout the world, including a politically significant minority in Malaysia…and a numerically negligible minority in the United States…The third symbolic universe consists of individual men and women, such as scholars, teachers, journalists, industrialists, traders, entrepreneurs, and writers, who try to understand China intellectually and bring their conceptions of China to their own linguistic communities. For the last four decades the international discourse on cultural China has been shaped more by the third symbolic universe than by the first two combined…Sinologists in North America Japan, Europe, and increasingly Australia have similarly exercised a great deal of power in determining the scholarly agenda for cultural China as a whole.20
“China’s rise” over the last two decades has reconfigured the geography of “cultural China,” and the dynamics of the interaction between these three “symbolic universes,” with the relocation of the “center” in mainland China which now seeks to bring the other two spheres under its hegemony. We need not view Tu’s description as some kind of blueprint in order to appreciate the valuable insight it offers into reading the contemporary situation. The PRC seeks to bring under its direct rule the Chinese societies of Hong Kong and Taiwan, with Singapore somewhat more problematic given its distance from the mainland, and this despite the fact that it served as a model for PRC development beginning in the 1990s. Chinese overseas are obviously a major target of PRC cultural activity, especially now that their numbers are being swelled by new immigrants from the PRC with considerable financial and political clout. What I have discussed above—and the Xu Lin episode—provide sufficient evidence, I think, to indicate the significance placed upon expanding the third sphere, and shaping its activities. Hegemony over the production of knowledge on China is crucial to this end.
There is nothing particularly earth-shattering about this activity except that the PRC’s habitual conspiratorial behavior makes it seem so. We may observe that the PRC is doing what other hegemonic powers—especially the US—have done before it: recruit foreign constituencies in the expansion of cultural power. To put it in mundane terms, as the so-called “West” established its global hegemony by creating “westernized” foreigners, the PRC in search of hegemony seeks through various means to expand the sphere of “Chinized” foreigners, to use the term offered by the author of the article discussed above.21
There has been considerable success over the last decade in promoting a positive image for the PRC globally, although it is still unclear how much of this success is due not to cultural activity but the economic lure of a fast developing economy.22 PRC analysts are quite correct to feel that this may be the opportune moment, given that the existing hegemon is mired in social division, dysfunctional political conflict, continual warfare and a seeming addiction to a culture of violence. It is also the case that the craze for what is called “development” trumps in the eyes of political leaders and large populations around the world qualms about human rights and democracy, especially where these are not major concerns to begin with.
It is also the case that similarly to its predecessors going back to the Guomindang in the 1930s, the current PRC regime has been unable to overcome a nativist provincialism intertwined with anxieties about the future of the Communist Party that is a major obstacle to its hegemonic aspirations.23 Complaints about cultural victimization and national humiliation sit uneasily with assertions of cultural superiority and aspirations to global hegemony. Hankerings for a global “Tianxia” ignore that despite the scramble to partake of the PRC’s economic development, other nation-states are just as keen about their political sovereignty and cultural legacies as the PRC itself—just as surely as they are aware of the spuriousness of claims to genetic peacefulness when PRC leaders, with enthusiastic support from public opinion, openly declare that “national rejuvenation” includes the recapture, if necessary by violence, the domains of their imperial predecessors, and then some.24 Pursuit of the globalization of so-called “Chinese culture” is accompanied by a cultural defensiveness that tags “Chinese characteristics” to everything from the most mundane everyday practices to crucial realms of state ideology. Claims of universal value for Chinese cultural products are rendered questionable by the simultaneous denial of universality as a tool of “Western” hegemony. PRC leaders and their spokespeople officially deny any aspirations to global hegemony, needless to say, but then we might wonder what they have in mind when they accuse other powers of “obstructing China’s rise,” when those powers celebrate the PRC’s economic development on which they have become dependent, and allow its propaganda organs into their educational systems! Similarly, if the goal is not hegemony over knowledge production about “China,” why would these same leaders and their functionaries be so concerned to show the world the universal value of Chinese civilization, when that is already very much part of the global perception that has made the PRC the beneficiary of a benign Orientalism—or tear out pages of a conference program on the Chiang Ching-kuo Foundation which shares the same goal of promoting “Chinese” civilization?
While the new “public relations” approach has yielded impressive results, discursive struggle entails more than a competition in the global cultural or “discourse market,”25 but finds expression also in the suppression of competing discourses at home and abroad. The “good things” from the outside world do not include the seven deadly sins which have been expressly forbidden as “dangerous western influences”: universal values, freedom of speech, civil society, civil rights, the historical errors of the Chinese Communist Party, crony capitalism, and judicial independence.26While the PRC boasts a constitution, talk of matters such as “constitutional democracy” is not to be permitted.27 A prohibition against the use of terms like “democracy,” “dictatorship,” “class,” etc., has been in effect for some time and, according to a colleague from Shanghai, authorities look askance at the use even of a seemingly word like “youth” (qingnian) in titles of scholarly works. Just recently, the Institute of Modern History of the Chinese Academy of Social Sciences was chosen by the Party Central Commission for Discipline Inspection as the location from which to warn the Academy that “it had been infiltrated by foreign forces.”28 The persecution and incarceration of both Han and Minority scholars and activists who transgress against these prohibitions is a matter of daily record. The same commentator who was cited above for the reference to a “global discourse market,” writes that “basically speaking,” the prohibitions have not changed the widespread attitude of reverence in the intellectual world for things western, “the blind and superstitious following of western scholarship and theories, and entrapment in the western `discourse pitfall’ (xianjin).” People may contend all they want, she concludes, but the discourse we need is one with Chinese “airs” (fengge) that strengthens China’s “discursive power”(huayu quan).29 This translates in practice to the construction of theories (including Marxism) and historical narratives built around Chinese development (with the Party at its core) that may also serve as inspiration if not an actual model for others.
* * *
The case of the PRC is especially important for illustrating the challenge to knowledge production of the reconfigurations of global power, but it is by no means the only one. Arguably even more egregious than Xu Lin’s attempt at censorship at the EACS conference was the lawsuit brought against the University of Chicago scholar Wendy Doniger’s book, The Hindus: An Alternative History, for its alleged insults to Indian religion, which resulted in Penguin publishers’ agreement to pulp the copies of the book in India. The lawsuit was brought by a Dina Nath Batra whose own books, devoted to purging the study of the past of “Western cultur[al]” influences, have been compulsory reading in Gujarat under state minister, Narendra Modi, now the prime minister of India. The Modi government recently appointed as the chairman of the Indian Council of Historical Research a little-known historian also devoted to what Indian scholars describe as “the saffronization of education.”30
If such incidents were just about censorship, we could easily ignore them as merely more vulgar and extreme cases of censorship which is not particularly novel at either the national or the global level, including in the USA. This is not to downplay their significance as threats to democracy and academic freedom globally, as they also set examples for others. Silence before such acts is to be complicit in oppressive practices.
Nevertheless, it would be a serious mistake to allow preoccupation with these oppressive practices to distract attention from even deeper problems with long term consequences. What renders these acts truly significant are the alternative knowledge or value system in whose name the censorship is exercised. The grievances that they express are hardly to be denied. Nor may we dismiss without due consideration the alternatives they offer at a time when the existing order presided over by Euro/American hegemony shows every sign of being unsustainable materially and spiritually.
It has been clear for some time now that “our ways of knowing” are in deep crisis. The social upheaval of the 1960s brought diverse new constituencies into educational institutions, who demanded representation both in the content of learning and its mode of delivery, which has expanded the scope of knowledge enormously but also made it more complicated than ever to determine what is and is not worth knowing. Similarly, on the global scene, postcolonial and postrevolutionary regimes that emerged from post-World War II national liberation struggles demand new kinds of knowledge that counter the erasure of their pasts and their cultural interests by colonial domination and imperialist hegemony. 31 This has been a concern all along of Chinese revolutionaries of differing stripes. The Gandhian legacy in India is even better known. The list may easily be expanded to include diverse peoples around the world, from indigenous peoples to formerly imperial entities. The colonial hubris that “progress” or “modernization” would doom to forgetfulness the pasts of the colonized or the dominated overlooked the very part colonial domination and imperial hegemony played in provoking the construction of the pasts that would serve the cause of independence and development. Those pasts have surfaced with a vengeance, insisting on their own voices in modernity, and the inclusion of their pasts in its making. Their very presence exposes the fallibility of the knowledge claims of Euromodernity, and the damage it has inflicted on nature and human societies in the very course of forcing them onto the path of “progress.” Almost by tacit common consent, it seems, modern knowledge is on trial, facing claimants who demand recognition of their various versions of how things came to be, and where they would like to see them headed.
These claims, however, are beset by contradictions. The same processes that have opened up the intellectual space to “alternative modernities,” as they are described, also are inexorably forcing people into a common future that will allow no viable alternative—what is commonly called globalization and/or development. This is a condition that I have described as global modernity: the simultaneous integration of the world through the globalization of capital, and its fracturing along a variety of faultlines which finds expression not only in conflicts of interest but in the assertion of reified sovereign cultural identities.<sup32 The contradiction is visible also in the realm of knowledge in the denial of universality to social, political and cultural practices while endowing with nearly universal status the logic of technology and the culture of consumption. The former appear not only as endowments of nation or civilization, but also as guarantors that identity will not be lost in its globalization. This is the significance of knowledge production in support of the cultivation of those values. On the other hand, it is difficult to keep apart the two realms of knowledge, the kind of knowledge for success in the capitalist economy and the kind of knowledge necessary to the cultivation of national or civilizational identity, as the dynamic interplay between the two realms produce new demands on identity and subjectivity.33 For over a century now, Chinese thinkers and leaders have not been able to find an answer to their search for a modernity that would preserve and strengthen a “Chinese” substance with “Western” instrumentality, the famous ti-yong distinction. Indeed, I hope it is clear from my discussion above of the search for a “Chinese discursive system” that even the effort to eliminate the influence of so-called “Western discourse” resorts to a conceptual vocabulary provided by the latter. This does not mean that there are no real differences among peoples, but it does suggest that those differences be viewed at all times also through the commonalities which are also a pervasive presence.
It seems deeply ironic that economic and to some extent social and cultural globalization should signal the end of universalism but it is not very surprising. Political universals follow the logic not of philosophy but of power and hegemony. Globalization may have been intended to complete the conquest of the globe for the capitalist modernity that for nearly half a millennium had empowered Euro/American domination. Capitalist modernity has emerged victorious, but contrary to expectations, rather than buttress the existing centers of hegemony, its benefits have gone mostly to challengers who now make their own claims on global power and hegemony, in the process denying the universality of value- and knowledge-claims that for two centuries have denied recognition to their intellectual and ethical inheritances. The denial of universality is at bottom little more than the denial of Euro/American hegemony in search of intellectual and ethical sovereignty, with the exception of the PRC whose aspirations, I have suggested, suggest not just a defensive nationalism but alternative global designs.
It might be useful here to recall two competing metaphors that appeared in the 1990s, almost simultaneously, that have a direct bearing on this question: the “clash of civilizations,” put forward by the late Samuel Huntington, and “hybridization,” that has held a central place in postcolonial criticism.34 We can see both paradigms at work in the contemporary world, albeit in different mixes and subject to local inflections. It is interesting that both paradigms criticized Eurocentric universalism, if for different reasons. Huntington’s exclusivist culturalism led him to advocate hardened cultural boundaries for the reason that others did not or could not share the values the “West” considered universal. Postcolonial criticism, on the other hand, perceived in hybridity the possibility of rendering cultural boundaries porous as a first step in the recognition of cultures only unsuccessfully suppressed under Euromodernity, and offering the possibility of exchange and negotiation between different cultural entities once they had achieved some measure of equivalence. Radical critics have understandably been drawn to the latter alternative, and in the process ignored the appeals of the “clash” paradigm among patriotic groups, including “leftist” patriotic groups in countries like China where memories of revolutionary anti-imperialism survive the abandonment of revolution. The puzzling attraction to Carl Schmitt’s friend/enemy distinction among such groups also appears more easily comprehensible when taken in conjunction with the Huntingtonian anticipation of “clash” if and when these groups emerged from under the hegemony of “western civilization,” which they already seemed to be doing when he offered his paradigm in the early 1990s. The “clash” paradigm has insistently moved to the foreground over the last two decades. The “hybridity” paradigm is by no means dead, but its vulnerabilities have also become increasingly evident. Cultural hybrids are not “things,” as they may appear in their biological counterparts—like nectarines, as it were—but complexes of relationships and contradictions the resolution of which depends on concrete historical circumstances.35 Put bluntly, depending on context, “hybrids” may end up on the political right or the left—or anywhere on a broad spectrum of possibilities. The stress in much postcolonial criticism on hybridity along ethnic, national or “civilizational” boundaries, moreover, invites reification of these categories, distracting attention from the differences and hybridities in their very constitution. In a global environment of counterrevolutionary shift to the right—combined with nostalgia for lost imperial greatness—pressures to exclusionary culturalism along these boundaries are quite powerful despite intensifying transnationalism propelled by a globalized capitalism. This may be seen, for example, in the growth of diasporic nationalism in closer identification with nations of origin, especially in the case of countries such as the PRC, India and Turkey which have registered impressive success in their ability to employ globalization to national ends.
What these changes imply for critical practice is worth pondering. Globalization insistently forces into one common intellectual space diverse conversations on knowledge and values. It creates commonalities but also differences that challenge assumptions of universality in hegemonic societies that long have been able to treat alternative voices as a minor nuisance. Comparisons between the present and Cold War conflicts are widely off the mark. Cold War confrontations between capitalism and socialism presupposed competing political economic spatialities, but shared common assumptions about universality. Socialism assumed national form, to be sure, but we may recall that differences between existing socialist societies were voiced in the language of “revisionism,” suggesting deviation from a political project informed by universal principles. To take the case of the Chinese revolution, when revolutionaries in the 1940s began to insist on “making Marxism Chinese”(Makesi zhuyi Zhongguohua), the project was conceived as the integration of “the universal principles of Marxism” with the concrete circumstances of Chinese society. The phrase is still commonplace in ideological discourse in postrevolutionary PRC, but more as a fading trace from the past than a meaningful guide to the future. The globalization of capitalism has abolished the competing spaces of political economy. Differences are expressed instead in claims to alternative cultural spaces. “Socialism with Chinese characteristics” is above all a cultural idea yoked to aspirations of national rejuvenation that are conspicuously suspicious of universality. To speak of “revisionism” in our day would no doubt seem farcically anachronistic. The global space capitalism claimed in the aftermath of the Cold War is already fragmenting under pressure from claims of cultural difference empowered by reconfigurations of the capitalist world economy. If universalism persists as a goal, it can no longer be phrased in the same terms as it was under the hegemony of Euromodernity, but will have to be formulated out of contemporary conversations that now include voices silenced or marginalized under the regime of Euromodernity.
Rescuing alternative knowledge and value systems from the erasures of Euromodernity has been part and parcel of radical critical thinking since the 1960s, nourished by a very universalist belief in the possibilities of human diversity. This task is much more complicated than it may appear. What these alternative knowledge and value systems consist of has been open to question all along—whether we speak of the cultures of women, ethnicities, indigenous peoples or nations and civilizations. The “traditions” that identified nations and civilizations in Euro/American modernization discourses were reified misrepresentations of complex intellectual and cultural legacies, often with blurred boundaries between the inside and the outside. Diversity in these societies is erased by a multiculturalism that similarly identifies “authentic” cultural identity with reified traditions.
The relationship to Euromodernity has been equally complicated. After two centuries of global revolutionary transformation, it is hardly possible to speak of East/West, Asia/Europe, Chinese/Western, etc., as if they were mutually exclusive cultural entities. The cultural identities that are attributed to Chineseness, Hinduism, Islam, or even more crudely, continental entities like Asia and Europe, are ironically legacies of Euromodern reification of these cultural entities. Their defense equally notably, draws upon the language of critical analysis that is rejected for being “Western.” Their sustenance requires not only warding off baneful “Western influences” by political fiat but also erasing or rewriting memories of their own revolutionary pasts in which those influences played crucial parts. After all, while the Communist Party of China may insist on the “Chineseness” of its Marxism, there is still a persistent reminder in the term “Marxism” of what it owes to the outside world, and the universalist vision that initially inspired its politics. Scholars of religion have argued that “religion” itself is a category that came with “the West,” along with all the other disciplinary appellations that have shaped the discourse on learning globally.
The point here is that how we respond to claims on alternative knowledges and values—or what appears in our discourse as national or global “multiculturalism—is not simply a matter of respect for difference, or of cultural tolerance and cosmopolitanism, but is deeply political in its implications that calls for critical judgment and discrimination, not just on competing cultural claims but more profoundly the notions of culture that inform them. Radical multiculturalism driven by universal human goals that temper difference with commonality is a different matter entirely than the multiculturalism of an identity politics obsessed with difference, with little regard for commonality, the managerial multiculturalism of transnational corporations, or the consumptive multiculturalism promoted by global capitalism. The appreciation of “cultural complexity,” the porosity of cultural boundaries, and the historicity of culture that emerged from the radical struggles of the 1960s challenged the reification of culture in modernization discourse but never quite overcame it. It has retreated in intervening years before the “polyculturalism” that multi-national corporations began to promote at about the same time, which replicated the reification of culture in modernization discourse, albeit with a recognition of contemporary presence to “traditions” that hitherto had been viewed as relics of doomed historical legacies.36 “Difference,” likewise, has come to overshadow commonality as categories that inspired collective affinity and action such as class or third world solidarity” have lost their plausibility, or have been systematically discredited, along with the universalist ethic in which they were grounded.
In her recent study, Moral Clarity, Susan Neiman writes that “the relativism that holds all moral values to be created equal is a short step from the nihilism that holds all talk of values to be superfluous.”37 We know that just as all cultural legacies and practices (including our own) are not bad, neither are they all good. We know that different cultural orientations have different motivations and consequences, so they are not all equal, without resorting to the language of good and evil. We know, or should know, that everyday life presents us with antinomies where choice seems impossible. We are all familiar with problems in the imposition of gender norms across ethnic and national boundaries. How do we respond when an elected member of the national assembly is prevented from taking her seat on account of wearing a head-dress, setting secular against democratic commitments? How do we respond when in the name of national order and security a state abuses its own citizens and intellectuals? What do we do when the identification of national culture with a set of religious precepts results in the oppression not only of its secular intellectuals but other sets of religious precepts upheld by its minority populations? Perhaps most relevant to the question at hand of critical practice, how do we respond to the bizarre proclamation of an American academic that academic freedom is a “Western” idea that should not be imposed upon others when a PRC academic loses his job for his promotion of “Western” freedoms? There are differences within differences, and dealing with them calls upon us to make choices, choices that are not just intellectual but deeply ethical and political.
Neiman’s study is devoted to an argument for the retrieval of Enlightenment values that have been under attack for the last half century from the left, for their alleged complicity in Euro/ American imperialism and, from the right, for the secular humanism that allegedly has undermined national morality and purpose. The argument draws on the work of Jonathan Israel, who has drawn a distinction between radical and moderate Enlightenment, with the former supplying most of the values that have come to be associated with Enlightenment as such. Israel identifies the “basic principles” of radical Enlightenment as:
democracy; racial and sexual equality; individual liberty of lifestyle; full freedom of thought, expression, and the press; eradication of religious authority from the legislative process and education; and full separation of church and state… Its universalism lies in its claim that all men have the right to pursue happiness in their own way, and think and say whatever they see fit, and no one, including those who convince others they are divinely chosen to be their master, rulers or spiritual guides, is justified in denying or hindering others in the enjoyment of rights that pertain to all men and women equally.38
These are the same values, we might add, that are condemned by spokespeople for the PRC regime, orthodox Muslims, or fundamentalist Hindus for their incompatibility with so-called native cultures which, in their claims to cultural purity, find alibi in multi-culturalist reification of cultural identity. Among the foremost casualties of the repudiation of the Enlightenment in cultural criticism is criticism itself. In the words of the British writer, Kenan Malik,
The issue of free speech and the giving of offence have become central to the multiculturalism debate. Speech, many argue, must be less free in a plural society. For such societies to function and be fair, we need to show respect for all cultures and beliefs. And to do so requires us to police public discourse about those cultures and beliefs, both to minimize friction between antagonistic cultures and beliefs, and to protect the dignity of individuals embedded in them. As [Tariq] Modood puts it, “If people are to occupy the same political space without conflict, they mutually have to limit the extent to which they subject each others’ fundamental beliefs to criticism.” One of the ironies of living in a plural society, it seems, is that the preservation of diversity requires us to leave less room for a diversity of views.39
What we seem to be witnessing, I might add, is a slide to the logic of communal politics. The motivating impulse behind multi-culturalism may be the recognition of difference, but even more significant is the part it plays in producing and defining cultural identities.40
* * *
About a year ago, I had the pleasure of visiting a university in your neighboring state to the north at the invitation of the Department of Sociology. Over a casual dinner, some mention was made of the Enlightenment, possibly by self, as a resource for countering the seemingly worldwide drift to intellectual and cultural obscurantism. The response from one of the colleagues was swift and decisive: “there is nothing good to be said for the Enlightenment!”
What impressed me most about this response was the categorical denial of ambiguity and historicity to the Enlightenment and its legacies that left no opening for critical engagement and dialogue. The Enlightenment presently invites criticism for endowing with universal status what were but the cultural assumptions of an emergent capitalist modernity infused with the values of its Euro/American origins. This meant by implication the denial of contemporary validity and relevance to alternative epistemologies and value-systems. In the unfolding of Euromodernity, universal reason would be captured for economic and technological rationality, and universal morality for the moral imperatives of the market economy and the nation-state. Euro/American capitalism was entangled from its origins in the colonization of known and unknown lands and peoples. Colonial modernity found ideological justification for rule over others in its claims to universal reason and morality, which made it “the white man’s burden” to rescue them from stagnant “traditions” they were mired in and usher them into modernity. Under the hegemony of Euromodernity, these assumptions have guided both politics and the production of knowledge of the world. Others—exterminated, colonized, deracinated, hegemonized—until recently have been silenced, by force if necessary but most effectively by being woven into an epistemological web designed by the hegemonic according to the dictates of Euromodernity. As a recent work puts it,
Euro-American social theory, as writers from the south have often observed, has tended to treat modernity as though it were inseparable from Aufklarung, the rise of Enlightenment reason. Not only is each taken to be a condition of the other’s possibility, but together they are assumed to have animated a distinctively European mission to emancipate humankind from its uncivil prehistory, from a life driven by bare necessity, from the thrall of miracle and wonder, enchantment and entropy. 41
None of this should be in dispute for anyone with an unbiased eye. What may be done about it, however, is much more problematic. Critics of the Enlightenment range from those who object to its ethnocentrism and its entanglement in colonial modernity to Tea Party ethnocentrists critical of democracy, science and secular humanism. The choices we make in dealing with the legacies of two centuries of colonial erasure and imperial hegemony are not merely intellectual, they are also profoundly political. The anti-hegemonic impulse that informs criticisms of the Enlightenment from anti-colonial anti-racist or gendered perspectives is more than matched by the service such criticism renders to political and cultural reaction and repression globally.42
The fact that these attacks on Enlightenment culture and epistemology coincide with the globalization of capitalist modernity should give us pause about rendering the Enlightenment and Euromodernity into Siamese twins, or dissolving the one into the other.43 If Euromodernity was about Enlightenment, it was also about religious legacies the Enlightenment sought to counter that nevertheless shaped European societies, about narratives of capitalism and the nation-state. There are different possibilities in the articulation of these various narratives that shape our understanding of the emergence and consequences of the Enlightenment. Where “social theory” is concerned, too much emphasis has been placed on its Eurocentrism, obscuring its origins in the need for new ways of organizing knowledge demanded by the rise of capitalism and the nation-state. This may explain why despite criticism of its Eurocentrism, the globalization of capitalism seems inevitably to bring in its wake the disciplinary products of so-called “Western” theory.
These relationships in their complexity deserve a more dialectical analysis that accounts for the contradictory historical relationship between the two, exemplified by Horkheimer and Adorno’s critique of Enlightenment in response to the rise of Nazi’ism and the “culture industries.”44 For all their political manipulation of human rights and democracy, capital and the capitalist state as in the US have repeatedly shown that they are no slaves to their professions of either of reason or the autonomous thinking individual, human rights and secularism—at home or abroad. If the Enlightenment could not resolve the tension between instrumental reason and a transcendent rationality, as generations of social philosophers attest to, it is also the case that instrumental reason is what matters in the pursuit of economic and political power—including the instrumentalization of human beings as labor power and consumers.45 It is not to be forgotten that to the extent Enlightenment ideals have become social realities in Euro/American societies, it was a result not of some cultural disposition but of prolonged and arduous struggles against power by constituencies from workers to women and subaltern ethnic groups. These struggles continue—now with the additional burden of resisting efforts by states and capital to roll back these past gains.
The need to distinguish capitalist modernity and Enlightenment legacies is even more apparent presently in the case of non-EuroAmerican societies anxious to partake of the fruits of global capitalism but equally anxious to keep at arm’s length the values most commonly associated with Enlightenment legacies. The reconfiguration of global power relations with the globalization of capital has empowered challenges to the cultural hegemony of Euromodernity, opening up the ideological space to the reappraisal of Enlightenment legacies from locations where they appeared not as instruments of liberation and progress but indispensable components of an oppressive apparatus of power. The rejection of these legacies is part of a broader effort to recover cultural and intellectual identities that had been consigned to the past as dead or stagnant traditions under the regime of Euromodernity. These traditions are now called upon as resources for “alternative modernities” that account for native values and system of knowledge, be it Islam, Confucianism, Buddhism, Hinduism or the many indigenous legacies that demand recognition. The universalistic assumptions of Euromodernity are giving way, at least in the realm of thought, to alternative claims on both reason and morality.
In praise or in condemnation, the juxtaposition of the Enlightenment as the source of Euromodernity against alternative cultural modernities inevitably produces cultural reification and reductionism, which is itself a consequence of the many encounters of modernity. It is often overlooked(if not viewed as of marginal significance) that the same Enlightenment legacies that capitalist modernity claimed for itself have also provided legitimation for struggles against the new forms labor, gender and racial oppression and exploitation took under the market economy. If Enlightenment legacies provided cultural justification for colonialism, moreover, it also offered a language of anti-colonialism that was readily assimilated by many in their struggles against European domination and capitalist modernity—not to speak of homegrown oppression and exploitation.46
Euromodernity may have claimed possession of universal reason and morality, but what these consisted of have been subjects all along of disagreement, contention and conflict—and the considerable measure of openness that owed much to the institutionalization of dissent. Contrary to simplistic binarisms that set the vitality of modernity against the quietude of tradition, no world of thought and morality is free of dissent and disagreement, however strenuous the imposition of orthodoxy. Nevertheless, the institution of dissent as a normative principle over enforced loyalty to any ideological orthodoxy or lineage may be the distinguishing feature of Euromodernity as a cultural formation, embodied in the capitalist economy that empowered it. Neiman writes that “the Enlightenment is inherently self-critical, morally bound to examine its own assumptions with the same zeal it examines others.”47 Michel Foucault, whose influential writings have done much to reveal the complicity of Enlightenment ideals in shaping modern practices of power, wrote nevertheless that
between the high Kantian enterprise and the little polemical professional activities that are called critique, it seems to me that there has been in the modern Western world (dating, more or less, empirically from the 15th to the 16th centuries) a certain way of thinking, speaking and acting, a certain relationship to what exists, to what one knows, to what one does, a relationship to society, to culture and also a relationship to others that we could call, let’s say, the critical attitude….critique only exists in relation to something other than itself: it is an instrument, a means for a future or a truth that it will not know or happen to be, it oversees a domain it would want to police but is unable to regulate.48
In her commentary on Foucault’s text, “What is Critique?,” Judith Butler suggests, along lines similar to Neiman’s, that to Foucault this critical attitude, “this exposure of the limit of the epistemological field is linked with the practice of virtue, as if virtue is counter to regulation and order, as if virtue itself is to be found in the risking of established order. He is not shy about the relation here. He writes, `there is something in critique that is akin to virtue.’ And then he says something which might be considered even more surprising: `this critical attitude [is] virtue in general.’”49 Karl Marx, we may recall, felt equally virtuous in his commitment to “ruthless criticism of all that exists.”
It should be obvious why political regimes that demand loyalty to their legitimizing principles should find this “critical attitude” undesirable or even dangerous. Attempts to establish ideological orthodoxies have been unable to withstand this combined force of economy and culture that demanded constant flexibility, innovation and criticism—including in so-called democratic societies. The Enlightenment may be the fountainhead of Euromodernity, but conflicts over its meaning are as much a defining feature of Euromodernity as loyalty to the universalism it has claimed. Legacies of the Enlightenment are visible in the very criticisms of the Enlightenment. The question, “What is Enlightenment?,” Foucault writes,
marks the discreet entrance into the history of thought of a question that modern philosophy has not been capable of answering, but that it has never managed to get rid of, either….for two centuries now. From Hegel through Nietzsche or Max Weber to Horkheimer or Habermas, hardly any philosophy has failed to confront this same question, directly or indirectly. What, then, is this event that is called the Aufklärung and that has determined, at least in part, what we are, what we think, and what we do today?50
The same complexity attended the reception of Enlightenment ideas outside of Euro/ America. Viewed in historical perspective, the contemporary attacks on the Enlightenment represent a reversal of the hopes Enlightenment ideals inspired for a century among intellectuals of the Global South struggling against despotism at home and imperialism abroad—and continue to do so. To be sure, Euromodern ideas and values provoked opposition among elites and populations at large for their foreignness or subversion of native values, and more often than not forced upon them.51 But they were also assimilated in one form or another by generations who were products of the encounter as sources of new visions of change that ranged from the total repudiation of “tradition” in the name of the modern to indigenized modernities that sought to translate the new values to native idiom. Liberal and socialist visions that bore upon them the imprint of the Enlightenment would trigger revolutionary changes that have launched societies globally on new trajectories of change. Indigenization itself is a two-way street: indigenizing foreign ideas to accommodate native legacies transforms not only the imported ideas but the traditions to which they are articulated. Even so-called “conservative” efforts to uphold native legacies have ended up endowing those legacies with new meanings and functions. Here, too, a distinction needs to be drawn between capitalist modernity and Enlightenment legacies, as the acceptance of one did not need automatically to acceptance of the other. Revolutions against capitalism and imperialist domination drew upon imported socialist and anarchist ideas for their legitimation. Conversely, participation in the global capitalist economy offers no guarantee of respect for freedom, democracy or human rights.
It may be no coincidence that contemporary attacks on the Enlightenment have acquired a hearing in a literally counter-revolutionary drift globally. Ideas derivative of the Enlightenment have nourished revolutionary or more broadly progressive movements and aspirations for two centuries not just in Europe and North America but globally. The relationship of Enlightenment legacies to modern revolutionary movements is as complex as their relationships to capitalist modernity, but the entanglement of Enlightenment visions in modern revolutionary movements is one important reason for the attacks directed against it at a time of wholesale repudiation of revolutionary pasts.52 As in the PRC beginning in the 1980s, revolutions have been consigned to a “conservative” past while the mantle of progress has been transferred to an alliance of economic neoliberalism and increasingly dictatorial states aligned with global capital that nourish off cultural nationalism.53
What needs to be underlined is that the criticism of Euromodernity is not limited to the repudiation of the hegemony of Euro/America but also targets the revolutionary pasts which appear now not as agents of progress and liberation but deviations from the proper historical paths of development. In the process, the pasts that revolutions sought to cast aside as obstacles to modernity have been revived as the sources of alternative modernities. Especially noteworthy is the mutually reinforcing relationship between liberal multiculturalism and cultural nativism or ethnocentrism that share common grounds in the criticism of Eurocentrism which is also their raison-d’etre. It is not uncommon these days to encounter attacks in the name of alternative cultural traditions and multiculturalism on legacies of academic freedom and critical thinking for being “Western” peculiarities—even as millions around the world continue to engage in political struggles to achieve those ends. This supposed “Western” peculiarity, moreover, is under attack in the “West,” as institutions avail themselves of a rising tide of censorship and surveillance to restrict free speech in accordance with the dictates of political and economic pressures.54
Kant’s own understanding of Enlightenment is phrased it in terms that are striking for their relevance in a global political environment that seems devoted to the infantilization of populations or, in the more colorful phrasing of imperial Chinese critics of despotism, “stupid people policy” (yumin zhengce).55 The terms have been echoed repeatedly in anarchist thinking in subsequent years:
Enlightenment is man’s emergence from his self-imposed immaturity. Immaturity is the inability to use one’s understanding without guidance from another…The guardians who have so benevolently taken over the supervision of men have carefully seen to it that the far greatest number of them (including the entire fair sex) regard taking the step to maturity as very dangerous, not to mention difficult. Having first made their domestic livestock dumb, and having carefully made sure that these docile creatures will not take a single step without the go-cart to which they have been harnessed, these guardians then show them the dangers that threaten them, should they attempt to walk alone…Thus, it is difficult for any individual man to work himself out of the immaturity that has all but become his nature…Thus a public can only attain enlightenment slowly…Nothing is required for this enlightenment, however, except freedom; and the freedom in question is the least harmful of all, namely, the freedom to use reason publicly in all matters.56
The “freedom” Kant has in mind here is not the freedom of consumer society, which juxtaposes freedom against democracy, but the freedom to deploy reason for public ends, which is the very condition of democracy. Referring to the anarchist Rudolf Rocker, Noam Chomsky notes in a recent talk that,
This brand of socialism, [Rocker] held, doesn’t depict “a fixed, self-enclosed social system” with a definite answer to all the multifarious questions and problems of human life, but rather a trend in human development that strives to attain Enlightenment ideals. So understood, anarchism is part of a broader range of libertarian socialist thought and action that includes the practical achievements of revolutionary Spain in 1936; reaches further to worker-owned enterprises spreading today in the American rust belt, in northern Mexico, in Egypt, and many other countries, most extensively in the Basque country in Spain; and encompasses the many cooperative movements around the world and a good part of feminist and civil and human rights initiatives. This broad tendency in human development seeks to identify structures of hierarchy, authority and domination that constrain human development, and then subject them to a very reasonable challenge: Justify yourself. 57
Critics of Enlightenment bear the burden for explaining why Enlightenment aspirations for freedom and democracy should be inconsistent with respect for and accommodation of alternative cultural legacies rather than as the very conditions that make possible recognition of those legacies in all their richness and diversity. Colonialism, denying the “maturity” of its subjects, also denied them the freedom necessary to come into their own as political and cultural subjects. Arguments based on “ontological differences” between native traditions and democracy or freedom share with the cultural colonialism they would resist assumptions that perpetuate popular dependence on the state not merely as an organ of government but also as the arbiter of cultural identity. On the other hand, from Frantz Fanon to Edward Said, seminal critics of Eurocentrism and colonialism from what used to be called the “third world” did not see any inconsistency between asserting the rights of the colonized and Enlightenment universalism, arguably because their affirmations of anti-colonial rights and subjectivities were framed within the critique of oppression in general rather than the temptations of identity politics.58
Like it or not, we live in a post-Euromodern world. Repudiation of Euro/American cultural hegemony is not the same as repudiating the history of Euromodernity that has transformed societies globally, launching them in new historical trajectories. At a more substantial level, the legacies of the Enlightenment continue to offer legitimation for the embrace of difference that is missing from many of the ethnocentric culturalisms that would challenge it.
At the same time, it is equally the case that reaffirmation of Enlightenment values may no longer be phrased in the language of the historical Enlightenment but has to answer to problems thrown up in the intervening two centuries, especially the postcolonial challenge. In the words of the late Emmanuel Eze,
In contrast to traditional theories of colonialism, critical theory in the postcolonial age, in its many facets, carries forward the promise of emancipation embodied in aspects of the Enlightenment and modernist discourses. But it also seeks to hold the processes of modernity and the European-inspired Enlightenment accountable for the false conceptual frameworks within which they produced, for example, the idea of history as something in the name of which peoples outside of the narrow spheres of Europe appeared to many European states as legitimate objects of capitalist enslavement, political conquest and economic depredation. It is in these dual intentions that the critical element in postcolonial theory is to be understood. 59
As Chomsky’s statement suggests, Enlightenment universalism is not a given, it is a project that remains to be realized. The project is no longer just Euro/American but needs to be global—not just in scope but in inspiration, inspiration that draws not only upon different historical legacies but even more importantly on ongoing grassroots struggles for human liberation, dignity and welfare—and increasingly, it seems, for survival in the face of impending ecological catastrophe. Against contemporary reifications of culture, we may recall the eloquent words of a thinker who, ironically, has been a foremost resource for postcolonial criticism of Euromodernity:
A national culture is not a folklore, nor an abstract populism that believes it can discover the people’s true nature. … A national culture is the whole body of efforts made by a people in the sphere of thought to describe, justify and praise the action through which that people has created itself and keeps itself in existence. 60
National culture as Fanon conceived it was an ongoing project that drew its inspiration not from parochial yearnings for past glory, or chauvinistic fantasies of global hegemony, but from struggles for liberation driven by universally shared aspirations to justice and democracy. It was a conception that has been shared widely among those frustrated by Euromodernity’s denial of who they were, but who also found a new promise in the vision of universality it offered. The author of a recent study writes, with reference to the seminal Chinese intellectual Liang Qichao and his social democratic disciples, Zhang Dongsun and Zhang Junmai, that, they
devised their cultural plan for constructing a new China along with their universal vision of a new world from a global perspective. …they re- discovered cultural differences (Chinese tradition) within the global system of culture and evaluated all differences by a universal standard of morality…their cultural vision can be understood in terms of “global universalism,” which denies “European universalism” but never abandons the universal itself….[they]envisioned a universal culture based on the universal human capacity for morality, and embraced Chinese culture as a local representation of this universal morality….they challenged Western universalism without falling into the traps of cultural relativism or nationalist cultural pride. 61
These sentiments may sound quaint in a neoliberal global environment in which Social Darwinian norms and conflicts over civilizational claims are on the ascendancy, and the fate of humanity hangs in the balance. Enlightenment is at its most elusive when we may need it the most. Enlightenment universalism is not to be dismissed as merely a handmaiden of capitalist modernity or colonialism, even though its entanglements with the latter have marred its image among those who encountered it upon the banners of Euro/American imperialism. We need to recall that it was also the inspiration for radical aspirations to freedom to live and breathe in dignity. Freedom is the condition of Enlightenment, as Kant maintained, but also its goal. It may hardly be discarded for its European origins, or the foul deeds that have been perpetrated in its name, for it is an integral part of histories globally that continues to inspire struggles for human rights to existence—and democracy—against the betrayals of capital and its states. The answer to problems of public enlightenment is more enlightenment, not willing surrender to oppression and bigotry in the guise of cultural difference.
* I am grateful to Paul Bove, Christopher Connery, Leo Douw, Russell Leong, Liu Zixu, Martin Miller, Ravi Palat, David Palumbo-Liu, and Wang Mingming for their comments on this article. They are in no way responsible for my argument(s). Back to the essay
_____
Notes
1. See, for example, Erik Brynjolfsson and Andrew McAfee, The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies (New York: W.W. Norton & Co., 2014). Back to the essay
3. EACS, “Report: The Deletion of Pages from EACS Conference Materials in Braga(July 2014),” Issued August 1, 2014. For the report and the letter of protest (“To whom it may concern”), see, the association website, here, viewed 2 August 2014. Back to the essay
4. For a broader spectrum of China specialists, see, “The Debate Over Confucius Institutes,” in two parts, China File, 06.23.14 and 07.01.14, here (Consulted 10 August 2014). It is interesting that most of the contributors to the debate are critical of the institutes. Indeed, in this sample at any rate, the defenders are those associated with the institutes or with business. Business organizations all along have been against criticism of the PRC for fear that it will interfere with business, and also supportive of the institutes for facilitating it. Back to the essay
5. Peter Schmidt, “AAUP Rebukes Colleges for Chinese Institutes and Censures Northeastern Illinois”, The Chronicle of Higher Education, June 15 2014, (consulted 10 August 2014). It is possible, hopefully, that the arrogance of PRC functionaries is finally catching the public eye. See, “Beijing’s Propaganda Lessons”, The Wall Street Journal, August 7, 2014, (viewed 10 August 2014). Rather than accede to Hanban demands for greater control, the Lyons (France) Confucius Institute was shut down in Fall 2013. Back to the essay
7. Naïve and sometimes self-serving arguments that the Confucius Institutes are under the Hanban which answers to the Ministry of Education disguise the importance of the reach of the Central Propaganda Bureau into all state organs, including Party think-tanks, and especially education. For a discussion, see, David Shambaugh, “China’s Propaganda System: Institutions, Processes and Efficacy,” The China Journal, No. 57 (January 2007): 25-58. See also, Anne-Marie Brady, Marketing Dictatorship: Propaganda and Thought Work in Contemporary China (Lanham, MD: Rowman&Littlefield, 2009). Back to the essay
8. World system analysts such as Immanuel Wallerstein and the late Giovanni Arrighi long have been interested in the question of hegemonic transition. The most thorough discussion I am aware of is Giovanni Arrighi, Adam Smith in Beijing: Lineages of the 21st Century (London: Verso, 2009) Back to the essay
9. John J. Mearsheimer, “The Gethering Storm: China’s Challenge to US Power in Asia,” The Chinese Journal of International Politics, Vol. 3(2010): 381-396, pp. 387-388. Mearsheimer is absolutely correct that the PRC search for hegemony has learned a great deal from the previous US experience. We might add that over the last three decades, the PRC has persistently mimicked the US in its pursuit of power and development. Back to the essay
10. For a brief English version, see, Tingyang Zhao, “Rethinking Empire from a Chinese Concept ‘All-under-Heaven’ (Tian-xia),” Social Identities, 12.1 (2006): 29-41. The idea has found favor among some US international relations experts such as David Kang at the University of Southern California. For critical discussions, see, William A. Callahan, “Chinese Visions of World Order: Post-Hegemonic or a New Hegemony?” International Studies Review, 10(2008): 740-761; Xu Bijun, “Is Zhao’s Tianxia System Misunderstood?” Tsinghua China Law Review, Vol. 6 (January 29, 2014): 95-108; Christopher R. Hughes, “Reclassifying Chinese nationalism: the geopolitik turn,” Journal of Contemporary China, 20(71) (2011): 601-20; and, Zhang Feng, “The Tianxia System: World Order in a Chinese Utopia”, China Heritage Quarterly, No. 21 (March 2010), (consulted 31 July 2014). Works like Zhao’s are part of an ongoing effort to construct an “IR theory with Chinese characteristics,” corresponding to the PRC’s global stature. For a historically sensitive account of the concept, see, Wang Mingming, “All under heaven (tianxia): Cosmological perspectives and political ontologies in pre-modern China,” HAU: Journal of Ethnographic Theory 2(1)(2012): 337-383. For a reminder that the tributary system might not be welcome to modern nations with their claims on sovereignty, see, Amitav Acharya, “Will Asia’s Past Be Its Future,” International Security, 28.3 (Winter 2003/04): 149-164. Others, most notably pan-Islamists, have their own vision of a new world order that, similarly to tianxia, seek to transcend the nation-based order overseen by “the West.” See, Behlul Ozkan, “Turkey, Davutoglu, and the Idea of Pan-Islamism,” Survival: Global Politics and Strategy, 56.4(2014): 119-141, published online. I am grateful to Prof. Tugrul Keskin for bringing this article to my attention. Back to the essay
12. For a detailed discussion, see, Arif Dirlik, “The Discourse of `Chinese Marxism.’” In Modern Chinese Religion: 1850-Present, Value Systems in Transformation, ed. Vincent Goossaert, Jan Kiely, and John Lagerwey (Leiden and Boston: Brill, forthcoming). Back to the essay
19. One may surmise that Confucius Institutes (and PRC students) are recruited to serve as the “eyes and ears” of officials who seem also to watch closely what happens in communities. When a US citizen of Taiwanese descent decided to have a mural on Tibet painted on a building he owned in the small town of Corvallis that is home to Oregon State University, officials from the PRC Consulate in San Francisco were dispatched to warn the mayor of consequences if the “transgression” was not stopped. See, “China asks city in Oregon to scrub mural for Tibetan, Taiwanese independence”, NBC News, Wednesday, September 12, 2012, (consulted 14 February 2014). PRC leaders are quick to take offense at outsiders’ “interference” in “China’s internal affairs,” which does not stop them from interfering in the affairs of others. Most common is the retaliation for friendly gestures toward the Dalai Lama. The Xu Lin episode is only one more example, if an egregious one, of the export of censorship. See, Elizabeth Coates, “Chinese Communist Party-backed Tech Giants Bring Censorship to the Global Stage”, TechCrunch, August 2, 2014, (consulted 7 August 2014). In spite of all this, and for all the complaints by PRC officials, the US State Department backed off from terminating the visas of “academics at university-based institutes…teaching at the elementary- and secondary-school levels” in violation of “the terms of their visas.” See, Karin Fischer, “State Department Directive Could Disrupt Teaching Activities of Campus-Based Confucius Institutes”, The Chronicle of Higher Education, May 21, 2012, (consulted 10 August 2014). According to the Wall Street Journal (see above, note 5), Confucius classrooms continue to spread in US primary and secondary schools in collusion with the administrators of the SAT. Back to the essay
20. Tu Wei-ming, “Cultural China: The Periphery as the Center,” in Tu Wei-ming(ed), The Living Tree: The Changing Meaning of Being Chinese Today (Stanford, CA: Stanford University Press, 1994), pp. 1-34, pp. 15-16. Back to the essay
21. The editors of an English-language theoretical journal recently invited members of the editorial board (all foreign) to submit discussions of the “China Dream” for a special issue. Getting well-known foreign Marxist or socialist intellectuals involved in such a discussion is of obvious symbolic significance in centering the PRC, and President Xi as a theorist. Upon inquiring about criticism of internal and external developments under President Xi, the editor honestly informed me that, yes, that might be a bit of a problem. This does not mean that there aren’t many socialists, among others, who think that the PRC’s is a socialist road, choosing to ignore the authoritarian capitalism that drives the system, the colonial policies toward minority populations, and an income gap more severe than most capitalist countries where, according to a recent report, one percent owns one-third of the national wealth. See, Xinhua Network, “1% of Chinese own one-third of national wealth: report”, 26 July 2014, (consulted 4 August 2014). It would appear that a world order dominated by corporate capitalism and oligarchy of wealth has become part of “Chinese Marxism,” and the “China Dream.” Tsinghua law professor and advciser to the government, Wang Zhenmin, recently explained that democracy had to be limited in Hong Kong in order to protect the wealthy and secure capitalist development. See, Michael Forsythe and Keith Bradsher, “On Hong Kong, Democracy and Prorecting the Rich”, The New York Times, August 29, 2014, (viewed 2 September 2014). Back to the essay
22. For further discussion, see, Arif Dirlik and Roxann Prazniak, “Social Justice, Democracy and the Politics of Development: The People’s Republic of China in Global Perspective,” International Journal of China Studies, 3.3(December 2012): 285-313. Back to the essay
24. The reference here is to President Xi’s assertion that Chinese are genetically indisposed to aggression against others. See, “Xi: there is no gene for invasion in our blood”, China Daily, 16 May 2014, (consulted 4 August 2014). Even if it is rhetorical, the racialization of the notion of “Chineseness” in this claim is noteworthy. Now that PRC historians once again have made Mongols into part of “Chinese” history, I wonder if this includes genes of the likes of Genghis Khan. What we call “China,” of course is a product of colonization, mainly by the Han people from the north. William Callahan informs us that according to a study published by the Chinese Academy of Military Science, over three thousand years, imperial dynasties were engaged in 3756 wars, an average of 1.4 wars a year. William A. Callahan, China Dreams: Twenty Visions of the Future (Oxford, UK: Oxford University Press, 2013), p. 48. See also Callahan’s study of “national humiliation” discourse, The Pessoptimist Nation (Oxford, UK: Oxford University Press, 2010). Back to the essay
25. “Marketing” of culture has been part of these discussions on discursive power. As with the other market, the Party-state does not hesitate to step in and determine its limits. The reference here is to a recent article published in the official Party journal, Qiushi (Seeking Truth, formerly the Red Flag), Yin Xia, “Jianli Zhongguo tese huoyu tixi ji xu sixiang jiede damianji juexing” (The establishment of discourse with Chinese characteristics urgently requires broad awakening of the intellectual world), Qiushi theory network, July 22, 2014. Back to the essay
26. Benjamin Carlson, “7 things you can’t talk about in China”, globalpost, June 3, 2013. For background in the new leadership’s ideological plans, see “Document 9: A China File Translation”, 11/08/13, (consulted 6 August 2014). The prohibition has been accompanied by criticism of the hypocrisy of the US government which exports “freedom” while betraying it at home. See, “Experts: the so-called `press freedom’ is just a `beautifying tool’”, Guangming online, 30 October 2013, (consulted 7 August 2014), The experts included three academics, regulars on the IR scene, and often cited in the press: Shi Yinhong of Renmin University, Shen Dingli of Fudan University and Zhao Kejin, the author discussed above. Back to the essay
29. Yin Xia, “Jianli Zhongguo tese huoyu tixi ji xu sixiang jiede damianji juexing. “Chinese airs” was the term Mao Zedong used in 1940 his seminal essay, “On New Democracy,” which inaugurated “making Marxism Chinese.” Back to the essay
32. Arif Dirlik, Global Modernity: Modernity in an Age of Global Capitalism (Boulder, CO: Paradigm Press, 2007). Back to the essay
33. This recalls an anectode the author was told by the late distinguished Pcific writer Epeli Hau’ofa, who was then head of the business school at the University of the South Pacific in Suva, Fiji. We have to teach our students two kinds of English, he said, World Bank English and pidgin English, one for success in the world, the other for the conduct of everyday life. The question is global. It nevertheless has to be ditingusihed according to power relations. There is a big difference between the deployment of “native” knowledge for global hegemony, and its importance for the survival of a small fragile society. Ethical neutrality may only end up in complicity with power. See, Arif Dirlik, “The Past as Legacy and Project: Postcolonial Criticism in the Perspective of Indigeneous Historicism,” American Indian Culture and Research Journal, 20.2(1996):1-31. Back to the essay
34. For “clash of civilizations,” see, Samuel P. Huntington, “The Clash of Civilizations?” Foreign Affairs 72.3 (Summer 1993): 22-49; “If not Civilizations, What? Paradigms of the Post-Cold War,” Foreign Affairs 72.5 (Nov/Dec 1993): 186-195; and, “The West Unique, Not Universal,” Foreign Affairs 75.6(Nov/Dec 1996): 28-46. These various essays were compiled and expanded in The Clash of Civilizations and the Remaking of World Order (New York: Simon and Shuster, 1996). The most thorough study of hybridity in historical perspective is Robert J.C. Young, Colonial Desire: Hybridity in Theory, Culture and Race (London, UK: Routledge, 1995). A prominent Chinese scholar who advocates similar ideas is He Chuanqi. See, “China Modernization Report 2009: The Study of Cultural Modernization”, China Development Gateway. See, also, an influential advocate of “Confucianism,” Kang Xiaoguang, “Confucianization: A Future in the Tradition,” Social Research, 73.1 (Spring 2006): 77-120. See, also, David Ownby, “Kang Xiaoguang: Social Science, Civil Society, and Confucian Religion,” China Perspectives, #4 (2009): 101-111. Kang views belief in democracy as a “superstition. Back to the essay
35. For further discussion, see, Arif Dirlik, “Bringing History Back In: Of Diasporas, Hybridities, Places and Histories,” Review of Education/Pedagogy/Cultural Studies, 21.2 (1999):95-131. Back to the essay
36. For the origins of multiculturalism in corporate managerial needs, see, Arif Dirlik, “The Postmodernization of Production and Its Organization: Flexible Production, Work and Culture,” in A. Dirlik, The Postcolonial Aura: Third World Criticism in the Age of Global Capitalism (Boulder, CO: Westview Press, 1997), pp. 186-219. Back to the essay
37. Susan Neiman, Moral Clarity: A Guide for Grown-up Idealists, revised edition (Princeton, NJ: Princeton University Press, 2009), p. 12. Back to the essay
38. Jonathan Israel, A Revolution of the Mind: Radical Enlightenment and the Intellectual Origins of Modern Democracy (Princeton, NJ: Princeton University Press, 2010), pp. vii-viii. Back to the essay
39. Kenan Malik, Multiculturalism and Its Discontents (London: Seagull Books, 2013), pp. 71-72. Indeed, any such criticism is met almost in knee-jerk fashion with charges of racism. The mutual tolerance in most cases is also less than mutual—as the example of the PRC, among others, illustrates. Back to the essay
40. Vijay Prashad, Uncle Swami: South Asians in America Today (New York: The New Press, 2012), esp. pp. 12-19, 110-114. Back to the essay
41. Jean Comaroff and John L. Comaroff, Theory from the South, Or, How Euro-America Is Evolving Toward Africa (Boulder, CO: Paradigm Publishers, 2012), p. 2. For the deployment of universalism in the service of Euro/American power, see, Immanuel Wallerstein, European Universalism: The Rhetoric of Power (New York: The New Press, 2006). Back to the essay
42. There is, moreover, a fallacy to the kind of criticism offered by Comaroff and Comaroff in the work just cited. It seems as if they would like to eat their cake, and have it, too. Bringing other perspectives into theory should not present much of a problem, even if it has become a major concern only recently. The more fundamental issue is that of theory itself, and the disciplinary organization of learning, which casts a web over our ways of knowing. If the hegemony of Enlightenment knowledge is to be challenged, that means questioning the whole enterprise of theory and disciplinary division of intellectual labor. This, of course, is the position of radical critics such as Ashis Nandy and Vine DeLoria, Jr., as well as Islamic fundamentalists and radical advocates of national learning in Chinese societies. Back to the essay
43. Anthony Appiah has observed that “attacks on `Enlightenment humanism’ have been attacks not on the universality of Enlightenment pretensions but on the Eurocentrism of their real bases. The confounding of Enlightenment and Eurocentrism is a pervasive problem. See, Kwame Anthony Appiah, The Ethics of Identity (Princeton, NJ: Princeton University Press, 2005), pp. 249-250. Back to the essay
44. Max Horkheimer and Theodor W. Adorno, The Dialectic of Enlightenment, tr. by John Cumming (New York: The Seabury Press, 1944). This devastating critique of the Enlightenment’s complicity in the rising tide of despotism in the 1930s nevertheless ends with the conclusion that “Enlightenment which is in possession of itself and coming to power can break the bounds of enlightenment.”(p. 208). Back to the essay
45. Darrow Schechter, The Critique of Instrumental Reason: From Weber to Habermas(New York: Continuum Books, 2010) for a comprehensive critical discussion. Back to the essay
46. Sankar Muthu, Enlightenment Against Empire (Princeton, NJ: Princeton University Press, 2002). Back to the essay
48. Michel Foucault, “What is Critique?” in Foucault, The Politics of Truth, ed by Sylvere Lotringer (Los Angeles: Semiotext(e), 2007), pp. 41-01, p. 42. Back to the essay
50. Michel Foucault, “What is Enlightenment?” in The Foucault Reader, ed. By Paul Rabinow (New York: Pantheon Books, 1984), pp. 32-50, p. 32. Back to the essay
51. The disillusionment with “Western civilization” has antecedents. It was especially pronounced in the aftermath of World War I which to many represented the spiritual bankruptcy of the “West.” See, Cemil Aydin, The Politics of Anti-Westernismin in Asia: Visions of World Order in Pan-Islamic and pan-Asian Thought (New York: ColumbiaUniversity Press, 2007), and, Dominic Sachsenmaier, “Alternative Visions of World Order in the Aftermath of World War I: Global Perspectives on Chinese Approaches,” in Sebastian Conrad and Dominic Sachsenmaier eds., Competing Visions of World Order: Global Moments and Movements, 1880s-1930s (New York: Palgrave Macmillan, 2007). Ironically, such disillusionment was also a reason for the attraction to socialist alternatives, suggesting a distinction between “Western” modernity and capitalism. Back to the essay
52. Enlightenment (qimeng) has been an ongoing concern of Chinese intellectuals since the New Culture Movement of the 1910s-1920s. See, Vera Schwarcz, The Chinese Enlightenment: Intellectuals and the Legacy of the May Fourth Movement of 1919 (Berkeley, CA: Univerity of California Press, 1986); He Ganzhi, Jindai Zhongguo qimeng yundong shi (History of the Modern Chinese Enlightenment Movement) (Shanghai: no publisher, 1936); Gu Xin, Zhongguo qimende lishi tujing (Historical Prospects of the Chinese Enlightenment) (Hong Kong: Oxford University Press, 1992); and, Zhang Xudong, Chinese Modernism in the Era of Reforms Cultural Fever, Avant-garde Fiction, and the New Chinese Cinema (Durham, NC: Duke University Press, 1997). Back to the essay
53. It is also important to note that this shift is anything but spontaneous. The surge in religion has been financed by states, and encouraged by Euro/American geopolitical interests, as in the case of Islam, with explicitly anti-revolutionary intentions. Organized activity also has played a major part, as in the case for example of the Gulen movement, whose impressive use of education in popularizing its goal of an Islamic capitalist modernity compare favorably with the censorial clumsiness of Confucius Institutes. For sympathetic studies, see, Helen Rose Ebaugh, The Gulen Movement: A Sociological Analysis of a Civic Movement Rooted in Moderate Islam (Dordrecht, the Netherlands: Springer, 2010), and, Turkish Islam and the Secular State: The Gulen Movement, ed. By M. Hakan Yavuz and John L. Esposito (Syracuse, NY: Syracuse University Press, 2003). Back to the essay
54. The dismissal of the Amerindian scholar Ward Churchill from the University of Colorado for negative comments about 9/11 has been followed by ongoing efforts to restrict speech on a variety of issues, most egregiously in the US, Israel. The most recent case is that of Steven Salaita who has been “unhired” by the University of Illinois at Champagne-Urbana on the grounds of “uncivil” language in tweets that were critical of Israel. The chilling effect on criticism of a vague charge that potentially covers a broad range of speech and behavior is imaginable. See, David Palumbo-liu, “Why the `Unhiring’ of Steven Salaita Is a Threat to Academic Freedom”, The Nation, August 27, 2014, (viewed 28 August, 2014). Ironically, Salaita is also a scholar of Amerindian Studies, with an interest in settler colonialism. Settler colonialism as the experience both of Amerindians and Palestinians has received increased attention among Amerindian scholars in recent years. Back to the essay
55. Herbert Marcuse similarly referred to “the systematic moronization of children and adults alike.” See, Marcuse, “Repressive Tolerance,” in Robert Paul Wolff, Barrington Moore, Jr., Herbert Marcuse, A Critique of Pure Tolerance (Boston: Beacon Press, 1965), pp. 81-117, p. 83. “Repressive tolerance” also effectively captures the repression of diversity (as well as critical reason) by unthinking tolerance of multiculturalism! Back to the essay
56. Immanuel Kant, “An Answer to the Question: What is Enlightenment?”(1784), in Immanuel Kant, Perpetual Peace and Other Essays, tr. by Ted Humphrey (Indianapolis, IN: Hackett Publishing Co., 1983), pp. 41-48, pp. 41-42. Emphases in the original. Back to the essay
57. Noam Chomsky, “What is the Common Good?” Dewey Lecture at Columbia University, December 6, 2013, adapted for publication in Truthout, 07 January 2014, (consulted 27 April 2014). See, also, Jacques Ranciere for a view of anarchy as the condition for democracy: “Democracy first of all means this: anarchic `government’, one based on nothing other than the absence of every title to govern.” Ranciere, Hatred of Democracy (London: Verso, 2006), p. 41. In his many works, the Japanese social philosopher Kojin Karatani also has elaborated on the links between Kantian notions of Enlightenment and anarchism, especially the anarchism of Pierre-Joseph Proudhon. See, Kojin Karatani, The Structure of World History: From Modes of Production to Modes of Exchange, tr. by Michael K. Bourdaghs (Durham, NC: Duke University Press, 2014). Back to the essay
58. For a sustained philosophical argument that is as down to earth as it is analytically sharp, see, Emmanuel Chukwudi Eze, On Reason: Rationality in a World of Cultural Conflict and Racism (Durham, NC: Duke University Press, 2008). We may also recall here an observation by Ernesto Laclau on “a dimension of the relationship particularism/universalism which has generally been disregarded. The basic point is this: I cannot assert a differential identity without distinguishing it from a context, and, in the process of making the distinction, I am asserting the context at the same time. And the opposite is also true: I cannot destroy a context without destroying at the same time the identity of the particular subject who carries out the destruction.” Laclau, Emancipation(s) (London and New York: Verso, 1996), Chap. 2, “Universalism, Particularism and the Question of Identity,” p. 27. The “ontological differences” is with reference to the work of Ahmet Davutoglu, Alternative Paradigms: The Impact of Islamic and Western Weltanshauungs on Political Theory (Lanham, MD: University Press of America, 1994), p. 195, where the author describes the “Islamic paradigm” as “absolutely alternative to the Western.” Davutoglu is currently the foreign minister(and soon-to-be prime minister) of Turkey. He is an advocate of Pan-Islamic expansionism, with Turkey at the center, and for all his insistence on “ontological difference,” draws heavily on Euro/American geopolitical ideas, especially German notions of lebensraum from the early 2oth century. See, Ozkan, “Turkey, Davutoglu, and the Idea of Pan-Islamism,” op.cit., fn. 10. Back to the essay
60. Frantz Fanon, The Wretched of the Earth, tr. by Constance Frrington (New York: Grove Press, 1963), p. 188. Back to the essay
61. Soonyi Lee, “Culture and Politics in Interwar China: The Two Zhangs and Chinese Socialism,” Ph.D. dissertation, Department of East Asian Studies, New York University (2014), p. 27. Back to the essay
Seventy years after the horror of Hiroshima, intellectuals negotiate a vastly changed cultural, political and moral geography. Pondering what Hiroshima means for American history and consciousness proves as fraught an intellectual exercise as taking up this critical issue in the years and the decades that followed this staggering inhumanity, albeit for vastly different reasons. Now that we are living in a 24/7 screen culture hawking incessant apocalypse, how we understand Foucault’s pregnant observation that history is always a history of the present takes on a greater significance, especially in light of the fact that historical memory is not simply being rewritten but is disappearing.1 Once an emancipatory pedagogical and political project predicated on the right to study, and engage the past critically, history has receded into a depoliticizing culture of consumerism, a wholesale attack on science, the glorification of military ideals, an embrace of the punishing state, and a nostalgic invocation of the greatest generation. Inscribed in insipid patriotic platitudes and decontextualized isolated facts, history under the reign of neoliberalism has been either cleansed of its most critical impulses and dangerous memories, or it has been reduced to a contrived narrative that sustains the fictions and ideologies of the rich and powerful. History has not only become a site of collective amnesia but has also been appropriated so as to transform “the past into a container full of colorful or colorless, appetizing or insipid bits, all floating with the same specific gravity.”2 Consequently, what intellectuals now have to say about Hiroshima and history in general is not of the slightest interest to nine tenths of the American population. While writers of fiction might find such a generalized, public indifference to their craft, freeing, even “inebriating” as Philip Roth has recently written, for the chroniclers of history it is a cry in the wilderness.3
At same time the legacy of Hiroshima is present but grasped, as the existential anxieties and dread of nuclear annihilation that racked the early 1950s to a contemporary fundamentalist fatalism embodied in collective uncertainty, a predilection for apocalyptic violence, a political economy of disposability, and an expanding culture of cruelty that has fused with the entertainment industry. We’ve not produced a generation of war protestors or government agitators to be sure, but rather a generation of youth who no longer believe they have a future that will be any different from the present.4 That such connections tying the past to the present are lost signal not merely the emergence of a disimagination machine that wages an assault on historical memory, civic literacy, and civic agency. It also points to a historical shift in which the perpetual disappearance of that atomic moment signals a further deepening in our own national psychosis.
If, as Edward Glover once observed, “Hiroshima and Nagasaki had rendered actual the most extreme fantasies of world destruction encountered in the insane or in the nightmares of ordinary people,” the neoliberal disimagination machine has rendered such horrific reality a collective fantasy driven by the spectacle of violence, nourished by sensationalism, and reinforced by scourge of commodified and trivialized entertainment.5 The disimagination machine threatens democratic public life by devaluing social agency, historical memory, and critical consciousness and in doing so it creates the conditions for people to be ethically compromised and politically infantilized. Returning to Hiroshima is not only necessary to break out of the moral cocoon that puts reason and memory to sleep but also to rediscover both our imaginative capacities for civic literacy on behalf of the public good, especially if such action demands that we remember as Robert Jay Lifton and Greg Mitchell remark “Every small act of violence, then, has some connection with, if not sanction from, the violence of Hiroshima and Nagasaki.”6
On Monday August 6, 1945 the United States unleashed an atomic bomb on Hiroshima killing 70,000 people instantly and another 70,000 within five years—an opening volley in a nuclear campaign visited on Nagasaki in the days that followed.7 In the immediate aftermath, the incineration of mostly innocent civilians was buried in official government pronouncements about the victory of the bombings of both Hiroshima and Nagasaki. The atomic bomb was celebrated by those who argued that its use was responsible for concluding the war with Japan. Also applauded was the power of the bomb and the wonder of science in creating it, especially “the atmosphere of technological fanaticism” in which scientists worked to create the most powerful weapon of destruction then known to the world.8Conventional justification for dropping the atomic bombs held that “it was the most expedient measure to securing Japan’s surrender [and] that the bomb was used to shorten the agony of war and to save American lives.”9 Left out of that succinct legitimating narrative were the growing objections to the use of atomic weaponry put forth by a number of top military leaders and politicians, including General Dwight Eisenhower, who was then the Supreme Allied Commander in Europe, former President Herbert Hoover, and General Douglas MacArthur, all of whom argued it was not necessary to end the war.10 A position later proven to be correct.
For a brief time, the Atom Bomb was celebrated as a kind of magic talisman entwining salvation and scientific inventiveness and in doing so functioned to “simultaneously domesticate the unimaginable while charging the mundane surroundings of our everyday lives with a weight and sense of importance unmatched in modern times.”11 In spite of the initial celebration of the effects of the bomb and the orthodox defense that accompanied it, whatever positive value the bomb may have had among the American public, intellectuals, and popular media began to dissipate as more and more people became aware of the massive deaths along with suffering and misery it caused.12
Kenzaburo Oe, the Nobel Prize winner for Literature, noted that in spite of attempts to justify the bombing “from the instant the atomic bomb exploded, it [soon] became the symbol of human evil, [embodying] the absolute evil of war.”13What particularly troubled Oe was the scientific and intellectual complicity in the creation of and in the lobbying for its use, with acute awareness that it would turn Hiroshima into a “vast ugly death chamber.”14 More pointedly, it revealed a new stage in the merging of military actions and scientific methods, indeed a new era in which the technology of destruction could destroy the earth in roughly the time it takes to boil an egg. The bombing of Hiroshima extended a new industrially enabled kind of violence and warfare in which the distinction between soldiers and civilians disappeared and the indiscriminate bombing of civilians was normalized. But more than this, the American government exhibited a ‘total embrace of the atom bomb,” that signalled support for the first time of a “notion of unbounded annihilation [and] “the totality of destruction.”15
Hiroshima designated the beginning of the nuclear era in which as Oh Jung points out “Combatants were engaged on a path toward total war in which technological advances, coupled with the increasing effectiveness of an air strategy, began to undermine the ethical view that civilians should not be targeted… This pattern of wholesale destruction blurred the distinction between military and civilian casualties.”16 The destructive power of the bomb and its use on civilians also marked a turning point in American self-identity in which the United States began to think of itself as a superpower, which as Robert Jay. Lifton points out refers to “a national mindset–put forward strongly by a tight-knit leadership group–that takes on a sense of omnipotence, of unique standing in the world that grants it the right to hold sway over all other nations.”17 The power of the scientific imagination and its murderous deployment gave birth simultaneously to the American disimagination machine with its capacity to rewrite history in order to render it an irrelevant relic best forgotten.
What remains particularly ghastly about the rationale for dropping two atomic bombs was the attempt on the part of its defenders to construct a redemptive narrative through a perversion of humanistic commitment, of mass slaughter justified in the name of saving lives and winning the war.18This was a humanism under siege, transformed into its terrifying opposite and placed on the side of what Edmund Wilson called the Faustian possibility of a grotesque “plague and annihilation.”19 In part, Hiroshima represented the achieved transcendence of military metaphysics now a defining feature of national identity, its more poisonous and powerful investment in the cult of scientism, instrumental rationality, and technological fanaticism—and the simultaneous marginalization of scientific evidence and intellectual rigour, even reason itself. That Hiroshima was used to redefine America’s “national mission and its utopian possibilities”20was nothing short of what the late historian Howard Zinn called a “devastating commentary on our moral culture.”21 More pointedly it serves as a grim commentary on our national sanity. In most of these cases, matters of morality and justice were dissolved into technical questions and reductive chauvinism relating matters of governmentally massaged efficiency, scientific “expertise”, and American exceptionalism. As Robert Jay Lifton and Greg Mitchell stated, the atom bomb was symbolic of the power of post-war America rather than a “ruthless weapon of indiscriminate destruction” which conveniently put to rest painful questions concerning justice, morality, and ethical responsibility. They write:
Our official narrative precluded anything suggesting atonement. Rather the bomb itself had to be “redeemed”: As “a frightening manifestation of technological evil … it needed to be reformed, transformed, managed, or turned into the vehicle of a promising future,” [as historian M. Susan] Lindee argued. “It was necessary, somehow, to redeem the bomb.” In other words, to avoid historical and moral responsibility, we acted immorally and claimed virtue. We sank deeper, that is, into moral inversion.22
This narrative of redemption was soon challenged by a number of historians who argued that the dropping of the atom bomb had less to do with winning the war than with an attempt to put pressure on the Soviet Union to not expand their empire into territory deemed essential to American interests.23 Protecting America’s superiority in a potential Soviet-American conflict was a decisive factor in dropping the bomb. In addition, the Truman administration needed to provide legitimation to Congress for the staggering sums of money spent on the Manhattan Project in developing the atomic weapons program and for procuring future funding necessary to continue military appropriations for ongoing research long after the war ended.24 Howard Zinn goes even further asserting that the government’s weak defense for the bombing of Hiroshima was not only false but was complicitous with an act of terrorism. Refusing to relinquish his role as a public intellectual willing to hold power accountable, he writes “Can we … comprehend the killing of 200,000 people to make a point about American power?”25 A number of historians, including Gar Alperowitz and Tsuyoshi Hasegawa, also attempted to deflate this official defense of Hiroshima by providing counter-evidence that the Japanese were ready to surrender as a result of a number of factors including the nonstop bombing of 26 cities before Hiroshima and Nagasaki, the success of the naval and military blockade of Japan, and the Soviet’s entrance into the war on August 9th.26
The narrative of redemption and the criticism it provoked are important for understanding the role that intellectuals assumed at this historical moment to address what would be the beginning of the nuclear weapons era and how that role for critics of the nuclear arms race has faded somewhat at the beginning of the twenty-first century. Historical reflection on this tragic foray into the nuclear age reveals the decades long dismantling of a culture’s infrastructure of ideas, its growing intolerance for critical thought in light of the pressures placed on media, on universities and increasingly isolated intellectuals to support comforting mythologies and official narratives and thus cede the responsibility to give effective voice to unpopular realities.
Within a short time after the dropping of the atom bombs on Hiroshima and Nagasaki, John Hersey wrote a devastating description of the misery and suffering caused by the bomb. Removing the bomb from abstract arguments endorsing matters of technique, efficiency, and national honor, Hersey first published in The New Yorker and later in a widely read book an exhausting and terrifying description of the bombs effects on the people of Hiroshima, portraying in detail the horror of the suffering caused by the bomb. There is one haunting passage that not only illustrates the horror of the pain and suffering, but also offers a powerful metaphor for the blindness that overtook both the victims and the perpetrators. He writes:
On his way back with the water, [Father Kleinsorge] got lost on a detour around a fallen tree, and as he looked for his way through the woods, he heard a voice ask from the underbrush, ‘Have you anything to drink?’ He saw a uniform. Thinking there was just one soldier, he approached with the water. When he had penetrated the bushes, he saw there were about twenty men, they were all in exactly the same nightmarish state: their faces were wholly burned, their eye sockets were hollow, the fluid from their melted eyes had run down their cheeks. Their mouths were mere swollen, pus-covered wounds, which they could not bear to stretch enough to admit the spout of the teapot.27
The nightmarish image of fallen soldiers staring with hollow sockets, eyes liquidated on cheeks and mouths swollen and pus-filled stands as a warning to those who would refuse blindly the moral witnessing necessary to keep alive for future generations the memory of the horror of nuclear weapons and the need to eliminate them. Hersey’s literal depiction of mass violence against civilians serves as a kind of mirrored doubling, referring at one level to nations blindly driven by militarism and hyper-nationalism. At another level, perpetrators become victims who soon mimic their perpetrators, seizing upon their own victimization as a rationale to become blind to their own injustices.
Pearl Harbor enabled Americans to view themselves as the victims but then assumed the identity of the perpetrators and became willfully blind to the United States’ own escalation of violence and injustice. Employing both a poisonous racism and a weapon of mad violence against the Japanese people, the US government imagined Japan as the ultimate enemy, and then pursued tactics that blinded the American public to its own humanity and in doing so became its own worst enemy by turning against its most cherished democratic principles. In a sense, this self-imposed sightlessness functioned as part of what Jacques Derrida once called a societal autoimmune response, one in which the body’s immune system attacked its own bodily defenses.28 Fortunately, this state of political and moral blindness did not extend to a number of critics for the next fifty years who railed aggressively against the dropping of the atomic bombs and the beginning of the nuclear age.
Responding to Hersey’s article on the bombing of Hiroshima published in The New Yorker, Mary McCarthy argued that he had reduced the bombing to the same level of journalism used to report natural catastrophes such as “fires, floods, and earthquakes” and in doing so had reduced a grotesque act of barbarism to “a human interest story” that had failed to grasp the bomb’s nihilism, and the role that “bombers, the scientists, the government” and others played in producing this monstrous act.29McCarthy was alarmed that Hersey had “failed to consider why it was used, who was responsible, and whether it had been necessary.”30 McCarthy was only partly right. While it was true that Hersey didn’t tackle the larger political, cultural and social conditions of the event’s unfolding, his article provided one of the few detailed reports at the time of the horrors the bomb inflicted, stoking a sense of trepidation about nuclear weapons along with a modicum of moral outrage over the decision to drop the bomb—dispositions that most Americans had not considered at the time. Hersey was not alone. Wilfred Burchett, writing for the London Daily Express, was the first journalist to provide an independent account of the suffering, misery, and death that engulfed Hiroshima after the bomb was dropped on the city. For Burchett, the cataclysm and horror he witnessed first-hand resembled a vision of hell that he aptly termed “the Atomic Plague.” He writes:
Hiroshima does not look like a bombed city. It looks as if a monster steamroller had passed over it and squashed it out of existence. I write these facts as dispassionately as I can in the hope that they will act as a warning to the world. In this first testing ground of the atomic bomb I have seen the most terrible and frightening desolation in four years of war. It makes a blitzed Pacific island seem like an Eden. The damage is far greater than photographs can show.31
In the end in spite of such accounts, fear and moral outrage did little to put an end to the nuclear arms race, but it did prompt a number of intellectuals to enter into the public realm to denounce the bombing and the ongoing advance of a nuclear weapons program and the ever-present threat of annihilation it posed. In the end, fear and moral outrage did little to put an end to the nuclear arms race, but it did prompt a number of intellectuals to enter into the public realm to denounce the bombing and the ongoing advance of a nuclear weapons program and the ever-present threat of annihilation it posed.
A number of important questions emerge from the above analysis, but two issues in particular stand out for me in light of the role that academics and public intellectuals have played in addressing the bombing of Hiroshima and the emergence of a nuclear weapons on a global scale, and the imminent threat of human annihilation posed by the continuing existence and danger posed by the potential use of such weapons. The first question focuses on what has been learned from the bombing of Hiroshima and the second question concerns the disturbing issue of how violence and hence Hiroshima itself have become normalized in the collective American psyche.
In the aftermath of the bombing of Hiroshima, there was a major debate not just about how the emergence of the atomic age and the moral, economic, scientific, military, and political forced that gave rise to it. There was also a heated debate about the ways in which the embrace of the atomic age altered the emerging nature of state power, gave rise to new forms of militarism, put American lives at risk, created environmental hazards, produced an emergent surveillance state, furthered the politics of state secrecy, and put into play a series of deadly diplomatic crisis, reinforced by the logic of brinkmanship and a belief in the totality of war.32
Hiroshima not only unleashed immense misery, unimaginable suffering, and wanton death on Japanese civilians, it also gave rise to anti-democratic tendencies in the United States government that put the health, safety, and liberty of the American people at risk. Shrouded in secrecy, the government machinery of death that produced the bomb did everything possible to cover up the most grotesque effects of the bomb on the people of Hiroshima and Nagasaki but also the dangerous hazards it posed to the American people. Lifton and Mitchell argue convincingly that if the development of the bomb and its immediate effects were shrouded in concealment by the government that before long concealment developed into a cover up marked by government lies and the falsification of information.33 With respect to the horrors visited upon Hiroshima and Nagasaki, films taken by Japanese and American photographers were hidden for years from the American public for fear that they would create both a moral panic and a backlash against the funding for nuclear weapons.34 For example, the Atomic Energy Commission lied about the extent and danger of radiation fallout going so far as to mount a campaign claiming that “fallout does not constitute a serious hazard to any living thing outside the test site.”35 This act of falsification took place in spite of the fact that thousands of military personal were exposed to high levels of radiation within and outside of the test sites.
In addition, the Atomic Energy Commission in conjunction with the Departments of Defense, Department of Veterans’ Affairs, the Central Intelligence Agency, and other government departments engaged in a series of medical experiments designed to test the effects of different levels radiation exposure on military personal, medical patients, prisoners, and others in various sites. According to Lifton and Mitchell, these experiments took the shape of exposing people intentionally to “radiation releases or by placing military personnel at or near ground zero of bomb tests.”36 It gets worse. They also note that “from 1945 through 1947, bomb-grade plutonium injections were given to thirty-one patients [in a variety of hospitals and medical centers] and that all of these “experiments were shrouded in secrecy and, when deemed necessary, in lies….the experiments were intended to show what type or amount of exposure would cause damage to normal, healthy people in a nuclear war.”37 Some of the long lasting legacies of the birth of the atomic bomb also included the rise of plutonium dumps, environmental and health risks, the cult of expertise, and the subordination of the peaceful development technology to a large scale interest in using technology for the organized production of violence. Another notable development raised by many critics in the years following the launch of the atomic age was the rise of a government mired in secrecy, the repression of dissent, and the legitimation for a type of civic illiteracy in which Americans were told to leave “the gravest problems, military and social, completely in the hands of experts and political leaders who claimed to have them under control.”38
All of these anti-democratic tendencies unleashed by the atomic age came under scrutiny during the latter half of the twentieth century. The terror of a nuclear holocaust, an intense sense of alienation from the commanding institutions of power, and deep anxiety about the demise of the future spawned growing unrest, ideological dissent, and massive outbursts of resistance among students and intellectuals all over the globe from the sixties until the beginning of the twenty-first century calling for the outlawing of militarism, nuclear production and stockpiling, and the nuclear propaganda machine. Literary writers extending from James Agee to Kurt Vonnegut, Jr. condemned the death-saturated machinery launched by the atomic age. Moreover, public intellectuals from Dwight Macdonald and Bertrand Russell to Helen Caldicott, Ronald Takaki, Noam Chomsky, and Howard Zinn, fanned the flames of resistance to both the nuclear arms race and weapons as well as the development of nuclear technologies. Others such as George Monbiot, an environmental activist, have supported the nuclear industry but denounced the nuclear arms race. In doing so, he has argued that “The anti-nuclear movement … has misled the world about the impacts of radiation on human health [producing] claims … ungrounded in science, unsupportable when challenged and wildly wrong [and] have done other people, and ourselves, a terrible disservice.”39
In addition, in light of the nuclear crises that extend from the Three Mile accident in 1979, the Chernobyl disaster in 1986 and the more recent Fukushima nuclear disaster in 2011, a myriad of social movements along with a number of mass demonstrations against nuclear power have developed and taken place all over the world.40 While deep moral and political concerns over the legacy of Hiroshima seemed to be fading in the United States, the tragedy of 9/11 and the endlessly replayed images of the two planes crashing into the twin towers of the World Trade Center resurrected once again the frightening image of what Colonel Paul Tibbetts, Jr., the Enola Gay’s pilot, referred to as “that awful cloud… boiling up, mushrooming, terrible and incredibly tall” after “Little Boy,” a 700-pound uranium bomb was released over Hiroshima. Though this time, collective anxieties were focused not on the atomic bombing of Hiroshima and its implications for a nuclear Armageddon but on the fear of terrorists using a nuclear weapon to wreak havoc on Americans. But a decade later even that fear, however parochially framed, seems to have been diminished if not entirely, erased even though it has produced an aggressive attack on civil liberties and given even more power to an egregious and dangerous surveillance state.
Atomic anxiety confronts a world in which 9 states have nuclear weapons and a number of them such as North Korea, Pakistan, and India have threatened to use them. James McCluskey points out that “there are over 20,0000 nuclear weapons in existence, sufficient destructive power to incinerate every human being on the planet three times over [and] there are more than 2000 held on hair trigger alert, already mounted on board their missiles and ready to be launched at a moment’s notice.”41 These weapons are far more powerful and deadly than the atomic bomb and the possibility that they might be used, even inadvertently, is high. This threat becomes all the more real in light of the fact that the world has seen a history of miscommunications and technological malfunctions, suggesting both the fragility of such weapons and the dire stupidity of positions defending their safety and value as a nuclear deterrent.42 The 2014 report, To Close for Comfort—Cases of Near Nuclear Use and Options for Policynot only outlines a history of such near misses in great detail, it also makes terrifyingly clear that “the risk associated with nuclear weapons is high.”43 It is also worth noting that an enormous amount of money is wasted to maintain these weapons and missiles, develop more sophisticated nuclear weaponries, and invest in ever more weapons laboratories. McCluskey estimates world funding for such weapons at $1trillion per decade while Arms Control Today reported in 2012 that yearly funding for U.S. nuclear weapons activity was $31 billion.44
In the United States, the mushroom cloud connected to Hiroshima is now connected to much larger forces of destruction, including a turn to instrumental reason over moral considerations, the normalization of violence in America, the militarization of local police forces, an attack on civil liberties, the rise of the surveillance state, a dangerous turn towards state secrecy under President Obama, the rise of the carceral state, and the elevation of war as a central organizing principle of society. Rather than stand in opposition to preventing a nuclear mishap or the expansion of the arms industry, the United States places high up on the list of those nations that could trigger what Amy Goodman calls that “horrible moment when hubris, accident or inhumanity triggers the next nuclear attack.”45 Given the history of lies, deceptions, falsifications, and retreat into secrecy that characterizes the American government’s strangulating hold by the military-industrial-surveillance complex, it would be naïve to assume that the U.S. government can be trusted to act with good intentions when it comes to matters of domestic and foreign policy. State terrorism has increasingly become the DNA of American governance and politics and is evident in government cover ups, corruption, and numerous acts of bad faith. Secrecy, lies, and deception have a long history in the United States and the issue is not merely to uncover such instances of state deception but to connect the dots over time and to map the connections, for instance, between the actions of the NSA in the early aftermath of the attempts to cover up the inhumane destruction unleashed by the atomic bomb on Hiroshima and Nagasaki and the role the NSA and other intelligence agencies play today in distorting the truth about government policies while embracing an all-compassing notion of surveillance and squelching of civil liberties, privacy, and freedom.
Hiroshima symbolizes the fact that the United States commits unspeakable acts making it easier to refuse to rely on politicians, academics, and alleged experts who refuse to support a politics of transparency and serve mostly to legitimate anti-democratic, if not totalitarian policies. Questioning a monstrous war machine whose roots lie in Hiroshima is the first step in declaring nuclear weapons unacceptable ethically and politically. This suggests a further mode of inquiry that focuses on how the rise of the military-industrial complex contributes to the escalation of nuclear weapons and what can we learn by tracing it roots to the development and use of the atom bomb. Moreover, it raises questions about the role played by intellectuals both in an out of the academy in conspiring to build the bomb and hide its effects from the American people? These are only some of the questions that need to be made visible, interrogated, and pursued in a variety of sites and public forums.
One crucial issue today is what role might intellectuals and matters of civic courage, engaged citizenship, and the educative nature of politics play as part of a sustained effort to resurrect the memory of Hiroshima as both a warning and a signpost for rethinking the nature of collective struggle, reclaiming the radical imagination, and producing a sustained politics aimed at abolishing nuclear weapons forever? One issue would be to revisit the conditions that made Hiroshima and Nagasaki possible, to explore how militarism and a kind of technological fanaticism merged under the star of scientific rationality. Another step forward would be to make clear what the effects of such weapons are, to disclose the manufactured lie that such weapons make us safe. Indeed, this suggests the need for intellectuals, artists, and other cultural workers to use their skills, resources, and connections to develop massive educational campaigns.
Such campaigns not only make education, consciousness, and collective struggle the center of politics, but also systemically work to both inform the public about the history of such weapons, the misery and suffering they have caused, and how they benefit the financial, government, and corporate elite who make huge amounts of money off the arms race and the promotion of nuclear deterrence and the need for a permanent warfare state. Intellectuals today appear numbed by ever developing disasters, statistics of suffering and death, the Hollywood disimagination machine with its investment in the celluloid Apocalypse for which only superheroes can respond, and a consumer culture that thrives on self-interests and deplores collective political and ethical responsibility.
There are no rationales or escapes from the responsibility of preventing mass destruction due to nuclear annihilation; the appeal to military necessity is no excuse for the indiscriminate bombing of civilians whether in Hiroshima or Afghanistan. The sense of horror, fear, doubt, anxiety, and powerless that followed Hiroshima and Nagasaki up until the beginning of the 21st century seems to have faded in light of both the Hollywood apocalypse machine, the mindlessness of celebrity and consumer cultures, the growing spectacles of violence, and a militarism that is now celebrated as one of the highest ideals of American life. In a society governed by militarism, consumerism, and neoliberal savagery, it has become more difficult to assume a position of moral, social, and political responsibility, to believe that politics matters, to imagine a future in which responding to the suffering of others is a central element of democratic life. When historical memory fades and people turn inward, remove themselves from politics, and embrace cynicism over educated hope, a culture of evil, suffering, and existential despair. Americans now life amid a culture of indifference sustained by an endless series of manufactured catastrophes that offer a source of entertainment, sensation, and instant pleasure.
We live in a neoliberal culture that subordinates human needs to the demand for unchecked profits, trumps exchange values over the public good, and embraces commerce as the only viable model of social relations to shape the entirety of social life. Under such circumstances, violence becomes a form of entertainment rather than a source of alarm, individuals no longer question society and become incapable of translating private troubles into larger public considerations. In the age following the use of the atom bomb on civilians, talk about evil, militarism, and the end of the world once stirred public debate and diverse resistance movements, now it promotes a culture of fear, moral panics, and a retreat into the black hole of the disimagination machine. The good news is that neoliberalism now makes clear that it cannot provide a vision to sustain society and works largely to destroy it. It is a metaphor for the atom bomb, a social, political, and moral embodiment of global destruction that needs to be stopped before it is too late. The future will look much brighter without the glow of atomic energy and the recognition that the legacy of death and destruction that extends from Hiroshima to Fukushima makes clear that no one can be a bystander if democracy is to survive.
notes: 1. This reference refers to a collection of interviews with Michel Foucault originally published by Semiotext(e). Michel Foucault, “What our present is?” Foucault Live: Collected Interviews, 1961–1984, ed. Sylvere Lotringer, trans. Lysa Hochroth and John Johnston,(New York: Semiotext(e), 1989 and 1996), 407–415. Back to the essay
2. Zygmunt Bauman and Leonidas Donskis, Moral Blindness: The loss of Sensitivity in Liquid Modernity, (Cambridge, UK: Polity Press, 2013), p. 33. Back to the essay
4. Of course, the Occupy Movement in the United States and the Quebec student movement are exceptions to this trend. See, for instance, David Graeber, The Democracy Project: A History, A Crisis, A Movement, (New York, NY,: The Random House Publishing Group, 2013) and Henry A. Giroux, Neoliberalism’s War Against Higher Education (Chicago: Haymarket, 2014). Back to the essay
5. Of course, the Occupy Movement in the United States and the Quebec student movement are exceptions to this trend. See, for instance, David Graeber, The Democracy Project: A History, A Crisis, A Movement, (New York, NY,: The Random House Publishing Group, 2013) and Henry A. Giroux, Neoliberalism’s War Against Higher Education (Chicago: Haymarket, 2014). Back to the essay
7. Jennifer Rosenberg, “Hiroshima and Nagasaki (Part 2),” About.com –20th Century History (March 28, 201). Online: http://history1900s.about.com/od/worldwarii/a/hiroshima_2.htm. A more powerful atom bomb was dropped on Nagasaki on August 9, 1945 and by the end of the year an estimated 70,000 had been killed. For the history of the making of the bomb, see the monumental: Richard Rhodes, The Making of the Atomic Bomb, Anv Rep edition (New York: Simon & Schuster, 2012. Back to the essay
8. The term “technological fanaticism” comes from Michael Sherry who suggested that it produced an increased form of brutality. Cited in Howard Zinn, The Bomb. (New York. N.Y.: City Lights, 2010), pp. 54-55. Back to the essay
11. Peter Bacon Hales, Outside the Gates of Eden: The Dream Of America From Hiroshima To Now. (Chicago. IL.: University of Chicago Press, 2014), p. 17. Back to the essay
12. Paul Ham, Hiroshima Nagasaki: The Real Story of the Atomic Bombings and Their Aftermath (New York: Doubleday, 2011). Back to the essay
13. Kensaburo Oe, Hiroshima Notes (New York: Grove Press, 1965), p. 114. Back to the essay
15. Robert Jay Lifton and Greg Mitchell, Hiroshima in America, (New York, N.Y.: Avon Books, 1995). p. 314-315. 328. Back to the essay
16. Ibid., Oh Jung, “Hiroshima and Nagasaki: The Decision to Drop the Bomb.” Back to the essay
17. Robert Jay Lifton, “American Apocalypse,” The Nation (December 22, 2003), p. 12. Back to the essay
18. For an interesting analysis of how the bomb was defended by the New York Times and a number of high ranking politicians, especially after John Hersey’s Hiroshima appeared in The New Yorker, see Steve Rothman, “The Publication of “Hiroshima” in The New Yorker,” Herseyheroshima.com, (January 8, 1997). Online: http://www.herseyhiroshima.com/hiro.php Back to the essay
19. Wilson cited in Lifton and Mitchell, Hiroshima In America, p. 309. Back to the essay
20. Ibid., Peter Bacon Hales, Outside The Gates of Eden: The Dream Of America From Hiroshima To Now, p. 8. Back to the essay
22. Ibid., Robert Jay Lifton and Greg Mitchell, Hiroshima In America. Back to the essay
23. For a more recent articulation of this argument, see Ward Wilson, Five Myths About Nuclear Weapons (new York: Mariner Books, 2013). Back to the essay
24. Ronald Takaki, Hiroshima: Why America Dropped the Atomic Bomb, (Boston: Back Bay Books, 1996), p. 39 Back to the essay
26. See, for example, Ibid., Haseqawa; Gar Alperowitz’s, Atomic Diplomacy Hiroshima and Potsdam: The Use of the Atomic Bomb and the American Confrontation with Soviet Power (London: Pluto Press, 1994) and also Gar Alperowitz, The Decision to Use the Atomic Bomb (New York: Vintage, 1996). Ibid., Ham. Back to the essay
27. John Hersey, Hiroshima (New York: Alfred A. Knopf, 1946), p. 68. Back to the essay
28. Giovanna Borradori, ed, “Autoimmunity: Real and Symbolic Suicides–a dialogue with Jacques Derrida,” in Philosophy in a Time of Terror: Dialogues with Jurgen Habermas and Jacques Derrida (Chicago: University of Chicago Press, 2004), pp. 85-136. Back to the essay
31. George Burchett & Nick Shimmin, eds. Memoirs of a Rebel Journalist: The Autobiography of Wilfred Burchett, (UNSW Press, Sydney, 2005), p.229. Back to the essay
40. Patrick Allitt, A Climate of Crisis: America in the Age of Environmentalism (New York: Penguin, 2015); Horace Herring, From Energy Dreams to Nuclear Nightmares: Lessons from the Anti-nuclear Power Movement in the 1970s (Chipping Norton, UK: Jon Carpenter Publishing, 2006; Alain Touraine, Anti-Nuclear Protest: The Opposition to Nuclear Energy in France (Cambridge, UK: Cambridge University Press, 1983); Stephen Croall, The Anti-Nuclear Handbook New York: Random House, 1979). On the decade that enveloped the anti-nuclear moment with a series of crisis, see Philip Jenkins, Decade of Nightmares: The End of the Sixties and the Making of Eighties America (New York: Oxford University Press, 2008). Back to the essay
Hans Ulrich Gumbrecht’s “The Future of Reading? Memories and Thoughts Toward a Genealogical Approach” asks a fundamental question: How does the younger generation of students and readers approach a text, and in which ways does their constant reading via one online device or another, as a ubiquitous electronic background activity, change their experience of interacting with the printed word? Gumbrecht bases his observations on two seminars that he taught in Santiago de Chile in 2013, in which Stanford undergraduates were reading fiction and nonfiction texts in both English and Spanish. While he notes that today’s younger readers possess breathtaking agility in identifying the key subject matters and problems of a text, he found their way of interacting with a text “endlessly puzzling.” Against a backdrop of the various methods and theories of reading from the past decades (e.g., deconstructive, cultural, hermeneutic), Gumbrecht asks what it means to have “everything always at hand” via various electronic forms of communication. If, for example, classic texts are those that have maintained their freshness and immediacy against the erosion of time, how is the electronic revolution, which makes the cultures and literatures of any time accessible to us, changing the “reading culture” of the younger generations? The questions and the argument developed in this essay go back to my contribution, on October 17, 2011, to the lecture series “How I Think about Literature,” in the Division of Literatures, Cultures, and Languages, at Stanford University.
on The Emergence of the Digital Humanities by Steven E. Jones
1
Steven E. Jones begins his Introduction to The Emergence of the Digital Humanities (Routledge, 2014) with an anecdote concerning a speaking engagement at the Illinois Institute of Technology in Chicago. “[M]y hosts from the Humanities department,” Jones tells us,
had also arranged for me to drop in to see the fabrication and rapid-prototyping lab, the Idea Shop at the University Technology Park. In one empty room we looked into, with schematic drawings on the walls, a large tabletop machine jumped to life and began whirring, as an arm with a router moved into position. A minute later, a student emerged from an adjacent room and adjusted something on the keyboard and monitor attached by an extension arm to the frame for the router, then examined an intricately milled block of wood on the table. Next door, someone was demonstrating finely machined parts in various materials, but mostly plastic, wheels within bearings, for example, hot off the 3D printer….
What exactly, again, was my interest as a humanist in taking this tour, one of my hosts politely asked?1
It is left almost entirely to more or less clear implication, here, that Jones’s humanities department hosts had arranged the expedition at his request, and mainly or even only to oblige a visitor’s unusual curiosity, which we are encouraged to believe his hosts (if “politely”) found mystifying. Any reader of this book must ask herself, first, if she believes this can really have occurred as reported: and if the answer to that question is yes, if such a genuinely unlikely and unusual scenario — the presumably full-time, salaried employees of an Institute of Technology left baffled by a visitor’s remarkable curiosity about their employer’s very raison d’être — warrants any generalization at all. For that is how Jones proceeds: by generalization, first of all from a strained and improbably dramatic attempt at defamiliarization, in the apparent confidence that this anecdote illuminating the spirit of the digital humanities will charm — whom, exactly?
It must be said that Jones’s history of “digital humanities” is refreshingly direct and initially, at least, free of obfuscation, linking the emergence of what it denotes to events in roughly the decade preceding the book’s publication, though his reading of those events is tendentious. It was the “chastened” retrenchment after the dot-com bubble in 2000, Jones suggests (rather, just for example, than the bubble’s continued inflation by other means) that produced the modesty of companies like our beloved Facebook and Twitter, along with their modest social networking platform-products, as well as the profound modesty of Google Inc. initiatives like Google Books (“a development of particular interest to humanists,” we are told2) and Google Maps. Jones is clearer-headed when it comes to the disciplinary history of “digital humanities” as a rebaptism of humanities computing and thus — though he doesn’t put it this way — a catachrestic asseveration of traditional (imperial-nationalist) philology like its predecessor:
It’s my premise that what sets DH apart from other forms of media studies, say, or other approaches to the cultural theory of computing, ultimately comes through its roots in (often text-based) humanities computing, which always had a kind of mixed-reality focus on physical artifacts and archives.3
Jones is also clear-headed on the usage history of “digital humanities” as a phrase in the English language, linking it to moments of consolidation marked by Blackwell’s Companion to Digital Humanities, the establishment of the National Endowment for the Humanities Office for the Digital Humanities, and higher-education journalism covering the annual Modern Language Association of America conventions. It is perhaps this sensitivity to “digital humanities” as a phrase whose roots lie not in original scholarship or cultural criticism itself (as was still the case with “deconstruction” or “postmodernism,” even at their most shopworn) but in the dependent, even parasitic domains of reference publishing, grant-making, and journalism that leads Jones to declare “digital humanities” a “fork” of humanities computing, rather than a Kuhnian paradigm shift marking otherwise insoluble structural conflict in an intellectual discpline.
At least at first. Having suggested it, Jones then discards the metaphor drawn from the tree structures of software version control, turning to “another set of metaphors” describing the digital humanities as having emerged not “out of the primordial soup” but “into the spotlight” (Jones, 5). We are left to guess at the provenance of this second metaphor, but its purpose is clear: to construe the digital humanities, both phenomenally and phenomenologically, as the product of a “shift in focus, driven […] by a new set of contexts, generating attention to a range of new activities” (5).
Change; shift; new, new, new. Not a branch or a fork, not even a trunk: we’re now in the ecoverse of history and historical time, in its collision with the present. The appearance and circulation of the English-language phrase “digital humanities” can be documented — that is one of the things that professors of English like Jones do especially well, when they care to. But “changes in the culture,” much more broadly, within only the last ten years or so? No scholar in any discipline is particularly well trained, well positioned, or even well suited to diagnosing those; and scholars in English studies won’t be at the top of anyone’s list. Indeed, Jones very quickly appeals to “author William Gibson” for help, settling on the emergence of the digital humanities as a response to what Gibson called “the eversion of cyberspace,” in its ostensibly post-panopticist colonization of the physical world.6It makes for a rather inarticulate and self-deflating statement of argument, in which on its first appearance eversion, ambiguously, appears to denote the response as much as its condition or object:
My thesis is simple: I think that the cultural response to changes in technology, the eversion, provides an essential context for understanding the emergence of DH as a new field of study in the new millennium.7
Jones offers weak support for the grandiose claim that “we can roughly date the watershed moment when the preponderant collective perception changed to 2004–2008″ (21). Second Life “peaked,” we are told, while World of Warcraft “was taking off”; Nintendo introduced the Wii; then Facebook “came into its own,” and was joined by Twitter and Foursquare, then Apple’s iPhone. Even then (and setting aside the question of whether such benchmarking is acceptable evidence), for the most part Jones’s argument, such as it is, is that something is happening because we are talking about something happening.
But who are we? Jones’s is the typical deference of the scholar to the creative artist, unwilling to challenge the latter’s utter dependence on meme engineering, at least where someone like Gibson is concerned; and Jones’s subsequent turn to the work of a scholar like N. Katherine Hayles on the history of cybernetics comes too late to amend the impression that the order of things here is marked first by gadgets, memes, and conversations about gadgets and memes, and only subsequently by ideas and arguments about ideas. The generally unflattering company among whom Hayles is placed (Clay Shirky, Nathan Jurgenson) does little to move us out of the shallows, and Jones’s profoundly limited range of literary reference, even within a profoundly narrowed frame — it’s Gibson, Gibson, Gibson all the time, with the usual cameos by Bruce Sterling and Neal Stephenson — doesn’t help either.
Jones does have one problem with the digital humanities: it ignores games. “My own interest in games met with resistance from some anonymous peer reviewers for the program for the DH 2013 conference, for example,” he tells us (33). “[T]he digital humanities, at least in some quarters, has been somewhat slow to embrace the study of games” (59). “The digital humanities could do worse than look to games” (36). And so on: there is genuine resentment here.
But nobody wants to give a hater a slice of the pie, and a Roman peace mandates that such resentment be sublated if it is to be, as we say, taken seriously. And so in a magical resolution of that tension, the digital humanities turns out to be constituted by what it accidentally ignores or actively rejects, in this case — a solution that sweeps antagonism under the rug as we do in any other proper family. “[C]omputer-based video games embody procedures and structures that speak to the fundamental concerns of the digital humanities” (33). “Contemporary video games offer vital examples of digital humanities in practice” (59). If gaming “sounds like what I’ve been describing as the agenda of the digital humanities, it’s no accident” (144).
Some will applaud Jones’s niceness on this count. It may strike others as desperately friendly, a lingering under a big tent as provisional as any other tent, someday to be replaced by a building, if not by nothing. Few of us will deny recognition to Second Life, World of Warcraft, Wii, Facebook, Twitter, etc. as cultural presences, at least for now. But Jones’s book is also marked by slighter and less sensibly chosen benchmarks, less sensibly chosen because Jones’s treatment of them, in a book whose ambition is to preach to the choir, simply imputes their cultural presence. Such brute force argument drives the pathos that Jones surely feels, as a scholar — in the recognition that among modern institutions, it is only scholarship and the law that preserve any memory at all — into a kind of melancholic unconscious, from whence his objects return to embarrass him. “[A]s I write this,” we read, “QR codes show no signs yet of fading away” (41). Quod erat demonstrandum.
And it is just there, in such a melancholic unconscious, that the triumphalism of the book’s title, and the “emergence of the digital humanities” that it purports to mark, claim, or force into recognition, straightforwardly gives itself away. For the digital humanities will pass away, and rather than being absorbed into the current order of things, as digital humanities enthusiasts like to believe happened to “high theory” (it didn’t happen), the digital humanities seems more likely, at this point, to end as a blank anachronism, overwritten by the next conjuncture in line with its own critical mass of prognostications.
2
To be sure, who could deny the fact of significant “changes in the culture” since 2000, in the United States at least, and at regular intervals: 2001, 2008, 2013…? Warfare — military in character, but when that won’t do, economic; of any interval, but especially when prolonged and deliberately open-ended; of any intensity, but especially when flagrantly extrajudicial and opportunistically, indeed sadistically asymmetrical — will do that to you. No one who sets out to historicize the historical present can afford to ignore the facts of present history, at the very least — but the fact is that Jones finds such facts unworthy of comment, and in that sense, for all its pretense to worldliness, The Emergence of the Digital Humanities is an entirely typical product of the so-called ivory tower, wherein arcane and plain speech alike are crafted to euphemize and thus redirect and defuse the conflicts of the university with other social institutions, especially those other institutions who command the university to do this or do that. To take the ambiguity of Jones’s thesis statement (as quoted above) at its word: what if the cultural response that Jones asks us to imagine, here, is indeed and itself the “eversion” of the digital humanities, in one of the metaphorical senses he doesn’t quite consider: an autotomy or self-amputation that, as McLuhan so enjoyed suggesting in so many different ways, serves to deflect the fact of the world as a whole?
There are few moments of outright ignorance in The Emergence of the Digital Humanities — how could there be, in the security of such a narrow channel?6 Still, pace Jones’s basic assumption here (it is not quite an argument), we might understand the emergence of the digital humanities as the emergence of a conversation that is not about something — cultural change, etc. — as much as it is an attempt to avoid conversing about something: to avoid discussing such cultural change in its most salient and obvious flesh-and-concrete manifestations. “DH is, of course, a socially constructed phenomenon,” Jones tells us (7) — yet “the social,” here, is limited to what Jones himself selects, and selectively indeed. “This is not a question of technological determinism,” he insists. “It’s a matter of recognizing that DH emerged, not in isolation, but as part of larger changes in the culture at large and that culture’s technological infrastructure” (8). Yet the largeness of those larger changes is smaller than any truly reasonable reader, reading any history of the past decade, might have reason to expect. How pleasant that such historical change was “intertwined with culture, creativity, and commerce” (8) — not brutality, bootlicking, and bank fraud. Not even the modest and rather opportunistic gloom of Gibson’s 2010 New York Times op-ed entitled “Google’s Earth” finds its way into Jones’s discourse, despite the extended treatment that Gibson’s “eversion” gets here.
From our most ostensibly traditional scholarly colleagues, toiling away in their genuine and genuinely book-dusty modesty, we don’t expect much respect for the present moment (which is why they often surprise us). But The Emergence of the Digital Humanities is, at least in ambition, a book about cultural change over the last decade. And such historiographic elision is substantive — enough so to warrant impatient response. While one might not want to say that nothing good can have emerged from the cultural change of the period in question, it would be infantile to deny that conditions have been unpropitious in the extreme, possibly as unpropitious as they have ever been, in U.S. postwar history — and that claims for the value of what emerges into institutionality and institutionalization, under such conditions, deserve extra care and, indeed defense in advance, if one wants not to invite a reasonably caustic skepticism.
When Jones does engage in such defense, it is weakly argued. To construe the emergence of the digital humanities as non-meaninglessly concurrent with the emergence of yet another wave of mass educational automation (in the MOOC hype that crested in 2013), for example, is wrong not because Jones can demonstrate that their concurrence is the concurrence of two entirely segregated genealogies — one rooted in Silicon Valley ideology and product marketing, say, and one utterly and completely uncaused and untouched by it — but because to observe their concurrence is “particularly galling” to many self-identified DH practitioners (11). Well, excuse me for galling you! “DH practitioners I know,” Jones informs us, “are well aware of [the] complications and complicities” of emergence in an age of precarious labor, “and they’re often busy answering, complicating, and resisting such opportunistic and simplistic views” (10). Argumentative non sequitur aside, that sounds like a lot of work undertaken in self-defense — more than anyone really ought to have to do, if they’re near to the right side of history. Finally, “those outside DH,” Jones opines in an attempt at counter-critique, “often underestimate the theoretical sophistication of many in computing,” who “know better than many of their humanist critics that their science is provisional and contingent” (10): a statement that will only earn Jones super-demerits from those of such humanist critics — they are more numerous than the likes of Jones ever seem to suspect — who came to the humanities with scientific and/or technical aptitudes, sometimes with extensive educational and/or professional training and experience, and whose “sometimes world-weary and condescending skepticism” (10) is sometimes very well-informed and well-justified indeed, and certain to outlive Jones’s winded jabs at it.
Jones is especially clumsy in confronting the charge that the digital humanities is marked by a forgetting or evasion of the commitment to cultural criticism foregrounded by other, older and now explicitly competing formations, like so-called new media studies. Citing the suggestion by “media scholar Nick Montfort” that “work in the digital humanities is usually considered to be the digitization and analysis of pre-digital cultural artifacts, not the investigation of contemporary computational media,” Jones remarks that “Montfort’s own work […] seems to me to belie the distinction,”7 as if Montfort — or anyone making such a statement — were simply deluded about his own work, or about his experience of a social economy of intellectual attention under identifiably specific social and historical conditions, or else merely expressing pain at being excluded from a social space to which he desired admission, rather than objecting on principle to a secessionist act of imagination.8
3
Jones tells us that he doesn’t “mean to gloss over the uneven distribution of [network] technologies around the world, or the serious social and political problems associated with manufacturing and discarding the devices and maintaining the server farms and cell towers on which the network depends” — but he goes ahead and does it anyway, and without apology or evident regret. “[I]t’s not my topic in this book,” we are told, “and I’ve deliberately restricted my focus to the already-networked world” (3). The message is clear: this is a book for readers who will accept such circumscription, in what they read and contemplate. Perhaps this is what marks the emergence of the digital humanities, in the re-emergence of license for restrictive intellectual ambition and a generally restrictive purview: a bracketing of the world that was increasingly discredited, and discredited with increasing ferocity, just by the way, in the academic humanities in the course of the three decades preceding the first Silicon Valley bubble. Jones suggests that “it can be too easy to assume a qualitative hierarchical difference in the impact of networked technology, too easy to extend the deeper biases of privilege into binary theories of the global ‘digital divide’” (4), and one wonders what authority to grant to such a pronouncement when articulated by someone who admits he is not interested, at least in this book, in thinking about how an — how any — other half lives. It’s the latter, not the former, that is the easy choice here. (Against a single, entirely inconsequential squib in Computer Business Review entitled “Report: Global Digital Divide Getting Worse,” an almost obnoxiously perfunctory footnote pits “a United Nations Telecoms Agency report” from 2012. This is not scholarship.)
Thus it is that, read closely, the demand for finitude in the one capacity in which we are non-mortal — in thought and intellectual ambition — and the more or less cheerful imagination of an implied reader satisfied by such finitude, become passive microaggressions aimed at another mode of the production of knowledge, whose expansive focus on a theoretical totality of social antagonism (what Jones calls “hierarchical difference”) and justice (what he calls “binary theories”) makes the author of The Emergence of the Digital Humanities uncomfortable, at least on its pages.
That’s fine, of course. No: no, it’s not. What I mean to say is that it’s unfair to write as if the author of The Emergence of the Digital Humanities alone bears responsibility for this particular, certainly overdetermined state of affairs. He doesn’t — how could he? But he’s getting no help, either, from most of those who will be more or less pleased by the title of his book, and by its argument, such as it is: because they want to believe they have “emerged” along with it, and with that tension resolved, its discomforts relieved. Jones’s book doesn’t seriously challenge that desire, its (few) hedges and provisos notwithstanding. If that desire is more anxious now than ever, as digital humanities enthusiasts find themselves scrutinized from all sides, it is with good reason.
_____
notes: 1. Jones, 1. Back to the essay 2. Jones, 4. “Interest” is presumed to be affirmative, here, marking one elision of the range of humanistic critical and scholarly attitudes toward Google generally and the Google Books project in particular. And of the unequivocally less affirmative “interest” of creative writers as represented by the Authors Guild, just for example, Jones has nothing to say: another elision. Back to the essay 3. Jones, 13. Back to the essay 4. See Gibson. Back to the essay 5. Jones, 5. Back to the essay 6. As eager as any other digital humanities enthusiast to accept Franco Moretti’s legitimation of DH, but apparently incurious about the intellectual formation, career and body of work that led such a big fish to such a small pond, Jones opines that Moretti’s “call for a distant reading” stands “opposed to the close reading that has been central to literary studies since the late nineteenth century” (Jones, 62). “Late nineteenth century” when exactly, and where (and how, and why)? one wonders. But to judge by what Jones sees fit to say by way of explanation — that is, nothing at all — this is mere hearsay. Back to the essay 7. Jones, 5. See also Montfort. Back to the essay 8. As further evidence that Montfort’s statement is a mischaracterization or expresses a misunderstanding, Jones suggests the fact that “[t]he Electronic Literature Organization itself, an important center of gravity for the study of computational media in which Montfort has been instrumental, was for a time housed at the Maryland Institute for Technology in the Humanities (MITH), a preeminent DH center where Matthew Kirschenbaum served as faculty advisor” (Jones, 5–6). The non sequiturs continue: “digital humanities” includes the study of computing and media because “self-identified practitioners doing DH” study computing and media (Jones, 6); the study of computing and media is also “digital humanities” because the study of computing and digital media might be performed at institutions like MITH or George Mason University’s Roy Rosenzweig Center for History and New Media, which are “digital humanities centers” (although the phrase “digital humanities” appears nowhere in their names); “digital humanities” also adequately describes work in “media archaeology” or “media history,” because such work has “continued to influence DH” (Jones, 6); new media studies is a component of the digital humanities because some scholars suggest it is so, and others cannot be heard to object, at least after one has placed one’s fingers in one’s ears; and so on. Back to the essay (feature image: “Bandeau – Manifeste des Digital Humanities,” uncredited; originally posted on flickr.)
This essay criticizes Ahmet Davutoğlu’s proposal that Islamic civilization complete the “unfinished project of modernity” (Jürgen Habermas), by challenging the concept of civilization itself. As scholars in multiple disciplines have demonstrated, civilizations are hybrid constructions that cannot be contained within a uniform conceptual frame, such as Islamic “authenticity.” The past is shared, and the present is as well. The Arab Spring demonstrates that modernity confronts political actors with similar problems, whatever their background. The essay addresses successive paradoxes within the unfinished project of democracy: the contradiction between free markets (capitalist inequality) and free societies (political equality), the hierarchical relationship between the people and their leaders (Jacques Ranciére’s Ignorant Schoolmaster is discussed), and the lack of democracy between nations within the present world order.
Feature Image: leaflet dropped on MNLA during the Malayan Emergency, offering $1,000 in exchange for the individual surrender of targeted MCP insurgents and the turning in of their Bren gun. A labeled “emergency” and not “war” for insurance purposes, it is suggested.