Remembering Postmodernism, Part Three: The Conspiracy Theory of Postmodernism

HOUSEKEEPING NOTE: There has been an unwanted lag in this series of posts, mainly because I’ve been struggling with what was supposed to be part three. Struggling in a good way! The TL;DR on it is that it occurred to me that a consideration of the artistic and literary responses to the two world wars respectively offers a useful insight into key elements of modernism and postmodernism. What initially seemed a straightforward, even simple breakdown has proved (not unpredictably, I now see) a lot more complex but also a lot more interesting. What I ultimately post may end up being the more straightforward version or possibly a two-part, lengthier consideration. One way or another, I’m quite enjoying going down this particular rabbit hole.

So in the meantime, in the interests of not letting this series lag too much, I’m leapfrogging to what was to have been part four.

My doctoral dissertation was on conspiracy theory and paranoia in postmodern American literature and culture. Towards the end of my defense, one of my examiners—a modernism scholar with a Wildean talent for aphoristic wit—asked “Is modernism postmodernism’s conspiracy theory?”

I should pause to note that I thoroughly enjoyed my thesis defense. This was largely because my examiners really liked my thesis, and so instead of being the pressure cooker these academic rites of passage can sometimes be, it was three hours of animated and lively discussion that the defense chair had to bring to a halt with some exasperation when we ran long. For all that, however, it was still three hours of intense scholarly back-and-forth, and so when my examiner asked about modernism as postmodernism’s conspiracy theory, my tank was nearly empty. I found myself wishing—and indeed gave voice to the thought—that the question had been asked in the first hour rather than the third, when I could have done it justice. I still kick myself to this day for not prefacing my response with a Simpsons reference (something the questioner would have appreciated), Reverend Lovejoy’s line “Short answer, yes with an if, long answer, no with a but,” and then getting into the intricacies and implications of the question. As it was, I seem to remember saying something insightful like “Um … sure?”

I’ve thought about that moment many times in the seventeen years (yikes) since I defended, mostly out of fondness for the questioner, who was and remains a friend, as well as annoyance with myself for not giving the question an answer it deserves (now that I think of it, perhaps I should include that in this series). But over the past few years, I’ve thought of it more in the context of how postmodernism itself—and its related but widely misunderstood concept “cultural Marxism”—have come to be treated as essentially conspiratorial in nature.

As I’ve alluded to in previous posts, there is a (mis)understanding of postmodernism among anti-woke culture warriors as something specifically created by Leftists for the purpose of attacking, undermining, and destroying the edifices of Western civilization, as variously manifested in Enlightenment thought, the U.S. constitution, the traditional Western literary canon, manly men, the virtues of European imperialism, and so on. Postmodernity, rather than being the upshot of unchecked corporate capitalism and consumer culture, is seen instead as being the specific invention of resentful and closeted Marxist academics.

Of late, which is to say over the past five years, the most vocal purveyor of this conspiracy theory has been Jordan B. Peterson and his figuration of “postmodern neo-Marxism.” Anyone who knows even the slightest thing about either postmodernism or Marxism understands that, in this formulation, it is the word “neo” doing the heavy lifting—which perhaps betrays at least a slight understanding on Peterson’s part that there can be no such thing as “postmodern Marxism,” as the two terms are very nearly antithetical. Marxism is a modernist philosophy rooted in Enlightenment thought; what I’ve been loosely calling “postmodern thought,” which is to say the loose categorization of theories and philosophy arising largely out of the need to make sense of the postmodern condition, is generally antagonistic to the instrumental reason of the Enlightenment and such totalizing ideologies as Marxism, taking its philosophical leads instead from Friederichs Nietzsche, Edmund Husserl, and Martin Heidegger. So if you’re going to attempt to conflate Marxism with postmodernism, it’s going to have to be very neo- indeed.

Peterson’s basic premise is that the “two architects of the postmodernist movement”[1]—specifically, the French theorists Michel Foucault and Jacques Derrida[2]—were themselves dyed-in-the-wool Marxists; but that when they were making their academic bones in the 1960s, they were faced with the unavoidable failure of Communism as a political force.  In one of his ubiquitous YouTube lectures, he declares that “in the late 60s and early 70s, they were avowed Marxists, way, way after anyone with any shred of ethical decency had stopped being Marxist.”

The “postmodernists,” Peterson continues, “knew they were pretty much done with pushing their classic Marxism by the late 60s and the early 70s,” because the evidence of Stalin’s atrocities were by then so unavoidable that carrying on under the Marxist banner was untenable.

This assertion, as it happens, is laughably untrue: the Communist Party of France reliably garnered twenty percent of the legislative vote through the 1960s and 70s. There was no shortage of people, inside the university and out, who were “avowed” Marxists. There was really no reason Derrida and Foucault—who incidentally hated each other, so were hardly co-conspirators—would have been compelled to disguise their Marxist convictions. If indeed they had any: it is an irony that Peterson can make this argument in part because neither Derrida nor Foucault were avowed Marxists. They have, indeed, often been looked upon with suspicion by Marxist scholars, and frequently castigated (Derrida especially[3]) for precisely the reasons I cited above: namely, that they eschewed the principles of Marx’s teleological philosophy and an extrinsic historical order. They were, to coin an expression, a little too “postmodern” for Marxists’ tastes.

Though Peterson is by no means voicing an original idea—the charge that “cultural Marxists” comprise a shadowy cabal of professors seeking to destroy Western civilization was first articulated in the early 1990s—he does imbue his attack with his own uniquely greasy brand of ad hominem logic familiar to anyone who has taken issue with his many transphobic screeds. See if you can spot the code words:

Foucault in particular, who was an outcast and a bitter one, and a suicidal one, and through his entire life did everything he possibly could with his staggering I.Q. to figure out every treacherous way possible to undermine the structure that wouldn’t accept him in all his peculiarity—and it’s no wonder, because there’d be no way of making a structure that could possibly function if it was composed of people as peculiar, bitter, and resentful as Michel Foucault.

Michel Foucault, for those unfamiliar with him, was queer; much of his work was preoccupied with the ways in which people marginalized and stigmatized by their sexuality were policed by society, and the ways in which that policing—that exercise of power—was effected discursively through the designations of mental illness (Madness and Civilization, 1961), the disciplining of society via surveillance (Discipline and Punish, 1975), and the ways in which the categorization of sexual identities exemplifies the normative determination of the self (The History of Sexuality, four volumes, 1976, 1984, 1984, and 2018).

Peterson’s characterization of Foucault is, in this respect, frankly vile—as is his description of Foucault when he first introduces him into his discussion: “A more reprehensible individual you could hardly ever discover, or even dream up, no matter how twisted your imagination.” His repetition of the word “peculiar” is an obvious dog-whistle, and he damns Foucault for being an “outcast,” “bitter,” and “suicidal,” as if Foucault’s “outcast” status as a queer man with a galaxy-sized brain was somehow a character flaw rather than a function of the strictures of a society he understandably took umbrage with.[4] Peterson might be the psychologist here, but I do sense a certain amount of animosity and revulsion that is not entirely directed at Foucault’s philosophy.

One way or another, the charge here is that Foucault and Derrida effectively invented postmodernism as a means of sublimating their doctrinaire Marxism into something more insidious and invidious, which would burrow into university humanities departments like a virus; still speaking of Foucault, Peterson says, “In any case, he did put his brain to work trying to figure out (a) how to resurrect Marxism under a new guise, let’s say, and (b) how to justify the fact that it wasn’t his problem that he was an outsider, it was actually everyone else’s problem.” Some fifty-odd years later, goes the Peterson narrative—which, again, is not specific to him, but prevalent among his fellow-travelers on the so-called “intellectual Dark Web”— we’re dealing with the harvest of what Derrida and Foucault sowed in the form of Black Lives Matter and critical race theory, trans people asserting their right to their preferred pronouns, cancel culture, and the general upending of what cishet white men perceive as the natural order of things.

We can argue over how we got to this moment in history, but the idea that the postmodern condition—or whatever we want to call the present moment—was orchestrated by a handful of resentful French intellectuals should be relegated to the same place of shame as most conspiracy theories. As I’ve been attempting to argue in this series of posts is that while thinkers like Foucault and Derrida have indeed profoundly influenced postmodern thought, they are not—nor are any of their acolytes—responsible for the cultural conditions of postmodernity more broadly. They have, rather, attempted to develop vocabularies that can describe what we’ve come to call postmodernity.[5]

NOTES


[1] The substance of Peterson’s conspiracy theory of postmodernism I quote here is from an invited lecture he delivered at the University of Wisconsin in 2017, posted to YouTube (the relevant bit starts at 29:30)

[2] Though Peterson acknowledges that Foucault and Derrida aren’t the only two masterminds of postmodern thought, they are, to the best of my knowledge, the only two he ever really talks about. I find it odd that, as a professor of psychology and practicing psychologist, he never (again, to the best of my knowledge) ever deals with the work of psycholinguist Jacques Lacan, whose poststructuralist adaptations of Freud are almost as influential as the work of Foucault and Derrida. Given how consistently he gets wrong the basic premises of Foucault and Derrida—but especially Foucault—the absence of Lacan from his diatribes strikes me as further evidence of the poverty of his understanding of the very issues he addresses.

[3] Derrida never tipped his ideological hand one way or another until his book Specters of Marx (1994), which he wrote in the aftermath of the collapse of the Soviet Union and the putative death of Communism. In this book he does his deconstructivist schtick, playing around with the trope of the “spectre,” talking a lot about the ghost of Hamlet’s father, and sort of admitting “Yeah, I was always a Marxist.” Some Marxist thinkers were overjoyed; many more were decidedly unimpressed, deriding him as a Johnny-come-lately only willing to assume the Marxist mantle as he sat among what he assumed was its ruins. In an essay bitterly titled “Marxism without Marxism,” Terry Eagleton wrote: “it is hard to resist asking, plaintively, where was Jacque Derrida when we needed him, in the long dark night of Reagan-Thatcher,” and continues on to say, “there is something rich … about this sudden dramatic somersault onto a stalled bandwagon. For Specters of Marx doesn’t just want to catch up with Marxism; it wants to outleft it by claiming that deconstruction was all along a radicalized version of the creed.”

[4] For those familiar with Peterson, this is consonant with his worldview, especially with respect to his anti-transgender animus. He is an unreconstructed Jungian: which is to say, he believes fervently in a sense of the biological imperatives of mythology—that all of our stories and narratives, our societal customs and traditions, are dictated by our most elemental relationships to nature. Hence his weird grafting of pseudo-Darwinian evolutionism onto myths of all stripes, from the Bible to ancient Egypt to the Greeks, and how these innate understandings manifest themselves in the popularity of Disney princesses or the necessary disciplinary presence of the bully Nelson in The Simpsons (seriously). The bottom line is that Peterson’s worldview is predicated on a sense of the innate, immutable nature of gender, gender roles, hierarchies, and the individual as the hero of his own story. Feminism, in this perspective, is a basic betrayal of human nature; people identifying as a gender other than their genitalia dictates? Well, that’s just beyond the pale.

[5] The opacity of those vocabularies, especially with regards to Derrida and Lacan, is another example of just how complex the postmodern condition is. Old joke from grad school: What do you get when you cross the Godfather with a poststructuralist? Someone who’ll give you an offer you can’t understand.

Leave a comment

Filed under Remembering Postmodernism

Tolkien and the Culture Wars, Part Two: Thoughts on the Tolkien Society Seminar

[PROGRAMMING NOTE: I have not abandoned my “Remembering Postmodernism” series—I’m just having problems putting the next installment into a coherent form that doesn’t run into Tolstoy-length rambling. In the interim, please enjoy this follow-up to my Tolkien and the culture wars post.]

I do so love serendipity. I would go so far as to say it is what most frequently inspires me in my research and writing. Unfortunately, that tends to make my primary research methodology not at all unlike the dog from Up.

This fall, I’ll be teaching a fourth-year seminar on the American Weird. In the original version of the course, which I taught three years ago, we looked at H.P. Lovecraft’s influence on American horror and gothic, and we considered as well how contemporary texts engaged specifically with Lovecraft’s hella racist tendencies—specifically with regard to Victor LaValle’s novella The Ballad of Black Tom, a retelling of Lovecraft’s story “The Horror at Red Hook” (one of his most explicitly racist stories) from the perspective of a Black protagonist, and Matt Ruff’s recently published Lovecraft Country.

Since then, Lovecraft Country was adapted to television by HBO under the imprimatur of Jordan Peele, and the magnificent N.K. Jemisin published an overtly Lovecraftian novel The City We Became. So this time around, we’re going all in on the issue of race and genre in “the American Weird,” not least because the current furor over “critical race theory” in the U.S. makes such a course at least a little timely.

But of course you’ve read the title of my post and might be wondering how this is serendipitous with the Tolkien society’s recent seminar. What on earth does Tolkien have to do with Lovecraft, or questions of race and identity in American literature?

It has to do with the issue of genre and the ways in which genre has come to be both a metaphor and a delineation of community, belonging, and exclusion. Genre has always been about drawing lines and borders: as Neil Gaiman has noted, genre is about letting you know what aisles of the bookstore not to walk down. But in the present moment, we’ve become more alert to this exclusionary function of genre, something that Lovecraft Country—both the novel and the series—took as its principal conceit, taking H.P. Lovecraft’s racist legacy and redeploying it to interrogate the ways in which genre can be opened up to other voices, other identities.

The greatest joy of participating in this Tolkien Society seminar was seeing precisely this dynamic in action—in seeing how an international and diverse cadre of Tolkien scholars found themselves within his works. As I said in my previous post, the backlash against the very idea of a “Tolkien and Diversity” conference employed the rhetoric of invasion and destruction. A post to a Tolkien message board epitomizes quite nicely the zero-sum sensibility I discussed:

The very first paper of the Tolkien Society seminar this weekend was titled “Gondor in Transition: A Brief Introduction to Transgender Realities in The Lord of the Rings.” As you might imagine, the very idea that there could be transgender characters in LOTR has been anathema to the various people slagging the seminar over social media. Tolkien’s world view did not allow for trans identity! has been one of the squawks heard from complainants. Leaving aside the fact that gender fluidity in Norse myth is something Tolkien would have been quite familiar with, this sort of complaint misses the point—which is that Tolkien’s work is voluminous enough to admit a multitude of readings he almost certainly did not have in mind as he composed his mythology.

I wish I could offer a useful precis of the paper, which was quite well done, but I was distracted by Zoom’s chat function and the avalanche of commentary scrolling up the side of my screen as I listened to the presentation—many of the comments being from trans and non-binary Tolkien enthusiasts expressing gratitude for a paper that made them feel seen in the context of Tolkien fandom and scholarship. This was actually the general tone of the two-day seminar: not people who, as the conference’s detractors have charged, look to destroy or desecrate Tolkien, but people who love his mythology and want to see themselves reflected within it. And who in the process demonstrated the capaciousness of Tolkien’s vision—one not limited, as the fellow cited above suggests, to the rigid circumscription of conservative Catholicism.

Genre is an interesting thing—less, from my perspective, for its more esoteric delineations within literary theory and criticism than for its blunter and cruder demarcations in popular culture. This latter understanding is where the real action has been for the past ten or twenty years, as the barriers walling off genre from the realms of art and literature have crumbled—with the advent of prestige television taking on the mob film, the cop procedural, and the western in The Sopranos, The Wire, and Deadwood, respectively; then proceeding to make fantasy and zombie apocalypse respectable with Game of Thrones and The Walking Dead; all while such “literary” authors as Cormac McCarthy, Margaret Atwood, Kazuo Ishiguro, and Colson Whitehead made their own forays into genre with The Road, Oryx and Crake, The Buried Giant, and Zone One, just as “genre” authors like Neil Gaiman, Alan Moore, Alison Bechdel and others attained “literary” reputations.

But as those walls have come down, so too have those genre enthusiasts of years past grown resentful of all the new people moving into their neighbourhoods. As I mentioned in my previous post, both the Gamergaters and Sad Puppies articulated a sense of incursion and loss—women and people of colour and queer folk invading what had previously been predominantly white male spaces. Like those attacking the Tolkien and Diversity seminar, they spoke in apocalyptic terms of things being ruined and destroyed and desecrated—SF/F would never be the same, they bleated, if these people had their way. What will become of our beloved space operas and Joseph Campbell-esque fantasy?

Well, nothing. They will still be there—fantasy in a neo-African setting rather than a neo-European one doesn’t obviate J.R.R. Tolkien or C.S. Lewis or any of the myriad authors they influenced. They’re still there, on your bookshelf or at the library or your local bookstore. However much we might use the metaphor of territory, it’s flawed: in the physical world, territory is finite; in the intellectual world, it’s infinite. Tolkien’s world contains multitudes, and can admit those people who see queer themes and characters and want to interrogate the possible hermaphroditic qualities of dwarves and orcs without displacing all its staunchly conservative readers.

Leave a comment

Filed under Return to Middle-Earth

Tolkien and the Culture Wars

We interrupt your regularly scheduled deep dive into postmodernism for a brief diversion into the latest anti-woke tempest in a teacup.

CORRECTION: when I started writing this post, I meant it as a “brief diversion.” It has grown in the writing and become decidedly non-brief. One might even call it very long. My apologies.

Because there was no way on earth casting Sir Ian McKellen as Gandalf could possibly queer that character.

The Tolkien Society was founded in 1969 with J.R.R. Tolkien himself as its president; like most scholarly societies, it has hosted annual conferences at which people present papers exploring aspects of Tolkien’s and related works, and generally mingle to share their passion and enthusiasm for the subject. And like most of these such conferences, each year tends to have a different theme—looking over the society’s web page listing past conferences, we see such previous themes as “21st Century Receptions of Tolkien,” “Adapting Tolkien,” “Tolkien the Pagan?”, “Life, Death, and Immortality,” “Tolkien’s Landscapes,” and so on … you get the idea.

This year’s conference, to be hosted online on July 3-4, is on “Tolkien and Diversity.”

If this blog post was a podcast, I would here insert a sudden needle-scratch sound effect to indicate the sudden, sputtering outrage of all the anti-woke culture warriors who, fresh from their jeremiads about Dr. Seuss and perorations about critical race theory, are now charging that the woke Left has come for Tolkien because these SJWs will not rest until they have destroyed everything that is great and good about white Western civilization. Because, you know, a two-day conference in which people discuss issues of race, gender, and sexuality with regards to Tolkien’s work will destroy The Lord of the Rings.

If that all sounds snarky and hyperbolic, well, I’ll cheerfully concede to the snark, but I’m not overstating the reaction. Survey the Twitter threads on the subject, and you’ll see the phrase “Tolkien spinning in his grave”1 an awful lot, as well as the frequent charge—made with varying degrees of profanity and varying familiarity with grammar—that all of these people who want to talk about race or sexual identity in Tolkien are doing it because they hate Tolkien and seek to destroy this great author and his legacy. Well, you might protest, that’s just Twitter, which is an unmitigated cesspool; and while you wouldn’t be wrong about Twitter, these reactions exist on a continuum with more prominent conservative outrage. The default setting for these reactions is the conviction that people engaging with such “woke” discourse as critical race theory, or queer studies, or feminism for that matter, do so destructively—that their aim is to take something wholesome and good and tear it down, whether that be Western civilization or your favourite genre fiction.

I shouldn’t have to argue that this Tolkien conference, with its grand total of sixteen presenters, isn’t about to do irreparable damage to Tolkien’s standing or his legacy. Nor would reading some or all of the papers ruin LotR or The Hobbit for you. The idea that it might—that this is some form of desecration that indelibly sullies the Great Man’s work—is frankly bizarre. It is, however, a kind of thinking that has come to infect both fandom and politics. Six years ago I wrote a blog post about the Hugo awards and the conservative insurgency calling itself the “Sad Puppies,” who were attempting the hijack the nominating process to exclude authors and works they believed to be “message fiction” (“woke” not yet having entered the lexicon at that point), i.e. SF/F more preoccupied with social justice and identity politics than with good old fashioned space opera and fur-clad barbarians. If I may be so gauche as to quote myself, I argued then what I’m arguing now:

… namely, that the introduction of new voices and new perspectives, some of which you find not to your taste, entails the wholesale destruction of what you love, whether it be gaming or SF/F. Anita Sarkeesian produced a handful of video essays critiquing the representation of women in video games. As such critiques go, they were pretty mild—mainly just taking images from a slew of games and letting them speak for themselves. Given the vitriol with which her videos were met, you’d be forgiven for thinking she’d advocated dictatorial censorship of the gaming industry, incarceration of the game creators, and fines to be levied on those who played them. But of course she didn’t—she just suggested that we be aware of the often unsubtle misogyny of many video games, that perhaps this was something that should be curtailed in the future, and further that the gaming industry would do well to produce more games that female gamers—an ever-growing demographic—would find amenable.

This is the same kind of thinking that had people wailing that the all-women reboot of Ghostbusters somehow retroactively ruined their childhoods, or that The Last Jedi was a thermonuclear bomb detonated at the heart of all of Star Wars.2 It’s the logic that leads people to sign a petition to remake the final season of Game of Thrones with different writers, as if such a thing were even remotely feasible.

By contrast, while I loved Peter Jackson’s LotR trilogy, I thought that his adaptation The Hobbit was an unmitigated trio of shit sandwiches—not least because it displayed precisely none of the respect Jackson had previously shown for the source material. And yet, aside from inevitable bitching about how terrible the films were, when I teach The Hobbit in my first-year class this autumn, I do not expect to be hobbled by the trauma of having seen a work I love treated so terribly. The Hobbit will remain The Hobbit—and nothing Peter Jackson can do on screen or an overeager grad student can do at a conference will change that.

Conservatives love to harp on about the entitled Left and “snowflake” culture, but the current culture war pearl-clutching—when it isn’t merely performative posturing by Republicans looking to gin up their base (though also when it is that)—comprises the very kind of entitlement they ascribe to the people they’re vilifying. When the Sad Puppies try to crowd out authors of colour and queer voices at the Hugos; when Gamergaters attack female gamers; when a Twitter mob drives Leslie Jones off social media in retaliation for her role in the “female” Ghostbusters; when Anita Sarkeesian has to cancel an invited talk due to bomb threats; or when an innocuous conference on diversity in Tolkien—no different from the dozens that have been held on “queering Shakespeare” or masculine fragility in Moby Dick—excites the ire of those declaring that such an abomination somehow desecrates Tolkien’s legacy, what is being articulated is an innate sense of ownership. This is mine, the reactionaries are saying. How dare you trespass on my territory. The zero-sum sensibility is one in which any change, any new voice, any new perspective that is not consonant with the way things have always been, is perceived as a loss. Anything challenging the mythos built up around an idea—be it the acknowledgement of systemic racism in America’s history or misogynist subtext in the Narnia Chronicles—is a betrayal of that mythos.

***

But we should also talk more specifically about the conference itself, as well as the counter-conference quickly thrown together by “The Society of Tolkien,” a group formed specifically in defiance of the incursion of “wokeness” into Tolkien studies.3

Ideally, I want to put this in perspective, and make clear why those people losing their shit really need to do a deep knee bend and take a fucking breath.

To be completely honest, this entire kerfuffle over “woke” Tolkien might have never showed up on my radar had I not been tagged in a general invitation to join my friend Craig and his partner Dani’s weekly online “office hours.” Craig and Dani are both professors in Philadelphia; they host discussions about a variety of topics, mostly dealing with American conservatism and anti-racism, and they focused this particular discussion on the backlash against the Tolkien and Diversity conference. Among those joining were members of the Tolkien Society, including some people presenting at the upcoming conference.

It was a lovely discussion, and since then I’ve been inducted into the Alliance of Arda—which is to say, I requested to join the Facebook group, and been accepted. There has been, as you could well imagine, a lot of discussion about the “Tolkien and Diversity” conference, particularly in regards to the backlash against it. This blog post came together in part because of my participation in those discussions.

And before I go further, I just need to make something clear: these lovely people who are enthusiastically participating in this conference are not resentful SJWs looking to take Tolkien down a peg or five. They are not haters. They do not want to cancel Tolkien. These are people who love Tolkien and his works. They would be as distressed as their detractors if Tolkien went out of print or was eliminated from curricula.

But they are looking to expand the ways in which we read and understand Tolkien, and in a zero-sum perspective … well, see above.

The original call for papers (CFP) for the conference is prefaced as follows:

While interest in the topic of diversity has steadily grown within Tolkien research, it is now receiving more critical attention than ever before. Spurred by recent interpretations of Tolkien’s creations and the cast list of the upcoming Amazon show The Lord of the Rings, it is crucial we discuss the theme of diversity in relation to Tolkien. How do adaptations of Tolkien’s works (from film and art to music) open a discourse on diversity within Tolkien’s works and his place within modern society? Beyond his secondary-world, diversity further encompasses Tolkien’s readership and how his texts exist within the primary world. Who is reading Tolkien? How is he understood around the globe? How may these new readings enrich current perspectives on Tolkien?

For those unfamiliar with academic conferences, this is pretty standard stuff, and is followed by a non-exhaustive list of possible topics participants might pursue:

  • Representation in Tolkien’s works (race, gender, sexuality, disability, class, religion, age etc.)
  • Tolkien’s approach to colonialism and post-colonialism
  • Adaptations of Tolkien’s works
  • Diversity and representation in Tolkien academia and readership
  • Identity within Tolkien’s works
  • Alterity in Tolkien’s works

Again, this is pretty standard fare for a CFP. Nor is it at all outlandish in terms of the topics being suggested. Speaking as both a professional academic and someone who first read Tolkien thirty-seven years ago, I can think of a half-dozen ways into each of those suggestions without really breaking a sweat.

But of course, the CFP was only the amuse bouche for the backlash. The real meal was made of the paper titles. Here’s a sample:

  • “Gondor in Transition: A Brief Introduction to Transgender Realities in The Lord of the Rings
  • “The Problem of Pain: Portraying Physical Disability in the Fantasy of J. R. R. Tolkien”
  • “The Problematic Perimeters of Elrond Half-elven and Ronald English-Catholic”
  • “Pardoning Saruman?: The Queer in Tolkien’s The Lord of the Rings
  • “Queer Atheists, Agnostics, and Animists, Oh, My!”
  • “Stars Less Strange: An Analysis of Fanfiction and Representation within the Tolkien Fan Community”

None of these seem at all beyond the pale to me, but then again I’m a professor and have presented numerous conference papers at numerous conferences, and these kind of titles are pretty standard. I can see where a Lord of the Rings enthusiast who never subjected themselves to grad school might think these titles absurd. True fact: academic paper titles are easily mocked, no matter what discipline, as they always employ a specialized language that tends to be opaque to those not versed in it.4

Case in point: I’m reasonably convinced that when former Globe and Mail contrarian and plagiarist Margaret Wente was at a loss for what aspect of liberal thought she would cherry-pick to vilify, she would comb the list of projects granted fellowships or scholarships by the Social Sciences and Humanities Research Council of Canada (SSHRC), and select the most arcane and convoluted project titles receiving taxpayer dollars and write one of her “And THIS is the state of academia today!” columns.

Emblematic of the Wente treatment was an article in the National Review by Bradley J. Birzer, a history professor and author of J. R. R. Tolkien’s Sanctifying Myth (2002). Writing about the upcoming Tolkien Society conference, he says,

While I have yet to read the papers and know only the titles for reference—some of which are so obscure and obtuse that I remain in a state of some confusion let’s, for a moment, consider “Pardoning Saruman? The Queer in Tolkien’s The Lord of the Rings.” In what way is Saruman, an incarnate Maia angel, sent by the Valar to do good in Middle-earth (Saruman really fails at this), queer? Is he in love with himself? True, with his immense ego, he might very well have been. Is he in love with Orthanc? Perhaps, but there is nothing in the text to support this. Is he in love with Radagast the Brown? No, he considers him a fool. Is he in love with Gandalf the Grey? No, he’s jealous of Gandalf and had been from their first arrival in Middle-earth. Is he in love with his bred Orcs? Wow, this would be twisted. Is he in love with Wormtongue? If so, nothing will come of it, for the lecherous Wormtongue has a leering and creepy gaze only for Eowyn. And, so, I remain baffled by all of this. Nothing about a queer Saruman seems to make sense.

Let’s begin with his admission that he hasn’t read any of the conference papers—which is fair, considering that, as I write this post, the conference is several days away—which means that if any of the presenters are at all like me, they’re just starting to write the papers now. But … do you really want to dive in so deeply based solely on a title? There was a story frequently told of a professor I knew at Western University, my alma mater: in answer to a question he’d asked, a student raised his hand, and said, “Well, I haven’t read the book yet, but …” and then went on confidently for several minutes while the professor nodded along. When the student stopped talking, the professor paused, and said, “Yeah. Read the book.”

Birzen’s choose-your-own-adventure speculation on what the paper might be about is no doubt entertaining to his readers who think, like him, that the whole concept of a queer Saruman is absurd, full stop. But here’s the thing: you don’t know. You don’t know what the paper is going to argue. You don’t even know if, despite the fact that he’s the only named character in the title, the paper has anything to say about Saruman’s queerness. Leaving aside for a moment the fact that considering someone a fool or being jealous of them doesn’t obviate the possibility of sexual desire for them, or the idea that demi-gods are somehow asexual (the Greeks would have something to say about that), perhaps consider that the paper isn’t about specific characters being queer? The pre-colon title, after all, is “Pardoning Saruman.” If you handed me this title in some sort of conference-paper-writing reality TV show, in which I have a few hours to write an essay conforming to the title, my approach would actually be to talk about the mercy both Gandalf and Frodo show Saruman, as something subverting a masculinist approach to punitive practices. Queerness, after all, isn’t merely about sex, but about a constellation of world-views at odds with heteronormativity; that Birzen reduces it to a question of who Saruman wants to fuck reflects the impoverishment of his argument.

But who knows. Will the paper be good? Perhaps, though that actually doesn’t matter much. Academic conferences are not about presenting perfect, ironclad arguments; the best papers are imperfect and lacking, because those are the ones that excite the most discussion. Conference papers tend to be works in progress, embryonic ideas that the presenters throw out into the world to test their viability. At its best, academia is an ongoing argument, always trying out new and weird ideas. Some fall flat; some flourish.

At its worst, academia becomes staid and resistant to change. Which brings me to the “Society of Tolkien” and its counter-conference. Their CFP reads as follows: “When J.R.R. Tolkien created Middle-earth, he filled it with characters, themes, and dangers that leapt from the pages to intrigue, excite, and give hope to his readers. In these sessions, we’ll explore these concepts to celebrate all that makes his works stand the test of time and what we should take from them today.” Or, to translate: no theme at all. Suggested topics include:

  • Analysis of characters, situations, and linguistics in the books
  • Military doctrine and tactics portrayed in the books or movies
  • Themes, lessons, and allegories drawn from or used by Tolkien
  • Works influenced by Tolkien’s writing
  • Works which influenced Tolkien’s writing
  • Middle-earth history

So, in other words: stuff we’ve seen before numerous times. But what is slightly more interesting is the supplemental list of topics that will not be considered for the conference:

  • Concepts not included in Tolkien’s writing
  • The Black Speech of Mordor
  • General foolishness

Speaking as someone who has attended numerous academic conferences, I can attest to the fact that “general foolishness” is always welcome to break up the monotony. As to “concepts not included in Tolkien’s writing,” that’s a pretty difficult territory to police. As one of my new Tolkien friends points out, the word “queer” appears in both The Hobbit and LotR multiple times—not, perhaps, meant in the way we mean it today, but still—where precisely does one draw the line?

I’m particularly intrigued by the prohibition against “The Black Speech of Mordor.” Because … why? Are they concerned, as were the attendees at the Council of Elrond when Gandalf spoke the rhyme on the One Ring in the language of Mordor, that the very sound of the language is corrupting? Do the people of the “Society of Tolkien” want to prohibit any presentation that might end up being sympathetic to Sauron? Or are they, as I joked in Arda’s comment section, worried that a paper about “Black Speech” could be a Trojan horse sneaking in critical race theory to the conference?

I said it as a joke, but I’m by no means convinced that that was not the reasoning behind the prohibition.

***

CODA

When I was enrolling for my fourth year of undergraduate study, I saw a course in the calendar on Tolkien and C.S. Lewis. I signed up unhesitatingly: it was, as I joked at the time, the one course at the university where I could probably walk into the final exam without ever having been to class and still get an A. I did however have a little bit of anxiety: I had first read LotR when I was twelve, and reread it several times through my teenage years. I did not reread it for the first three and a half years of my undergrad, so inundated was I by all the new texts and authors and ideas that saturated my days. I was concerned, on returning to this novel that had taught me so profoundly how literature can have affect—how it can transform you on a seemingly molecular level—that I might now find LotR deficient or less than it was. Since starting my English degree, I’d had that transformative experience again numerous times: Annie Dillard and Toni Morrison, Gabriel García Márquez and George Eliot, Yeats and Eliot and Adrienne Rich, Salman Rushdie and Isabel Allende, Tom Stoppard and Bertholt Brecht and Ibsen, Woolf and Joyce and Faulkner, to say nothing of a raft of literary theory and criticism from Plato and Aristotle to Derrida and De Man. How would Tolkien measure up? And how, having taken classes on postcolonial literature and women’s literature, would I react to Tolkien’s more problematic depictions of race and gender?

Here’s the thing: I needn’t have worried. Yes, all the problematic stuff was there, but so was the deeper complexity of Tolkien’s world-building, the nuanced friendship of Sam and Frodo, the insoluble problem of Smeagol/Gollum, as well as Tolkien’s bravura descriptive prose that I appreciated in a way I could not when I was a callow teenager. There was also my greater appreciation of Tolkien the person. My professor posed a question to the class: why do you think Tolkien—scholar and translator of Norse and Teutonic sagas featuring such indomitable warriors as Beowulf—why did he, when he wrote his great epic, make diminutive halflings his heroes? Heroes who, she continued, proved doughtier than such Teutonic-adjacent heroes as Boromir?

Could it be, she suggested, that he was annoyed by the deployment of Norse and Teutonic myth by a certain 1930s dictator to bolster the fiction of a master race?

She then read to us Tolkien’s response to a Berlin publishing house interested in publishing a German translation of The Hobbit, but wanted confirmation of Tolkien’s Aryan bona fides before proceeding. Tolkien’s response remains one of the most professorial go fuck yourselves I’ve ever encountered:

Thank you for your letter. I regret that I am not clear as to what you intend by arisch. I am not of Aryan extraction: that is Indo-Iranian; as far as I am aware none of my ancestors spoke Hindustani, Persian, Gypsy, or any related dialects. But if I am to understand that you are enquiring whether I am of Jewish origin, I can only reply that I regret that I appear to have no ancestors of that gifted people. My great-great-grandfather came to England in the eighteenth century from Germany: the main part of my descent is therefore purely English, and I am an English subject — which should be sufficient. I have been accustomed, nonetheless, to regard my German name with pride, and continued to do so throughout the period of the late regrettable war, in which I served in the English army. I cannot, however, forbear to comment that if impertinent and irrelevant inquiries of this sort are to become the rule in matters of literature, then the time is not far distant when a German name will no longer be a source of pride.

Your enquiry is doubtless made in order to comply with the laws of your own country, but that this should be held to apply to the subjects of another state would be improper, even if it had (as it has not) any bearing whatsoever on the merits of my work or its sustainability for publication, of which you appear to have satisfied yourselves without reference to my Abstammung [ancestry].

All of which is by way of saying: Tolkien was a product of his nation and his era. He was a complex person, and would almost certainly blanch at the Tolkien Society’s current roster of conference papers.

But then, he’d probably blanch at many of the ones in years past too.

NOTES

1. Anyone even halfway familiar with J.R.R. Tolkien would tell you that he most likely started spinning in his grave the moment he was interred and hasn’t paused much since. Some authors are generally indifferent to people’s interpretations of their work and are happy to let fans, academics, and adaptors have their own ideas; others retain a proprietary sense of their work, and will respond to negative reviews or interpretations they think miss the mark of what had been intended, as well as being antagonistic to the idea of someone else adapting their work to stage or screen. Tolkien was quite firmly in the latter category, often engaging in lengthy correspondence with fans in which he corrects their understanding of The Lord of the Rings with his own, especially with regards to allegorical or symbolic readings (with few exceptions, Tolkien adamantly maintained that there was no symbolism or allegory in LotR, and no correspondence to contemporary world-historical events). He was also quite bearish on the idea of adapting The Lord of the Rings to film, even though there was an American project in the works in 1958; in his letters, Tolkien communicates his antipathy quite clearly. One can only imagine what he would have thought of the allusions to Middle-Earth in the music of Rush and Led Zepplin, or Ralph Bakshi’s vaguely hallucinogenic animated film (much beloved by the guardians of “classic” SF/F). Whether he would have appreciated Peter Jackson’s LotR trilogy is less certain, though the dilatory and overstuffed debacle of The Hobbit was surely responsible for seismic events in the vicinity of Tolkien’s grave. Compared to Radagast the Brown’s rabbit sled, King Thranduil’s war moose, Legolas constantly defying the laws of physics, and the dwarves’ battle with Smaug inside the Lonely Mountain (I could go on), a paper titled “’Something Mighty Queer’: Destabilizing Cishetero Amatonormativity in the Works of Tolkien” is small beer.

2. That there is so much hate levied against The Last Jedi, while Revenge of the Sith apparently gets a pass, baffles me.

3. As one commenter observed, the fact that some people split off to form “The Society of Tolkien” is such pure Monty Python that it truly puts the absurdity of all this into context.

4. And it is by no means limited to academia. On the day I wrote this post, I also drove my partner Stephanie, who is a talented guitarist, out to Reed Music in Mount Pearl so she could purchase a Fender Telecaster. Unlike Steph, I am utterly unmusical. As we drove, I idly asked her what the difference was between a Telecaster and a Stratocaster. She then proceeded, very animatedly, to descend into guitarist-speak for the next ten minutes. After a certain point, she looked at me and said, “You didn’t understand any of that, did you?” To which I replied, “No, about ten seconds in, you started to sound like the adults in Charlie Brown specials.” But as I write this, she’s in the next room playing “Sultans of Swing” on her new guitar, and, really, that’s a language I speak just fine.

Leave a comment

Filed under Return to Middle-Earth, wingnuttery

Remembering Postmodernism Part Two: A Brief(ish) History of Postmodernity

And we’re back! To reiterate my caveat from my previous post, these discussions of postmodernism are filtered through my lens as an Americanist. Postmodernism is a global phenomenon, and while it is conditioned by American cultural and economic hegemony, its manifestations are various and culturally specific around the world. A significant element of the post-WWII world was the collapse of empires and the fraught process of decolonization, as well as the rise of the neo-imperialism of corporate capitalism. What we’ve come to call postcolonialism isn’t synonymous with postmodernism by any means, but on that Venn diagram there is more than a little overlap.

I reiterate this caveat in part because this post is probably the one in this series most specific to U.S. history. I have thus far offered a broad introduction to postmodernism and offered my specific understanding of what I consider its most basic unifying element; what I want to do here in this post is provide some historical context for the material circumstances that gave rise to the postmodern condition.

Are you sitting comfortably? Then we’ll begin.

Whenever I teach my second-year class on American Literature After 1945, I always begin the semester with a lecture in which I contextualize, historically, where the United States was at the end of the Second World War. And I begin with a question: where do you think the U.S. ranked globally in terms of the size and strength of its military in 1939? My students, to their credit, are always wary at this question—they know I wouldn’t ask it if the answer was obvious. So, obviously not #1 … but they’ve all grown up in a world (as indeed I have) in which American military might is vast, and the Pentagon’s budget is larger, at last accounting, than the next fourteen countries (twelve of which are allies) combined. So some brave soul will usually hazard a guess at … #5? No, I reply. Someone more audacious might suggest #10. But again, no. In 1939, I tell them, the U.S. ranked #19 globally in its military’s size and strength, with an army that was smaller than Portugal’s.

The point of this exercise, as you’ve probably gleaned, is to describe how the U.S. went from being a middling world power to a superpower in the course of six years—and how that fundamentally changed U.S. society and culture, and in the process reshaped the world.

There’s an episode of The West Wing from season three that centers on President Bartlett’s State of the Union Address; at once point speechwriter Sam Seaborn (Rob Lowe) talks about how a president’s words can inspire a nation to great achievements, citing FDR’s SOTU in 1940 in which he predicted that the country would build 50,000 aircraft by the end of the war—which proved incorrect, as the U.S. actually built over 100,000.

While Roosevelt’s shrewd leadership of the U.S. through WWII should not be discounted, Sam’s historical factoid has more to do with propping up the Aaron Sorkin Great Man Theory of History: Oratorical Edition™. But this historical actuality does offer a useful thumbnail sketch of the sheer size and scope of the United States’ industrial capacity when totally mobilized. Consider the haunting images of planes and ships mothballed after the war, all of which had been churned out by American factories and which were now the detritus of a massive war effort.

Airplane boneyard in Ontario, California
Mothballed Pacific fleet in San Diego, California

The U.S. emerged from the devastation of the war with its industrial infrastructure intact, unlike those of its allies and enemies: Britain’s had been badly damaged, France’s suffered from the fierce fighting after the Normandy landings, and both Germany and Japan had basically been pounded flat by Allied bombing; the Soviet Union had had to move its factories far enough east to be out of the range of the Luftwaffe, while also suffering 10 million military and 14 million civilian deaths, more than any other combatant nation by a magnitude (the U.S. by contrast lost 418,500, only 1700 of which were civilians). [EDIT: as an historian friend of mine pointed out, China actually suffered losses comparable to Russia’s between 1937-1945]

All of which meant that the United States had gone from being ranked nineteenth in military strength to number one with a bullet (pun intended), with the world’s only fully functioning industrial base of any note; meanwhile, the returning soldiers benefited from the G.I. Bill passed in 1944, which, among other things, provided for college tuition. The discharged veterans poured into colleges and vocational schools and emerged with degrees that put them in good stead to swell the ranks of white-collar workers and take advantage of other provisions of the G.I. Bill, such as one year of unemployment compensation, low-interest loans to start a business or a farm, and low-cost mortgages. This government investment led to the greatest expansion of middle-class prosperity in history.1

The industry that had been cranked up to eleven in the war did not sit idle; as Americans earned more, they spent more. The Big Three automakers switched from jeeps and tanks to cars for the newly suburban nuclear families. Factories changed over from military production2 to making refrigerators, washing machines, and a host of new appliances that, the rapidly expanding business of commercial advertising declared, all households needed. And somewhere in all of that, television made its presence known as sales of television sets grew exponentially: 7,000 in 1945, 172,000 in 1948, and 5 million in 1950; in 1950, twenty percent of households had a television set; by the end of the decade, ninety percent did.

Meanwhile, the wartime acceleration of technology facilitated the rapid growth of, among other things, air travel: the first commercially successful jetliner, the Boeing 707, took to the skies in 1954.

But, you might ask, what does all this have to do with postmodernity? Well, everything … what I’ve been describing are a series of paradigm shifts in travel and communication technology, as well as the United States’ transition from industrial capitalism to full-bore consumerism. These changes were not just material, but symbolic as well, as the U.S. created of them a counter-narrative to Communism. Consumerism became understood as patriotic: the wealthier Americans grew, the more cars and appliances they bought, the more the American Dream could be held up as the obvious virtuous alternative to dour Soviet unfreedom.

This postwar period that comprises the 1950s and early 60s was the era, in the words of scholar Alan Nadel, of “containment”—a time that was marked, to oversimplify somewhat, by “the general acceptance … of a relatively small set of narratives by a relatively large portion of the population.3 Or to put it even more simply, it was a time of pervasive public trust in American institutions, especially government and the military. Now, to be clear, “pervasive” doesn’t mean “total”—this was also, after all, the time of McCarthyism and the Red Hunts, as well as the first glimmers of cultural dissent that would influence the counterculture of the 1960s (most especially embodied in the Beat Generation), and also the snowballing insurgency of the Civil Rights Movement. But taken overall, the 1950s—for all its nascent and repressed anxieties—embodied a complacency facilitated by prosperity and the higher standard of living prosperity made possible. As Nadel argues in his book Containment Culture, this delimiting of narratives was about containment: containing women in their domestic spaces, containing men in the role of patriarchal breadwinner, containing fathers, mothers, and children in the bubble of the nuclear family, containing Black Americans within the confines of Jim Crow, containing ideological expression within apolitical art forms like abstract expressionism and critical methodologies like the “New Criticism” being practised in English departments across the country, and above all containing the Soviet cultural threat with the power of America’s example and its military threat with the example of America’s power.

It didn’t take long for the 1950s to be nostalgized and mythologized in popular culture: American Graffiti was released in 1973, and Happy Days first aired in 1974. Indeed, the 50s were mythologized in popular culture at the time, with shows like Leave it to Beaver and Father Knows Best embodying the utopian ideals of domesticity and the nuclear family. Even popular stories of youthful rebellion were stripped of any possible political content, perhaps most explicitly in the very title of Rebel Without a Cause (1955), or in Marlon Brando’s iconic line in The Wild One (1953): when asked what he’s rebelling against, he replies “Whaddya got?”

When Donald Trump made “Make America Great Again” his slogan, he was tacitly citing the 1950s as the apogee of America’s greatness.4 This was possibly the canniest of his various simplistic catch phrases, given that 1950s nostalgia has reliably proven attractive at such moments of cultural crisis or ennui as the early 1970s; but like all nostalgic figurations, it leaves a lot out. There were those during the 1950s who labelled the period the “age of anxiety,” though as I’ll delve into more deeply in my next post, it could also have been called the age of avoidance: the rhetoric and discourse of containment sought, among other things, to deflect from such new global realities as the threat of nuclear conflict and the haunting aftermath of the Holocaust.

One way to understanding the emergence of postmodernism—and this is indeed one of the primary arguments of Alan Nadel’s book—is as a product of the breakdown of containment, of the fracturing of a societal consensus in which a relatively large number of people accepted a relatively small number of narratives into a more chaotic and fragmented social order that was deeply suspicious of such narratives. Though this breakdown progressed over a series of shocks to the system—the assassination of JFK, the start of the Vietnam War and the concomitant rise of the anti-war Left, the growth and visibility of the Civil Rights Movement culminating in the traumatic murder of Martin Luther King, Jr., the assassination of Robert Kennedy, all of which unfolded on television screens across the nation—this fracture was seeded in the immediate aftermath of the Second World War. The war spawned a world that was more truly global than at any point in history. The world was simultaneously smaller and larger: smaller because of the instantaneous nature of televisual and telephonic communication, as well as the fact that one could travel around the world in a matter of hours rather than weeks or months; larger because of the ever-increasing volume of information available through new communications technology.

David Harvey, The Condition of Postmodernity (Blackwell 1987)

The increasingly efficient and indeed instantaneous transmission of information facilitated the more efficient transmission of capital, which in turn facilitated the growth of transnational corporate capitalism, which by the time we reach the orgy of financial deregulation also known as the Reagan Administration becomes less about manufacturing and industry than the arcane calculus of stock markets. By this time, America’s postwar preeminence as the sole nation with functioning industrial infrastructure had been steadily eclipsed as the rest of the world rebuilt, or, as with the case of China and Southeast Asia and the Indian subcontinent, had begun to establish their own industrial bases—facilitating the departure of blue-collar jobs as multinational corporations offshored their factories to whatever countries offered the lowest wages and best tax breaks.

This confluence of historical, economic, technological, and cultural circumstances created a profound sense of dislocation, a disruption of what Marxist theorist and literary scholar Fredric Jameson calls our “cognitive map.” Jameson is the author of Postmodernism, or The Cultural Logic of Late Capitalism (1991), arguably one of the definitive works theorizing postmodernism (as well as being exhibit A of why “postmodern neo-Marxism” is an incoherent concept, but more on that in a future post). He borrows the concept of cognitive mapping from urban theorist Kevin Lynch’s 1960 book The Image of the City; in Lynch’s telling, cities are comforting or alienating to the extent to which one can have a “cognitive map” of the urban space in which one can picture oneself in relation to the city. Cities with a straightforward layout and visible landmarks (such as Toronto) are comforting; cities that are confusing and repetitive sprawls (such as Los Angeles) are alienating.5 Jameson adapts this concept to culture more broadly: how we see ourselves in relation to what Jameson calls the “totality” of our culture and society is predicated on how much of it we can know. Someone living in a small town with no need to have intercourse with the larger society has a fairly straightforward cognitive map. Someone whose life and livelihood depends on circumstances well outside their control or understanding fares less well when they are negatively impacted by the offshoring of jobs or a financial meltdown that erases their retirement savings. By the same token, the individual in the small town’s ability to maintain their cognitive map will be impacted by financial downturns, or even just by having cable TV and the internet.

We could well substitute “sense of reality” for “cognitive map.” Where we find ourselves—have indeed found ourselves for several decades now—is in a place where reality feels increasingly contingent and unstable. When I wrote my doctoral dissertation on conspiracy theory and paranoia in American postwar culture, my argument basically fell along these lines—that the dislocations of postmodernity make the certainties of conspiracism attractive. I defended that thesis seventeen years ago (egad); as evidenced by the 9/11 Trutherism, anti-Obama birtherism, and the current lunacy of QAnon, conspiracism has only metastasized since then.

I am, to be clear, leaving a lot of stuff out here. But my overarching point is that what we call “postmodernism” is no one thing. It is, rather, the product of all the stuff I’ve described, and more. When the Jordan Petersons of the world accuse postmodernism of positing that any and all interpretations of something are all equally valid, they’re leveling that charge against nerdy academics influenced by a handful of French theorists and philosophers who are not, in fact, saying any such thing; Peterson et al are, though they don’t seem to know it, railing against a cultural condition that has made certitude functionally impossible, and which promulgates an infinitude of interpretations through no fault of philosophers and theorists and critics doing their level best to find the language to describe this condition.

NOTES

1. To hearken back to my posts on systemic racism, Black Americans largely found themselves shut out of the G.I. Bill’s benefits, both because of explicit restrictions written in by Southern Democrats, but also because almost all banks simply refused to extend the low-interest loans and mortgages to Black veterans. While white America enjoyed the fruits of the postwar boom, accruing wealth that would be passed onto their children and grandchildren, the exclusion of Black America from the 1950s boom continues to contribute to the stark wealth inequality between whites and non-whites in America today.

2. Which is not to say, of course, that U.S. military production slackened much; even as the nation demobilized, it shifted focus to the new enemy, the U.S.S.R., and poured huge amounts of money into R&D for possible new conflicts. Indeed, it can be argued that the recessions of 1947 and 1950, resultant from the downturn of wartime production, were counteracted by the Truman doctrine in 1947—the principle asserting America’s obligation to contain Soviet expansion globally—and the outbreak of the Korean War in 1950. Both events goosed the stock market with the promise of increased military production. By the end of Dwight D. Eisenhower’s presidency in 1961, he had grown so alarmed by the growth of military production and the power shared by the Pentagon and arms manufacturers that the architect of the D-Day landings said, in his valedictory address to the nation:

This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence—economic, political, even spiritual—is felt in every city, every State house, every office of the Federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society … we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.

3. Alan Nadel, Containment Culture (1995) p. 4.

4. The peculiar genius of the slogan “Make America Great Again” is precisely its ahistorical vagueness—playing upon the intuitive contradiction felt by people whose instinct is to think of America as the greatest nation in the world, but who cannot see evidence for that greatness in their day to day lives. Hence, it must be returned to that state of greatness; and though the slogan’s vagueness allows its adherents to imagine for themselves precisely when America was great, the most standard conservative default setting is the 1950s (allowing for the likelihood of the most nativist part of the MAGA crowd pining nostalgically for the 1850s).And indeed, the touchstones of Trump’s 2016 campaign promises—the return of well-paid factory jobs, higher wages for the working class, the tacit and sometimes explicit exclusion of people of colour, and the return to a masculine America in which women know their place—all hearkened back to the era of containment (eliding, of course, the 90% top marginal tax rate and the fact that working-class jobs could support a family because of the pervasiveness of unions). Trump’s promises were all about containment, the boxing-in and walling-off of Black, queer, and women’s voices, and containing America itself from the influx of immigrants, refugees, and Muslims, all of which envisions a very strictly contained understanding of what comprises “real” America.

5. Posthumous apologies to Kevin Lynch for so egregiously oversimplifying his nuanced discussions of urban space.

Leave a comment

Filed under Remembering Postmodernism

Remembering Postmodernism Part One: Contingent Realities

On rereading my introductory post in this series, I realized I forgot to include an important caveat about the particularity of my perspective. To wit: I am, academically speaking, an Americanist—which is to say my principal research and teaching focus is on contemporary American literature and culture. To that end, my understanding of everything I will be talking about will be filtered through that particular lens of expertise. Postmodernism is, by definition, a global phenomenon, and as I will discuss in my next post, globalism as we know it is, to a large extent, the product of a world war in which the United States emerged as the hegemon of the Western world.

But however much the U.S. put its stamp on what we call postmodernism and postmodernity, there are many iterations specific to other nations, other cultures, other histories—not least of which is the weird Venn diagram of postmodernism and postcolonialism. Or the equally weird 1980s feedback loop between American and Japanese postmodernism.

But for the sake of keeping things simple, though I’ll be gesturing to other such iterations of the postmodern, I’ll mostly be keeping to my wheelhouse. So please chime in with all the stuff I’m leaving out.

OK, so that being said, let’s recap: my introductory post was basically about the complexity of postmodernism and the fact that there is no one definition, and that there is no unifying thread linking all aspects of postmodernism.

So let me suggest own theory of the unifying thread linking all aspects of postmodernism.

Very simply: the postmodern condition is one in which our tacit understanding is that language does not reflect reality, but that language creates reality.

Before I go further, let me stipulate that the phrase “tacit understanding” is doing the heavy lifting in that sentence: as I’ll elaborate below, there is a broad range of responses to my suggestion, not the least of which is outright hostility. An instinctive response from certain quarters to the idea that language “creates” reality would be to see it as confirmation that postmodern thought is all about absolute relativism and the denial of objective truth. As physicist and notorious anti-postmodernist Alan Sokal said in the 1990s, anyone suggesting that gravity was a social construct was welcome to walk out of his third-floor office window.

Except that neither I nor any “postmodern” thinker of any substance is suggesting that objective reality doesn’t exist, or that it only exists as something conjured from words, any more than the postmodern thinkers Sokal mocked believed we could all float up into the air if we just denied the existence of gravity or deleted the word from the dictionary. I find it’s useful in this discussion to make a distinction between “reality” and “actuality,” in which the latter is the world as it is, and the former is how we make it comprehensible to one another.1 No postmodernist thinker not holding court in a 3am dorm room blue with pot smoke seeks to deny the actuality of the world. We all inhabit bodies that feel pain and experience the sensoria of our immediate environments, and if we choose to step out of a third-floor window, it almost certainly isn’t to prove a point of about social constructionism.

In fact, the word and concept of gravity is as good a starting point as any. The word “gravity” comes to us from the Latin gravitas, which means “seriousness” and was one of the ancient Roman virtues. It also could mean dignity or importance, or the moral rigor required to undertake a task of great importance. When you consult the Oxford English Dictionary, the first three entries are (1) “the quality of being grave”; (2) “Grave, weighty, or serious character or nature; importance, seriousness”; (3) “Weighty dignity; reverend seriousness; serious or solemn conduct or demeanour befitting a ceremony, an office, etc.” Only after wading through those three entries do you get to number four, which breaks down gravity’s meanings “in physical senses,” and only at number five to you arrive at “The attractive force by which all bodies tend to move towards the centre of the earth; the degree of intensity with which a body in any given position is affected by this force, measured by the amount of acceleration produced.”

The mathematical expression of Earth’s gravity is 9.6 m/s2, which is your vertical acceleration should you jump out of a plane or Alan Sokal’s office window. A postmodernist consideration of gravity would not be skeptical of the fact that things fall, but rather would be preoccupied with how we understand it, and how the linguistic signifier “gravity” possesses meanings other than its mathematical rendering—pointing to its origins meaning seriousness and “weighty” as a metaphor referring to matters of importance, to the word’s shared root with “grave” and the semantic overlap there. One might also cite the history of how we’ve come to understand gravity, from Aristotle’s assertion that heavy objects fall because their element is of earth, and thus seek their natural state. Though the apple falling on Isaac Newton’s head is almost certainly apocryphal, it makes for a good story, even if his mathematics did not actually do much more than Aristotle to clarify not just how but also why gravity works. That had to wait for Einstein’s theories of relativity and the understanding of gravity as curvatures in space/time shaped by objects with mass. And even now, physicists still scratch their heads over the relationship between gravity and quantum mechanics.

None of which questions the actuality that objects fall to the earth at 9.6 m/s2. What it does do is get us into language games that highlight the contingency of meaning, and a simple trio of facts that I always lead off with in my first-year classes: all language is descriptive; all language is metaphorical; all language is rhetorical. Which is to say, all language seeks to describe the world, it does so invariably through analogy, and it seeks to persuade. What language creates, as I said above, is a shared reality that at its best sharpens and clarifies our understanding of the actuality we all individually inhabit. Also, it’s fun: one of the great pleasures of reading a gifted poet or prose stylist is seeing the ways in which they can make you think of certain things anew by using language in challenging and novel ways. Heh, “novel” ways—see what I did there? By which I mean even the humble pun has the capacity to highlight the slipperiness of our shared vocabularies. “I don’t think you quite grasp the gravity of your situation” is a pun that has been used, among other places, in Star Trek and Doctor Who to refer to the fact that the seriousness of one’s circumstances specifically relates to the imminent danger of falling from a great height. My favourite line from Back to the Future is when Doc Brown, puzzled by Marty McFly’s constant use of the word “Heavy!” finally demands whether something has gone wrong with gravity in the future. And of course, there’s the old chestnut that there is no gravity—the Earth just sucks.

But, I can hear some people protesting, maundering about the various meanings of gravity isn’t what we’re concerned with—what we’re concerned with is postmodernism’s denial of objective truth! Which is a big deal! And yes, it would be a big deal if that were indeed the case. The problem is that the word “truth” entails some significant gradations between straightforward facts in evidence and the capital-T Truths bound up in abstractions like justice, morality, and good and evil. Your average postmodernist has no quibble with facts in evidence, but takes issue with the notion of transcendent truths—such as a concept of absolute justice, or that evil exists outside of our capacity to characterize it semantically. Where people most commonly get postmodernism wrong is in characterizing it as a denial of actuality. One suggestion that has surfaced in a significant number of think-pieces over the past several years, that Donald Trump operates out of the “postmodern playbook,” insofar as he treats reality as fungible and truth as something subject to his own whim, is also a basic misapprehension.2 Postmodernism—or, more accurately, postmodern thought—isn’t about the denial of objective truth or actuality, but the interrogation of the premises and cultural assumptions on which the conceptions of capital-T objective Truths are based.

To return to my earlier assertion and its load-bearing words: when I say that “the postmodern condition is one in which our tacit understanding is that language does not reflect reality, but that language creates reality,” I’m not necessarily asserting that language actually creates reality. (As it happens, I find this understanding completely persuasive, but that’s just me). I grasp why this idea is anathema to many, many people, especially religiously devout people who are deeply invested in the assumption of transcendent Truths that exist beyond language. Relatedly, there is also the very long idealist tradition in Western thought and philosophy that is predicated on the basic idea that there is an objective, external Truth towards which we strive, with language as our principal vehicle in doing so.3 What I’m arguing is that the postmodern condition is one in which the prospect of language creating reality isn’t necessarily something that presents itself as such, but is rather a felt experience presenting in most cases intuitively as suspicion, fear, or just a general anxiety. It is usually not articulated specifically, except by nerdy academics like myself or in angry rejections of postmodern thought by other nerdy academics.

This is what I mean by “tacit understanding”: something bound up in a broader cultural condition in which the critical mass of information and the critical mass of media through which we access information, a global economic system that is bewildering to literally everybody, technology that far outstrips the average individual’s capacity to understand it—and I could go on, but I’ll refer you back to my previous post’s bullet-points—has created a cultural condition in which the language/reality relationship has become, shall we say, unmoored. This unmooring was not the specific creation of such anti-postmodernists’ bêtes noir as Jacques Derrida and Michel Foucault; it is, rather—as I will be delving into in future posts—the elemental lived experience of the cultural condition of postmodernity.

Or to put it more simply: you might reject with all your being the idea that language creates reality, but you live in a world so thoroughly suffused by consumerism, so fractured by the multiplicity of disparate media platforms, so atomized by digital culture, that your lived reality is one in which any access to what you consider objective truth is rendered at best deeply fraught and at worst impossible. Those who wonder at the lunacy of QAnon really need only understand this basic dimension of the postmodern condition: namely, that when people’s sense of reality becomes unmoored they will often latch onto epistemic systems that give them a sense of order (no matter how batshit that system might be). Conspiracy theorizing and conspiracy-based paranoia of course long predate the postmodern era, but they find particularly fertile ground in a situation where reality itself feels contingent and slippery.

By the same token, “the imagination of disaster”—as Susan Sontag called it in her classic 1965 essay—has become pervasive, either imagining extinction-level events (alien invasion, asteroid headed for Earth, etc.) that are ultimately averted after much destruction, and which re-establish a sense of order; or, increasingly, depicting post-apocalyptic scenarios in which civilization has collapsed and survivors navigate a new sparsely populated world. Both are fantasies of return: in the first case to a society in which we can have faith in our institutions, in the latter to a more elemental existence shorn of the distractions and trivialities of postmodern life. Indeed, I would argue (and have argued) that post-apocalyptic shows like The Walking Dead share DNA with fantasy like Game of Thrones, insofar as both zombie apocalypse and fantasy are inherently nostalgic, imagining as they do a return to premodern, pre-industrial worlds in which “objective reality” is boiled down the immediacy of survival, whether in the face of attacking zombies, or the imperative of destroying the One Ring4 … in other words, something elemental and visceral and not subject to the seeming infinitude of mediations and contingencies of meaning manifest in the postmodern condition. As a number of thinkers have suggested, it is easier to imagine the destruction of contemporary civilization than how it might be fixed. Perhaps most notably, Fredric Jameson wryly observed that “It seems to be easier for us today to imagine the thoroughgoing deterioration of the earth and of nature than the breakdown of late capitalism,” and that “perhaps that is due to some weakness in our imaginations.”5

Perhaps it is a weakness of our imaginations, but one that, according to Jameson’s voluminous writings on postmodernism, is entirely understandable. In what is perhaps the definitive study of postmodernism, his 1991 book Postmodernism, or The Cultural Logic of Late Capitalism, Jameson argues that one of postmodernism’s key features is its sheer inescapability: the “prodigious new expansion of multinational capitalism”—which is more or less synonymous with the postmodern condition—“ends up penetrating and colonizing those very precapitalist enclaves (Nature and the Unconscious) which offered extraterritorial and Archimedean footholds for critical effectivity.”6 Or, to translate it into non-Jamesonian English, if you can’t get outside out something—physically, mentally, or otherwise—how can you effect a proper critique? How can you address something with which you are always already complicit?7

Keeping in mind that Jameson made that observation about “Archimedean footholds” in a book published in 1991, I think it’s safe to say that we can remark, from our perspective thirty years later, on how the situation he describes has only expanded by a magnitude with the advent of the internet and the current impossibility—short of decamping to live “off the grid” in the wilderness (itself a disappearing enclave)—of extricating oneself from the digital networks that now penetrate all aspects of life.

So how did we get here? Well, that’s my next post. Stay tuned.

NOTES

1. We find the same distinction in the philosophy of Immanuel Kant, who distinguished between the noumenon, or “the thing in itself” (das Ding an sich) and the phenomenon, or the thing as it appears to an observer. By the same token, psycholinguist Jacques Lacan—who weirdly is entirely ignored by postmodernism’s detractors (who generally choose to train their fire on Derrida and Foucault)—distinguishes between the “Real” and the “Symbolic.” The Real aligns with actuality, our experience of the world; the Symbolic by contrast is the realm of language, and any translation of the Real into language takes it out of the realm of the Real and into the Symbolic. In other words, your personal experience is specific to you, but is ultimately incommunicable as such—to communicate it to others means translating into the realm of the unreal, i.e. of language and our shared vocabularies.

2. The Trumpian world of “alternative facts” is not a product of the postmodern condition, but is more properly associated with authoritarianism—something Hannah Arendt, in passages much-quoted these past few years, asserted in The Origins of Totalitarianism: “The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist,” she writes, “but people for whom the distinction between fact and fiction (i.e., the reality of experience) and the distinction between true and false (i.e., the standards of thought) no longer exist.” She also said that “Before mass leaders seize the power to fit reality to their lies, their propaganda is marked by its extreme contempt for facts as such, for in their opinion fact depends entirely on the power of man who can fabricate it.”

3. As I’ll discuss in a future post, modernist art and literature was largely predicated on the premise that true art was about the accessing of reality—T.S. Eliot called this the “objective correlative,” the idea that the right combination of words and metaphors could conceivably access the fundamental truth of a given emotion. But for the modernists, “objective reality” was vanishingly difficult to touch; its principal revolt was against nineteenth century and Victorian positivism, and its assumption that objective reality could be rendered unproblematically through the practices of realism.

4. There’s an article or possibly an entire book to be written about how the popularity of The Lord of the Rings in America was in part a function of antipathy to the early stirrings of postmodernity—especially considering that Tolkien was especially popular on college campuses in the 1960s, which, when you think about it, is more than a little counter-intuitive, that a deeply conservative, essentially Catholic story would take root amidst leftist radicalism. Even those people amenable to the rise of the New Left and the newly translated writings of Derrida and Foucault et al were inclined to wear buttons declaring “Frodo Lives!”, which suggests that postmodernist theory might have fascinated people, but the lived reality of postmodernity still inspired imaginative escape to worlds not ruled by moral ambiguity and contingent realities.

5. Fredric Jameson, The Cultural Turn: Selected Writings on the Postmodern, 1983-1998. p. 50.

6. Fredric Jameson, Postmodernism, or The Cultural Logic of Late Capitalism. p. 49.

7. Because I don’t want these blog posts to turn into novellas, I will just quickly here note in passing that, within the arguments between those theorizing postmodernism, is this very question. More doctrinaire Marxists like Jameson or scions of the Frankfurt School like Theodor Adorno (more on the Frankfurt School and “cultural Marxism” in a future post) assert that the postmodern condition obviates the possibility of substantive cultural critique. Other critics and theorists see fifth columnists: Linda Hutcheon argues at length that postmodernist art and literature weaponizes irony and parody in what she terms “complicitous critique,” while such thinkers of the Birmingham School as Dick Hebdige and Stuart Hall—who effectively invented what we now call “cultural studies—argued that even within the worst excesses of a consumerist culture industry, artists carve out their own enclaves of resistance.

Leave a comment

Filed under Remembering Postmodernism

Remembering Postmodernism: Introduction

Before I get under way: why “remembering” postmodernism? Several reasons: back when I was an undergraduate, I came across a book about Canadian art titled Remembering Postmodernism. It struck me on two points: one, that the afterword was written by Linda Hutcheon, whose books and articles on postmodernist fiction had been my gateway drug into the topic; and two, that the title was pleasantly cheeky, considering that as far as I was concerned at the time, postmodernism was an ongoing thing.

Now, I never actually read the book, mainly because postmodernist art was—however interesting I found it—somewhat outside my wheelhouse. But the title stuck with me. Two years ago I taught a fourth-year course on American postmodernist fiction, which I titled “Remembering Postmodernism.” In that instance, the expression was a straightforward acknowledgement of postmodernism as an historical period: we started with Thomas Pynchon’s 1966 novel The Crying of Lot 49 and ended with Monique Truong’s The Book of Salt (2003), and a large part of our discussion of Truong’s novel was whether it was, in fact, postmodernist, or whether it represented whatever the next (as yet unnamed) historical phase was.

Now I’m doing these posts and contemplating whether it’s worthwhile to write a book called The Idiot’s Guide to Postmodernism, for the simple fact that there’s an awful lot of talk about postmodernism and postmodernists these days, and none of it seems to have the slightest clue about it. It is rather a term of disapprobation that seems designed on one hand to vilify contemporary university humanities programs as irredeemably “woke,” and as a shorthand to encompass the varying iterations of “wokeness” on the other.

Hence, as far as I’m concerned, we need to remember what postmodernism was and think about what it might still be. And remembering is a useful term insofar as it means both the calling to mind of things forgotten, as well as the act of re-assemblage.

If I were to ask you what the opposite of “remember” is, you’d most likely say “forget,” for the simple reason that that is entirely correct. But in a more strictly linguistic and semantic way, the opposite of “remember” is “dismember.” Sometimes when we remember something, it’s a simple matter of that something simply springing into our mind—where we left our car keys, for instance, or the fact that today is someone’s birthday. The more laborious process of remembering, however, is one of re-membering, of finding those sundered scraps of the past and putting them together like Isis reassembling Osiris’ dismembered body. 

And lest you think this post is just an excuse for me to nerd out over semantics (which, to be fair, it totally is—more of that to come), the fallibilities of memory and their relationship to how we conceive of history are quite germane to postmodern thought.

So, I want to do a deep dive on postmodernism over a series of posts. And to be clear, by “deep dive” I actually mean more of a SparkNotes-type run-through of its history, its basic premises, and what it means in the grander scheme of things. I’ve been studying and writing on postmodernism for the better part of my academic career, starting with my undergrad years, and there are libraries-worth of scholarship on the subject. So I’m hardly going to do much more in a handful of blog posts than offer the general contours.

Why, then, do I want to even bother? Two reasons: first, as I posted previously, because I want to use my blog this summer to work through my confused thoughts on a variety of issues. And second, because nobody honking off about it in the present moment—outside of those who have actually considered postmodernism within the confines of academe—seems to have any bloody clue what they’re talking about. As a case in point, I recently read an otherwise interesting article in The Bulwark, an online publication by anti-Trump conservative thinkers, on anti-democratic intellectuals of the new Right who opine that the tenets of “classical liberalism” in fact contain the seeds of tyranny. The article in question was a reasoned and persuasive defense of Enlightenment thought and its influence on the United States’ founding fathers. But then there was this paragraph:

Contemporary “political correctness” or “wokeness” comes from Marx and Nietzsche by way of the Postmodernists, not from John Locke or the Founding Fathers. A serious person would feel the need to at least attempt to trace some of that intellectual history and confront the ideological differences.

Yes, a serious person would feel the need to at least attempt to trace some of that intellectual history and confront the ideological differences—which the author of this article obviously did not do when throwing Marx, Nietzsche, and postmodernism into the same bucket and tacitly ascribing a causal line of influence. He is smart enough at least to cite Marx and Nietzsche—which is more than most people invoking the dreaded specter of postmodernism do—and obviously knows enough to understand that there is a relationship between those two thinkers and some aspects of postmodernism. In other words, there is a very general sense in which he isn’t wrong, but any number of specific senses in which everything about that statement betrays a profound ignorance of all three of its subjects—starting with the basic fact that Marxism and postmodernism are, if not strictly speaking antithetical, then extremely antagonistic.1

But what is postmodernism? Well, let’s start with the pervasive misapprehension that it constitutes a sort of absolute relativism—a contradiction in terms, yes, but one that ostensibly obviates the possibility of objective meaning. After all, if everything is relative to everything else, where is the capital-T Truth?

As with the paragraph I quoted above, this understanding gets some things right, but remains a basic misapprehension for the simple reason that there is no singular “postmodernism.” The rejection of absolutes, the fundamental skepticism about grand narratives, and the understanding of power not as something external to ourselves but bound up in the flux of discourse and language, are all key aspects of what I’ll be calling “postmodern thought.” Postmodern thought is what Jordan Peterson is referring to when he asserts that, for postmodernists, all interpretations of anything at all are equally valid. I will have occasion in future posts to explain why this is completely wrong (which doesn’t differentiate it from most of his assertions), but for my first few posts I want to distinguish between the various intellectual strands of thinking that emerged from postmodernism, and postmodernism as the cultural condition that gave rise to postmodern thought.

Because the one thing I want to clarify, even if I don’t manage to make anything else clear in these posts, is that postmodernism—or more specifically postmodernity—is above all other things a cultural condition emerging from a confluence of historical circumstances and contexts. The tacit understanding of postmodernism when deployed as a term of disapprobation is that it is a pernicious relativist mode of thinking that exerts a profound and deleterious influence on, well, everything; in the crudest and most conspiratorial version of this thinking, such contemporary social movements that enrage conservatives like feminism, trans rights, Black Lives Matter (and the current bête noir, “critical race theory”) , and a host of other “woke” causes, can be traced back to a handful of French intellectuals in the late 1960s who invented postmodernism as a means of pursuing Marxism and the destruction of Western civilization by other means.2 And while the loose assemblage of schools of thought I’m calling “postmodern” has undoubtedly shaped contemporary attitudes—sometimes positively, sometimes not—my larger argument is that they have emerged in response to cultural conditions; what postmodernism’s detractors call “postmodernism” has usually been less a tool of cultural change than a series of attempts to find language to describe the dramatic cultural transformations of the post-WWII landscape. The cultural changes in question, from intersectional understandings of identity to the normalization of gay marriage, might have been facilitated in part by aspects of postmodern thought, but have been far more facilitated by the technological and economic transformations of postmodernity.

Which, once again, brings us back to the basic question of just what postmodernism is. As I say above, it is a lot of things, including but not limited to:

  • The cultural logic of late capitalism.
  • The breakdown of faith in such societal grand narratives as religion, governance, justice, science, etc.
  • A set of aesthetic practices in art, fiction, film, and architecture (among others) that reflect and articulate such breakdowns.
  • The rise of multinational corporate capital, and its transformation of the imperialist project from a national, colonial one to the subjugation of national interests to the global market.
  • The ascendancy of neoliberal free-market fundamentalism.
  • The snowballing of technology, especially communication technology—from television to the internet to social media—and the concomitant erosion of traditional informational gatekeepers (e.g. legacy media).
  • The inescapability of consumer culture and the culture industry.
  • A set of philosophical, theoretical, and critical attempts to adequately describe all of the above.

So … that clears things up, right? Just kidding—of course it doesn’t. But hopefully that starts to communicate the complexity of the subject.

I do want to make clear that this series of posts is not meant as an apologia for postmodernity, postmodernism, or postmodern thought. I have for many years studied postmodernism and written about it and taught it in university classrooms, but I would not call myself a postmodernist (though others, mostly the people who don’t really understand it, certainly would). My own thinking and philosophical inclinations bend toward pragmatism (the philosophical kind) and a sort of small-h humanism; my lodestars in this respect are Richard Rorty and Terry Pratchett. I passionately love some aspects of postmodern culture, mostly its aesthetic incarnations—the fiction of Toni Morrison and Don DeLillo, for example, or films like Blade Runner—and I find the theoretical armature defining postmodernism, whether it’s celebratory like that of Linda Hutcheon or Brian McHale, or damning like that of Fredric Jameson, fascinating; but huge swaths of what I’m calling postmodernity are pernicious and harmful (corporate capitalism and social media among them).

This series of posts is rather a quixotic attempt to re-inject some nuance into what has become an impoverished and overdetermined understanding of a bewilderingly complex concept. More importantly, it is my own way of working through my own thoughts on the matter. So bear with me.

NOTES

1. There is also the rather amusing thought of just how disgusted Nietzsche would be with “woke” sentiments.

2. More on this in my future post on “The Conspiracy Theory of Postmodernism.”

Leave a comment

Filed under Remembering Postmodernism

A Few Things

Summer Blogging Update     Several posts ago, I announced my summer blogging plans; I then dutifully followed up with a three-part series on memory, history, and forgetting in fairly quick succession. And then … well, nothing for two weeks. That’s largely due to the fact that my next series of posts, which will be a deep dive into postmodernism—what it is, what it was, what it isn’t, and why most of the people using the word in the media these days have no idea what they’re talking about—has been taking somewhat longer to write than I’d planned. Actually, that isn’t entirely true: the second and third installments are all but completed, and I’ve gotten a healthy start on the third and fourth, but the introductory post is taking longer (and is getting longer). I am optimistic that I will have it done by the end of the weekend, however, so hopefully Phase Two of My Ambitious Posts No One Will Read will soon commence.

What I’ve Watched     A few weeks ago I binged the series Rutherford Falls on Amazon Prime. I’d read some positive reviews of it and heard good things via NPR’s Pop Culture Happy Hour podcast (always a reliable guide); the show is also co-created by Michael Schur, a showrunner who, in my opinion, has been batting 1.000 for years: The Office, Parks and Recreation, Brooklyn Nine-Nine, The Good Place. But what’s particularly notable about Rutherford Falls is that the premise of the show is rooted in the relationship in the titular town between its indigenous and non-indigenous townsfolk. The series was co-created by Sierra Teller Ornelas, a Navajo screenwriter and filmmaker. The writers’ room is also well represented with Native American writers, and the cast is remarkable.

Ed Helms and Jana Schmieding.

Ed Helms plays Nathan Rutherford, a scion of the family that gives the town its name, and who is about the most Ed Helms character I’ve yet seen—a painfully earnest and well-meaning but frequently clueless fellow whose entire sense of self is bound up in his family’s history and that of the town. He runs a museum-ish space in his ancestral home. His best friend is Reagan (Jana Schmieding), a member of the (fictional) Minishonka Nation who is more or less persona non grata in her Native community because she left her fiancée at the altar a number of years before and instead went and did her masters degree at Northwestern. She runs—or attempts to run—a “cultural center” inside the local casino, which is owned by Terry Thomas (Michael Greyeyes), the de facto leader of the Minishonka community.

The show is hilarious, but is also quite comfortable in its own skin (so to speak)—it engages with difficult issues of Native history, white entitlement and appropriation of Native culture, the memorialization of settler culture (the precipitating crisis is when the town’s African-American mayor seeks to remove a statue of the town’s founding Rutherford in the town square, not because it’s a symbol of colonialism, but because cars keep crashing into it), and the fraught navigation by Terry Thomas of American capitalism as a means of accruing power and influence for his people—all without ever losing its humour or coming across as pedantic. A key moment comes in the fourth episode, when an NPR podcaster, having sniffed out that there might be a story in this sleepy town, interviews Terry. He asks him how he reconciles the contradiction of running a casino—the epitome of capitalistic graft—with his own Native identity. Terry, who had been answering the podcaster’s questions with a politician’s practiced smoothness, reaches out and pauses the recorder—and then proceeds, in a stern but not-angry voice, to school the well-meaning NPR hipster on precisely how the casino is consonant with Native values because it is not about the accrual of wealth for wealth’s sake, but for the benefit of a community that has long been marginalized. He then hits play again, and resumes his breezy tone for the rest of the interview.

It is a bravura performance; indeed, Michael Greyeyes is the series’ great revelation. A Canadian-born actor of the Cree Nation, he has long been a staple in Canadian and American indigenous film, or film and television featuring indigenous characters, and has (at least in most of the stuff I’ve seen him in) tended to play stern, brooding, or dangerous characters. So it is a joy to see him show off his comedic talents. His best line? When arguing with Nathan Rutherford over the historical accuracy of a costume he wants him to wear in his planned historical theme park, he says in exasperation, “Our market research shows that the average American’s understanding of history can be boiled down to seven concepts: George Washington, the flag, Independence Day, Independence Day the movie, MLK, Forrest Gump, and butter churns.”

What I’m Currently Watching     Well, we just watched the season four finale of The Handmaid’s Tale last night, and I’m going to need a while to process that. And we watched the series premier of Loki, which looks promising—largely on the basis of the fact that a Tom Hiddleston / Owen Wilson buddy comedy is unlikely to disappoint no matter what is done to it.

F. Murray Abraham, Danny Pudi, David Hornsby, Rob McElhenney, and Charlotte Nicdao

But the show we’re loving right now beyond what is strictly rational is the second season of Mythic Quest on AppleTV. For the unfamiliar: it’s the creation of Rob McElhenney, Charlie Day, and Megan Ganz, the folks who brought you It’s Always Sunny In Philadelphia; the series is a workplace comedy set in the offices of a video game called Mythic Quest: Raven’s Banquet, an MMORPG along the lines of World of Warcraft or Elder Scrolls. McElhenney plays the game’s creator and mastermind Ian (pronounced “Ion”) Grimm, a self-styled visionary who is sort of a cross between Steve Jobs and a lifestyle guru. Charlotte Nicdao plays Poppy, the perennially anxious and high-strung lead engineer who is responsible for turning Ian’s (again, pronounced “Ion”) ideas into workable code. Danny Pudi, who most famously played Abed in Community, is Brad, the “head of monetization,” and is delightfully cast here as a kind of anti-Abed who knows little and cares less about pop culture and video gaming and is only concerned about how he can wring every last cent out of the game’s devoted players. And F. Murray Abraham—yes, Salieri himself—plays the game’s head writer, washed-up SF author C.W. Longbottom, a lascivious alcoholic whose sole claim to fame (to which he clings like a barnacle) is having won a Nebula award in the early 1970s.

That combination of characters alone is a selling point—but the show is also incredibly smart, and also—just when you least expect it—deeply emotional.

What I’m Reading     At this point in the early summer as I scramble to use my time to complete at least one article before September, as well as move my summer blogging project forward (to say nothing of starting on course prep for the Fall), the question feels a little more like what am I not reading.

But that’s all business. My pleasure reading of the moment is a trilogy by M.R. Carey. Carey wrote the brilliant zombie apocalypse novel The Girl With all the Gifts—which was made into a quite good film starring Glenn Close and Gemma Arterton—and its companion novel set in the same world The Boy on the Bridge. One of my grad students this past semester alerted me to the fact that Carey had recently published a series set in another post-apocalyptic scenario. The “Ramparts Trilogy”—comprised of the novels The Book of Koli, The Trials of Koli, and The Fall of Koli—isn’t really a trilogy per se. The novels were all released within several months of each other, suggesting that “Ramparts” was really more of 1000+ page novel that the publisher chose to release in serial form rather than all at once.

Like the other Carey novels I’ve read, the Koli books are post-apocalyptic, set in an England some three to four centuries after humanity more or less obliterated itself in “the Unfinished War.” It is a word in which humanity has been reduced to a mere handful of its former numbers, and everything in nature now seems to be intent on killing the remaining survivors. Human manipulation of trees and vegetation to make them grow faster as part of an effort of reverse climate change has resulted in hostile forests: only on cold or overcast days can anyone walk among the trees without them snaring them with their tendrils. Any wood cut has to be “cured” in saline vats to kill it before it can be used as lumber. And the animal life that has evolved to live in such an environment is equally dangerous.

Koli is, at the start of the novels, an earnest if slightly dim fifteen-year old living in the village of Mythen Rood in the Calder Valley in the northwest of what was once England. His family are woodsmiths; Mythen Rood is protects by the Ramparts, villagers who can use the tech of the old world—a flamethrower, a guided bolt-gun, a “cutter,” which emits an invisible cutting beam, and a database that offers up helpful but cryptic knowledge and information from the old world.

There is much tech left over from the old world, but most of it is useless. Koli’s story essentially begins when he becomes obsessed with “waking” a piece of tech and joining the ranks of the Ramparts.

And that’s all I’ll tell you of the story: suffice to say, with one-third of the last novel to go, I am captivated by both the story Carey tells and the potential future he envisions.

Leave a comment

Filed under A Few Things, what I'm reading, what I'm watching

History, Memory, and Forgetting Part 3: The Backlash Against Remembering

“The struggle of man against power is the struggle of memory against forgetting.”

—Milan Kundera The Book of Laughter and Forgetting

I had drafted the first two installments of this three-part series of posts and was working on this one when the news of the discovery of the remains of 215 indigenous children on the grounds of a former residential school in BC broke. I paused for a day or two, uncertain of whether I wanted to talk about it here, in the context of my broader theme of history, memory, and forgetting. Part of my hesitation is I honestly lack the words for what is a shocking but utterly unsurprising revelation.

I must also confess that, to my shame, one of my first thoughts was the dread certainty that we’d soon be treated to some tone-deaf and historically ignorant column from the likes of Rex Murphy or Conrad Black or any one of the coterie of residential school apologists. So far however the usual suspects seem to be steering clear of this particular story; possibly the concrete evidence of so much death and suffering perpetrated by white turnkeys in the name of “civilizing” Native children is a bridge too far even for Murphy and Black et al’s paeans to Western civilization and white Canada’s munificence.

What I’m talking about in this third post of three is remembering as a political act: more specifically, of making others remember aspects of our history that they may not want to accept or believe. Scouting as I did for anything trying to ameliorate or excuse or explain away this evidence of the residential schools’ inhumanity,[1] I found my way to a 2013 Rex Murphy column that might just be the epitome of the genre, as one gleans from its title: “A Rude Dismissal of Canada’s Generosity.” In Murphy’s telling, in this column as in other of his rants and writings, conditions for Native Canadians in the present day are a vast improvement over historical circumstances, but the largesse of white Canadians and the government is something our indigenous populations perversely deny at every turn. He writes: “At what can be called the harder edges of native activism, there is a disturbing turn toward ugly language, a kind of razor rhetoric that seeks to cut a straight line between the attitudes of a century or a century and a half ago and the extraordinarily different attitudes that prevail today.”

“Attitudes” is the slippery word there: outside of unapologetically anti-indigenous and racist enclaves, I doubt you’d have difficulty finding white Canadians who did not piously agree that the exploitation and abuse of our indigenous peoples was a terrible thing. You’d have a much harder time finding anyone willing to do anything concrete about it, such as restoring the land we cite in land acknowledgments to its ancestral people. Attitudes, on the balance, have indeed improved, but that has had little effect on Native peoples’ material circumstances. And in his next paragraph, Murphy seems intent on demonstrating that not all attitudes have, in fact, improved:

From native protestors and spokespeople there is a vigorous resort to current radical jargon—referring to Canadians as colonialist, as settlers, as having a settler’s mentality. Though it is awkward to note, there is a play to race in this, a conscious effort to ground all issues in the allegedly unrepentant racism of the “settler community.” This is an effort to force-frame every dispute in the tendentious framework of the dubious “oppression studies” and “colonial theory” of latter-day universities.

And there it is—the “radical jargon” that seeks to remember. Referring to Canadians as colonialist settlers isn’t radical, nor is it jargon, but is a simple point of fact—and indeed, for decades history textbooks referred to settlers as brave individuals and the colonizing of Canada as a proud endeavour, necessarily eliding the genocidal impact on the peoples already inhabiting the “new world.” Murphy’s vitriol is, essentially, denialism: denying that our racist and oppressive history doesn’t linger on in a racist present. He speaks for an unfortunately large demographic of white Canada that is deeply invested in a whitewashed history, and reacts belligerently when asked to remember things otherwise.

This is a phenomenon we see playing out on a larger scale to the south, most recently with a substantial number of Republican-controlled state legislatures introducing bills that would forbid schools from teaching any curricula suggesting that the U.S. is a racist country, that it has had a racist history, or really anything that suggests racism is systemic and institutional rather than an individual failing. The bogeyman in much of the proposed legislation is “critical race theory.” Like Rex Murphy’s sneering characterization of “latter-day universities” offering degrees in “oppression studies” (not actually a thing), critical race theory is stigmatized as emerging from the university faculty lounge as part and parcel of “cultural Marxism’s” sustained assault on the edifices of Western civilization.[2] While critical race theory did indeed emerge from the academy, it was (and is) a legal concept developed by legal scholars like Derrick Bell and Kimberlé Crenshaw[3] in the 1970s and 80s. As Christine Emba notes, “It suggests that our nation’s history of race and racism is embedded in law and public policy, still plays a role in shaping outcomes for Black Americans and other people of color, and should be taken into account when these issues are discussed.” As she further observes, it has a clear and quite simple definition, “one its critics have chosen not to rationally engage with.”

Instead, critical race theory is deployed by its critics to connote militant, illiberal wokeism in a manner, to again quote Emba, that is “a psychological defense, not a rational one”—which is to say, something meant to evoke suspicion and fear rather than thought. The first and third words of the phrase, after all, associate it with elite liberal professors maundering in obscurantist jargon, with which they indoctrinate their students into shrill social justice warriors. (The slightly more sophisticated attacks will invoke such bêtes noir of critical theory as Michel Foucault or Jacques Derrida[4]).

But again, the actual concept is quite simple and straightforward: racism is systemic, which should not be such a difficult concept to grasp when you consider, for example, how much of the wealth produced in the antebellum U.S. was predicated on slave labour, especially in the production of cotton—something that also hugely benefited the northern free states, whose textile mills profitably transformed the raw cotton into cloth. Such historical realities, indeed, were the basis for the 1619 Project, the New York Times’ ambitious attempt to reframe American history through the lens of race—arguing that the true starting-point of America was not with the Declaration of Independence in 1776, but when the first African slaves set foot on American soil in 1619.

The premise is polemical by design, and while some historians took issue with some of the claims made, the point of the project was an exercise in remembering aspects of a national history that have too frequently been elided, glossed over, or euphemized. In my previous post, I suggested that the forgetting of the history of Nazism and the Holocaust—and its neutering through the overdeterminations of popular culture—has facilitated the return of authoritarian and fascistic attitudes. Simultaneously, however, it’s just as clear that this revanchist backsliding in the United States has as much to do with remembering. The reactionary crouch inspired by the tearing down of Confederate monuments isn’t about “erasing history,” but remembering it properly: remembering that Robert E. Lee et al were traitors to the United States and were fighting first and foremost to maintain the institution of slavery. Apologists for the Confederacy aren’t wrong when they say that the Civil War was fought over “states’ rights,” they’re just eliding the fact that the principal “right” being fought for above all others was the right to enslave Black people. All one needs to do to clarify this particular point is to read the charters and constitutions of the secessionist states, all of which make the supremacy of the white race and the inferiority of Africans their central tenet. The Confederate Vice President Alexander Stephens declared that the “cornerstone” of the Confederacy was that “the negro is not equal to the white man; that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”

Statue of Nathan Bedford Forrest in Memphis, Tennessee

Bringing down Confederate monuments isn’t erasure—it’s not as if Robert E. Lee and Nathan Bedford Forrest disappear from the history books because people no longer see their bronze effigies in parks and town squares—but the active engagement with history. That was also the case with the erection of such monuments, albeit in a more pernicious manner: the vast majority were put up in the 1920s and 1950s, both of which were periods when white Americans felt compelled to remind Black Americans of their subordinate place by memorializing those who had fought so bloodily to retain chattel slavery. Like the Confederate battle flag, these monuments were always already signifiers of white supremacy, though that fact has been systematically euphemized with references to “southern tradition,” and of course the mantra of states’ rights.

Martin Sheen as Robert E. Lee in Gettysburg (1993)

Once when I was still in grad school and had gone home to visit my parents for a weekend, I was up late watching TV, thinking about going to bed, and while flipping channels happened across a film set during the Civil War. It was about halfway through, but I stayed up watching until it was done. The story was compelling, the acting was really good, and the battle scenes were extremely well executed. The film was Gettysburg, which had been released in 1993. It had a huge cast, including Jeff Daniels and Sam Elliott, and Martin Sheen as General Robert E. Lee. Because I’d only seen the second half, I made a point of renting it so I could watch the entire thing.

Gettysburg is a well-made film with some great performances, and is very good at pushing emotional buttons. Colonel Joshua Chamberlain’s (Jeff Daniels) bayonet charge down Little Round Top is a case in point:

I’m citing Gettysburg here because it is one of the most perfect examples of how deeply the narrative of the Lost Cause became rooted in the American imagination. The Lost Cause, for the uninitiated, is the ongoing project, begun just after the end of the Civil War, to recuperate and whitewash (pun intended) the Confederacy and the antebellum South. Its keynotes include the aforementioned insistence that the Civil War wasn’t fought over slavery but states’ rights; its foregrounding of cultured and genteel Southern gentlemen and women; depictions of slaves (when depicted at all) as happy spiritual-singing fieldhands under the benevolent supervision of the mastah; Northerners as rapacious carpetbaggers who proceeded to despoil everything good and noble about the South in the years following the war; and the war itself as a tragic but honourable dispute between sad but dutiful officer classes prosecuting their respective sides’ goals not in anger but sorrow.

This last element is Gettysburg’s connective tissue. Let’s be clear: the film isn’t Southern propaganda like D.W. Griffith’s Birth of a Nation. It is, indeed, high-minded and even-handed. Martin Sheen—Martin fuckin’ Jed Bartlett Sheen!—plays Robert E. Lee, one of the prime targets of statue removers. But the film is propagandistic: there is, to the best of my recollection, not a single Black character in the film, and slavery as the cause of the conflict is alluded to (again, to the best of my recollection) only once—and then only obliquely.

My point in using this example—I could just as easily have cited Gone With the Wind or Ken Burns’ docuseries The Civil War—is how invidiously the Lost Cause narrative has wormed its way into the American imaginary. It is of a piece with everything else I’ve been talking about in this thee-part series of posts. What novels like The Underground Railroad and historical reckonings like the 1619 Project—as well as the campaign to tear down Confederate monuments—attempt is a kind of radical remembering. And as we see from the ongoing backlash, to remember things differently can be threatening to one’s sense of self and nation.

NOTES


[1] As I said, none of the usual suspects seems to have advanced an opinion, but there was—not unpredictably—an awful lot of such attempts in the comments sections on articles about the grisly discovery. They ranged from the usual vile racist sentiments one always finds in these digital cesspools, to one person who argued at length that the child mortality rates in residential schools were consonant with child mortality rates in the population at large, nineteenth century hygiene and medicine being what it was. This individual was not undeterred from their thesis in spite of a long-suffering interlocutor who provided stats and links showing that (a) what the person was referencing was infant mortality rates, which is not the same thing; (b) that the death rates in residential schools were actually more egregious in the 1930s and 40s, and (c) that mass burials in unmarked graves without proper records and death certificates spoke to the dehumanization of the Native children on one hand, and the likelihood on the other hand that this indicated that the “teachers” at these schools were reluctant to leave evidence of their abusive treatment.

[2] I will have a lot more to say on this particular misapprehension of “the university faculty lounge” in a future post on the more general misapprehensions of what comprises a humanities degree.

[3] Crenshaw also developed that other concept that triggers conservatives, “intersectionality.”

[4] Stay tuned for my forthcoming post, “The Conspiracy Theory of Postmodernism.”

Leave a comment

Filed under maunderings

History, Memory, and Forgetting Part 2: Forgetting and the Limits of Defamiliarization

“We cross our bridges when we come to them, and burn them behind us, with nothing to show for our progress except a memory of the smell of smoke, and a presumption that once our eyes watered.”

—Tom Stoppard, Rosencrantz and Guildenstern are Dead

In my first year at Memorial, I taught one of our first-year fiction courses. I ended the term with Martin Amis’ novel Time’s Arrow—a narrative told from the perspective of a parasitic consciousness that experiences time backwards. The person to whom the consciousness is attached turns out to be a Nazi war criminal hiding in America, who was a physician working with Dr. Mengele at Auschwitz. Just as we get back there, after seeing this man’s life played in reverse to the bafflement of our narrator, the novel promises that things will start to make sense … now. And indeed, the conceit of Amis’ novel is that the Holocaust can only make sense if played in reverse. Then, it is not the extermination of a people, but an act of benevolent creation—in which ashes and smoke are called down out of the sky into the chimneys of Auschwitz’s ovens and formed into naked, inert bodies. These bodies then have life breathed into them, are clothed, and sent out into the world. “We were creating a race of people,” the narrative consciousness says in wonder.

Time’s Arrow, I told my class, is an exercise of defamiliarization: it wants to resist us becoming inured to the oft-repeated story of the Holocaust, and so requires us to view it from a different perspective. Roberto Benigni’s film Life is Beautiful, I added, worked to much the same end, by (mostly) leaving the explicit brutalities of the Holocaust offstage (as it were), as a father clowns his way through the horror in order to spare his son the reality of their circumstances. As I spoke, however, and looked around the classroom at my students’ uncomprehending expressions, I felt a dread settle in my stomach. Breaking off from my prepared lecture notes, I asked the class: OK, be honest here—what can you tell me about the Holocaust?

As it turned out, not much. They knew it happened in World War II? And the Germans were the bad guys? And the Jews didn’t come out of it well …? (I’m honestly not exaggerating here). I glanced at my notes, and put them aside—my lecture had been predicated on the assumption that my students would have a substantive understanding of the Holocaust. This was not, to my mind, an unreasonable assumption—I had grown up learning about it in school by way of books like Elie Wiesel’s Night, but also seeing movies depicting its horrors. But perhaps I misremembered: I was from a young age an avid reader of WWII history (and remain so to this day), so I might have assumed your average high school education would have covered these bases in a more thorough manner.[1]

The upshot was that I abandoned my lecture notes and spent the remaining forty minutes of class delivering an off-the-cuff brief history of the Holocaust that left my students looking as if I’d strangled puppies in front of them, and me deeply desiring a hot shower and a stiff drink.

In pretty much every single class I’ve ever taught since, I will reliably harangue my students that they need to read more history. To be fair, I’d probably do that even without having had this particular experience; but I remember thinking of Amis’ brilliant narrative conceit that defamiliarization only works if there has first been familiarization, and it depressed me to think that the passage of time brings with it unfamiliarization—i.e. the memory-holing of crucial history that, previously, was more or less fresh in the collective consciousness. The newfound tolerance for alt-right perspectives and the resurgence of authoritarian and fascist-curious perspectives (to which the Republican Party is currently in thrall) proceeds from a number of causes, but one of them is the erosion of memory that comes with time’s passage. The injunctions against fascism that were so powerful in the decades following WWII, when the memory of the Holocaust was still fresh and both the survivors and the liberators were still ubiquitous, have eroded—those whose first-hand testimonials gave substance to that history have largely passed away. Soon none will remain.

What happens with a novel like Time’s Arrow or a film like Life is Beautiful when you have audiences who are effectively ignorant of the history informing their narrative gambits? Life is Beautiful, not unpredictably, evoked controversy because it was a funny movie about the Holocaust. While it was largely acclaimed by critics, there were a significant number who thought comedy was egregiously inappropriate in a depiction of the Holocaust,[2] as was using the Holocaust as a backdrop for a story focused on a father and his young son. As Italian film critic Paolo Mereghetti observes, “In keeping with Theodor Adorno’s idea that there can be no poetry after Auschwitz, critics argued that telling a story of love and hope against the backdrop of the biggest tragedy in modern history trivialized and ultimately denied the essence of the Holocaust.” I understand the spirit of such critiques, given that humour—especially Roberto Benigni’s particular brand of manic clowning—is jarring and dissonant in such a context, but then again, that’s the entire point. The film wants us to feel that dissonance, and to interrogate it. And not for nothing, but for all of the hilarity Benigni generates, the film is among one of the most heartbreaking I’ve seen, as it is about a father’s love and his desperate need to spare his son from the cruel reality of their circumstances. Because we’re so focused on the father’s clownish distractions, we do not witness—except for one haunting and devastating scene—the horrors that surround them.

In this respect, Life is Beautiful is predicated on its audience being aware of those unseen horrors, just as Time’s Arrow is predicated on its readers knowing the fateful trajectory of Jews rounded up and transported in boxcars to their torture and death in the camps, to say nothing of the grotesque medical “experiments” conducted by Mengele. The underlying assumption of such defamiliarization is that an oft-told history such as the Holocaust’s runs the risk of inuring people to its genuinely unthinkable proportions.[3] It is that very unthinkability, fresher in the collective memory several decades ago, that drove Holocaust denial among neo-Nazi and white supremacist groups—because even such blinkered, hateful, ignorant bigots understood that the genocide of six million people was a morally problematic onion in their racial purity ointment.[4]

“I know nothing, I see nothing …”

They say that tragedy plus time begets comedy. It did not take long for Nazis to become clownish figures on one hand—Hogan’s Heroes first aired in 1965, and The Producers was released in 1967—and one-dimensional distillations of evil on the other. It has become something of a self-perpetuating process: Nazis make the best villains because (like racist Southerners, viz. my last post) you don’t need to spend any time explaining why they’re villainous. How many Stephen Spielberg films embody this principle? Think of Indiana Jones in The Last Crusade, looking through a window into a room swarming with people in a certain recognizable uniform: “Nazis. I hate these guys.” It’s an inadvertently meta- moment, as well as a throwback to Indy’s other phobia in Raiders of the Lost Ark: “Snakes. Why’d it have to be snakes?” Snakes, Nazis, tomato, tomahto. Though I personally consider that a slander against snakes, the parallel is really about an overdetermined signifier of evil and revulsion, one that functions to erase nuance.

Unfortunately, if tragedy plus time begets comedy, it also begets a certain cultural amnesia when historically-based signifiers become divorced from a substantive understanding of the history they’re referencing. Which is really just a professorial way of saying that the use of such terms as “Nazi” or “fascist,” or comparing people to Hitler has become ubiquitous in a problematic way, especially in the age of social media. Case in point, Godwin’s Law, which was formulated by Michael Godwin in the infancy of the internet (1990). Godwin’s Law declares that “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.” This tendency has been added to the catalogue of logical fallacies as the “Reduction ad Hitlerum,” which entails “an attempt to invalidate someone else’s position on the basis that the same view was held by Adolf Hitler or the Nazi Party.”

Perhaps the most egregious recent example of this historical signification was Congresswoman and QAnon enthusiast Marjorie Taylor Greene’s comparison of Nancy Pelosi’s decision to continue requiring masks to be worn in the House of Representatives (because so many Republicans have declared their intention to not get vaccinated) to the Nazi law requiring Jews to wear a gold Star of David on their chests. She said, “You know, we can look back at a time in history where people were told to wear a gold star, and they were definitely treated like second class citizens, so much so that they were put in trains and taken to gas chambers in Nazi Germany. And this is exactly the type of abuse that Nancy Pelosi is talking about.”

To be certain, Greene was roundly condemned by almost everybody, including by many in her own party—even the craven and spineless Minority Leader Kevin McCarthy had some stern words for her—but what she said was different not in kind but in degree from the broader practice of alluding to Nazism and the Holocaust in glib and unreflective ways.[5]

Though this tendency is hardly new—Godwin’s Law is thirty-one years old—it has been amplified and exacerbated by social media, to the point where it made it difficult to find terms to usefully describe and define Donald Trump’s authoritarian tendencies. The ubiquity of Nazi allusions has made them necessarily diffuse, and so any attempt to characterize Trumpism as fascist in character could be easily ridiculed as alarmist and hysterical; and to be fair, there were voices crying “fascist!” from the moment he made that initial descent on his golden escalator to announce his candidacy. That those voices proved prescient rather than alarmist doesn’t obviate the fact that they muddied the rhetorical waters.[6] As the contours of Trump’s authoritarian tendencies came into focus, the fascistic qualities of the Trumpian Right became harder and harder to ignore; bizarrely, they’ve become even more clearly delineated since Trump left office, as the republicans still kowtow to the Mar-A-Lago strongman and move to consolidate minoritarian power.

Historians and political philosophers and a host of other thinkers of all stripes will be years in unravelling the historical and cultural strands of Trump’s rise and the previously unthinkable hold that Trumpism has over a stubborn rump of the electorate; but I do think that one of the most basic elements is our distressing tendency toward cultural amnesia. It makes me think we’re less in need of defamiliarizing history than defamiliarizing all the clichés of history that have become this inchoate jumble of floating signifiers, which allow neo-Nazis and white supremacists to refashion themselves as clean cut khaki-clad young men bearing tiki-torches, or to disingenuously euphemize their racism as “Western chauvinism” and meme their way out of accusations of ideological hatefulness—“It’s just about the lulz, dude.”

There is also, as I will discuss in the third of these three posts, the fact that remembering is itself a politically provocative act. On one hand, the diminution in the collective memory of Nazism and the Holocaust has facilitated the re-embracing of its key tropes; on the other, the active process of remembering the depredations of Western imperialism and the myriad ways in which slavery in the U.S. wasn’t incidental to the American experiment but integral gives rise to this backlash that takes refuge in such delusions as the pervasiveness of anti-white racism.

NOTES


[1] To be clear, this is not to castigate my students; as I’ll be expanding on as I go, historical amnesia is hardly limited to a handful of first-year university students in a class I taught fifteen years ago.

[2] One can only speculate on what such critics make of Mel Brooks’ career.

[3] Martin Amis attempts something similar in his 1985 short story collection Einstein’s Monsters, which is about nuclear war. His lengthy introductory essay “Thinkability” (to my mind, the best part of the book) addresses precisely the way in which military jargon euphemizes the scope and scale of a nuclear exchange, precisely to render the unthinkable thinkable. 

And speaking of humour used to defamiliarize horror: Dr. Strangelove, or How I Finally Stopped Worrying and Learned to Love the Bomb, and General Buck Turgidson (George C. Scott)’s own “thinkability” regarding the deaths in a nuclear war: “Mr. President, we are rapidly approaching a moment of truth, both for ourselves as human beings and for the life of our nation. Now, truth is not always a pleasant thing. But it is necessary now to make a choice, to choose between two admittedly regrettable, but nevertheless distinguishable, post-war environments: one where you got 20 million people killed, and the other where you got 150 million people killed! … Mr. President, I’m not saying we wouldn’t get our hair mussed, but I do say no more than 10 to 20 million killed, tops! Uh, depending on the breaks.”

[4] Which always rested on a tacit contradiction in their logic: it didn’t happen, but it should have.

[5] We see the same tendency, specifically among conservatives, to depict any attempt at raising taxes or expanding social programs as “socialism,” often raising the spectre of “Soviet Russia”—which is about as coherent as one of my favourite lines from Community, when Britta cries, “It’s just like Stalin back in Russia times!”

[6] I don’t say so to castigate any such voices, nor to suggest that they were premature—the contours of Trump’s authoritarian, nativist style were apparent from before he announced his candidacy to anyone who looked closely enough.

Leave a comment

Filed under maunderings

History, Memory, and Forgetting Part 1: Deconstructing History in The Underground Railroad

“Yesterday, when asked about reparations, Senate Majority Leader Mitch McConnell offered a familiar reply: America should not be held liable for something that happened 150 years ago, since none of us currently alive are responsible … This rebuttal proffers a strange theory of governance, that American accounts are somehow bound by the lifetime of its generations … But we are American citizens, and thus bound to a collective enterprise that extends beyond our individual and personal reach.”

—Ta-Nehisi Coates, Congressional Hearing on Reparations, 20 June 2019
Thuso Mbedu as Cora in The Underground Railroad

I’ve been slowly working my way through Barry Jenkins’ ten-episode adaptation of Colson Whitehead’s novel The Underground Railroad. I’ve been a huge fan of Whitehead’s fiction ever since I got pointed towards his debut novel The Intuitionist; I read The Underground Railroad when it was still in hardcover, and I’ve included it twice in classes I’ve taught. When I saw it was being adapted to television by the virtuoso director of Moonlight and If Beale Street Could Talk, I knew this was a series I wanted to watch.

I was also wary—not because I was concerned about the series keeping faith with the novel, but because I knew it would make for difficult viewing. Whatever liberties Whitehead takes with history (as I discuss below), he is unsparing with the brutal historical realities of slavery and the casual cruelty and violence visited on slaves. It is often difficult enough to read such scenes, but having them depicted on screen—seeing cruelty and torture made explicit in audio and visual—can often be more difficult to watch.

For this reason, for myself at least, the series is the opposite of bingeable. After each episode, I need to digest the story and the visuals and think.

The Underground Railroad  focuses on the story of Cora (Thuso Mbedu), a teenage girl enslaved on a Georgia plantation whose mother had escaped when she was young and was never caught, leaving Cora behind. Another slave named Caesar (Aaron Pierre) convinces her to flee with him. Though reluctant at first, circumstances—the kind of circumstances which make the show difficult viewing—convince her to go. She and Caesar and another slave named Lovey (who had seen them go and, to their dismay, tagged along) are waylaid by slave-catchers. Though Lovey is recaptured, Cora and Caesar escape, but in the process one of the slave-catchers is killed. They make it to a node of the Underground Railroad and are sheltered by a white abolitionist with whom Caesar had been in contact. He then takes them underground to the “station,” where they wait for the subterranean train that will take them out of Georgia.

Because that is the principal conceit of The Underground Railroad: that the rail system spiriting slaves away is not metaphorical, but literal, running through underground tunnels linking the states.

More than a few reviews of the series, and also of the novel on which it is based, have referred to the story as “magical realism.” This is an inaccurate characterization. Magical realism is a mode of narrative in which one group or culture’s reality collides with another’s, and what seems entirely quotidian to one is perceived as magical by the other. That’s not what Colson Whitehead is doing, and, however lyrical and ethereal occasionally dreamlike Barry Jenkin’s visual rendering is, it’s not what the series is doing either. In truth, the literalization of the underground railroad is the least of Whitehead’s historical tweaks: The Underground Railroad is not magical realism, but nor is it alternative history (a genre that usually relies on a bifurcation in history’s progression, such as having Nazi sympathizer Charles Lindbergh win the presidency in 1940 in Philip Roth’s The Plot Against America). The Georgia plantation on which The Underground Railroad begins is familiar enough territory, not discernable from similar depictions in Uncle Tom’s Cabin or 12 Years a Slave. But then as Cora journeys from state to state, each state embodies a peculiar distillation of racism. Cora and Caesar first make it to South Carolina, which appears at first glance to be an enlightened and indeed almost utopian place: there is no slavery, and the white population seems dedicated to uplifting freed Blacks, from providing education and employment to scrupulous health care to good food and lodging. Soon however the paternalistic dimension of this altruism becomes more glaring, and Cora and Caesar realize that the free Blacks of South Carolina are being sterilized and unwittingly used in medical experimentation.

In North Carolina, by contrast, Blacks have been banished, and any found within the state borders are summarily executed. The white people of North Carolina declared slavery a blight—because it disenfranchised white workers. Since abolishing slavery and banishing or killing all of the Black people, the state has made the ostensible purity of whiteness a religious fetish. Cora spends her time in North Carolina huddled in the attic of a reluctant abolitionist and his even more reluctant wife in an episode that cannot help but allude to Anne Frank.

Whitehead’s vision—stunningly rendered in Jenkin’s adaptation—is less an alternative history than a deconstructive one. As Scott Woods argues, The Underground Railroad is not a history lesson, but a mirror, with none of “the finger-wagging of previous attempts to teach the horrors of slavery to mainstream audiences.” I think Woods is being polite here, using “mainstream” as a euphemism for “white,” and he tactfully does not observe that effectively all of such finger-wagging attempts (cinematically, at any rate) have tended to come from white directors and feature white saviour protagonists to make liberal white audiences feel better about themselves.

There are no white saviours in The Underground Railroad; there is, in fact, very little in the way of salvation of any sort, just moments of relative safety. As I still have a few episodes to go, I can’t say how the series ends; the novel, however, ends ambivalently, with Cora having defeated the dogged slave-catcher who has been pursuing her from the start, but still without a clear sense of where she is going—the liberatory trajectory of the underground railroad is unclear and fraught because of the weight of the experience Cora has accrued over her young life. As she says at one point, “Make you wonder if there ain’t no real place to escape to … only places to run from.” There is no terminus, just endless flight.

When I say The Underground Railway is a “deconstructive” history, I don’t use the term in the sense as developed by Jacques Derrida (or at least, not entirely). Rather, I mean it in the more colloquial sense such as employed by, for example, chefs when they put, say, a “deconstructed crème brûlée” on the menu, which might be a smear of custard on the plate speared by a shard of roasted sugar and garnished with granitas infused with cinnamon and nutmeg. If the dish is successful, it is because it defamiliarizes a familiar dessert by breaking down its constituent ingredients in such a way as to make the diner appreciate and understand them anew—and, ideally, develop a more nuanced appreciation for a classic crème brûlée.

So—yes, odd analogy. But I’d argue that Whitehead’s novel is deconstructive in much the same manner, by taking a pervasive understanding of racism and the legacy of slavery and breaking it down into some of its constituent parts. In South Carolina, Cora and Caesar experience the perniciousness of white paternalism of the “white man’s burden” variety—the self-important concern for Black “uplift” that is still invested in the conviction of Black culture’s barbarism and inferiority, and which takes from this conviction license to violate Black bodies in the name of “science.”

Thuso Mbedu as Cora and William Jackson Harper as Royal.

Then in North Carolina, we see rendered starkly the assertion that Blacks are by definition not American, and are essentially seen as the equivalent of an invasive species. This, indeed, was the basis of the notorious 1857 Supreme Court ruling on Dred Scott v. Sandford, which asserted that Black people could not be citizens of the United States. Though that precedent was effectively voided by the thirteenth and fourteenth amendments—which abolished slavery and established citizenship for people of African descent born in America, respectively—the franchise was not effectively extended to Black Americans until Lyndon Johnson signed the Voting Rights act a century after the end of the Civil War. The delegitimization of Black voters continues: while the current Trumpian incarnation of the G.O.P. tacitly depicts anyone voting Democrat as illegitimate and not a “real American,” in practice, almost all of the legal challenges to the 2020 election result were directed at precincts with large numbers of Black voters.

Later in the novel when Cora finds her way to Indiana to live on a thriving Black-run farm, we see the neighbouring white community’s inability to countenance Black prosperity in such close proximity, especially when Black people flourishing reflects badly on their own failures. The pogrom that follows very specifically evokes the Tulsa Massacre of 1921, when a huge white mob essentially burned down a thriving Black part of town.

What’s important to note here is that Whitehead’s deconstructive process is less about history proper than about our pervasive depictions of history in popular culture, especially by way of fiction, film, and television. Or to be more accurate, it is about the mythologization of certain historical tropes and narratives pertaining to how we understand racism. One of the big reasons why so many (white) people are able to guilelessly[1] suggest that America is not a racist nation, or claim that the election of a Black president proves that the U.S. is post-racial, is because racism has come to be understood as a character flaw rather than a systemic set of overlapping cultural and political practices. Think of the ways in which Hollywood has narrated the arc of American history from slavery to the Civil War to the fight for civil rights, and try to name films that don’t feature virtuous white protagonists versus racist white villains. Glory, Mississippi Burning, Ghosts of Mississippi, A Time to Kill, Green Book, The Help[2]—and this one will make some people bristle—To Kill a Mockingbird. I could go on.

To be clear, I’m not saying some of these weren’t excellent films, some of which featured nuanced and textured Black characters[3] with considerable agency; but the point, as with all systemic issues, is not the individual examples, but the overall patterns. These films and novels flatter white audiences—we’re going to identify with Willem Dafoe’s earnest FBI agent in Mississippi Burning against the Klan-associated sheriff. “That would be me,” we[4] think, without considering how the FBI—at the time the film was set, no less!—was actively working to subvert the civil rights movement and shore up the societal structures subjugating and marginalizing Black Americans, because in this framing, racism is a personal choice, and therefore Dafoe’s character is not complicit in J. Edgar Hoover’s.  

Gene Hackman and Willem Dafoe in Mississippi Burning (1988).

The white saviour tacitly absolves white audiences of complicity in racist systems in this way, by depicting racism as a failing of the individual. It allows us to indulge in the fantasy that we would ourselves be the white saviour: no matter what point in history we find ourselves, we would be the exception to the rule, resisting societal norms and pressures in order to be non-racists. Possibly the best cinematic rebuke to this fantasy was in 12 Years a Slave,[5] in the form of a liberal-minded plantation owner played by Benedict Cumberbatch, who recognizes the talents and intelligence of Solomon Northup (Chiwetel Ejiofor), a formerly free Black man who had been dragooned by thugs and illicitly sold into slavery. Cumberbatch’s character looks for a brief time to be Solomon’s saviour, as he enthusiastically takes Solomon’s advice on a construction project over that of his foreman. But when the foreman violently retaliates against Solomon, Cumberbatch’s character cannot stand up to him. In a more typical Hollywood offering, we might have expected the enlightened white man to intervene; instead, he lacks the intestinal fortitude to act in a way that would have brought social disapprobation, and as a “solution” sells Solomon to a man who proves to be cruelly sociopathic.

Arguing for the unlikeliness that most people could step out of the roles shaped for them by social and cultural norms and pressures might seem like an apologia for historical racism—how can we have expected people to behave differently?—but really it’s about the resistance to seeing ourselves enmeshed in contemporary systemic racism.

Saying that that is Whitehead’s key theme would be reductive; there is so much more to the novel that I’m not getting to. There is, to be certain, a place—a hugely important place—for straightforward historical accounts of the realities and finer details of slavery, even the more “finger wagging” versions Scott Woods alludes to. But what both Whitehead’s novel and Barry Jenkins’ adaptation of it offer is a deconstruction of the simplistic binarism of decades of us vs. them, good vs. bad constructions of racism that give cover to all but the most unapologetically racist white people. The current backlash against “critical race theory”—which I’ll talk more about in the third of these three posts—proceeds to a great extent from its insistence on racism not as individual but systemic, as something baked into the American system.

Which, when you think about it, is not the outrageous argument conservatives make it out to be. Not even close: Africans brought to American shores, starting in 1619, were dehumanized, brutalized, subjected to every imaginable violence, and treated as subhuman property for almost 250 years. Their descendants were not given the full franchise as American citizens until the Civil Rights Act and the Voting Rights Act of 1964 and 1965, respectively. Not quite sixty years on from that point, it’s frankly somewhat baffling that anyone, with a straight face, can claim that the U.S. isn’t a racist nation. One of the greatest stumbling blocks to arriving at that understanding is how we’ve personalized racism as an individual failing. It shouldn’t be so difficult to recognize, as a white person, one’s tacit complicity in a long history without having to feel the full weight of disapprobation that the label “racist” has come to connote through the pop cultural mythologization of racism as a simple binary.

NOTES


[1] I want to distinguish here between those who more cynically play the game of racial politics, and those who genuinely do not see racism as systemic (granted that there is a grey area in between these groups). These are the people for whom having one of more Black friends is proof of their non-racist bona fides, and who honestly believe that racism was resolved by the signing of the Civil Rights Act and definitively abolished by Obama’s election.

[2] The Help, both the novel and the film, is perhaps one of the purest distillations of a white saviour story packaged in such a way to flatter and comfort white audiences. Essentially, it is the story of an irrepressible young white woman (with red hair, of course) nicknamed Skeeter (played by Emma Stone) who chafes against the social conventions of 1963 Mississippi and dreams of being a writer and journalist. TL;DR: she ends up telling the stories of the “help,” the Black women working as domestic labour for wealthy families such as her own, publishes them—anonymously of course, though Skeeter is the credited author—and thus gets the traction she needs to leave Mississippi for a writing career.

I can’t get into all of the problems with this narrative in a footnote; hopefully I don’t need to enumerate them. But I will say that one of the key things that irked me about this story, both the novel and the movie, is how it constantly name-checks Harper Lee and To Kill a Mockingbird as Skeeter’s inspiration. Lee might have given us the archetype of the white saviour in the figure of Atticus Finch, but she did it at a moment in time (it was published in 1960) when the subject matter was dangerous (Mary Badham, who played Scout, found herself and her family shunned when they returned to Alabama after filming ended for having participated in a film that espoused civil rights). By contrast, The Help, which was published in 2009 and the film released in 2011, is about as safe a parable of racism can be in the present day—safely set in the most racist of southern states during the civil rights era, with satisfyingly vile racist villains and an endearing, attractive white protagonist whose own story of breaking gender taboos jockeys for pole position with her mission to give voice to “the help.”

[3] Though to be fair, in some instances this had as much to do with virtuoso performances by extremely talented actors, such as Denzel Washington, Morgan Freeman, and Andre Braugher in Glory. And not to heap yet more scorn on The Help, but the best thing that can be said about that film is that it gave Viola Davis and Octavia Spencer the visibility—and thus the future casting opportunities—that their talent deserves.

[4] I.e. white people.

[5] Not coincidentally helmed by a Black director, Steve McQueen (no, not that Steve McQueen, this Steve McQueen).

Leave a comment

Filed under Uncategorized