I’ve been thinking a lot about gremlins these past few days.
I’m teaching a graduate seminar on weird fiction this summer (full title: “Weird Fiction: Lovecraft, Race, and the Queer Uncanny”), and the first assignment is a piece of creative non-fiction describing your most terrifying fictional experience; whether in a book, film, or episode of television, what scared you so badly that it stayed with you for weeks or years? I’ve done this kind of assignment before in upper-level classes, and it has always worked well—especially considering that I post everyone’s pieces in the online course shell so they can be read by the entire class. That always leads to a good and interesting class discussion.
In the interests of fairness and by way of example, I’m also writing one. Which is where the gremlins come in.
No, not the 1984 movie. I couldn’t watch it, given that by the time it was released I’d already been traumatized by “Nightmare at 20,000 Feet.” And no, not the original Twilight Zone episode from 1963. If I’d watched that episode—which featured pre-Star Trek William Shatner gnawing the furniture as a nervous flyer who sees a gremlin that looks like a man in a monkey suit on the wing of the plane—it’s possible gremlins wouldn’t have come to haunt my imagination the way they did. The original episode is creepy, to be certain, but not particularly scary; the gremlin is too fluffy and Shatner too Shatner to really evoke terror.
The gremlin that made me terrified to sleep at night for several months was the one in the “Nightmare” segment of the 1983 film The Twilight Zone: The Movie.
The premise is very simple, and most likely familiar even to those who haven’t seen it (not least because it was parodied in a Simpsons Halloween episode): a man who is a nervous flyer to start with is on a plane during a storm. Looking out the window, he sees … something. At first, he thinks his eyes are playing tricks, but then he sees it again. And then again, and each time it becomes clearer that there is a person-shaped thing out there on the wing. Panicked, he calls for a flight attendant, shouting “There’s a man on the wing of this plane!” This, of course, is impossible, and it is obvious that the fight staff and his fellow passengers think him hysterical (it doesn’t help his case that the segment begins with a flight attendant talking to him through the bathroom door as he’s inside having a panic attack). After talking himself down, realizing it would be impossible for a man to be on the wing at this speed and altitude, He accepts a valium from the senior flight attendant, closes the window shade, and attempts to sleep.
Of course, after a fitful attempt, he can’t help himself, and he opens the shade … and sees the thing, clearly demonic in appearance now, inches away from his face on the other side of the window.
This was the precise moment it broke my brain and gave me nightmares for months.
Anyway, TL;DR: either through turbulence or the creature’s sabotage, the plane lurches violently. The man grabs a gun from the sky marshal and shoots at the creature through the window. The cabin decompresses and he’s sucked out halfway. He shoots at the creature again, which wags a clawed finger at him, and flies off into the night. The plane lands and the man is taken off in a straitjacket; meanwhile, the mechanics examining the plane’s engine find it torn to shreds, the metal bent and ripped and covered in deep rents that look like claw marks.
I don’t remember any of the rest of the movie, which had three other segments based on old Twilight Zone episodes. I just remember watching “Nightmare,” being terrified, and my father telling me, in reply to my shocked question, that the creature was a gremlin and that they sabotage airplanes.
Really, it’s amazing I ever willingly went back on a plane.
I’ve been thinking about that and remembering a lot of details about the summer of 1984, which is when this all happened, and trying to work through precisely why it scared me so profoundly. I’ll post that essay here when it’s written; but in the meantime, I’ve been going down the rabbit hole on gremlins and their origins as an element of modern folklore.
There’s surprisingly little written about gremlins, which is possibly a function of the twinned facts that, on one hand, they’re basically a sub-species of a vast array of pixies, fairies, goblins, imps, and other mischievous fey creatures from folklore and legend; and on the other hand, they have a recent and fairly specific point of origin. Gremlins emerge alongside aviation (something The Twilight Zone hews to and the movie Gremlins ignores). More specifically, gremlins are creatures of the RAF, and start appearing as an explanation for random malfunctions sometime in the 1920s, becoming a staple of flyers’ mythos by the outbreak of WWII.
Gremlins, indeed, almost became the subject of a Disney film: author Roald Dahl, who would go on to write Charlie and the Chocolate Factory and James and the Giant Peach among innumerable other beloved children’s books, was an RAF pilot. His first book was titled The Gremlins, about a British Hawker Hurricane pilot named Gus who is first tormented by gremlins, but ultimately befriends them and convinces them to use their technical savvy to help the British war effort. In 1942, Dahl was invalided out of active service and sent to Washington, D.C. as an RAF attaché. The Gremlins brought the RAF mythos of airborne imps to America, and was popular enough that Disney optioned it as an animated feature. Though Disney ultimately did not make the movie, Dahl convinced them to publish it with the animators’ illustrations in 1943. First Lady Eleanor Roosevelt reportedly delighted in reading it to her grandchildren.
There was also a Loony Toons short in 1943 featuring Bugs Bunny being bedevilled by a gremlin on a U.S. airbase.
Though Dahl would later claim to have coined the word “gremlin,” that is demonstrably false, as the term was in use from the 1920s and was featured in Pauline Gower’s 1938 novel The ATA: Women With Wings. The word’s etymology is difficult to determine, with some suggesting it comes from the Old English word gremian, “to vex,” which is also possibly related to gremmies, British English slang for goblin or imp. Another theory holds that the word is a conflation of goblin and Fremlin, the latter being a popular brand of beer widely available on British airbases in the mid-century—one can imagine tales of mischievous airborne creatures characterized as goblins seen after too many Fremlins.
One of the more interesting aspects of the gremlins mythos is how many flyers seemed genuinely convinced of the creatures’ existence. So common were tales of malfunction attributed to gremlins that U.S. aircrews stationed in England picked up on the lore and many of them, like their British counterparts, swore up and down they’d actually seen the little bastards working their mischief. Indeed, one of the only academic pieces of writing I’ve been able to find on gremlins is not the work of a folklorist, but a sociologist: in a 1944 edition of The Journal of Educational Sociology, Charles Massinger writes gravely about the fact that “a phase of thinking that had become prevalent in the Royal Air Force”—which is to say, gremlins—“had subsequently infected the psychology of the American airmen in the present war.” Massinger’s article expresses concern that otherwise rational people, thoroughly trained in advanced aviation, who necessarily possess a “breadth of … scientific knowledge relative to cause and effect of stress on the fighting machine” would be so irrational as to actually believe in the existence of “fantastic imps.”
Massinger suggests that it is the stress of combat that gives rise to such fantasies, which is not an unreasonable hypothesis—war zones are notoriously given to all sorts of fabulation. But he says that it is the stress and fear in the moment, in which split-second decisions and reactions that don’t allow for measured and reasoned thought, that short-circuits the sense of reality: “If pilots had sufficient time to think rationally about machine deficiencies under actual flying conditions,” he says, “it is doubtful whether the pixy conception would have crept into their psychology.” Leaden prose aside, this argument strikes me as precisely wrong. The mythology surrounding gremlins may have had its start in panicked moments of crisis while aloft, but it developed and deepened in moments of leisure—airmen relaxing between missions in the officers’ club or mess, probably over numerous bottles of Fremlins. It is indeed with just such a scene that we first learn of gremlins in Dahl’s story.
I do however think Massinger’s instinct isn’t wrong here, i.e. the idea that airmen respond to the stresses of combat and the frustrations of frequent baffling breakdowns with fantasy rather than reason. What he’s missing is the way in which mess-hall fabulation humanizes the experience; the rationality of science and technology in such situations, I would hazard, is not a comfort, no matter how long the flyers have for reflection. The mechanical dimension of air combat is the alienating factor, especially at a point in time when flight was not just new but evolving by leaps and bounds. Roald Dahl’s experience in this respect is instructive: he started the war flying Gloster Gladiator biplanes, which were badly obsolete even when they were first introduced in 1934. By the time he was invalided, he had graduated to Hawker Hurricanes, which in the early days of the war were among the most advanced fighters. By the time he was in the U.S. and Eleanor Roosevelt was reading his first book to her grandchildren, the Allied bombing campaign had already lost more planes than flew in total during the First World War, with the new planes coming off assembly lines not just matching the losses but growing the massive air fleets.
Air travel has become so rote and banal today, and catastrophic airframe malfunctions so rare, that it is difficult to remember what must have been a vastly disorienting experience in WWII: ever-more sophisticated fighters and bombers that were nevertheless plagued by constant mechanical failures, machines of awesome destructive power that were also terribly vulnerable. Bomber crews suffered the highest rates of attrition in the war—about half of them were killed in action—while there was also the constant drumbeat of propaganda about the supposed indomitability of the Allied bombing wings.
When I teach my second-year course on American literature after 1945, I always start with the poetry of Randall Jarrell; specifically, we do a few of his war poems, as a means of emphasizing how the Second World War so profoundly transformed the world and the United States’ place in it, and the extent to which American popular culture became invested in mythologizing the war. Jarrell’s poetry is a disconcertingly ambivalent glimpse of the depersonalization and mechanization of the soldier by a war machine Hollywood has largely erased through such sentimental portrayals as The Longest Day and Saving Private Ryan. “The Death of the Turret-Ball Gunner” is usually the first poem we do, and I can reliably spend an entire class on it despite its brevity. In its entirety:
From my mother’s sleep I fell into the State, And I hunched in its belly till my wet fur froze. Six miles from earth, loosed from its dream of life, I woke to black flak and the nightmare fighters. When I died they washed me out of the turret with a hose.
The final line is a gut-punch, but it’s the first two lines that establish one of Jarrell’s key themes with devastating economy. The speaker “falls” from the warmth and safety of the mother’s care, where he is loved as an individual, to the ownership of the State, where he is depersonalized and expendable—rendered inhuman even before the “black flak” (anti-aircraft fire) unincorporates his body. In the second line, the State is explicitly conflated with the weapon of war, the bomber, of which he has become a mechanism, and which functions as a monstrous womb: the parallel structure of the two lines aligns the “belly” of airplane with the “mother’s sleep.” The “wet fur,” freezing in the sub-zero temperatures of the high altitude, is literally the fur lining his bomber jacket, but also alludes to the lanugo, the coat of fur that fetuses develop and then shed while in the womb.
The bomber functions in Jarrell’s poetry as the exemplar of the Second World War’s inhuman scope and scale, built in vast numbers, visiting vast devastation on its targets—the last two of which were Hiroshima and Nagasaki—but which itself was terribly vulnerable and always in need of more bodies to fill out its crews. The machine itself was never scarce.
All of which might seem like a huge digression from a discussion of gremlins, but it’s really not: gremlins are identifiably kin to myth and folklore’s long history of mischievous “little people,” from pixies to the sidhe. That they emerge as a specific sub-species (sub-genre?) at the dawn of aviation—specifically, military aviation—is suggestive of a similar mythopoeic impulse when faced with the shock of the new. That some airmen become convinced of their existence as the war went on and the air war grew to unthinkable proportions is, I would suggest, pace Massinger, utterly unsurprising.
Donald, Graeme. Sticklers, Sideburns, and Bikinis: The Military Origins of Everyday Words and Phrases. 2008.
Leach, Maria (ed). The Dictionary of Folklore. 1985.
Massinger, Charles. “The Gremlin Myth.” The Journal of Educational Sociology., Vol. 17 No. 6 (Feb. 1944). pp. 359-367.
Rose, Carol. Spirits, Fairies, Gnomes, and Goblins: An Encyclopedia of the Little People. 1996.
I was interviewed recently by a student of mine for Memorial’s student newspaper on the topic of the importance of the humanities.1 I’m now wishing I’d read this Washington Post column by Jason Willick, titled “Putin has a huge advantage in the kind of nuclear weapon he would be most likely to use” prior. This paragraph in particular:
Russia has only a modest lead over the United States in long-range, strategic nuclear warheads regulated by the 2010 New Start treaty — 1,456 vs. 1,357 of the high-payload weapons. But when it comes to unregulated, shorter-range and lower-payload tactical nuclear weapons, according to a 2021 Congressional Research Service report, the United States has only 230, “with around 100 deployed with aircraft in Europe.” Russia has up to 2,000.
I’m not saying that having done a degree in English, philosophy, or history would automatically alert you to the absurdity of this framing;2 but there is a greater likelihood that one would, having studied such subjects, understand, respectively, its perversion of language, its moral and ethical failure, and its ignorance of historical context .
There have been a lot of commentators reaching for comparisons to the Cold War in the past week or so. Whatever the valence of such historical parallels, I think this is the first time I’ve read something that has resorted to Cold War logic. One of the benefits, rhetorically and imaginatively speaking, of the Soviet Union’s collapse was that we started again to think of nuclear weapons in singular terms—by which I mean, a reversion to the wise instinct that one nuclear warhead was one too many. I’m old enough to remember the nuclear anxiety that pervaded in the 1980s, the relief when that briefly vanished in the period spanning glasnost and the U.S.S.R.’s implosion, and then the more diffuse but still nagging anxiety attached to the prospect of bad actors trafficking “loose nukes.” 9/11 ramped up the paranoia about “suitcase bombs” whose relatively small yields would not have registered on cold warriors’ thermonuclear radar, but which served as a reminder of the irreducible violence—different from conventional munitions not in degree but in kind—of weaponized fission.
This understanding is what makes a nation like Iran developing the Bomb unthinkable. It is why nobody in their right mind shrugs off North Korea’s nuclear arsenal because it is minuscule.
And yet here we are, mere days after Vladimir Putin re-introduces the spectre of nuclear warfare—and not merely by inference!—talking about the “advantage” of numbers of nuclear weapons. When I read the passage quoted above, I had to pause and talk to the empty room in lieu of having the article’s author present for a vigorous lapel-shake. I want to ask him: what advantage, precisely, does a 2000 : 100 ratio of TACTICAL NUCLEAR WEAPONS grant you? Tactical nuclear weapons range from the tens of kilotons to the hundreds; to put that in perspective, the bomb that destroyed Hiroshima was thirteen kilotons. So the United States’ paltry tactical nuclear capability currently in Europe, considered conservatively, has the capacity of one hundred Hiroshimas. But once those are used up, presumably in a back-and-forth with Russia, Putin can deliver 1900 more!
Of course, that there could ever be such an exchange—that the initial use of any nuclear weapon, no matter how relatively modest in yield, would not in itself be a world-changing event—is absurd on its face. The relative size of the arsenals would be instantly irrelevant. In the best case scenario, everything comes to a crashing halt as the world looks on in horror and heaps recriminations on the perpetrator. In the worst case scenario, sadly the more likely, the initial use of tactical nuclear weapons rapidly escalates to a large-scale exchange in weapons measured not in kilotons but megatons.
Any attempt to euphemize or elide the singular horror of nuclear weapons needs to be met, at the very least, with the mocking spectre of Buck Turgidson (George C. Scott), the trigger-happy Air Force general from Stanley Kubrick’s masterpiece of black comedy Dr. Strangelove, or How I Learned to Stop Worrying and Love the Bomb (1964): when the President (Peter Sellers) responds to Turgidon’s exhortation to follow through on an unplanned nuclear strike on the Soviet Union, “You’re talking about mass murder, General, not war!” Turgidson says, “Mr. President, I’m not saying we wouldn’t get our hair mussed! But I do say no more than 10 to 20 million killed, tops! Uh, depending on the breaks.”
It’s distressing, especially in the present moment, to come across in one of the United States’ major newspapers such an ostensibly reasonable and rational discussion of an invention that is everything but reasonable and rational. Martin Amis’ essay “Thinkability,” the introduction to his 1987 collection of short stories about nuclear weapons and nuclear war, Einstein’s Monsters, addresses precisely the fallacy of trying to make the unthinkable thinkable, and the ways in which the attempt invariably tortures the language used:
It is gratifying in a way that all military-industrial writing about nuclear “options” should be instantly denatured by the nature of the weapons it describes, as if language itself were refusing to cooperate with such notions. (In this sense language is a lot more fastidious than reality, which has doggedly accepted the antireality of the nuclear age.) In the can-do world of nuclear “conflict management,” we hear talk of retaliating first; in this world, deaths in the lower tens of millions are called acceptable; in this world, hostile, provocative, destabilizing nuclear weapons are aimed at nuclear weapons (counterforce), while peaceful, defensive, security-conscious nuclear weapons (there they languish, adorably pouting) are aimed at cities (countervalue). In this world, opponents of the current reality are known as cranks. “Deceptive basing modes,” “dense pack groupings,” “baseline terminal defense,” “the Football” (i.e., the Button), acronyms like BAMBI, SAINTS, PALS, and AWDREY (Atomic Weapons Detection, Recognition, and Estimation of Yield), “the Jedi concept” (near-lightspeed plasma weapons), “Star Wars” itself: these locutions take you out onto the sports field—or back to the nursery.
Reading Amis’ essay anew is a good reminder of the absurd rhetorical lengths the national security apparatus went to (and presumably still does in a more limited fashion) to make the use—and indeed, the very existence—of nuclear weapons seem reasonable.
They are not reasonable. Frighteningly, it doesn’t seem as though Vladimir Putin is reasonable at this moment in time either. But it’s not his numerical advantage in tactical nukes that makes me lose sleep—it’s that he might consider using even one, of any size.
1. When I started writing this post with precisely this sentence, I then proceeded to digress into a discussion of the interview and the difficulty of abstracting from the forty-five minute interview a sentence of two that best sums up the value of an education in the humanities. I went on for about three paragraphs, realized I was writing a different blog post, and opened a new Word document to start over. Look forward to a post in the near future in which I go on at length about the humanities.
2. Any argument for the humanities rooted in the idea that it invariably fosters empathy and morality needs to remember the Ivy League pedigrees of “the best and brightest” of John F. Kennedy and then Lyndon B. Johnson’s cabinets, whose intellectual arrogance—emerging from educations at Harvard, Yale, et al that would have requirements to read the Great Books—precipitated and then escalated the United States’ war in Vietnam.
Anybody who reads this blog knows I post sporadically at best. I go through some periods of great energy, and then this space can lie fallow for months at a time. Which isn’t to say I don’t frequently have ideas for posts: it’s a more a question of whether the idea that pops into my head is something I can stick the landing on. I have a folder on my desktop full of half-written posts that I’ve either lost the thread on, couldn’t make work to my satisfaction, or simply was distracted by a shiny thing, and by the time I think about returning to the post in progress, it is no longer timely.
This year was interesting: I blogged 35 times, which is not a lot, but then my posts were largely clustered in the first half of the year. I had a fair bit of momentum coming out of 2020, and was propelled through January and February by events (most notably the assault on the Capitol and Biden’s inauguration). I posted eight times in January, which is a lot for me; then my productivity was reduced by two each month following until April (with two posts). May and June were saw five and six posts, respectively, in part because I was being ambitious and attempting to produce several posts on a handful of themes. That tapered off in July … and then nothing until November (once), and a single December post.
I’m never entirely sure why the well goes dry on a fairly regular basis, though as I said, often it’s not so much about the writing as about the finishing (I still have sitting on my desktop a post about galactic empires that I do want to finish). Sometimes it’s reflective of how productive I’m being otherwise, but not always; sometimes my blog is a useful procrastinatory device, something that makes me feel productive when I should be directing my energies elsewhere.
At any rate, I thought I might do something I’ve seen other blogs do, which is a year in review with a list of the best/most read/favourite posts. Given that my readership here is pretty tiny, it would be a bit silly to list my most popular posts. So I’m going with my personal favourites: which is to say, the posts was proudest of, and which I felt managed to get closest to the thoughts that spawned them.
I’m going with my top five, though I’m not ranking them, just listing them in chronological order.
It’s a little odd that, of these top five, two of them deal with the topic of monarchy. This post is about how the American Republican system of government—developed specifically as a revolt against the tyrannical British crown—has ironically ended up imbuing the American chief executive with more king-like qualities than the prime minister in a parliamentary system. This contradiction, I point out, had become all the more glaring in the Age of Trump, whose authoritarian tendencies exacerbated the monarchical elements of the Office of the President.
I will also note that I posted this entry the day before the January 6 insurgency.
As I have noted many times on this blog, I wrote my doctoral dissertation on conspiracy and paranoia in postwar American fiction, film, and popular culture. Through the successive waves of Trutherism, Birtherism, Glenn Beck’s chalkboard rants, and the various paranoid fantasies spun by Tea Partiers, I have thought about dusting off the thesis … and am overcome with the sensation of probing the nerve of a tooth.
QAnon, like everything it does, ratchets that up to eleven. And yet I found myself writing this post, which I think does a pretty good job of breaking down the important elements. And I’ve watched the HBO docuseries Q: Into the Storm and read the book The Storm is Upon Us by Mike Rothschild … and now I find myself writing about it for another project.
It makes me feel like Al Pacino in Godfather III: “Every time I try to get OUT … they keep dragging me back IN.”
My second of two posts on monarch was prompted, perhaps counter-intuitively, by my profound indifference to the various plights of the British royal family. The occasion of this particular bout of indifference was the fallout from Meghan Windsor née Markle’s interview with Oprah Winfrey and the fallout it caused.
But if you’re so indifferent, why go to the effort of writing a blog post, you may ask? Well, imaginary interlocutor, I started pondering precisely why hereditary monarchy has such a powerful hold on the contemporary imagination. And that led me down a rabbit hole of thought that proved quite interesting.
Also, it gave me the opportunity to write the sentence “Piers Morgan was the result of a lab experiment in which a group of scientists got together to create the most perfect distillation of an asshole.” So there’s that, too.
I made two attempts at sustained deep dives into large topics this summer. The first was “History, Memory, and Forgetting,” and the other was a series of posts revisiting the concept of postmodernism (“Remembering Postmodernism”). I have to say, I was very pleased with what I produced on both fronts, and annoyed with myself that the postmodernism series got bogged down and remains unfinished. (It’s for that reason that “Remembering Postmodernism” is not represented here).
To be honest, I think all three of my “History, Memory, and Forgetting” posts deserve a place here, but because they’re sort of a unit, I’ll settle for the one I’m proudest of, which is about how the erosion of memory about the Holocaust—through time, distraction, and the death of survivors—has denuded the historical awareness and created a present situation in which such terms as “Nazi” and “fascist” have lost meaning in the popular imagination.
Early this summer, I was alerted to a backlash against the Tolkien Society’s Summer Seminar—an annual conference in which academics present papers on a theme chosen by the society. Part themes have included Tolkien and Landscape, the Art of Tolkien’s World, and so on. This year? Tolkien and Diversity. Which prompted a not-unpredictable backlash in conservative circles, especially after the paper titles were posted. Though it hardly reached “critical race theory” levels of vitriol, there was an awful lot of angry talk about the “woke mob” coming to tear down J.R.R. Tolkien. Though most of this happened on message boards and social media, it did reach the lofty heights of the National Review.
A few days after I wrote this post, I attended the (virtual) Tolkien Seminar conference and watched every single paper. Guess what? Tolkien’s legacy survived. And guess what else? Everyone presenting at that conference loves Tolkien. No one wanted to tear him down. The very thought would horrify them. The biggest fallacy of “anti-woke” thought—which, really, stretches back to the culture wars of the 90s and Harold Bloom’s castigation of what he called “the school of resentment”—is the idea that people who challenge the traditional canons of art and literature and offer feminist, queer, anti-colonialist, anti-racist, or other such readings are doing so because they hate and resent the genius of such canonical writers as Shakespeare, Milton, or Wordsworth (an irony here being that Tolkien has never been included in “the Western Canon”). While there may be genuinely antagonistic readings of classic authors, most of the time people—like those at the Tolkien seminar—are finding spaces in their work in which they see themselves reflected.
And in the end, it’s a testament to Tolkien’s genius that queer graduate students can find themselves in the work of an ultraconservative Catholic. To those who lambasted the “Tolkien and Diversity” seminar, I ask: how is that a bad thing?
Happy new year, everyone. Did you all have a good New Year’s Eve? Stephanie and I celebrated by ordering pizza and watching Don’t Look Up on Netflix—a film that is at once simultaneously so hilarious and so depressing that I found myself wondering whether watching it on the last day of 2021 was a terrible idea or entirely appropriate … though I suppose it could be both.
I hate New Year’s Eve as a night of celebration, and I always have—even when I was younger and more disposed to party like it’s 1999. A good friend of mine had the perfect summary of why NYE is so terrible. There are two days a year, he said, when you feel the most societal pressure to enjoy yourself: your birthday and NYE. Provided you have friends and/or family who love you, birthdays are great fun because they’re all about you. On New Year’s Eve, by contrast, it’s everybody’s birthday. Which is possibly why the more excessive the celebrations, the more they smack of desperation.
Normally I would have prepared a nice meal as a reluctant nod to mark the day, but I just got back from my parents’ place in Ontario five days ago, which means I’ve been quarantining. Which also means I can’t leave the house until tomorrow and buy groceries, and while we don’t lack for the basics, there isn’t much with which to make anything more then, well, the basics.
So, pizza. Which, given my antipathy to this particular holiday, seemed even more appropriate than anything requiring effort on my part. It might have to become the new custom.
I’ve been seeing a lot of 2021-related memes on social media, most of which involved (1) WTF? And (2) warnings not to jinx the coming year with high expectations, which we did at the start of 2021. But when you think about it, we’ve been exiting the year with a snarl and a backward-facing middle finger since … well, 2016, haven’t we? Which makes many of us want to blame Donald Trump for this series of successively sucky years, but if we haven’t yet collectively understood that he’s not the architect of our societal woes, just the bellwether, then things are gonna keep sliding downhill.
I mean … they probably will anyway, but the big reason for 2021’s unreasonably high expectations was the tacit assumption that with Trump out of office, things would inevitably get better.
And for a time, it looked like they were! And then … well, I was about to say that reality reasserted itself, but that wasn’t the biggest problem with this past year, was it? It would be more accurate to say that unreality reasserted itself. The Big Lie, anti-vaxxers, the hysteria over “critical race theory” and other bogus culture war non-issues, January 6 trutherism, and of course the ongoing state of climate-change denial. Reality has never been the problem, except insofar as accessing it without having to run a gauntlet of disinformation is now more or less impossible.
This profound sense of frustration and disconnect is why Don’t Look Up has landed so hard on its viewers. I laughed throughout the film, because it’s hard not to—it is comedy, based in the wilful obliviousness and ignorance of people so self-interested, so elementally selfish, that they are congenitally incapable of recognizing the very idea of a collective good. Many times through the film, I found myself thinking, “Wow, this is unsubtle.” But then … so is Tucker Carlson. So is QAnon. So is Donald J. Trump, and so are his legions of enablers and imitators. We live, sad to say, in profoundly unsubtle times.
Or as Stephanie put it midway through the movie, “It’s really depressing when there’s no daylight between satire and the thing it’s satirizing.”
There’s a moment that happens on occasion in American football, when a running back, having been handed the ball, breaks through all the defenders and sprints into the open field, with nothing between him and the endzone.
And sometimes, rather than making a touchdown, he gets blindsided by a linebacker he didn’t see coming.
That’s what 2021 has been like, especially these past few months.
I was vaccinated in June, and then again in August. We resumed in-person classes at Memorial University in September. Like many people, I felt a massive sense of relief—the pandemic finally seemed to be heading into the rearview mirror, and things were returning to something resembling normalcy. Except … yeah, not so much. I’m writing this at my parents’ house on Christmas day, but we won’t be properly celebrating the holiday until tomorrow because my niece and nephew were possibly exposed to COVID on a schoolbus, and thus need to quarantine for another day. I will myself have to quarantine for five days on returning to St. John’s.
We should be in the clear, running to the endzone with open field, but we keep getting tackled.
This is what I told all my students earlier this term: I wanted them to know it wasn’t just them, it was a general malaise that I was also experiencing. I told them they weren’t alone in feeling anxious, discomfited, or just generally off. I told them that if it came down to it, if they were struggling to get their classwork done, that there was no shame in dropping my course … or any other course they were taking. I told them that I would do whatever I could to help, provided they kept me in the loop. You don’t need to give me the gory details, I said—just tell me you’re having a rough go of it, and that’s enough. We’ll work something out.
I have to imagine there are those who will say I’m catering to the fragility of the snowflake generation in making such accommodations. Anyone having that thought can fuck right off. What saved me from breakdown these past few months was the fact that I had such extraordinary students: as I said to all my classes at the term’s end, they’re the reason why I have hope for the world. As a GenXer, irony is my default setting; for those inheriting the catastrophes of climate change and resurgent fascism, earnestness is their baseline. What would have been radically uncool in the 1990s is quite possibly what will save us all.
On Christmas day, I find myself thinking about all the best parts of the past year. My students are, as they often are, Exhibit A.
I taught a graduate seminar in the winter term that, in spite of the fact that it happened on Zoom, was hands down my favourite class ever. It was called “The Spectre of Catastrophe,” and looked at 21st-century post-apocalyptic narratives. In spite of the fact that the subject was perhaps a little bit too on the nose for our current situation, it was the most fun I’ve had as a teacher ever (and that’s saying a lot, as my following comments will make clear). The only advantage classes taught over Zoom have over in-person teaching is the comments students can write as the lecture/discussion unfolds. In this particular class, there was always an ongoing conversation scrolling up on the side. I used the ten-minute breaks each hour in our three-hour classes to read through the lively and often hilarious discussions happening in parallel to the actual class.
Getting back into the classroom this past September felt so very, very good. It didn’t hurt that I was teaching a roster of courses that were, for me, an embarrassment of riches: a first-year class called “Imagined Places,” which gave me the (gleeful) chance to teach Tolkien (The Hobbit), Ursula K. Le Guin (A Wizard of Earthsea), Sir Terry Pratchett (Small Gods), Neil Gaiman (The Ocean at the End of the Lane), and Jo Walton (The Just City) (for the record, everybody needs to read Jo Walton). I also taught, for the third time, “Reading The Lord of the Rings.” I had five, count ‘em FIVE, students in LOTR who were also in my fourth-year seminar “The American Weird: Lovecraft, Race, and Genre as Metaphor.” (I always measure my success as an educator by the number of students who take multiple classes with me. I assume it means I’m doing something right, though it’s also possible they’ve just got my number and know what it takes to get a decent grade). Given that the Tolkien and the Lovecraft courses were back-to-back, I had something like an entourage following me from the chemistry building in which I taught LOTR to the Arts buildings that was the home of the weird.
I called my Lovecraft students “weirdos,” because, well, obviously. It only offended them the first time I called them that.
Even given the fact that I was teaching a dream slate of classes, this fall semester was hard. I am congenitally incapable of asking for help, or for that matter recognizing that I need help until after the fact; these past few days of decompressing at my parents’ home have been invaluable in offering relaxation and reflection. I have also realized I need to find my way to a therapist in 2022.
But the moral of the story is that the students I worked with this past year kept me sane, and gave me hope. So on this day of prayer (even for those as atheistic as myself) I am grateful for you all. And given how many of you have signed up for yet more Lockett pontification next term, I can only assume I’m doing something right.
Or you’ve got my number. Either way, it’s all good.
Wow, it’s been a minute. Four months and no posts is … well, as my hiatuses from this blog go, it’s fairly standard.
I’ve been thinking a lot lately about science fiction, which I suppose is hardly unusual for me. To be more specific, I’ve found that my SF thoughts have been squaring up of late with contemporary events and situations, enough so that I’ve been writing them down so as to try to make sense of my inchoate thoughts in a significant way. This post, which is very long, is one of two I’ve been pecking away at for about two weeks.
And again—this is quite long, but I didn’t want to lop it up. I have however divided it into sections, so maybe read it in stages if you don’t feel like slogging through it all at once.
The Books that Shape Us
An internet sage once said: “There are two novels that can change a bookish fourteen-year old’s life: The Lord of the Rings and Atlas Shrugged. One is a childish fantasy that often engenders a lifelong obsession with its unbelievable heroes, leading to an emotionally stunted, socially crippled adulthood, unable to deal with the real world. The other, of course, involves orcs.”
I’ve wondered on occasion about the cult of Ayn Rand among those whom she most venerated, billionaires and multi-millionaires (and the politicians who have built careers catering to the needs of the über-wealthy); I wonder if it was reading Rand’s odes to selfishness as “bookish fourteen-year olds” that set them on the path of rapacious capitalist accumulation, or whether Atlas Shrugged came, post-facto, to provide a useful pseudo-intellectual justification for why they should have vast wealth while 99%+ of people do not.
The question is rhetorical, of course, even perhaps somewhat disingenuous: while I have little doubt that some people did in fact find their life’s mission revealed to them in the pages of Rand’s writing at an early age (Ted Cruz springs to mind as an obvious candidate), the people at the extremes of that opposition are probably few and far between. I know a significant number of people who devoured The Fountainhead and Atlas Shrugged as teens, who later discovered that what they found compelling about the novels when they were young was completely out of step with their adult sensibilities. That of course is not an uncommon experience with certain works of literature more generally: I read The Catcher in the Rye at the age of sixteen; given that Salinger’s novel is scientifically calibrated to resonate with angsty teenage boys, I read it in a single feverish sitting, but doubt I could tolerate Holden Caulfield for more than a few pages today. By contrast, I was a bookish twelve-year old when I read The Lord of the Rings, and it did indeed change my life—it remains a core text in my life, as evidenced by the fact that I am currently teaching my department’s Tolkien course for the third time.1
I recently shared an anecdote from Neil Gaiman with my first-year English students. In a lecture delivered at The Reading Agency in 2013, Gaiman said,
I was in China in 2007, at the first party-approved science fiction and fantasy convention in Chinese history. And at one point I took a top official aside and asked him Why? SF had been disapproved of for a long time. What had changed? It’s simple, he told me. The Chinese were brilliant at making things if other people brought them the plans. But they did not innovate and they did not invent. They did not imagine. So they sent a delegation to the US, to Apple, to Microsoft, to Google, and they asked the people there who were inventing the future about themselves. And they found that all of them had read science fiction when they were boys or girls.
To be clear, reading SF when you’re young doesn’t necessarily translate into becoming a tech genius or innovator (sometimes it means you become an English professor); what Gaiman is advocating in his lecture is reading promiscuously, and especially making available for children as wide a range of reading options as possible: “The simplest way to make sure that we raise literate children is to teach them to read, and to show them that reading is a pleasurable activity. And that means, at its simplest, finding books that they enjoy, giving them access to those books, and letting them read them.” He cites the story from his participation in the SF/F convention in China not to suggest that SF makes you innovative, but that the kind of creative imagination exemplified by SF and a multiplicity of other genres—which facilitates an imaginative engagement with the impossible and the unreal—is a precondition for inventiveness.
As with Rand’s objectivist fiction, however, there’s always a danger in reading stuff prescriptively. I’ve been thinking about this story of Gaiman’s a lot lately, though not so much in terms of the positive sense of it I communicated to my students. I have, rather, been watching as a trio of billionaires seem stuck in what I can only characterize as a sort of arrested development corresponding to certain eras of SF—two of them speaking confidently about colonizing Mars and the third apparently determined to build cyberspace—and none of them seem to have truly understood the SF paradigms they’re trying to make real.
Let’s take them in turn. Both Jeff Bezos and Elon Musk have been quite enthusiastic about the prospect of settling humans on other planets, starting with Mars. (I’m leaving Richard Branson out of this consideration for that reason: his spacefaring ambitions seem more in the line of having the coolest, fastest new vehicle, something made obvious by the sleek aesthetics of Virgin Galactic’s Unity spacecraft vs. Blue Origin’s phallic homage to Dr. Evil’s rocket).
Musk recently tweeted that his ultimate intention is “to get humanity to Mars and preserve the light of consciousness.”
Bezos is also ultimately determined to make it to Mars (though he’s asserted that returning to the moon is a necessary first step), and has reportedly been fascinated by space travel all his life—as valedictorian in high school, he ended his address with the line “Space: the final frontier. Meet me there.” His love of Star Trek, as evidenced in those words, is something he’s frequently expressed, most recently when he shot ninety-year-old William Shatner into space.
It’s hardly unusual for people to love works of art that articulate values at odds with their own—as much as I love LOTR, for example, I’m hardly down with Tolkien’s conservative Catholicism as it emerges in the novel’s pages—but sometimes the dissonance can be jarring. Consider, for example, when former Republican Speaker of the House and Ayn Rand enthusiast Paul Ryan named Rage Against the Machine as his favourite band (which always begs the question: against which machine did you think Zack de la Rocha and Tom Morello were raging, Paul? Spoiler alert, it’s you). It’s unsurprising that a tech guy like Bezos would love Star Trek, but it’s also hard not to see a comparable dissonance between Gene Rodenberry’s utopian, egalitarian vision of the future and the rapacious business ethos embodied by Amazon.
It makes me wonder what the Trek-loving Bezos made of the Star Trek: The Next Generation episode in which the Enterprise comes across a 21st century satellite (somehow dislodged from Earth’s orbit and adrift in deep space) containing cryonically frozen people with terminal illnesses. They are unfrozen and their illnesses cured with 24th century medical technology, which of course had been the point of being frozen, kept in stasis until the day they could be treated. One of the unfrozen people is a stereotypical Wall Street douchenozzle who initially gloats over the success of his plan to cheat death—only possible, he notes, because of his considerable wealth—but is horrified to learn that the long-term investments he’d expected to cash in on would now be nonexistent. Captain Picard tells him, “A lot has changed in the past three hundred years. People are no longer obsessed with the accumulation of things. We have eliminated hunger, want, the need for possessions. We’ve grown out of our infancy” (ep. 1.25, “The Neutral Zone”).
Even in its grittier iterations, Star Trek always retains the germ of Rodenberry’s utopianism, born in the American mid-century before postwar optimism would truly start to fracture under the repeated shocks of political assassinations, the Vietnam War, and the cynical politics of the Nixon administration. “The final frontier” was a specific echo of John F. Kennedy’s “New Frontier” rhetoric; and while there is no lack of cynicism or dystopian predictions in the SF of the 1950s and 60s, the broader trends were more hopeful: one reads the SF of that period and finds, if not always optimism necessarily, then certainly a prevalent sense of the possibilities created by advanced technology … and, as with Star Trek, a not-uncommon faith that such technological advances will be matched by humanity’s moral maturation.
Undoubtedly, apologists for the Musk/Bezos space race would tell me that just because the billionaires are the contemporary embodiment of being “obsessed with the accumulation of things,” it doesn’t obviate the altruism of their visionary space-faring ambitions; that the groundwork laid by these early, seemingly trivial joyrides to the edge of space will be crucial for future larger-scale space travel; that governments have failed in such endeavours, so it falls to the billionaire class to carry humanity into the next stage of our development; almost certainly the Musk/Bezos devotees (especially the Muskovites) would tell me I’m a small-minded person resentful and jealous of their genius as manifested in their vast wealth. Possibly Bezos himself would say something to the effect of how he loves the utopianism of Star Trek and hopes for such a future, but to bring it about he needs to play by the rules of the world as it is today.
It’s possible that some of these things might end up being true; I suppose it’s even possible that my conviction that the very fact of billionaires’ existence in societies where children go hungry is a moral obscenity simply masks my own true desire to accumulate wealth on a cosmic scale. Possible, but unlikely; it strikes me that Bezos and Musk and their cheering sections might have science-fictional enthusiasms about colonizing space, but don’t seem to have paid much attention to the transformations in SF’s preoccupations. While we don’t lack for contemporary SF depicting the spread of humanity through space and the colonization of far-flung planets, the utopian spirit and assumptions of the mid-century has diminished rather dramatically. SF has become darker in its outlook—dystopian figurations are more pervasive, especially those imagining post-apocalyptic futures; increasingly, visions of the future have become tempered by the current environmental realities of the climate crisis and the Anthropocene; and even those imagined futures featuring the colonization of other planets largely eschew such tacit utopian assumptions like those of Star Trek, in which humanity’s expansion into the solar system and beyond entails progressing beyond the factious politics of the present. More and more, space-focused SF grapples specifically with the stubborn realities not just of the vast distances involved in space travel, but of human nature as well (I’m also currently writing a longish post about how, in the present moment, the most unbelievable aspect of much SF is the idea that any given planet could have one-government rule, never mind a galaxy-spanning empire or federation). A good example of this the Expanse novels by James S.A. Corey, which have been adapted into an excellent television series—ironically enough—by Amazon. In the future envisioned by The Expanse, humanity has colonized the Moon, Mars, and a bunch of moons and asteroids collectively known as “the Belt.” One of the things both the novels and the series does well is convey the sense of scale, the enormous distances involved just in our local solar system. Space travel is punishing, inflicting potentially stroke inducing Gs on travelers; people born and raised in the low gravity of Mars or the even lower gravity of the Belt cannot endure the gravity of Earth; and resentments, hatreds, and regional prejudices mark the attitudes of Earther, Martians, and Belters to each other and Earth’s imperial history plays out again in its exploitation of the Belt’s resources. Plus ça change, plus c’est la même chose.
The impossible-to-comprehend vastness of space is central to Kim Stanley Robinson’s Aurora, about a generation ship’s three-century journey to the nearest possibly, hopefully, hospitable planet (spoiler alert: it isn’t). More germane to Musk and Bezos’ Mars ambitions is Andy Weir’s novel The Martian, popularized by the film adaptation starring Matt Damon. When botanist Mark Watney is stranded on Mars, his only way to survive is—in the now-famous line—“to science the shit out of this.” And as he notes, his margin for error is nonexistent: “If the oxygenator breaks down, I’ll suffocate. If the water reclaimer breaks down, I’ll die of thirst. If the hab[itat] breaches, I’ll just kind of implode. If none of those things happen, I’ll eventually run out of food and starve to death.”
The novels of Robinson and Weir are exemplars of “extrapolative” SF, which is to say: SF that is scrupulous in its science, hewing closely to the possible and plausible. What is interesting in the context of what I’m talking about is a shift in focus—The Martian isn’t a trenchant argument against space exploration, but it also eschews an imaginative leap ahead to a Mars in the early stages of terraforming, with safe habitats for thousands of colonists to live, instead focusing on the granular problems involved with even beginning to think about such a project Again, the Musk/Bezos contingent would certainly say that of course we’d have to start small, that it would be harsh and difficult for a long time, but that the billionaire visionaries are the ones laying the groundwork.
To which I say: read the room. Or better yet, read the SF of the past twenty years. Really, just scan the spines in the SF section of a decent bookstore and count the number of novels from the past two decades that are dystopian, post-apocalyptic, or which, like John Scalzi’s recent “Interdependency” trilogy (The Collapsing Empire, The Consuming Fire, The Last Emperox) are essentially allegories about the current climate crisis. Then do a quick calculation and see how much of SF’s current market share comprises such narratives. Because it’s not as though dystopian stories and post-apocalyptic SF are at all new;2 they’re just more prevalent than ever,3 reflecting a zeitgeist more preoccupied with the depredations of modernity and technology than with their capacity for continued expansion.
All of which is by way of saying, if you want the TL;DR: this planet is our home and it would be great to fix the problems we have with it before seriously thinking of colonizing others—not least because the colonization of Mars, even if it proves possible, is the work not of years but centuries, centuries we simply don’t have. The resources Musk and Bezos are bringing to bear on their respective dreams of space travel would be much more profitably devoted to developing, among other things, green energy. Mars ain’t going anywhere.
But then, even billionaires are mortal. It’s possible to see their current competition less as mere dick-measuring (though it is that, to be certain) than as a race to satisfy their dreams of space travel while they still draw breath (in this respect, one wonders if sending ninety-year old William Shatner into space was not just an homage to Bezos’ favourite SF, but a cynical calculation on the part of the fifty-seven-year old billionaire to reassure himself he’s still got at least three decades of space faring years). Elon Musk has said he wants to die on Mars; as with many of his pronouncements, it’s not at all obvious if he’s serious or just taking the piss, but it does cast his ostensible determination, as quoted above, “to get humanity to Mars and preserve the light of consciousness,” in a somewhat more fatalistic light—it seems to suggest that the “light of consciousness” isn’t likely to survive on Earth, and must therefore be seeded elsewhere. It’s difficult not to interpret these words in the context of the broader trend of doomsday prepping among the super-rich that has seen multi-millionaires and billionaires buying remote property on which to build bunkers, and stockpiling food, supplies, guns, and ammunition against the anticipated collapse of civilization. At the time of writing, Elon Musk was accounted the wealthiest man in the world, with his fortune exceeding $300B; it makes a sort of perverse sense that, as status-bunkers go, having one on an entirely different planet would win the game.
I am still stumped, however, by how Musk means “light of consciousness.” Is it him? Does he mean to be found, eons from now, cryonically frozen like the people on The Next Generation by an alien species, the last remaining human? Because it seems to me that the light of human consciousness is best carried into the future by teeming billions living on a planet saved by their collective effort. Sending the light of consciousness to Mars is more akin to the final words of Margaret Atwood’s short fiction “Time Capsule Found on a Dead Planet”: “You who have come here from some distant world, to this dry lakeshore and this cairn, and to this cylinder of brass, in which on the last day of all our recorded days I place our final words: Pray for us, who once, too, thought we could fly.”
… or, from outer space to inner space.
I tried to watch Mark Zuckerberg’s ninety-minute infomercial in which he announced the launch of the “metaverse” and Facebook’s rebranding as “Meta,” and lasted all of about five minutes. I had no expectation of getting through the entire thing—nor of seriously attempting to—but you’d have thought I could have endured more. But apparently not … I didn’t know whether to laugh or cringe, so I did a fair bit of both. It was just so damned earnest, and it baffled me how this putative tech genius could talk so enthusiastically and unironically about this new utopian virtuality, with no acknowledgment of (1) the dystopian shitshow that is the current state of social media generally and Facebook specifically, and (2) the fact that there has yet to be a depiction of virtual reality in fiction that is not even more dystopian than Facebook IRL.
The word “disingenuous” doesn’t even scratch the surface. “Sociopathically delusional” might be closer to the mark.
The next day I subjected my long-suffering first-year students to an impromptu rant about how SF authors in the 80s and 90s had anticipated all of this—specifically, a digital reality embedded in a broader condition in which massive transnational corporations have more power than most nation-states, wealth disparity is a vast and unbridgeable chasm, industry has irrevocably poisoned the environment, and the broader populace is distracted with the soporific of digital entertainment and misinformation—and this was why everybody needed to take more English courses (don’t ever tell me I don’t shill for my own department). I noted that the very term “metaverse” was used by SF writer Neal Stephenson to describe the online virtual environment in his 1992 novel Snow Crash; I also noted that depictions of virtual reality appeared long before computers were ubiquitous, such as in Philip K. Dick’s 1969 novel Ubik, and that William Gibson wrote Neuromancer (1983), which became the definitive cyberpunk novel, on a manual typewriter without ever having owned a personal computer.
And I emphasized the point I made above: that in no instance were any of these depictions anything other than dystopian.
But then, it’s sadly unsurprising that the man who not only invented Facebook, but who for all intents and purposes is Facebook, should forge ahead with the arrogant certainty that he knows better than all the people who envisioned the metaverse long before it was a technical possibility. If Musk and Bezos’ arrested development is stuck sometime in the SF of the 1960s, Zuckerberg’s is in the 80s—but with apparently even less understanding of the material than his billionaire bros.
I could go on with all the elements of cyberpunk Zuckerberg obviously doesn’t get, but I think you get the point … and besides which, my main concern here is with the way in which the choice to rebrand Facebook as “Meta” is perfectly emblematic of Zuckerberg’s apparently congenital inability to engage in self-reflection. It is also, not to put too fine a point on it, what irritates me so profoundly: I have what you might characterize as a professional beef with this rebranding, as it entails the flagrant misuse of a prefix that is crucial to contemporary, and especially postmodernist, literature and culture.
What is “meta-“? Notably, in his infomercial and the letter posted to Facebook announcing the rebranding, Zuckerberg says that “meta” is from the Greek and means “beyond.” This much is true: the second entry in the OED’s definition of meta- is “beyond, above, at a higher level.” Zuckerberg goes on to say that, for him, it “symbolizes that there is always more to build, and there is always a next chapter to the story.” Except that that is entirely not the sense in which meta- is used, not even colloquially. It is not uncommon to hear people say things like “that’s so meta!” or “can we get meta about this?” In both cases, what is meant is not a sense of taking things further, but of reflection and reconsideration. “That’s so meta!” is precisely the kind of thing you might expect to hear from someone watching a movie like Kiss Kiss Bang Bang, a noirish murder mystery that constantly draws attention to the tropes, conventions, and clichés of noirish murder mystery, and which further features Robert Downey Jr.’s character frequently breaking the fourth wall to remind us that, yes, we’re watching a noirish murder mystery movie. The suggestion “can we get meta about this?” usually precedes a discussion not of the issue at hand, but rather things surrounding, informing, framing, or giving rise to the issue at hand.
In other words, as the next line in the OED definition tells us, meta- is typically “prefixed to the name of a subject or discipline to denote another which deals with ulterior issues in the same field, or which raises questions about the nature of the original discipline and its methods, procedures, and assumptions.” Hence, metafiction is fiction about fiction, metacriticism is criticism about criticism, and Kiss Kiss Bang Bang—whose title is taken from a disparaging comment by film critic Pauline Kael about what audiences want from movies—is metacinema (or metanoir, if you want to be more specific). And while the practice of being endlessly self-referential can grow tiresome,5 my point is more that meta- is, literally by definition, introspective.
Something Mark Zuckerberg manifestly is not. To be meta- about “Meta” would not involve plunging forward with a tone-deaf and bloody-minded ninety-minute infomercial promising to double down on Facebook’s putative mission of “bringing people together.” Rather, one might reflect upon the assertion that “We’re still the company that designs technology around people,” and wonder instead if all the internal studies establishing that Instagram is toxic to adolescent girls’ body-image and that Facebook facilitates atrocities and props up dictatorships are perhaps more indicative of technology shaping people’s behaviour. Perhaps all that SF from the 80s and 90s might help in this process?
1. I didn’t read anything by Ayn Rand until I was in grad school. When I finally did read her, it was in part out of curiosity, but more significantly because by then I’d had the basic tenets of objectivism explained to me in some detail; I found the entire philosophy abhorrent, as you might imagine, but also figured that were I ever to find myself in argument with a libertarian about the relative merits of her fiction, I should at least know more precisely what I was talking about. So first I read The Fountainhead, which was a fairly quick read and the more engaging of her two most notable novels. I completely understood its appeal to adolescent readers, especially precocious teens convinced of their own rightness and unsparing in their hatred of anything reeking of compromise and “selling out.” (Thinking about the novel now, it strikes me as possessing a notably modernist sensibility, if not style: architect Howard Roark’s ultimate determination to demolish his building rather than allow “lesser” architectural minds to taint the putative genius of his design reminds me of little else than the unfortunately common tendency among scholars of modernism to express categorical disdain for anything they consider unworthy of study).
Atlas Shrugged was more of a slog, and I often wondered as I read it (in fits and starts over several weeks) at how feverishly its devotees tore through its 1000+ pages, and then obsessively reread it over and over. It’s not that it’s a difficult read—it’s certainly not particularly complex, nor is the prose dense or opaque—I just had difficulty caring about the characters and their stories. I’ve nodded sympathetically many, many times when people have confessed to me that Tolkien’s digressions into the history of Middle-earth lost them, but the purplest of passages in LOTR ain’t got nothin’ on John Galt’s sixty-page-long paean to the moral virtues of selfishness and greed.
A number of years ago, as Breaking Bad was airing its final episodes, I hit on an idea for an article about how that series functioned in part as a sustained and trenchant critique of Ayn Rand. I was fairly deep into drafting it when I hit a wall—specifically, the resigned realization that, if I was going to make a serious run at writing it, I’d have to reread Atlas Shrugged. And, well, that was a bridge too far … though I did write a blog post sketching out the general argument.
2. The critical consensus is that Mary Shelley’s Frankenstein (1818) was the first SF novel, and Frankenstein’s nothing if not a dystopian warning about the unthinking and unethical use of technology. Eight years later she published The Last Man, a post-apocalyptic story set in the midst of a deadly pandemic, so it’s safe to say she was ahead of her time.
3. The corollary to this upswing in post-apocalyptic SF is the proliferation post-apocalyptic narratives that don’t quite classify as SF, with the biggest and most obvious example being the critical mass of zombie apocalypse in fiction, film, television, and games. The corollary of this corollary is the concomitant prevalence of fantasy, with which zombie apocalypse shares DNA insofar as it posits a return to a premodern state of existence. If much SF is dystopian and articulates a bleak outlook, the zombies and the dragons are a kind of nostalgia for a world stripped of the modernity that went so very wrong.
4. Yes, yes, it was Facebook’s “virtual connect” conference … but really, it was an infomercial. A long, tedious, cringeworthy infomercial.
5. I recently watched the new Netflix film Red Notice, which was entirely forgettable, in part because of just how meta- it was. The biggest eye-roll for me was a scene in which the FBI agent and the master thief played, respectively, by Dwayne “The Rock” Johnson and Ryan Reynolds, are searching an arms dealer’s antiquities collection for a certain ancient Egyptian treasure that precipitates the action of the film. “Where is it?” mutters The Rock, to which Reynolds responds, “Check for a box labelled ‘MacGuffin’.”
HOUSEKEEPING NOTE: There has been an unwanted lag in this series of posts, mainly because I’ve been struggling with what was supposed to be part three. Struggling in a good way! The TL;DR on it is that it occurred to me that a consideration of the artistic and literary responses to the two world wars respectively offers a useful insight into key elements of modernism and postmodernism. What initially seemed a straightforward, even simple breakdown has proved (not unpredictably, I now see) a lot more complex but also a lot more interesting. What I ultimately post may end up being the more straightforward version or possibly a two-part, lengthier consideration. One way or another, I’m quite enjoying going down this particular rabbit hole.
So in the meantime, in the interests of not letting this series lag too much, I’m leapfrogging to what was to have been part four.
My doctoral dissertation was on conspiracy theory and paranoia in postmodern American literature and culture. Towards the end of my defense, one of my examiners—a modernism scholar with a Wildean talent for aphoristic wit—asked “Is modernism postmodernism’s conspiracy theory?”
I should pause to note that I thoroughly enjoyed my thesis defense. This was largely because my examiners really liked my thesis, and so instead of being the pressure cooker these academic rites of passage can sometimes be, it was three hours of animated and lively discussion that the defense chair had to bring to a halt with some exasperation when we ran long. For all that, however, it was still three hours of intense scholarly back-and-forth, and so when my examiner asked about modernism as postmodernism’s conspiracy theory, my tank was nearly empty. I found myself wishing—and indeed gave voice to the thought—that the question had been asked in the first hour rather than the third, when I could have done it justice. I still kick myself to this day for not prefacing my response with a Simpsons reference (something the questioner would have appreciated), Reverend Lovejoy’s line “Short answer, yes with an if, long answer, no with a but,” and then getting into the intricacies and implications of the question. As it was, I seem to remember saying something insightful like “Um … sure?”
I’ve thought about that moment many times in the seventeen years (yikes) since I defended, mostly out of fondness for the questioner, who was and remains a friend, as well as annoyance with myself for not giving the question an answer it deserves (now that I think of it, perhaps I should include that in this series). But over the past few years, I’ve thought of it more in the context of how postmodernism itself—and its related but widely misunderstood concept “cultural Marxism”—have come to be treated as essentially conspiratorial in nature.
As I’ve alluded to in previous posts, there is a (mis)understanding of postmodernism among anti-woke culture warriors as something specifically created by Leftists for the purpose of attacking, undermining, and destroying the edifices of Western civilization, as variously manifested in Enlightenment thought, the U.S. constitution, the traditional Western literary canon, manly men, the virtues of European imperialism, and so on. Postmodernity, rather than being the upshot of unchecked corporate capitalism and consumer culture, is seen instead as being the specific invention of resentful and closeted Marxist academics.
Of late, which is to say over the past five years, the most vocal purveyor of this conspiracy theory has been Jordan B. Peterson and his figuration of “postmodern neo-Marxism.” Anyone who knows even the slightest thing about either postmodernism or Marxism understands that, in this formulation, it is the word “neo” doing the heavy lifting—which perhaps betrays at least a slight understanding on Peterson’s part that there can be no such thing as “postmodern Marxism,” as the two terms are very nearly antithetical. Marxism is a modernist philosophy rooted in Enlightenment thought; what I’ve been loosely calling “postmodern thought,” which is to say the loose categorization of theories and philosophy arising largely out of the need to make sense of the postmodern condition, is generally antagonistic to the instrumental reason of the Enlightenment and such totalizing ideologies as Marxism, taking its philosophical leads instead from Friederichs Nietzsche, Edmund Husserl, and Martin Heidegger. So if you’re going to attempt to conflate Marxism with postmodernism, it’s going to have to be very neo- indeed.
Peterson’s basic premise is that the “two architects of the postmodernist movement”—specifically, the French theorists Michel Foucault and Jacques Derrida—were themselves dyed-in-the-wool Marxists; but that when they were making their academic bones in the 1960s, they were faced with the unavoidable failure of Communism as a political force. In one of his ubiquitous YouTube lectures, he declares that “in the late 60s and early 70s, they were avowed Marxists, way, way after anyone with any shred of ethical decency had stopped being Marxist.”
The “postmodernists,” Peterson continues, “knew they were pretty much done with pushing their classic Marxism by the late 60s and the early 70s,” because the evidence of Stalin’s atrocities were by then so unavoidable that carrying on under the Marxist banner was untenable.
This assertion, as it happens, is laughably untrue: the Communist Party of France reliably garnered twenty percent of the legislative vote through the 1960s and 70s. There was no shortage of people, inside the university and out, who were “avowed” Marxists. There was really no reason Derrida and Foucault—who incidentally hated each other, so were hardly co-conspirators—would have been compelled to disguise their Marxist convictions. If indeed they had any: it is an irony that Peterson can make this argument in part because neither Derrida nor Foucault were avowed Marxists. They have, indeed, often been looked upon with suspicion by Marxist scholars, and frequently castigated (Derrida especially) for precisely the reasons I cited above: namely, that they eschewed the principles of Marx’s teleological philosophy and an extrinsic historical order. They were, to coin an expression, a little too “postmodern” for Marxists’ tastes.
Though Peterson is by no means voicing an original idea—the charge that “cultural Marxists” comprise a shadowy cabal of professors seeking to destroy Western civilization was first articulated in the early 1990s—he does imbue his attack with his own uniquely greasy brand of ad hominem logic familiar to anyone who has taken issue with his many transphobic screeds. See if you can spot the code words:
Foucault in particular, who was an outcast and a bitter one, and a suicidal one, and through his entire life did everything he possibly could with his staggering I.Q. to figure out every treacherous way possible to undermine the structure that wouldn’t accept him in all his peculiarity—and it’s no wonder, because there’d be no way of making a structure that could possibly function if it was composed of people as peculiar, bitter, and resentful as Michel Foucault.
Michel Foucault, for those unfamiliar with him, was queer; much of his work was preoccupied with the ways in which people marginalized and stigmatized by their sexuality were policed by society, and the ways in which that policing—that exercise of power—was effected discursively through the designations of mental illness (Madness and Civilization, 1961), the disciplining of society via surveillance (Discipline and Punish, 1975), and the ways in which the categorization of sexual identities exemplifies the normative determination of the self (The History of Sexuality, four volumes, 1976, 1984, 1984, and 2018).
Peterson’s characterization of Foucault is, in this respect, frankly vile—as is his description of Foucault when he first introduces him into his discussion: “A more reprehensible individual you could hardly ever discover, or even dream up, no matter how twisted your imagination.” His repetition of the word “peculiar” is an obvious dog-whistle, and he damns Foucault for being an “outcast,” “bitter,” and “suicidal,” as if Foucault’s “outcast” status as a queer man with a galaxy-sized brain was somehow a character flaw rather than a function of the strictures of a society he understandably took umbrage with. Peterson might be the psychologist here, but I do sense a certain amount of animosity and revulsion that is not entirely directed at Foucault’s philosophy.
One way or another, the charge here is that Foucault and Derrida effectively invented postmodernism as a means of sublimating their doctrinaire Marxism into something more insidious and invidious, which would burrow into university humanities departments like a virus; still speaking of Foucault, Peterson says, “In any case, he did put his brain to work trying to figure out (a) how to resurrect Marxism under a new guise, let’s say, and (b) how to justify the fact that it wasn’t his problem that he was an outsider, it was actually everyone else’s problem.” Some fifty-odd years later, goes the Peterson narrative—which, again, is not specific to him, but prevalent among his fellow-travelers on the so-called “intellectual Dark Web”— we’re dealing with the harvest of what Derrida and Foucault sowed in the form of Black Lives Matter and critical race theory, trans people asserting their right to their preferred pronouns, cancel culture, and the general upending of what cishet white men perceive as the natural order of things.
We can argue over how we got to this moment in history, but the idea that the postmodern condition—or whatever we want to call the present moment—was orchestrated by a handful of resentful French intellectuals should be relegated to the same place of shame as most conspiracy theories. As I’ve been attempting to argue in this series of posts is that while thinkers like Foucault and Derrida have indeed profoundly influenced postmodern thought, they are not—nor are any of their acolytes—responsible for the cultural conditions of postmodernity more broadly. They have, rather, attempted to develop vocabularies that can describe what we’ve come to call postmodernity.
 The substance of Peterson’s conspiracy theory of postmodernism I quote here is from an invited lecture he delivered at the University of Wisconsin in 2017, posted to YouTube (the relevant bit starts at 29:30)
 Though Peterson acknowledges that Foucault and Derrida aren’t the only two masterminds of postmodern thought, they are, to the best of my knowledge, the only two he ever really talks about. I find it odd that, as a professor of psychology and practicing psychologist, he never (again, to the best of my knowledge) ever deals with the work of psycholinguist Jacques Lacan, whose poststructuralist adaptations of Freud are almost as influential as the work of Foucault and Derrida. Given how consistently he gets wrong the basic premises of Foucault and Derrida—but especially Foucault—the absence of Lacan from his diatribes strikes me as further evidence of the poverty of his understanding of the very issues he addresses.
 Derrida never tipped his ideological hand one way or another until his book Specters of Marx (1994), which he wrote in the aftermath of the collapse of the Soviet Union and the putative death of Communism. In this book he does his deconstructivist schtick, playing around with the trope of the “spectre,” talking a lot about the ghost of Hamlet’s father, and sort of admitting “Yeah, I was always a Marxist.” Some Marxist thinkers were overjoyed; many more were decidedly unimpressed, deriding him as a Johnny-come-lately only willing to assume the Marxist mantle as he sat among what he assumed was its ruins. In an essay bitterly titled “Marxism without Marxism,” Terry Eagleton wrote: “it is hard to resist asking, plaintively, where was Jacque Derrida when we needed him, in the long dark night of Reagan-Thatcher,” and continues on to say, “there is something rich … about this sudden dramatic somersault onto a stalled bandwagon. For Specters of Marx doesn’t just want to catch up with Marxism; it wants to outleft it by claiming that deconstruction was all along a radicalized version of the creed.”
 For those familiar with Peterson, this is consonant with his worldview, especially with respect to his anti-transgender animus. He is an unreconstructed Jungian: which is to say, he believes fervently in a sense of the biological imperatives of mythology—that all of our stories and narratives, our societal customs and traditions, are dictated by our most elemental relationships to nature. Hence his weird grafting of pseudo-Darwinian evolutionism onto myths of all stripes, from the Bible to ancient Egypt to the Greeks, and how these innate understandings manifest themselves in the popularity of Disney princesses or the necessary disciplinary presence of the bully Nelson in The Simpsons (seriously). The bottom line is that Peterson’s worldview is predicated on a sense of the innate, immutable nature of gender, gender roles, hierarchies, and the individual as the hero of his own story. Feminism, in this perspective, is a basic betrayal of human nature; people identifying as a gender other than their genitalia dictates? Well, that’s just beyond the pale.
 The opacity of those vocabularies, especially with regards to Derrida and Lacan, is another example of just how complex the postmodern condition is. Old joke from grad school: What do you get when you cross the Godfather with a poststructuralist? Someone who’ll give you an offer you can’t understand.
[PROGRAMMING NOTE: I have not abandoned my “Remembering Postmodernism” series—I’m just having problems putting the next installment into a coherent form that doesn’t run into Tolstoy-length rambling. In the interim, please enjoy this follow-up to my Tolkien and the culture wars post.]
I do so love serendipity. I would go so far as to say it is what most frequently inspires me in my research and writing. Unfortunately, that tends to make my primary research methodology not at all unlike the dog from Up.
This fall, I’ll be teaching a fourth-year seminar on the American Weird. In the original version of the course, which I taught three years ago, we looked at H.P. Lovecraft’s influence on American horror and gothic, and we considered as well how contemporary texts engaged specifically with Lovecraft’s hella racist tendencies—specifically with regard to Victor LaValle’s novella The Ballad of Black Tom, a retelling of Lovecraft’s story “The Horror at Red Hook” (one of his most explicitly racist stories) from the perspective of a Black protagonist, and Matt Ruff’s recently published Lovecraft Country.
Since then, Lovecraft Country was adapted to television by HBO under the imprimatur of Jordan Peele, and the magnificent N.K. Jemisin published an overtly Lovecraftian novel The City We Became. So this time around, we’re going all in on the issue of race and genre in “the American Weird,” not least because the current furor over “critical race theory” in the U.S. makes such a course at least a little timely.
But of course you’ve read the title of my post and might be wondering how this is serendipitous with the Tolkien society’s recent seminar. What on earth does Tolkien have to do with Lovecraft, or questions of race and identity in American literature?
It has to do with the issue of genre and the ways in which genre has come to be both a metaphor and a delineation of community, belonging, and exclusion. Genre has always been about drawing lines and borders: as Neil Gaiman has noted, genre is about letting you know what aisles of the bookstore not to walk down. But in the present moment, we’ve become more alert to this exclusionary function of genre, something that Lovecraft Country—both the novel and the series—took as its principal conceit, taking H.P. Lovecraft’s racist legacy and redeploying it to interrogate the ways in which genre can be opened up to other voices, other identities.
The greatest joy of participating in this Tolkien Society seminar was seeing precisely this dynamic in action—in seeing how an international and diverse cadre of Tolkien scholars found themselves within his works. As I said in my previous post, the backlash against the very idea of a “Tolkien and Diversity” conference employed the rhetoric of invasion and destruction. A post to a Tolkien message board epitomizes quite nicely the zero-sum sensibility I discussed:
The very first paper of the Tolkien Society seminar this weekend was titled “Gondor in Transition: A Brief Introduction to Transgender Realities in The Lord of the Rings.” As you might imagine, the very idea that there could be transgender characters in LOTR has been anathema to the various people slagging the seminar over social media. Tolkien’s world view did not allow for trans identity! has been one of the squawks heard from complainants. Leaving aside the fact that gender fluidity in Norse myth is something Tolkien would have been quite familiar with, this sort of complaint misses the point—which is that Tolkien’s work is voluminous enough to admit a multitude of readings he almost certainly did not have in mind as he composed his mythology.
I wish I could offer a useful precis of the paper, which was quite well done, but I was distracted by Zoom’s chat function and the avalanche of commentary scrolling up the side of my screen as I listened to the presentation—many of the comments being from trans and non-binary Tolkien enthusiasts expressing gratitude for a paper that made them feel seen in the context of Tolkien fandom and scholarship. This was actually the general tone of the two-day seminar: not people who, as the conference’s detractors have charged, look to destroy or desecrate Tolkien, but people who love his mythology and want to see themselves reflected within it. And who in the process demonstrated the capaciousness of Tolkien’s vision—one not limited, as the fellow cited above suggests, to the rigid circumscription of conservative Catholicism.
Genre is an interesting thing—less, from my perspective, for its more esoteric delineations within literary theory and criticism than for its blunter and cruder demarcations in popular culture. This latter understanding is where the real action has been for the past ten or twenty years, as the barriers walling off genre from the realms of art and literature have crumbled—with the advent of prestige television taking on the mob film, the cop procedural, and the western in The Sopranos, The Wire, and Deadwood, respectively; then proceeding to make fantasy and zombie apocalypse respectable with Game of Thrones and The Walking Dead; all while such “literary” authors as Cormac McCarthy, Margaret Atwood, Kazuo Ishiguro, and Colson Whitehead made their own forays into genre with The Road, Oryx and Crake, The Buried Giant, and Zone One, just as “genre” authors like Neil Gaiman, Alan Moore, Alison Bechdel and others attained “literary” reputations.
But as those walls have come down, so too have those genre enthusiasts of years past grown resentful of all the new people moving into their neighbourhoods. As I mentioned in my previous post, both the Gamergaters and Sad Puppies articulated a sense of incursion and loss—women and people of colour and queer folk invading what had previously been predominantly white male spaces. Like those attacking the Tolkien and Diversity seminar, they spoke in apocalyptic terms of things being ruined and destroyed and desecrated—SF/F would never be the same, they bleated, if these people had their way. What will become of our beloved space operas and Joseph Campbell-esque fantasy?
Well, nothing. They will still be there—fantasy in a neo-African setting rather than a neo-European one doesn’t obviate J.R.R. Tolkien or C.S. Lewis or any of the myriad authors they influenced. They’re still there, on your bookshelf or at the library or your local bookstore. However much we might use the metaphor of territory, it’s flawed: in the physical world, territory is finite; in the intellectual world, it’s infinite. Tolkien’s world contains multitudes, and can admit those people who see queer themes and characters and want to interrogate the possible hermaphroditic qualities of dwarves and orcs without displacing all its staunchly conservative readers.
We interrupt your regularly scheduled deep dive into postmodernism for a brief diversion into the latest anti-woke tempest in a teacup.
CORRECTION: when I started writing this post, I meant it as a “brief diversion.” It has grown in the writing and become decidedly non-brief. One might even call it very long. My apologies.
The Tolkien Society was founded in 1969 with J.R.R. Tolkien himself as its president; like most scholarly societies, it has hosted annual conferences at which people present papers exploring aspects of Tolkien’s and related works, and generally mingle to share their passion and enthusiasm for the subject. And like most of these such conferences, each year tends to have a different theme—looking over the society’s web page listing past conferences, we see such previous themes as “21st Century Receptions of Tolkien,” “Adapting Tolkien,” “Tolkien the Pagan?”, “Life, Death, and Immortality,” “Tolkien’s Landscapes,” and so on … you get the idea.
This year’s conference, to be hosted online on July 3-4, is on “Tolkien and Diversity.”
If this blog post was a podcast, I would here insert a sudden needle-scratch sound effect to indicate the sudden, sputtering outrage of all the anti-woke culture warriors who, fresh from their jeremiads about Dr. Seuss and perorations about critical race theory, are now charging that the woke Left has come for Tolkien because these SJWs will not rest until they have destroyed everything that is great and good about white Western civilization. Because, you know, a two-day conference in which people discuss issues of race, gender, and sexuality with regards to Tolkien’s work will destroy The Lord of the Rings.
If that all sounds snarky and hyperbolic, well, I’ll cheerfully concede to the snark, but I’m not overstating the reaction. Survey the Twitter threads on the subject, and you’ll see the phrase “Tolkien spinning in his grave”1 an awful lot, as well as the frequent charge—made with varying degrees of profanity and varying familiarity with grammar—that all of these people who want to talk about race or sexual identity in Tolkien are doing it because they hate Tolkien and seek to destroy this great author and his legacy. Well, you might protest, that’s just Twitter, which is an unmitigated cesspool; and while you wouldn’t be wrong about Twitter, these reactions exist on a continuum with more prominent conservative outrage. The default setting for these reactions is the conviction that people engaging with such “woke” discourse as critical race theory, or queer studies, or feminism for that matter, do so destructively—that their aim is to take something wholesome and good and tear it down, whether that be Western civilization or your favourite genre fiction.
I shouldn’t have to argue that this Tolkien conference, with its grand total of sixteen presenters, isn’t about to do irreparable damage to Tolkien’s standing or his legacy. Nor would reading some or all of the papers ruin LotR or The Hobbit for you. The idea that it might—that this is some form of desecration that indelibly sullies the Great Man’s work—is frankly bizarre. It is, however, a kind of thinking that has come to infect both fandom and politics. Six years ago I wrote a blog post about the Hugo awards and the conservative insurgency calling itself the “Sad Puppies,” who were attempting the hijack the nominating process to exclude authors and works they believed to be “message fiction” (“woke” not yet having entered the lexicon at that point), i.e. SF/F more preoccupied with social justice and identity politics than with good old fashioned space opera and fur-clad barbarians. If I may be so gauche as to quote myself, I argued then what I’m arguing now:
… namely, that the introduction of new voices and new perspectives, some of which you find not to your taste, entails the wholesale destruction of what you love, whether it be gaming or SF/F. Anita Sarkeesian produced a handful of video essays critiquing the representation of women in video games. As such critiques go, they were pretty mild—mainly just taking images from a slew of games and letting them speak for themselves. Given the vitriol with which her videos were met, you’d be forgiven for thinking she’d advocated dictatorial censorship of the gaming industry, incarceration of the game creators, and fines to be levied on those who played them. But of course she didn’t—she just suggested that we be aware of the often unsubtle misogyny of many video games, that perhaps this was something that should be curtailed in the future, and further that the gaming industry would do well to produce more games that female gamers—an ever-growing demographic—would find amenable.
This is the same kind of thinking that had people wailing that the all-women reboot of Ghostbusters somehow retroactively ruined their childhoods, or that The Last Jedi was a thermonuclear bomb detonated at the heart of all of Star Wars.2 It’s the logic that leads people to sign a petition to remake the final season of Game of Thrones with different writers, as if such a thing were even remotely feasible.
By contrast, while I loved Peter Jackson’s LotR trilogy, I thought that his adaptation The Hobbit was an unmitigated trio of shit sandwiches—not least because it displayed precisely none of the respect Jackson had previously shown for the source material. And yet, aside from inevitable bitching about how terrible the films were, when I teach The Hobbit in my first-year class this autumn, I do not expect to be hobbled by the trauma of having seen a work I love treated so terribly. The Hobbit will remain The Hobbit—and nothing Peter Jackson can do on screen or an overeager grad student can do at a conference will change that.
Conservatives love to harp on about the entitled Left and “snowflake” culture, but the current culture war pearl-clutching—when it isn’t merely performative posturing by Republicans looking to gin up their base (though also when it is that)—comprises the very kind of entitlement they ascribe to the people they’re vilifying. When the Sad Puppies try to crowd out authors of colour and queer voices at the Hugos; when Gamergaters attack female gamers; when a Twitter mob drives Leslie Jones off social media in retaliation for her role in the “female” Ghostbusters; when Anita Sarkeesian has to cancel an invited talk due to bomb threats; or when an innocuous conference on diversity in Tolkien—no different from the dozens that have been held on “queering Shakespeare” or masculine fragility in Moby Dick—excites the ire of those declaring that such an abomination somehow desecrates Tolkien’s legacy, what is being articulated is an innate sense of ownership. This is mine, the reactionaries are saying. How dare you trespass on my territory. The zero-sum sensibility is one in which any change, any new voice, any new perspective that is not consonant with the way things have always been, is perceived as a loss. Anything challenging the mythos built up around an idea—be it the acknowledgement of systemic racism in America’s history or misogynist subtext in the Narnia Chronicles—is a betrayal of that mythos.
But we should also talk more specifically about the conference itself, as well as the counter-conference quickly thrown together by “The Society of Tolkien,” a group formed specifically in defiance of the incursion of “wokeness” into Tolkien studies.3
Ideally, I want to put this in perspective, and make clear why those people losing their shit really need to do a deep knee bend and take a fucking breath.
To be completely honest, this entire kerfuffle over “woke” Tolkien might have never showed up on my radar had I not been tagged in a general invitation to join my friend Craig and his partner Dani’s weekly online “office hours.” Craig and Dani are both professors in Philadelphia; they host discussions about a variety of topics, mostly dealing with American conservatism and anti-racism, and they focused this particular discussion on the backlash against the Tolkien and Diversity conference. Among those joining were members of the Tolkien Society, including some people presenting at the upcoming conference.
It was a lovely discussion, and since then I’ve been inducted into the Alliance of Arda—which is to say, I requested to join the Facebook group, and been accepted. There has been, as you could well imagine, a lot of discussion about the “Tolkien and Diversity” conference, particularly in regards to the backlash against it. This blog post came together in part because of my participation in those discussions.
And before I go further, I just need to make something clear: these lovely people who are enthusiastically participating in this conference are not resentful SJWs looking to take Tolkien down a peg or five. They are not haters. They do not want to cancel Tolkien. These are people who love Tolkien and his works. They would be as distressed as their detractors if Tolkien went out of print or was eliminated from curricula.
But they are looking to expand the ways in which we read and understand Tolkien, and in a zero-sum perspective … well, see above.
The original call for papers (CFP) for the conference is prefaced as follows:
While interest in the topic of diversity has steadily grown within Tolkien research, it is now receiving more critical attention than ever before. Spurred by recent interpretations of Tolkien’s creations and the cast list of the upcoming Amazon show The Lord of the Rings, it is crucial we discuss the theme of diversity in relation to Tolkien. How do adaptations of Tolkien’s works (from film and art to music) open a discourse on diversity within Tolkien’s works and his place within modern society? Beyond his secondary-world, diversity further encompasses Tolkien’s readership and how his texts exist within the primary world. Who is reading Tolkien? How is he understood around the globe? How may these new readings enrich current perspectives on Tolkien?
For those unfamiliar with academic conferences, this is pretty standard stuff, and is followed by a non-exhaustive list of possible topics participants might pursue:
Representation in Tolkien’s works (race, gender, sexuality, disability, class, religion, age etc.)
Tolkien’s approach to colonialism and post-colonialism
Adaptations of Tolkien’s works
Diversity and representation in Tolkien academia and readership
Identity within Tolkien’s works
Alterity in Tolkien’s works
Again, this is pretty standard fare for a CFP. Nor is it at all outlandish in terms of the topics being suggested. Speaking as both a professional academic and someone who first read Tolkien thirty-seven years ago, I can think of a half-dozen ways into each of those suggestions without really breaking a sweat.
But of course, the CFP was only the amuse bouche for the backlash. The real meal was made of the paper titles. Here’s a sample:
“Gondor in Transition: A Brief Introduction to Transgender Realities in The Lord of the Rings”
“The Problem of Pain: Portraying Physical Disability in the Fantasy of J. R. R. Tolkien”
“The Problematic Perimeters of Elrond Half-elven and Ronald English-Catholic”
“Pardoning Saruman?: The Queer in Tolkien’s The Lord of the Rings”
“Queer Atheists, Agnostics, and Animists, Oh, My!”
“Stars Less Strange: An Analysis of Fanfiction and Representation within the Tolkien Fan Community”
None of these seem at all beyond the pale to me, but then again I’m a professor and have presented numerous conference papers at numerous conferences, and these kind of titles are pretty standard. I can see where a Lord of the Rings enthusiast who never subjected themselves to grad school might think these titles absurd. True fact: academic paper titles are easily mocked, no matter what discipline, as they always employ a specialized language that tends to be opaque to those not versed in it.4
Case in point: I’m reasonably convinced that when former Globe and Mail contrarian and plagiarist Margaret Wente was at a loss for what aspect of liberal thought she would cherry-pick to vilify, she would comb the list of projects granted fellowships or scholarships by the Social Sciences and Humanities Research Council of Canada (SSHRC), and select the most arcane and convoluted project titles receiving taxpayer dollars and write one of her “And THIS is the state of academia today!” columns.
Emblematic of the Wente treatment was an article in the National Review by Bradley J. Birzer, a history professor and author of J. R. R. Tolkien’s Sanctifying Myth (2002). Writing about the upcoming Tolkien Society conference, he says,
While I have yet to read the papers and know only the titles for reference—some of which are so obscure and obtuse that I remain in a state of some confusion let’s, for a moment, consider “Pardoning Saruman? The Queer in Tolkien’s The Lord of the Rings.” In what way is Saruman, an incarnate Maia angel, sent by the Valar to do good in Middle-earth (Saruman really fails at this), queer? Is he in love with himself? True, with his immense ego, he might very well have been. Is he in love with Orthanc? Perhaps, but there is nothing in the text to support this. Is he in love with Radagast the Brown? No, he considers him a fool. Is he in love with Gandalf the Grey? No, he’s jealous of Gandalf and had been from their first arrival in Middle-earth. Is he in love with his bred Orcs? Wow, this would be twisted. Is he in love with Wormtongue? If so, nothing will come of it, for the lecherous Wormtongue has a leering and creepy gaze only for Eowyn. And, so, I remain baffled by all of this. Nothing about a queer Saruman seems to make sense.
Let’s begin with his admission that he hasn’t read any of the conference papers—which is fair, considering that, as I write this post, the conference is several days away—which means that if any of the presenters are at all like me, they’re just starting to write the papers now. But … do you really want to dive in so deeply based solely on a title? There was a story frequently told of a professor I knew at Western University, my alma mater: in answer to a question he’d asked, a student raised his hand, and said, “Well, I haven’t read the book yet, but …” and then went on confidently for several minutes while the professor nodded along. When the student stopped talking, the professor paused, and said, “Yeah. Read the book.”
Birzen’s choose-your-own-adventure speculation on what the paper might be about is no doubt entertaining to his readers who think, like him, that the whole concept of a queer Saruman is absurd, full stop. But here’s the thing: you don’t know. You don’t know what the paper is going to argue. You don’t even know if, despite the fact that he’s the only named character in the title, the paper has anything to say about Saruman’s queerness. Leaving aside for a moment the fact that considering someone a fool or being jealous of them doesn’t obviate the possibility of sexual desire for them, or the idea that demi-gods are somehow asexual (the Greeks would have something to say about that), perhaps consider that the paper isn’t about specific characters being queer? The pre-colon title, after all, is “Pardoning Saruman.” If you handed me this title in some sort of conference-paper-writing reality TV show, in which I have a few hours to write an essay conforming to the title, my approach would actually be to talk about the mercy both Gandalf and Frodo show Saruman, as something subverting a masculinist approach to punitive practices. Queerness, after all, isn’t merely about sex, but about a constellation of world-views at odds with heteronormativity; that Birzen reduces it to a question of who Saruman wants to fuck reflects the impoverishment of his argument.
But who knows. Will the paper be good? Perhaps, though that actually doesn’t matter much. Academic conferences are not about presenting perfect, ironclad arguments; the best papers are imperfect and lacking, because those are the ones that excite the most discussion. Conference papers tend to be works in progress, embryonic ideas that the presenters throw out into the world to test their viability. At its best, academia is an ongoing argument, always trying out new and weird ideas. Some fall flat; some flourish.
At its worst, academia becomes staid and resistant to change. Which brings me to the “Society of Tolkien” and its counter-conference. Their CFP reads as follows: “When J.R.R. Tolkien created Middle-earth, he filled it with characters, themes, and dangers that leapt from the pages to intrigue, excite, and give hope to his readers. In these sessions, we’ll explore these concepts to celebrate all that makes his works stand the test of time and what we should take from them today.” Or, to translate: no theme at all. Suggested topics include:
Analysis of characters, situations, and linguistics in the books
Military doctrine and tactics portrayed in the books or movies
Themes, lessons, and allegories drawn from or used by Tolkien
Works influenced by Tolkien’s writing
Works which influenced Tolkien’s writing
So, in other words: stuff we’ve seen before numerous times. But what is slightly more interesting is the supplemental list of topics that will not be considered for the conference:
Concepts not included in Tolkien’s writing
The Black Speech of Mordor
Speaking as someone who has attended numerous academic conferences, I can attest to the fact that “general foolishness” is always welcome to break up the monotony. As to “concepts not included in Tolkien’s writing,” that’s a pretty difficult territory to police. As one of my new Tolkien friends points out, the word “queer” appears in both The Hobbit and LotR multiple times—not, perhaps, meant in the way we mean it today, but still—where precisely does one draw the line?
I’m particularly intrigued by the prohibition against “The Black Speech of Mordor.” Because … why? Are they concerned, as were the attendees at the Council of Elrond when Gandalf spoke the rhyme on the One Ring in the language of Mordor, that the very sound of the language is corrupting? Do the people of the “Society of Tolkien” want to prohibit any presentation that might end up being sympathetic to Sauron? Or are they, as I joked in Arda’s comment section, worried that a paper about “Black Speech” could be a Trojan horse sneaking in critical race theory to the conference?
I said it as a joke, but I’m by no means convinced that that was not the reasoning behind the prohibition.
When I was enrolling for my fourth year of undergraduate study, I saw a course in the calendar on Tolkien and C.S. Lewis. I signed up unhesitatingly: it was, as I joked at the time, the one course at the university where I could probably walk into the final exam without ever having been to class and still get an A. I did however have a little bit of anxiety: I had first read LotR when I was twelve, and reread it several times through my teenage years. I did not reread it for the first three and a half years of my undergrad, so inundated was I by all the new texts and authors and ideas that saturated my days. I was concerned, on returning to this novel that had taught me so profoundly how literature can have affect—how it can transform you on a seemingly molecular level—that I might now find LotR deficient or less than it was. Since starting my English degree, I’d had that transformative experience again numerous times: Annie Dillard and Toni Morrison, Gabriel García Márquez and George Eliot, Yeats and Eliot and Adrienne Rich, Salman Rushdie and Isabel Allende, Tom Stoppard and Bertholt Brecht and Ibsen, Woolf and Joyce and Faulkner, to say nothing of a raft of literary theory and criticism from Plato and Aristotle to Derrida and De Man. How would Tolkien measure up? And how, having taken classes on postcolonial literature and women’s literature, would I react to Tolkien’s more problematic depictions of race and gender?
Here’s the thing: I needn’t have worried. Yes, all the problematic stuff was there, but so was the deeper complexity of Tolkien’s world-building, the nuanced friendship of Sam and Frodo, the insoluble problem of Smeagol/Gollum, as well as Tolkien’s bravura descriptive prose that I appreciated in a way I could not when I was a callow teenager. There was also my greater appreciation of Tolkien the person. My professor posed a question to the class: why do you think Tolkien—scholar and translator of Norse and Teutonic sagas featuring such indomitable warriors as Beowulf—why did he, when he wrote his great epic, make diminutive halflings his heroes? Heroes who, she continued, proved doughtier than such Teutonic-adjacent heroes as Boromir?
Could it be, she suggested, that he was annoyed by the deployment of Norse and Teutonic myth by a certain 1930s dictator to bolster the fiction of a master race?
She then read to us Tolkien’s response to a Berlin publishing house interested in publishing a German translation of The Hobbit, but wanted confirmation of Tolkien’s Aryan bona fides before proceeding. Tolkien’s response remains one of the most professorial go fuck yourselves I’ve ever encountered:
Thank you for your letter. I regret that I am not clear as to what you intend by arisch. I am not of Aryan extraction: that is Indo-Iranian; as far as I am aware none of my ancestors spoke Hindustani, Persian, Gypsy, or any related dialects. But if I am to understand that you are enquiring whether I am of Jewish origin, I can only reply that I regret that I appear to have no ancestors of that gifted people. My great-great-grandfather came to England in the eighteenth century from Germany: the main part of my descent is therefore purely English, and I am an English subject — which should be sufficient. I have been accustomed, nonetheless, to regard my German name with pride, and continued to do so throughout the period of the late regrettable war, in which I served in the English army. I cannot, however, forbear to comment that if impertinent and irrelevant inquiries of this sort are to become the rule in matters of literature, then the time is not far distant when a German name will no longer be a source of pride.
Your enquiry is doubtless made in order to comply with the laws of your own country, but that this should be held to apply to the subjects of another state would be improper, even if it had (as it has not) any bearing whatsoever on the merits of my work or its sustainability for publication, of which you appear to have satisfied yourselves without reference to my Abstammung [ancestry].
All of which is by way of saying: Tolkien was a product of his nation and his era. He was a complex person, and would almost certainly blanch at the Tolkien Society’s current roster of conference papers.
But then, he’d probably blanch at many of the ones in years past too.
1. Anyone even halfway familiar with J.R.R. Tolkien would tell you that he most likely started spinning in his grave the moment he was interred and hasn’t paused much since. Some authors are generally indifferent to people’s interpretations of their work and are happy to let fans, academics, and adaptors have their own ideas; others retain a proprietary sense of their work, and will respond to negative reviews or interpretations they think miss the mark of what had been intended, as well as being antagonistic to the idea of someone else adapting their work to stage or screen. Tolkien was quite firmly in the latter category, often engaging in lengthy correspondence with fans in which he corrects their understanding of The Lord of the Rings with his own, especially with regards to allegorical or symbolic readings (with few exceptions, Tolkien adamantly maintained that there was no symbolism or allegory in LotR, and no correspondence to contemporary world-historical events). He was also quite bearish on the idea of adapting The Lord of the Rings to film, even though there was an American project in the works in 1958; in his letters, Tolkien communicates his antipathy quite clearly. One can only imagine what he would have thought of the allusions to Middle-Earth in the music of Rush and Led Zepplin, or Ralph Bakshi’s vaguely hallucinogenic animated film (much beloved by the guardians of “classic” SF/F). Whether he would have appreciated Peter Jackson’s LotR trilogy is less certain, though the dilatory and overstuffed debacle of The Hobbit was surely responsible for seismic events in the vicinity of Tolkien’s grave. Compared to Radagast the Brown’s rabbit sled, King Thranduil’s war moose, Legolas constantly defying the laws of physics, and the dwarves’ battle with Smaug inside the Lonely Mountain (I could go on), a paper titled “’Something Mighty Queer’: Destabilizing Cishetero Amatonormativity in the Works of Tolkien” is small beer.
2. That there is so much hate levied against The Last Jedi, while Revenge of the Sith apparently gets a pass, baffles me.
3. As one commenter observed, the fact that some people split off to form “The Society of Tolkien” is such pure Monty Python that it truly puts the absurdity of all this into context.
4. And it is by no means limited to academia. On the day I wrote this post, I also drove my partner Stephanie, who is a talented guitarist, out to Reed Music in Mount Pearl so she could purchase a Fender Telecaster. Unlike Steph, I am utterly unmusical. As we drove, I idly asked her what the difference was between a Telecaster and a Stratocaster. She then proceeded, very animatedly, to descend into guitarist-speak for the next ten minutes. After a certain point, she looked at me and said, “You didn’t understand any of that, did you?” To which I replied, “No, about ten seconds in, you started to sound like the adults in Charlie Brown specials.” But as I write this, she’s in the next room playing “Sultans of Swing” on her new guitar, and, really, that’s a language I speak just fine.
And we’re back! To reiterate my caveat from my previous post, these discussions of postmodernism are filtered through my lens as an Americanist. Postmodernism is a global phenomenon, and while it is conditioned by American cultural and economic hegemony, its manifestations are various and culturally specific around the world. A significant element of the post-WWII world was the collapse of empires and the fraught process of decolonization, as well as the rise of the neo-imperialism of corporate capitalism. What we’ve come to call postcolonialism isn’t synonymous with postmodernism by any means, but on that Venn diagram there is more than a little overlap.
I reiterate this caveat in part because this post is probably the one in this series most specific to U.S. history. I have thus far offered a broad introduction to postmodernism and offered my specific understanding of what I consider its most basic unifying element; what I want to do here in this post is provide some historical context for the material circumstances that gave rise to the postmodern condition.
Are you sitting comfortably? Then we’ll begin.
Whenever I teach my second-year class on American Literature After 1945, I always begin the semester with a lecture in which I contextualize, historically, where the United States was at the end of the Second World War. And I begin with a question: where do you think the U.S. ranked globally in terms of the size and strength of its military in 1939? My students, to their credit, are always wary at this question—they know I wouldn’t ask it if the answer was obvious. So, obviously not #1 … but they’ve all grown up in a world (as indeed I have) in which American military might is vast, and the Pentagon’s budget is larger, at last accounting, than the next fourteen countries (twelve of which are allies) combined. So some brave soul will usually hazard a guess at … #5? No, I reply. Someone more audacious might suggest #10. But again, no. In 1939, I tell them, the U.S. ranked #19 globally in its military’s size and strength, with an army that was smaller than Portugal’s.
The point of this exercise, as you’ve probably gleaned, is to describe how the U.S. went from being a middling world power to a superpower in the course of six years—and how that fundamentally changed U.S. society and culture, and in the process reshaped the world.
There’s an episode of The West Wing from season three that centers on President Bartlett’s State of the Union Address; at once point speechwriter Sam Seaborn (Rob Lowe) talks about how a president’s words can inspire a nation to great achievements, citing FDR’s SOTU in 1940 in which he predicted that the country would build 50,000 aircraft by the end of the war—which proved incorrect, as the U.S. actually built over 100,000.
While Roosevelt’s shrewd leadership of the U.S. through WWII should not be discounted, Sam’s historical factoid has more to do with propping up the Aaron Sorkin Great Man Theory of History: Oratorical Edition™. But this historical actuality does offer a useful thumbnail sketch of the sheer size and scope of the United States’ industrial capacity when totally mobilized. Consider the haunting images of planes and ships mothballed after the war, all of which had been churned out by American factories and which were now the detritus of a massive war effort.
The U.S. emerged from the devastation of the war with its industrial infrastructure intact, unlike those of its allies and enemies: Britain’s had been badly damaged, France’s suffered from the fierce fighting after the Normandy landings, and both Germany and Japan had basically been pounded flat by Allied bombing; the Soviet Union had had to move its factories far enough east to be out of the range of the Luftwaffe, while also suffering 10 million military and 14 million civilian deaths, more than any other combatant nation by a magnitude (the U.S. by contrast lost 418,500, only 1700 of which were civilians). [EDIT: as an historian friend of mine pointed out, China actually suffered losses comparable to Russia’s between 1937-1945]
All of which meant that the United States had gone from being ranked nineteenth in military strength to number one with a bullet (pun intended), with the world’s only fully functioning industrial base of any note; meanwhile, the returning soldiers benefited from the G.I. Bill passed in 1944, which, among other things, provided for college tuition. The discharged veterans poured into colleges and vocational schools and emerged with degrees that put them in good stead to swell the ranks of white-collar workers and take advantage of other provisions of the G.I. Bill, such as one year of unemployment compensation, low-interest loans to start a business or a farm, and low-cost mortgages. This government investment led to the greatest expansion of middle-class prosperity in history.1
The industry that had been cranked up to eleven in the war did not sit idle; as Americans earned more, they spent more. The Big Three automakers switched from jeeps and tanks to cars for the newly suburban nuclear families. Factories changed over from military production2 to making refrigerators, washing machines, and a host of new appliances that, the rapidly expanding business of commercial advertising declared, all households needed. And somewhere in all of that, television made its presence known as sales of television sets grew exponentially: 7,000 in 1945, 172,000 in 1948, and 5 million in 1950; in 1950, twenty percent of households had a television set; by the end of the decade, ninety percent did.
Meanwhile, the wartime acceleration of technology facilitated the rapid growth of, among other things, air travel: the first commercially successful jetliner, the Boeing 707, took to the skies in 1954.
But, you might ask, what does all this have to do with postmodernity? Well, everything … what I’ve been describing are a series of paradigm shifts in travel and communication technology, as well as the United States’ transition from industrial capitalism to full-bore consumerism. These changes were not just material, but symbolic as well, as the U.S. created of them a counter-narrative to Communism. Consumerism became understood as patriotic: the wealthier Americans grew, the more cars and appliances they bought, the more the American Dream could be held up as the obvious virtuous alternative to dour Soviet unfreedom.
This postwar period that comprises the 1950s and early 60s was the era, in the words of scholar Alan Nadel, of “containment”—a time that was marked, to oversimplify somewhat, by “the general acceptance … of a relatively small set of narratives by a relatively large portion of the population.3 Or to put it even more simply, it was a time of pervasive public trust in American institutions, especially government and the military. Now, to be clear, “pervasive” doesn’t mean “total”—this was also, after all, the time of McCarthyism and the Red Hunts, as well as the first glimmers of cultural dissent that would influence the counterculture of the 1960s (most especially embodied in the Beat Generation), and also the snowballing insurgency of the Civil Rights Movement. But taken overall, the 1950s—for all its nascent and repressed anxieties—embodied a complacency facilitated by prosperity and the higher standard of living prosperity made possible. As Nadel argues in his book Containment Culture, this delimiting of narratives was about containment: containing women in their domestic spaces, containing men in the role of patriarchal breadwinner, containing fathers, mothers, and children in the bubble of the nuclear family, containing Black Americans within the confines of Jim Crow, containing ideological expression within apolitical art forms like abstract expressionism and critical methodologies like the “New Criticism” being practised in English departments across the country, and above all containing the Soviet cultural threat with the power of America’s example and its military threat with the example of America’s power.
It didn’t take long for the 1950s to be nostalgized and mythologized in popular culture: American Graffiti was released in 1973, and Happy Days first aired in 1974. Indeed, the 50s were mythologized in popular culture at the time, with shows like Leave it to Beaver and Father Knows Best embodying the utopian ideals of domesticity and the nuclear family. Even popular stories of youthful rebellion were stripped of any possible political content, perhaps most explicitly in the very title of Rebel Without a Cause (1955), or in Marlon Brando’s iconic line in The Wild One (1953): when asked what he’s rebelling against, he replies “Whaddya got?”
When Donald Trump made “Make America Great Again” his slogan, he was tacitly citing the 1950s as the apogee of America’s greatness.4 This was possibly the canniest of his various simplistic catch phrases, given that 1950s nostalgia has reliably proven attractive at such moments of cultural crisis or ennui as the early 1970s; but like all nostalgic figurations, it leaves a lot out. There were those during the 1950s who labelled the period the “age of anxiety,” though as I’ll delve into more deeply in my next post, it could also have been called the age of avoidance: the rhetoric and discourse of containment sought, among other things, to deflect from such new global realities as the threat of nuclear conflict and the haunting aftermath of the Holocaust.
One way to understanding the emergence of postmodernism—and this is indeed one of the primary arguments of Alan Nadel’s book—is as a product of the breakdown of containment, of the fracturing of a societal consensus in which a relatively large number of people accepted a relatively small number of narratives into a more chaotic and fragmented social order that was deeply suspicious of such narratives. Though this breakdown progressed over a series of shocks to the system—the assassination of JFK, the start of the Vietnam War and the concomitant rise of the anti-war Left, the growth and visibility of the Civil Rights Movement culminating in the traumatic murder of Martin Luther King, Jr., the assassination of Robert Kennedy, all of which unfolded on television screens across the nation—this fracture was seeded in the immediate aftermath of the Second World War. The war spawned a world that was more truly global than at any point in history. The world was simultaneously smaller and larger: smaller because of the instantaneous nature of televisual and telephonic communication, as well as the fact that one could travel around the world in a matter of hours rather than weeks or months; larger because of the ever-increasing volume of information available through new communications technology.
The increasingly efficient and indeed instantaneous transmission of information facilitated the more efficient transmission of capital, which in turn facilitated the growth of transnational corporate capitalism, which by the time we reach the orgy of financial deregulation also known as the Reagan Administration becomes less about manufacturing and industry than the arcane calculus of stock markets. By this time, America’s postwar preeminence as the sole nation with functioning industrial infrastructure had been steadily eclipsed as the rest of the world rebuilt, or, as with the case of China and Southeast Asia and the Indian subcontinent, had begun to establish their own industrial bases—facilitating the departure of blue-collar jobs as multinational corporations offshored their factories to whatever countries offered the lowest wages and best tax breaks.
This confluence of historical, economic, technological, and cultural circumstances created a profound sense of dislocation, a disruption of what Marxist theorist and literary scholar Fredric Jameson calls our “cognitive map.” Jameson is the author of Postmodernism, or The Cultural Logic of Late Capitalism (1991), arguably one of the definitive works theorizing postmodernism (as well as being exhibit A of why “postmodern neo-Marxism” is an incoherent concept, but more on that in a future post). He borrows the concept of cognitive mapping from urban theorist Kevin Lynch’s 1960 book The Image of the City; in Lynch’s telling, cities are comforting or alienating to the extent to which one can have a “cognitive map” of the urban space in which one can picture oneself in relation to the city. Cities with a straightforward layout and visible landmarks (such as Toronto) are comforting; cities that are confusing and repetitive sprawls (such as Los Angeles) are alienating.5 Jameson adapts this concept to culture more broadly: how we see ourselves in relation to what Jameson calls the “totality” of our culture and society is predicated on how much of it we can know. Someone living in a small town with no need to have intercourse with the larger society has a fairly straightforward cognitive map. Someone whose life and livelihood depends on circumstances well outside their control or understanding fares less well when they are negatively impacted by the offshoring of jobs or a financial meltdown that erases their retirement savings. By the same token, the individual in the small town’s ability to maintain their cognitive map will be impacted by financial downturns, or even just by having cable TV and the internet.
We could well substitute “sense of reality” for “cognitive map.” Where we find ourselves—have indeed found ourselves for several decades now—is in a place where reality feels increasingly contingent and unstable. When I wrote my doctoral dissertation on conspiracy theory and paranoia in American postwar culture, my argument basically fell along these lines—that the dislocations of postmodernity make the certainties of conspiracism attractive. I defended that thesis seventeen years ago (egad); as evidenced by the 9/11 Trutherism, anti-Obama birtherism, and the current lunacy of QAnon, conspiracism has only metastasized since then.
I am, to be clear, leaving a lot of stuff out here. But my overarching point is that what we call “postmodernism” is no one thing. It is, rather, the product of all the stuff I’ve described, and more. When the Jordan Petersons of the world accuse postmodernism of positing that any and all interpretations of something are all equally valid, they’re leveling that charge against nerdy academics influenced by a handful of French theorists and philosophers who are not, in fact, saying any such thing; Peterson et al are, though they don’t seem to know it, railing against a cultural condition that has made certitude functionally impossible, and which promulgates an infinitude of interpretations through no fault of philosophers and theorists and critics doing their level best to find the language to describe this condition.
1. To hearken back to my posts on systemic racism, Black Americans largely found themselves shut out of the G.I. Bill’s benefits, both because of explicit restrictions written in by Southern Democrats, but also because almost all banks simply refused to extend the low-interest loans and mortgages to Black veterans. While white America enjoyed the fruits of the postwar boom, accruing wealth that would be passed onto their children and grandchildren, the exclusion of Black America from the 1950s boom continues to contribute to the stark wealth inequality between whites and non-whites in America today.
2. Which is not to say, of course, that U.S. military production slackened much; even as the nation demobilized, it shifted focus to the new enemy, the U.S.S.R., and poured huge amounts of money into R&D for possible new conflicts. Indeed, it can be argued that the recessions of 1947 and 1950, resultant from the downturn of wartime production, were counteracted by the Truman doctrine in 1947—the principle asserting America’s obligation to contain Soviet expansion globally—and the outbreak of the Korean War in 1950. Both events goosed the stock market with the promise of increased military production. By the end of Dwight D. Eisenhower’s presidency in 1961, he had grown so alarmed by the growth of military production and the power shared by the Pentagon and arms manufacturers that the architect of the D-Day landings said, in his valedictory address to the nation:
This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence—economic, political, even spiritual—is felt in every city, every State house, every office of the Federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society … we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.
3. Alan Nadel, Containment Culture (1995) p. 4.
4. The peculiar genius of the slogan “Make America Great Again” is precisely its ahistorical vagueness—playing upon the intuitive contradiction felt by people whose instinct is to think of America as the greatest nation in the world, but who cannot see evidence for that greatness in their day to day lives. Hence, it must be returned to that state of greatness; and though the slogan’s vagueness allows its adherents to imagine for themselves precisely when America was great, the most standard conservative default setting is the 1950s (allowing for the likelihood of the most nativist part of the MAGA crowd pining nostalgically for the 1850s).And indeed, the touchstones of Trump’s 2016 campaign promises—the return of well-paid factory jobs, higher wages for the working class, the tacit and sometimes explicit exclusion of people of colour, and the return to a masculine America in which women know their place—all hearkened back to the era of containment (eliding, of course, the 90% top marginal tax rate and the fact that working-class jobs could support a family because of the pervasiveness of unions). Trump’s promises were all about containment, the boxing-in and walling-off of Black, queer, and women’s voices, and containing America itself from the influx of immigrants, refugees, and Muslims, all of which envisions a very strictly contained understanding of what comprises “real” America.
5. Posthumous apologies to Kevin Lynch for so egregiously oversimplifying his nuanced discussions of urban space.