There’s a moment that happens on occasion in American football, when a running back, having been handed the ball, breaks through all the defenders and sprints into the open field, with nothing between him and the endzone.
And sometimes, rather than making a touchdown, he gets blindsided by a linebacker he didn’t see coming.
That’s what 2021 has been like, especially these past few months.
I was vaccinated in June, and then again in August. We resumed in-person classes at Memorial University in September. Like many people, I felt a massive sense of relief—the pandemic finally seemed to be heading into the rearview mirror, and things were returning to something resembling normalcy. Except … yeah, not so much. I’m writing this at my parents’ house on Christmas day, but we won’t be properly celebrating the holiday until tomorrow because my niece and nephew were possibly exposed to COVID on a schoolbus, and thus need to quarantine for another day. I will myself have to quarantine for five days on returning to St. John’s.
We should be in the clear, running to the endzone with open field, but we keep getting tackled.
This is what I told all my students earlier this term: I wanted them to know it wasn’t just them, it was a general malaise that I was also experiencing. I told them they weren’t alone in feeling anxious, discomfited, or just generally off. I told them that if it came down to it, if they were struggling to get their classwork done, that there was no shame in dropping my course … or any other course they were taking. I told them that I would do whatever I could to help, provided they kept me in the loop. You don’t need to give me the gory details, I said—just tell me you’re having a rough go of it, and that’s enough. We’ll work something out.
I have to imagine there are those who will say I’m catering to the fragility of the snowflake generation in making such accommodations. Anyone having that thought can fuck right off. What saved me from breakdown these past few months was the fact that I had such extraordinary students: as I said to all my classes at the term’s end, they’re the reason why I have hope for the world. As a GenXer, irony is my default setting; for those inheriting the catastrophes of climate change and resurgent fascism, earnestness is their baseline. What would have been radically uncool in the 1990s is quite possibly what will save us all.
On Christmas day, I find myself thinking about all the best parts of the past year. My students are, as they often are, Exhibit A.
I taught a graduate seminar in the winter term that, in spite of the fact that it happened on Zoom, was hands down my favourite class ever. It was called “The Spectre of Catastrophe,” and looked at 21st-century post-apocalyptic narratives. In spite of the fact that the subject was perhaps a little bit too on the nose for our current situation, it was the most fun I’ve had as a teacher ever (and that’s saying a lot, as my following comments will make clear). The only advantage classes taught over Zoom have over in-person teaching is the comments students can write as the lecture/discussion unfolds. In this particular class, there was always an ongoing conversation scrolling up on the side. I used the ten-minute breaks each hour in our three-hour classes to read through the lively and often hilarious discussions happening in parallel to the actual class.
Getting back into the classroom this past September felt so very, very good. It didn’t hurt that I was teaching a roster of courses that were, for me, an embarrassment of riches: a first-year class called “Imagined Places,” which gave me the (gleeful) chance to teach Tolkien (The Hobbit), Ursula K. Le Guin (A Wizard of Earthsea), Sir Terry Pratchett (Small Gods), Neil Gaiman (The Ocean at the End of the Lane), and Jo Walton (The Just City) (for the record, everybody needs to read Jo Walton). I also taught, for the third time, “Reading The Lord of the Rings.” I had five, count ‘em FIVE, students in LOTR who were also in my fourth-year seminar “The American Weird: Lovecraft, Race, and Genre as Metaphor.” (I always measure my success as an educator by the number of students who take multiple classes with me. I assume it means I’m doing something right, though it’s also possible they’ve just got my number and know what it takes to get a decent grade). Given that the Tolkien and the Lovecraft courses were back-to-back, I had something like an entourage following me from the chemistry building in which I taught LOTR to the Arts buildings that was the home of the weird.
I called my Lovecraft students “weirdos,” because, well, obviously. It only offended them the first time I called them that.
Even given the fact that I was teaching a dream slate of classes, this fall semester was hard. I am congenitally incapable of asking for help, or for that matter recognizing that I need help until after the fact; these past few days of decompressing at my parents’ home have been invaluable in offering relaxation and reflection. I have also realized I need to find my way to a therapist in 2022.
But the moral of the story is that the students I worked with this past year kept me sane, and gave me hope. So on this day of prayer (even for those as atheistic as myself) I am grateful for you all. And given how many of you have signed up for yet more Lockett pontification next term, I can only assume I’m doing something right.
Or you’ve got my number. Either way, it’s all good.
“Yesterday, when asked about reparations, Senate Majority Leader Mitch McConnell offered a familiar reply: America should not be held liable for something that happened 150 years ago, since none of us currently alive are responsible … This rebuttal proffers a strange theory of governance, that American accounts are somehow bound by the lifetime of its generations … But we are American citizens, and thus bound to a collective enterprise that extends beyond our individual and personal reach.”
I’ve been slowly working my way through Barry Jenkins’ ten-episode adaptation of Colson Whitehead’s novel The Underground Railroad. I’ve been a huge fan of Whitehead’s fiction ever since I got pointed towards his debut novel The Intuitionist; I read The Underground Railroad when it was still in hardcover, and I’ve included it twice in classes I’ve taught. When I saw it was being adapted to television by the virtuoso director of Moonlight and If Beale Street Could Talk, I knew this was a series I wanted to watch.
I was also wary—not because I was concerned about the series keeping faith with the novel, but because I knew it would make for difficult viewing. Whatever liberties Whitehead takes with history (as I discuss below), he is unsparing with the brutal historical realities of slavery and the casual cruelty and violence visited on slaves. It is often difficult enough to read such scenes, but having them depicted on screen—seeing cruelty and torture made explicit in audio and visual—can often be more difficult to watch.
For this reason, for myself at least, the series is the opposite of bingeable. After each episode, I need to digest the story and the visuals and think.
The Underground Railroad focuses on the story of Cora (Thuso Mbedu), a teenage girl enslaved on a Georgia plantation whose mother had escaped when she was young and was never caught, leaving Cora behind. Another slave named Caesar (Aaron Pierre) convinces her to flee with him. Though reluctant at first, circumstances—the kind of circumstances which make the show difficult viewing—convince her to go. She and Caesar and another slave named Lovey (who had seen them go and, to their dismay, tagged along) are waylaid by slave-catchers. Though Lovey is recaptured, Cora and Caesar escape, but in the process one of the slave-catchers is killed. They make it to a node of the Underground Railroad and are sheltered by a white abolitionist with whom Caesar had been in contact. He then takes them underground to the “station,” where they wait for the subterranean train that will take them out of Georgia.
Because that is the principal conceit of The Underground Railroad: that the rail system spiriting slaves away is not metaphorical, but literal, running through underground tunnels linking the states.
More than a few reviews of the series, and also of the novel on which it is based, have referred to the story as “magical realism.” This is an inaccurate characterization. Magical realism is a mode of narrative in which one group or culture’s reality collides with another’s, and what seems entirely quotidian to one is perceived as magical by the other. That’s not what Colson Whitehead is doing, and, however lyrical and ethereal occasionally dreamlike Barry Jenkin’s visual rendering is, it’s not what the series is doing either. In truth, the literalization of the underground railroad is the least of Whitehead’s historical tweaks: The Underground Railroad is not magical realism, but nor is it alternative history (a genre that usually relies on a bifurcation in history’s progression, such as having Nazi sympathizer Charles Lindbergh win the presidency in 1940 in Philip Roth’s The Plot Against America). The Georgia plantation on which The Underground Railroad begins is familiar enough territory, not discernable from similar depictions in Uncle Tom’s Cabin or 12 Years a Slave. But then as Cora journeys from state to state, each state embodies a peculiar distillation of racism. Cora and Caesar first make it to South Carolina, which appears at first glance to be an enlightened and indeed almost utopian place: there is no slavery, and the white population seems dedicated to uplifting freed Blacks, from providing education and employment to scrupulous health care to good food and lodging. Soon however the paternalistic dimension of this altruism becomes more glaring, and Cora and Caesar realize that the free Blacks of South Carolina are being sterilized and unwittingly used in medical experimentation.
In North Carolina, by contrast, Blacks have been banished, and any found within the state borders are summarily executed. The white people of North Carolina declared slavery a blight—because it disenfranchised white workers. Since abolishing slavery and banishing or killing all of the Black people, the state has made the ostensible purity of whiteness a religious fetish. Cora spends her time in North Carolina huddled in the attic of a reluctant abolitionist and his even more reluctant wife in an episode that cannot help but allude to Anne Frank.
Whitehead’s vision—stunningly rendered in Jenkin’s adaptation—is less an alternative history than a deconstructive one. As Scott Woods argues, The Underground Railroad is not a history lesson, but a mirror, with none of “the finger-wagging of previous attempts to teach the horrors of slavery to mainstream audiences.” I think Woods is being polite here, using “mainstream” as a euphemism for “white,” and he tactfully does not observe that effectively all of such finger-wagging attempts (cinematically, at any rate) have tended to come from white directors and feature white saviour protagonists to make liberal white audiences feel better about themselves.
There are no white saviours in The Underground Railroad; there is, in fact, very little in the way of salvation of any sort, just moments of relative safety. As I still have a few episodes to go, I can’t say how the series ends; the novel, however, ends ambivalently, with Cora having defeated the dogged slave-catcher who has been pursuing her from the start, but still without a clear sense of where she is going—the liberatory trajectory of the underground railroad is unclear and fraught because of the weight of the experience Cora has accrued over her young life. As she says at one point, “Make you wonder if there ain’t no real place to escape to … only places to run from.” There is no terminus, just endless flight.
When I say The Underground Railway is a “deconstructive” history, I don’t use the term in the sense as developed by Jacques Derrida (or at least, not entirely). Rather, I mean it in the more colloquial sense such as employed by, for example, chefs when they put, say, a “deconstructed crème brûlée” on the menu, which might be a smear of custard on the plate speared by a shard of roasted sugar and garnished with granitas infused with cinnamon and nutmeg. If the dish is successful, it is because it defamiliarizes a familiar dessert by breaking down its constituent ingredients in such a way as to make the diner appreciate and understand them anew—and, ideally, develop a more nuanced appreciation for a classic crème brûlée.
So—yes, odd analogy. But I’d argue that Whitehead’s novel is deconstructive in much the same manner, by taking a pervasive understanding of racism and the legacy of slavery and breaking it down into some of its constituent parts. In South Carolina, Cora and Caesar experience the perniciousness of white paternalism of the “white man’s burden” variety—the self-important concern for Black “uplift” that is still invested in the conviction of Black culture’s barbarism and inferiority, and which takes from this conviction license to violate Black bodies in the name of “science.”
Then in North Carolina, we see rendered starkly the assertion that Blacks are by definition not American, and are essentially seen as the equivalent of an invasive species. This, indeed, was the basis of the notorious 1857 Supreme Court ruling on Dred Scott v. Sandford, which asserted that Black people could not be citizens of the United States. Though that precedent was effectively voided by the thirteenth and fourteenth amendments—which abolished slavery and established citizenship for people of African descent born in America, respectively—the franchise was not effectively extended to Black Americans until Lyndon Johnson signed the Voting Rights act a century after the end of the Civil War. The delegitimization of Black voters continues: while the current Trumpian incarnation of the G.O.P. tacitly depicts anyone voting Democrat as illegitimate and not a “real American,” in practice, almost all of the legal challenges to the 2020 election result were directed at precincts with large numbers of Black voters.
Later in the novel when Cora finds her way to Indiana to live on a thriving Black-run farm, we see the neighbouring white community’s inability to countenance Black prosperity in such close proximity, especially when Black people flourishing reflects badly on their own failures. The pogrom that follows very specifically evokes the Tulsa Massacre of 1921, when a huge white mob essentially burned down a thriving Black part of town.
What’s important to note here is that Whitehead’s deconstructive process is less about history proper than about our pervasive depictions of history in popular culture, especially by way of fiction, film, and television. Or to be more accurate, it is about the mythologization of certain historical tropes and narratives pertaining to how we understand racism. One of the big reasons why so many (white) people are able to guilelessly suggest that America is not a racist nation, or claim that the election of a Black president proves that the U.S. is post-racial, is because racism has come to be understood as a character flaw rather than a systemic set of overlapping cultural and political practices. Think of the ways in which Hollywood has narrated the arc of American history from slavery to the Civil War to the fight for civil rights, and try to name films that don’t feature virtuous white protagonists versus racist white villains. Glory, Mississippi Burning, Ghosts of Mississippi, A Time to Kill, Green Book, The Help—and this one will make some people bristle—To Kill a Mockingbird. I could go on.
To be clear, I’m not saying some of these weren’t excellent films, some of which featured nuanced and textured Black characters with considerable agency; but the point, as with all systemic issues, is not the individual examples, but the overall patterns. These films and novels flatter white audiences—we’re going to identify with Willem Dafoe’s earnest FBI agent in Mississippi Burning against the Klan-associated sheriff. “That would be me,” we think, without considering how the FBI—at the time the film was set, no less!—was actively working to subvert the civil rights movement and shore up the societal structures subjugating and marginalizing Black Americans, because in this framing, racism is a personal choice, and therefore Dafoe’s character is not complicit in J. Edgar Hoover’s.
The white saviour tacitly absolves white audiences of complicity in racist systems in this way, by depicting racism as a failing of the individual. It allows us to indulge in the fantasy that we would ourselves be the white saviour: no matter what point in history we find ourselves, we would be the exception to the rule, resisting societal norms and pressures in order to be non-racists. Possibly the best cinematic rebuke to this fantasy was in 12 Years a Slave, in the form of a liberal-minded plantation owner played by Benedict Cumberbatch, who recognizes the talents and intelligence of Solomon Northup (Chiwetel Ejiofor), a formerly free Black man who had been dragooned by thugs and illicitly sold into slavery. Cumberbatch’s character looks for a brief time to be Solomon’s saviour, as he enthusiastically takes Solomon’s advice on a construction project over that of his foreman. But when the foreman violently retaliates against Solomon, Cumberbatch’s character cannot stand up to him. In a more typical Hollywood offering, we might have expected the enlightened white man to intervene; instead, he lacks the intestinal fortitude to act in a way that would have brought social disapprobation, and as a “solution” sells Solomon to a man who proves to be cruelly sociopathic.
Arguing for the unlikeliness that most people could step out of the roles shaped for them by social and cultural norms and pressures might seem like an apologia for historical racism—how can we have expected people to behave differently?—but really it’s about the resistance to seeing ourselves enmeshed in contemporary systemic racism.
Saying that that is Whitehead’s key theme would be reductive; there is so much more to the novel that I’m not getting to. There is, to be certain, a place—a hugely important place—for straightforward historical accounts of the realities and finer details of slavery, even the more “finger wagging” versions Scott Woods alludes to. But what both Whitehead’s novel and Barry Jenkins’ adaptation of it offer is a deconstruction of the simplistic binarism of decades of us vs. them, good vs. bad constructions of racism that give cover to all but the most unapologetically racist white people. The current backlash against “critical race theory”—which I’ll talk more about in the third of these three posts—proceeds to a great extent from its insistence on racism not as individual but systemic, as something baked into the American system.
Which, when you think about it, is not the outrageous argument conservatives make it out to be. Not even close: Africans brought to American shores, starting in 1619, were dehumanized, brutalized, subjected to every imaginable violence, and treated as subhuman property for almost 250 years. Their descendants were not given the full franchise as American citizens until the Civil Rights Act and the Voting Rights Act of 1964 and 1965, respectively. Not quite sixty years on from that point, it’s frankly somewhat baffling that anyone, with a straight face, can claim that the U.S. isn’t a racist nation. One of the greatest stumbling blocks to arriving at that understanding is how we’ve personalized racism as an individual failing. It shouldn’t be so difficult to recognize, as a white person, one’s tacit complicity in a long history without having to feel the full weight of disapprobation that the label “racist” has come to connote through the pop cultural mythologization of racism as a simple binary.
 I want to distinguish here between those who more cynically play the game of racial politics, and those who genuinely do not see racism as systemic (granted that there is a grey area in between these groups). These are the people for whom having one of more Black friends is proof of their non-racist bona fides, and who honestly believe that racism was resolved by the signing of the Civil Rights Act and definitively abolished by Obama’s election.
The Help, both the novel and the film, is perhaps one of the purest distillations of a white saviour story packaged in such a way to flatter and comfort white audiences. Essentially, it is the story of an irrepressible young white woman (with red hair, of course) nicknamed Skeeter (played by Emma Stone) who chafes against the social conventions of 1963 Mississippi and dreams of being a writer and journalist. TL;DR: she ends up telling the stories of the “help,” the Black women working as domestic labour for wealthy families such as her own, publishes them—anonymously of course, though Skeeter is the credited author—and thus gets the traction she needs to leave Mississippi for a writing career.
I can’t get into all of the problems with this narrative in a footnote; hopefully I don’t need to enumerate them. But I will say that one of the key things that irked me about this story, both the novel and the movie, is how it constantly name-checks Harper Lee and To Kill a Mockingbird as Skeeter’s inspiration. Lee might have given us the archetype of the white saviour in the figure of Atticus Finch, but she did it at a moment in time (it was published in 1960) when the subject matter was dangerous (Mary Badham, who played Scout, found herself and her family shunned when they returned to Alabama after filming ended for having participated in a film that espoused civil rights). By contrast, The Help, which was published in 2009 and the film released in 2011, is about as safe a parable of racism can be in the present day—safely set in the most racist of southern states during the civil rights era, with satisfyingly vile racist villains and an endearing, attractive white protagonist whose own story of breaking gender taboos jockeys for pole position with her mission to give voice to “the help.”
 Though to be fair, in some instances this had as much to do with virtuoso performances by extremely talented actors, such as Denzel Washington, Morgan Freeman, and Andre Braugher in Glory. And not to heap yet more scorn on The Help, but the best thing that can be said about that film is that it gave Viola Davis and Octavia Spencer the visibility—and thus the future casting opportunities—that their talent deserves.
Why do I write this blog? Well, it certainly isn’t because I have a huge audience—most of my posts top out at 40-60 views, and many garner a lot less than that. Every so often I get signal boosted when one or more people share a post. The most I’ve ever had was when a friend posted a link of one I wrote aboutThe Wire and police militarization to Reddit, and I got somewhere in the neighbourhood of 1500 views. Huge by my standards, minuscule by the internet’s.
Not that I’m complaining. I have no compunction to chase clicks, or to do the kind of networking on Twitter that seems increasingly necessary to building an online audience, which also entails strategically flattering some audiences and pissing off others. The topics I write about are eclectic and occasional, usually the product of a thought that crosses my mind and turns into a conversation with myself. My posts are frequently long and sometimes rambling, which is also not the best way to attract readers.
Blogging for me has always been something akin to thinking out loud—like writing in a journal, except in a slightly more formal manner, with the knowledge that, however scant my audience is, I’m still theoretically writing for other people, and so my thoughts have to be at least somewhat coherent. And every so often I get a hit of dopamine when someone shares a post or makes a complimentary comment.
I started my first blog when I moved to Newfoundland as a means of giving friends and family a window into my new life here, without subjecting them to the annoyance of periodic mass emails. I posted in An Ontarian in Newfoundland for eight years, from 2005 to 2013, during which time it went from being a digest of my experiences in Newfoundland to something more nebulous, in which I basically posted about whatever was on my mind. I transitioned to this blog with the thought that I would focus it more on professional considerations—using it as a test-space for scholarship I was working on, discussions about academic life, and considerations of things I was reading or watching. I did do that … but then also inevitably fell into the habit of posting about whatever was on my mind, often with long stretches of inactivity that sometimes lasted months.
During the pandemic, this blog has become something akin to self-care. I’ve written more consistently in this past year than I have since starting my first blog (though not nearly as prolifically as I posted in that first year), and it has frequently been a help in organizing what have become increasingly inchoate thoughts while enduring the nadir of Trump’s tenure and the quasi-isolation enforced by the pandemic. I won’t lie: it has been a difficult year, and wearing on my mental health. Sometimes putting a series of sentences together in a logical sequence to share with the world brought some order to the welter that has frequently been my mind.
As we approach the sixth month of the Biden presidency and I look forward to my first vaccination in a week, you’d think there would be a calming of the mental waters. And there has been, something helped by the more frequent good weather and more time spent outside. But even as we look to be emerging from the pandemic, there’s a lot still plaguing my peace of mind, from my dread certainty that we’re looking at the end of American democracy, to the fact that we’re facing huge budget cuts in health care and education here in Newfoundland.
The Venn diagram of the thoughts preoccupying my mind has a lot of overlaps, which contributes to the confusion. There are so many points of connection: the culture war, which irks me with all of its unnuanced (mis)understandings of postmodernism, Marxism, and critical race theory; the sustained attack on the humanities, which proceeds to a large degree from the misperception that it’s all about “woke” indoctrination; the ways in which cruelty has become the raison d’être of the new Right; the legacy of the “peace dividend” of the 1990s, the putative “end of history,” and the legacy of post-9/11 governance leading us to the present impasse; and on a more hopeful note, how a new humanism practiced with humility might be a means to redress some of our current problems.
For about three or four weeks I’ve been spending part of my days scribbling endless notes, trying to bring these inchoate preoccupations into some semblance of order. Reading this, you might think that my best route would be to unplug and refocus; except that this has actually been energizing. It helps in a way that there is significant overlap with a handful of articles I’m working on, about (variously) nostalgia and apocalypse, humanism and pragmatism, the transformations of fantasy as a genre, and the figuration of the “end of history” in Philip Roth’s The Plot Against America and contemporary Trumpist figurations of masculinity.
(Yes, that’s a lot. I’m hoping, realistically, to get one completed article out of all that, possibly two).
With all the writing I’ve been doing, it has been unclear—except for the scholarly stuff—how best to present it. I’ve been toying with the idea of writing a short book titled The Idiot’s Guide to Postmodernism, which wouldn’t be an academic text but more of a user manual to the current distortions of the culture wars, with the almost certainly vain idea of reintroducing nuance into the discussion. That would be fun, but in the meantime I think I’ll be breaking it down into a series of blog posts.
Some of the things you can expect to see over the next while:
A three-part set of posts (coming shortly) on history, memory, and forgetting.
A deep dive into postmodernism—what it was, what it is, and why almost everyone bloviating about it and blaming it for all our current ills has no idea what they’re talking about.
A handful of posts about cruelty.
“Jung America”—a series of posts drawing a line from the “crisis of masculinity” of the 1990s to the current state of affairs with Trumpism and the likes of Jordan Peterson and Ben Shapiro.
At least one discussion about the current state of the humanities in the academy, as well as an apologia arguing why the humanities are as important and relevant now as they have ever been.
Phew. Knowing me, I might get halfway through this list, but we’ll see. Meantime, stay tuned.
 David Simon also left a complimentary comment on that one. Without a doubt, the highlight of my blogging career.
 Specifically, Jordan Peterson, but there are others who could use a primer to get their facts straight.
Over the past year or so, is has seemed as though whatever shadowy Deep State agencies responsible for covering up the existence of extraterrestrials have thrown up their hands and said “Yeah. Whatever.”
Perhaps the real-world equivalent of The X-Files Smoking Man finally succumbed to lung cancer, and all his subordinates just couldn’t be bothered to do their jobs any more.
Or perhaps the noise of the Trump presidency created the circumstances in which a tacit acknowledgement of numerous UFO sightings wouldn’t seem to be bizarre or world-changing.
One way or another, the rather remarkable number of declassified videos from fighter pilots’ heads-up-displays of unidentified flying objects of odd shapes and flying capabilities has evoked an equally remarkable blasé response. It’s as if the past four years of Trump, natural disasters, civil tragedies, and a once-in-a-century (touch wood) pandemic has so eroded our capacity for surprise that, collectively, we seem to be saying, “Aliens? Bring it.” Not even the QAnon hordes, for whom no event or detail is too unrelated not to be folded into the grand conspiracy have seen fit to make comment upon something that has so long been a favourite subject of conspiracists (“Aliens? But are they pedophile child sex-trafficking aliens?”).
Perhaps we’re all just a bit embarrassed at the prospect of alien contact, like having a posh and sophisticated acquaintance drop by when your place is an utter pigsty. I have to imagine that, even if the aliens are benevolent and peaceful, humanity would be subjected to a stern and humiliating talking-to about how we let our planet get to the state it’s in.
Not to mention that if they landed pretty much anywhere in the U.S., they’d almost certainly get shot at.
And imagine if they’d landed a year ago.
“Take us to your leader!” “Um … are you sure? Perhaps you should try another country.” “All right, how do I get to Great Britain?” “Ooh … no. You really don’t want that.” “Russia then? China? India? Hungary?” “Uh, no, no, no, and no.” “Brazil?” “A world of nope.” “Wait–what’s the one with the guy with good hair?” “Canada. But, yeah … probably don’t want to go there either. Maybe … try Germany?” “Wasn’t that the Hitler country?” “They got better.”
You’d think there would be more demand for the U.S. government to say more about these UFO sightings. The thing is, I’m sure that in some sections of the internet, there is a full-throated ongoing yawp all the time for that, but it hasn’t punctured the collective consciousness. And frankly, I don’t care enough to go looking for it.
It is weird, however, considering how we’ve always assumed that the existence of extraterrestrial life would fundamentally change humanity, throwing religious belief into crisis and dramatically transforming our existential outlook. The entire premise of Star Trek’s imagined future is that humanity’s first contact with the Vulcans forced a dramatic reset of our sense of self and others—a newly galactic perspective that rendered all our internecine tribal and cultural squabbles irrelevant, essentially at a stroke resolving Earth’s conflicts.
To be certain, there hasn’t been anything approaching definitive proof of alien life, so such epiphany or trauma lies only in a possible future. Those who speak with any authority on the matter are always careful to point out that “UFO” is not synonymous with “alien”—they’re not necessarily otherworldly, just unidentified.
I, for one, am deeply skeptical that these UFOs are of extraterrestrial origin—not because I don’t think it’s infinitesimally possible, just that the chances are in fact infinitesimal. In answer to the question of whether I think there’s life on other planets, my answer is an emphatic yes, which is something I base on the law of large numbers. The Milky Way galaxy, by current estimates, contains somewhere in the neighbourhood of 100 billion planets. Even if one tenth of one percent of those can sustain life, that’s still a million planets, and that in just one of the hundreds of billions of galaxies in the universe.
But then there’s the question of intelligence, and what comprises intelligent life. We have an understandably chauvinistic understanding of intelligence, one largely rooted in the capacity for abstract thought, communication, and inventiveness. We grant that dolphins and whales are intelligent creatures, but have very little means of quantifying that; we learn more and more about the intelligence of cephalopods like octopi, but again: such intelligences are literally alien to our own. The history of imagining alien encounters in SF has framed alien intelligence as akin to our own, just more advanced—developing along the same trajectory until interplanetary travel becomes a possibility. Dolphins might well be, by some metric we haven’t yet envisioned, far more intelligent than us, but they’ll never build a rocket—in part because, well, why would they want to? As Douglas Adams put it in The Hitchhiker’s Guide to the Galaxy, “man had always assumed that he was more intelligent than dolphins because he had achieved so much—the wheel, New York, wars and so on—whilst all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man—for precisely the same reasons.”
To put it another way, to properly imagine space-faring aliens, we have to imagine not so much what circumstances would lead to the development of space travel as how an alien species would arrive at an understanding of the universe that would facilitate the very idea of space travel.
Consider the thought experiment offered by Hans Blumenberg in the introduction of his book The Genesis of the Copernican World. Blumenberg points out that our atmosphere has a perfect density, “just thick enough to enable us to breath and to prevent us from being burned up by cosmic rays, while, on the other hand, it is not so opaque as to absorb entirely the light of the stars and block any view of the universe.” This happy medium, he observes, is “a fragile balance between the indispensable and the sublime.” The ability to see the stars in the night sky, he says, has shaped humanity’s understanding of themselves in relation to the cosmos, from our earliest rudimentary myths and models, to the Ptolemaic system that put us at the center of Creation and gave rise to the medieval music of the spheres, to our present-day forays in astrophysics. We’ve made the stars our oracles, our gods, and our navigational guides, and it was in this last capacity that the failings of the Ptolemaic model inspired a reclusive Polish astronomer named Mikołaj Kopernik, whom we now know as Copernicus.
But what, Blumenberg asks, if our atmosphere was too thick to see the stars? How then would have humanity developed its understanding of its place in the cosmos? And indeed, of our own world—without celestial navigation, how does seafaring evolve? How much longer before we understood that there was a cosmos, or grasped the movement of the earth without the motion of the stars? There would always of course be the sun, but it was always the stars, first and foremost, that inspired the celestial imagination. It is not too difficult to imagine an intelligent alien species inhabiting a world such as ours, with similar capabilities, but without the inspiration of the night sky to propel them from the surface of their planet.
Now think of a planet of intelligent aquatic aliens, or creatures that live on a gas giant that swim deep in its dense atmosphere.
Or consider the possibility that our vaunted intelligence is in fact an evolutionary death sentence, and that that is in fact the case for any species such as ourselves—that our development of technology, our proliferation across the globe, and our environmental depredations inevitably outstrip our primate brains’ capacity to reverse the worst effects of our evolution.
Perhaps what we’ve been seeing is evidence of aliens who have mastered faster than light or transdimensional travel, but they’re biding their time—having learned the dangers of intelligence themselves, they’re waiting to see whether we succeed in not eradicating ourselves with nuclear weapons or environmental catastrophe; perhaps their rule for First Contact is to make certain a species such as homo sapiens can get its shit together and resolve all the disasters we’ve set in motion. Perhaps their Prime Directive is not to help us, because they’ve learned in the past that unless we can figure it out for our own damned selves, we’ll never learn.
In the words of the late great comedian Bill Hicks, “Please don’t shoot at the aliens. They might be here for me.”
EDIT: Stephanie read this post and complained that I hadn’t worked in Monty Python’s Galaxy Song, so here it is:
 And indeed, Douglas Adams did imagine such a species in Life, the Universe, and Everything—an alien race on a planet surrounded by a dust cloud who live in utopian peace and harmony in the thought that they are the sum total of creation, until the day a spaceship crash-lands on their planet and shatters their illusion. At which point, on reverse engineering the spacecraft and flying beyond the dust cloud to behold the splendours of the universe, decide there’s nothing else for it but to destroy it all.
… and I don’t particularly mean that as a compliment.
Literally minutes before he is stabbed to death by a posse of conspiring senators, Shakespeare’s Julius Caesar declares himself to be the lone unshakeable, unmoving, stalwart man among his flip-flopping compatriots. He makes this claim as he arrogantly dismisses the petition of Metellus Cimber, who pleads for the reversal of his brother’s banishment. Cimber’s fellow conspirators echo his plea, prostrating themselves before Caesar, who finally declares in disgust,
I could be well moved if I were as you. If I could pray to move, prayers would move me. But I am constant as the northern star, Of whose true-fixed and resting quality There is no fellow in the firmament. The skies are painted with unnumbered sparks. They are all fire and every one doth shine, But there’s but one in all doth hold his place. So in the world. ‘Tis furnished well with men, And men are flesh and blood, and apprehensive, Yet in the number I do know but one That unassailable holds on his rank, Unshaked of motion. And that I am he.
Caesar mistakes the senators’ begging for weakness, not grasping that they are importuning him as a ploy to get close enough to stab him until it is too late.
Fear not, I’m not comparing Liz Cheney to Julius Caesar. I suppose you could argue that Cheney’s current anti-Trump stance is akin to Caesar’s sanctimonious declaration if you wanted to suggest that it’s more performative than principled. To be clear, I’m not making that argument—not because I don’t see it’s possible merits, but because I really don’t care.
I come not to praise Liz Cheney, whose political beliefs I find vile; nor do I come to bury her. The latter I’ll leave to her erstwhile comrades, and I confess I will watch the proceedings with a big metaphorical bowl of popcorn in my lap, for I will be a gratified observer no matter what the outcome. If the Trumpists succeed in burying her, well, I’m not about to mourn a torture apologist whose politics have always perfectly aligned with those of her father. If she soldiers on and continues to embarrass Trump’s sycophants by telling the truth, that also works for me.
Either way, I’m not about to offer encomiums for Cheney’s courage. I do think it’s admirable that she’s sticking to her guns, but as Adam Serwer recently pointed out in The Atlantic, “the [GOP’s] rejection of the rule of law is also an extension of a political logic that Cheney herself has cultivated for years.” During Obama’s tenure, she frequently went on Fox News to accuse the president of being sympathetic to jihadists, and just as frequently opined that American Muslims were a national security threat. During her run for a Wyoming Senate seat in 2014, she threw her lesbian sister Mary under the bus with her loud opposition to same-sex marriage, a point on which she stands to the right of her father. And, not to repeat myself, but she remains an enthusiastic advocate of torture. To say nothing of the fact that, up until the January 6th assault on the Capitol, was a reliable purveyor of the Trump agenda, celebrated then by such current critics as Steve Scalise and Matt Gaetz.
Serwer notes that the Cheney’s “political logic”—the logic of the War on Terror—is consonant with that of Trumpism not so much in policy as in spirit: the premise that there’s them and us, and that “The Enemy has no rights, and anyone who imagines otherwise, let alone seeks to uphold them, is also The Enemy.” In the Bush years, this meant the Manichaean opposition between America and Terrorism, and that any ameliorating sentiment about, say, the inequities of American foreign policy, meant you were With the Terrorists. In the present moment, the Enemy of the Trumpists is everyone who isn’t wholly on board with Trump. The ongoing promulgation of the Big Lie—that Biden didn’t actually win the election—is a variation of the theme of “the Enemy has no rights,” which is to say, that anyone who does not vote for Trump or his people is an illegitimate voter. Serwer writes:
This is the logic of the War on Terror, and also the logic of the party of Trump. As George W. Bush famously put it, “You are either with us or with the terrorists.” You are Real Americans or The Enemy. And if you are The Enemy, you have no rights. As Spencer Ackerman writes in his forthcoming book, Reign of Terror, the politics of endless war inevitably gives way to this authoritarian logic. Cheney now finds herself on the wrong side of a line she spent much of her political career enforcing.
All of which is by way of saying: Liz Cheney has made her bed. The fact that she’s chosen the hill of democracy to die on is a good thing, but this brings us back to my Julius Caesar allusion. The frustration being expressed by her Republican detractors, especially House Minority Leader Kevin McCarthy, is at least partially rational: she’s supposed to be a party leader, and in so vocally rejecting the party line, she’s not doing her actual job. She is being as constant as the Northern Star here, and those of us addicted to following American politics are being treated to a slow-motion assassination on the Senate (well, actually the House) floor.
But it is that constancy that is most telling in this moment. Cheney is anchored in her father’s neoconservative convictions, and in that respect, she’s something of a relic—an echo of the Bush years. As Serwer notes, however, while common wisdom says Trump effectively swept aside the Bush-Cheney legacy in his rise to be the presidential candidate, his candidacy and then presidency only deepened the bellicosity of Bush’s Us v. Them ethos, in which They are always already illegitimate. It’s just now that the Them is anyone opposed to Trump.
In the present moment, I think it’s useful to think of Liz Cheney as an unmoving point in the Republican firmament: to remember that her politics are as toxic and cruel as her father’s, and that there is little to no daylight between them. The fact that she is almost certainly going to lose both her leadership position and lose a primary in the next election to a Trump loyalist, is not a sign that she has changed. No: she is as constant as the Northern Star, and the Trump-addled GOP has moved around her. She is not become more virtuous; her party has just become so very much more debased.
What I’m Reading I first heard of Heather McGhee two or three years ago when she was interviewed on one of the political podcasts I listen to. She was then the president of Demos, a progressive think-tank focused on race and economics and strategies to strengthen American democracy; I was immediately impressed by how clearly and articulately she broke down the inextricability of race and economic policy, and the ways in which Republicans have successfully sold the idea to white voters of government spending as a zero-sum game in which every dollar that goes to help Black people and minorities is a dollar taken from them—and that government programs that help non-wealthy whites are somehow stealing from them to benefit inner-city Blacks. And hence, non-wealthy whites have become reliable Republican voters who vote against the own interests in election after election.
To be clear, this is not a new insight. President Lyndon B. Johnson himself, who signed the Civil Rights Act into being in 1964 and the Voting Rights Act the following year, knew that he was alienating a significant portion of his own party. “Well, we’ve lost the South,” he is reported to have said on signing the civil rights legislation; he also famously acknowledged the principle on which Richard Nixon would successfully court southern white Democrats: “If you can convince the lowest white man he’s better than the best colored man, he won’t notice you’re picking his pocket. Hell, give him somebody to look down on, and he’ll empty his pockets for you.”
What impressed me about McGhee was how clearly she laid out the historical narrative, as well as how convincingly she argued her central premise: that systemic racism hurts everyone, white people included. I don’t remember which podcast it was on which I originally heard her, but that’s become something of a moot point as since then she’s been on all the podcasts—especially lately since her book The Sum of Us: What Racism Costs Everyone and How We Can Prosper Together came out. Since that first podcast I heard, she resigned as Demos’ president and traveled the U.S., speaking to hundreds of experts, activists, historians, and ordinary people. The Sum of Us is the result, and it makes her original argument in an exhaustively detailed and forceful manner. It is an eminently readable book: personal without being subjective, wonky without losing itself in the weeds, and both rigorously historical while still relating straightforward stories that persuasively bring home the societal costs of systemic racism. The one example she shares in her interviews functions as the book’s central metaphor: starting in the 1920s, the U.S. invested heavily in public projects and infrastructure, one thing being the construction of public pools. During the Depression, Roosevelt’s Works Progress Administration (WPA) continued this trend, using such community investment to generate jobs. By the 1950s, towns and cities across the country boasted ever-larger and more lavish public pools, which became a point of pride for communities—pools large enough, in some cases, to admit thousands of swimmers.
But such pools were, of course, whites-only. With the advent of desegregation in the mid-late 50s, courts decreed that these public pools were legally obliged to admit Blacks. Town and city councils responded swiftly, voting to drain the pools and fill them in with dirt and seed them over with grass (in Georgia, the Parks and Recreation Department was simply eliminated, and was not resurrected for ten years). Affluent whites did not suffer: there was a concomitant boom in the construction of backyard pools and the establishment of private swimming clubs, which could effect de facto segregation by leaving membership decisions to the discretion of a governing board. But non-wealthy whites were suddenly left without a key option for summer recreation, all because their communities could not countenance sharing a publicly-funded pool with their Black citizens. In what is one of the pernicious elements of systemic racism, McGhee observes, many of the non-wealthy whites who could no longer bring their children to swim in one of these magnificent pools for free probably thought that this was a fair deal—better to go without than be obliged to share with people you’d been brought up to consider beneath you.
I am at present about halfway through The Sum of Us; look for a longer blog post when I’ve finished it. Meanwhile, I would suggest that this book should be required reading for our present moment.
What I’m Watching I wrote in my last post about how much I’m enjoying rewatching Battlestar Galactica, but as Stephanie and I took a hiatus from that show to binge The Mandalorian, so again we’re taking a hiatus to watch The Stand—the recent mini-series adaptation of Stephen King’s mammoth 1978 novel in which 99.9% of the world is wiped out by a weaponized superflu nicknamed “Captain Trips,” and the remaining people of the U.S. gather in two opposing communities. On one side are the forces of good, who have been drawn to Boulder, Colorado by dreams of a 108 year old Black woman named Mother Abigail. On the other are those drawn by promises of power, licentiousness, and revenge by the evil Randall Flagg, a denim-clad and cowboy-boot shod demon in human form, who establishes his new society in (of course) Las Vegas.
As I’ve discussed a few times on this blog, last term I taught a fourth year class on pandemic fiction; I did not include The Stand, in spite of the fact that it’s one of the few actual pandemic novels written prior to the 21st century, mainly because it is way too long (almost 1500 pages) to shoehorn into a semester-long course. Given its significance to the topic, however, I did record a short lecture in which I ran down the key themes and plot points (which you can watch here if you’re so inclined). But one of the things I found interesting in retrospect—I first read The Stand in high school, and then read it again when King published the unexpurgated version in 1990—after doing all the preparatory reading for my course, was how King transformed a story about a biological catastrophe into a Manichaean light v. dark, G v. E, cosmic battle royale with Mother Abigail as God’s surrogate and Randall Flagg as Satan’s proxy. While the novel meditated at length on the nature of civilization and how one pragmatically goes about rebuilding after the apocalypse—with 1500 pages, how could it not?—it is obvious that it’s the metaphysical war that most interested King.
We’re slightly more than halfway through the new adaptation, and quite enjoying it. It was quite badly reviewed; and while I can agree with some of the complaints, it has been on the whole well-adapted to the screen, and (mostly) well-cast. Alexander Skarsgard is at his menacing best as Randall Flagg, James Marsden is all wry southern charm as Texan Stu Redman, Greg Kinnear plays the professorial Glen Bateman with the right balance of pomposity and insight; Whoopi Goldberg basically plays Mother Abigail as a devoutly Christian Guinan with a head of white dreadlocks; my favourite however is Brad William Henke, who plays the mentally disabled Tom Cullen with a guileless, earnest simplicity that avoids stereotypes (those who watch Justified will recognize Henke from season two as Coover Bennett, a similarly mentally delayed character whose disability manifests instead in sociopathic violence).
There is much that is left out, and much that could have been done better, but on the whole it is a pretty satisfying adaptation of an intriguing but flawed novel (“intriguing but flawed” is how I’d characterize most of King’s oeuvre, but I suppose that is to be expected when you churn out an average of two brick-sized novels a year). If you like The Stand, or are just amenable to Stephen King more generally, I’d recommend this series.
What I’m Writing I recently dusted off an article-in-progress that had been mouldering for a year or two, on zombie apocalypse and celebrity; in a fit of energy I finished it and submitted it to a journal. I now have another that I’m looking to finish, on Emily St. John Mandel’s Station Eleven and nostalgia. Given that I’ll be doing that novel in both of my classes over the next two weeks, it seems an ideal time to return to it. Given that it is also about apocalypse, though of the non-zombie variety—and indeed about a civilization-ending pandemic—I’ve been trying to rewrite my introduction to put it in the context of the past year. It’s been slow going, not least because finding the right balance between the personal and the objective can be tricky when your aim is to submit it to a scholarly journal. But the overarching argument—that Mandel’s post-apocalyptic world in which the main characters are actors and musicians travelling between settlements to perform Shakespeare and classical music comprises a nostalgic desire to return to a pre-postmodern humanism—is, I think, a strong one. I just have to fill out a core section in which I discuss humanism in a more granular way.
(This process will also be useful, as it will give me a lot of lecture material).
On a related subject, I’ve also been working on a new essay on Terry Pratchett and Discworld. I have an article on Pratchett and his campaign for assisted dying coming out soon in a new collection; I’m trying to carry that momentum forward on a handful of Sir Terry essays, but the one I’ve been focusing on is a reading the Discworld novels in the context of the philosophy of American Pragmatism and what I’ve been calling the “magical humanism” exhibited in a lot of contemporary fantasy.
As much as I love Sir Terry’s writing, I find it difficult to write about it in a scholarly manner, for the basic reason that I find it difficult to find a focus and not end up running off madly in all directions. The essay I wrote for the collection, which came in way past deadline, needed to be cut from nine thousand words to six thousand (one of the essay’s blind reviewers said something to the effect of “this is obviously a piece of work gesturing to a much larger theory of Pratchett’s fiction,” which was at once both gratifying and true). Part of the problem is the iterative nature of the Discworld’s world-building: each of the forty-one novels is a standalone narrative, and with each new installment, Sir Terry modified and refined aspects of that world, but also returned to the same themes and preoccupations in such a way that it is close to impossible to discuss the political and philosophical preoccupations of a given novel without being obliged to reference a dozen others.
This isn’t the most conducive thing for my intellectual temperament, which at the best of times is digressive and inclined to run down whatever rabbit holes I find, until I realize that, several paragraphs of writing on, I’ve found myself discussing an entirely different topic. (Presumably, devoted readers of this blog will have noticed this). That being said, however, it is a pleasure to lose myself in this topic … not least because I increasingly see Sir Terry’s humanism as a necessary antidote to our present toxic political moment.
I did not watch Oprah Winfrey’s much-hyped interview of Prince Harry and Meghan Markle for much the same reason I did not watch the two most recent royal weddings: I didn’t care. Especially at this point in time, between marking a year of pandemic and the ongoing reverberations of the Trump presidency, the travails of Harry and Meghan—even inasmuch as I sympathize with them against the Royal Family—don’t really do much to excite my imagination or interest.
On the other hand, the fallout from the interview, coupled with related issues and events, has piqued my interest indeed. That people will be instinctively partisan for one party or the other is about as unsurprising as learning that some people in “the Firm” fretted about whether or not Meghan’s first child would be dark-complexioned. Racism in the Royal Family? Get away! But of course, this particular charge was picked up by right-wing pundits as further evidence of “cancel culture” at work, and we’ve been treated to the bizarre spectacle of self-described red-blooded American patriots rushing to the defense of HRM Queen Elizabeth II.1
Someone might want to remind them just what those boys at Lexington and Concord died for. Or perhaps tell them to watch Hamilton.
Notably, the one person emerging not just unscathed but burnished from the interview was the Queen herself—both Harry and Meghan were careful to say that none of the difficulties they’ve experienced emanated from her, and that she has indeed been the one person who is blameless (some reports have read between the lines and extrapolated that the Queen was prescient enough to have given Harry funds to see him through being cut off financially).
Leaving aside for the moment the possibility, or possibly even the likelihood, that this is entirely true, this sympathy is reflective of a broader reluctance to be critical of Elizabeth II. Even the 2006 film The Queen, starring Helen Mirren in the title role, which was all about the Palace’s cold and inept response to the shocking death of Diana, ended up painting a vaguely sympathetic portrait (though to be fair, that has a lot to do with the virtuosity of Helen Mirren). And The Crown (created and written by Peter Morgan, who wrote The Queen), which is largely unsparing of all the other royals and their courtiers, generally depicts Elizabeth as a victim of circumstance who spends her life doing her level best to do her royal duty and constrained by this very sense of duty from being a more compassionate and loving human.
The Queen is a person whom, I would argue, people tend to see through a nostalgic lens: nostalgia, in this case, for a form of stiff-upper-lip, keep-calm-and-carry-on Britishness memorialized in every WWII film ever—something seen as lost in the present day, along with Britannia’s status in the world. As we have seen in much of the pro-Brexit rhetoric, these two losses are not perceived as unrelated; and seeing Queen Elizabeth as the cornerstone of an ever-more-fractured Royal Family is a comforting anchor, but one that grows more tenuous as she ages.
There’s an episode in season four of The Crown that articulates this sensibility. In it, Elizabeth, having grown concerned that her children might not appreciate the scale and scope of the duties they’ve inherited, meets with each of them in turn and is perturbed by their feckless selfishness. Charles is in the process of destroying his marriage to Diana; Andrew is reckless in his passions; Anne is consumed by resentment and anger; and Edward is at once isolated by his royal status at school and indulgent in his royal privilege. Though her disappointment in her spawn is never put into words, it is obvious (Olivia Coleman can convey more with her facial expressions than I can in ten thousand words), and The Crown effectively indicts the younger generation of royals as unworthy of their status, and definitely unworthy of the throne.
This, I think, is where we’re at right now with Harry and Meghan’s interview. I’ve joked on occasion that “shocked but not surprised” should be the title of the definitive history of the Trump presidency, but it might also function as a general sentiment for this particular epoch. It is difficult, precisely, to put one’s finger on the substance of the outrage over Meghan’s revelations, aside from an instinctive royalist animus directed at anyone with the temerity to criticize the monarchy. This is why, perhaps, some (<cough> <cough> PIERS MORGAN <cough>) have simply chosen to call bullshit on Meghan Markle’s story of mental health issues and suicidal ideation;2 but it was the charge of racism that seems to have becomes the most ubiquitous bee in a whole lot of bonnets. Shocking, yes; surprising, no. The entire British colonial enterprise was predicated on the premise of white English supremacy, and royalty of all different nationalities has always been assiduous in policing their bloodlines. Prior to the divorce of Charles and Diana amid revelations of his relationship with Camilla Parker-Bowles, the greatest scandal the British monarchy had weathered was the abdication of Edward VIII so he could marry his American divorcée paramour, Wallis Simpson. Meghan Markle, it has been noted by many, ticks two of those scandalous boxes insofar as she is American and a divorcée.
She is also, to use postcolonial theorist Homi Bhabha’s phrasing, “not white/not quite.” Which is to say, she is biracial, and as such will never thus be qualified to be a royal in a stubborn subsection of the British cultural imagination.
The fascination many people have with the British Royal Family—especially among those who aren’t British—has always baffled me more than a little. But on honest reflection, I suppose I shouldn’t be baffled. In spite of the fact that hereditary monarchy is an objectively terrible form of governance, it is also one of the most durable throughout history. Human beings, it seems, are suckers for dynastic power, in spite of the illogic of its premise; as the late great Christopher Hitchens wryly observed, being the eldest son of a dentist does not somehow confer upon you the capacity to be a dentist. And yet down through the centuries, people have accepted that the eldest son (and occasionally daughter) of the current monarch had the right to assume the most important job in the nation on that monarch’s passing.
Of course, “right” and “ability” don’t always intersect, and there have been good, bad, and indifferent kings and queens down through history (of course, being democratically elected is no guarantee of governing ability, but at least the people have the option of cancelling your contract every few years). For every Henry V there’s a Richard III, and we’re equally fascinated by both, while mediocre kings and queens who preside over periods of relative peace don’t tend to get the dramatic treatment.
Indeed, on even just a brief reflection, it’s kind of amazing at just how pervasive the trope of monarchy is in literature and popular culture more broadly. It is unsurprising that Shakespeare, for example, would have made kings and queens the subject of many of his plays—that was, after all, the world in which he lived—but the persistence of hereditary monarchy in the 20th century cultural imagination is quite remarkable. It’s pretty much a staple of fantasy, as the very title of Game of Thrones attests; but where George R.R. Martin’s saga and its televisual adaptation are largely (but sadly not ultimately)3 rooted in a critique of the divine right of kings and the concept of the “chosen one,” the lion’s share of the genre rests in precisely the comfort bestowed by the idea that there is a true king out there perfectly suited to rule justly and peaceably.
More pervasive and pernicious than Shakespearean or Tolkienesque kings and queens, however, is the Disney princess-industrial complex. Granted, the fairy-tale story of the lowly and put-upon girl finding her liberatory prince pre-dates Walt Disney’s animated empire by centuries, but I think we can all agree that Disney has at once expanded, amplified, and sanded down the sharp edges of the princesses’ folkloric origins—all while inculcating in millions of children the twinned conceptions of royalistic destiny and the heteronormative gender roles associated with hereditary nobility (to be fair to Disney, it has done better with such recent excursions as Brave and Frozen—possibly the best endorsement of the latter’s progressiveness is the fact that Jordan Peterson loathes it). It’s telling that Disney’s most prominent branding image isn’t Mickey Mouse, but the Disney castle,4 a confection of airy spires and towers that any medievalist would tell you defeats the purpose of having a castle to start with. Even your more inept horde of barbarians would have little difficulty storming those paper-thin defenses, but then it’s not the bulwarks and baileys that are important, but the towers … the towers, built to entrap fair maidens until their rescuing princes can slip the lock or scale the wall.
I have to imagine that a large part of the obsession over royal weddings proceeds from precisely this happy-ending narrative on which the Mouse has built its house: the sumptuous spectacle of excess and adulation that evokes, time and again, Cinderella’s arrival at the ball. The disruption of this mythos is at once discomforting and titillating: Diana’s 1995 interview presaged Harry and Meghan’s with its revelations of constraint and isolation, and the active antagonism of both the Royal Family and its functionaries toward any sort of behaviour that might reflect badly upon it—even if that behaviour simply entailed seeking help for mental health issues. There have been many think-pieces breaking down which elements of The Crown are fact and which are fiction, but it is at this point fairly well established wisdom that being born a Windsor—or marrying into the family—is no picnic. And while Meghan’s claim that she never Googled Harry or his family strains credulity, I think it’s probably safe to say that no matter how much research one does, the realities of royal life almost certainly beggar the imagination.
Also, The Crown was only in its second season when Meghan married Harry.
I confess that, aside from the very first episode, I did not watch the first three seasons of The Crown, the principal reason being that I couldn’t get my girlfriend Stephanie into the show. While I may be more or less indifferent to the British monarchy, Stephanie is actively hostile5 to it. Born in South Africa, she and her family came to Canada when she was fourteen; having imbibed an antipathy to her birth nation’s colonizer that is far more diffuse in Canada, she gritted her teeth through the part of her citizenship oath in which she had to declare loyalty to the Queen. Her love of Gillian Anderson (Stephanie is, among her other endearing qualities, the biggest X-Files fan I’ve ever met) overcame her antipathy, however, for season four, and so we gleefully watched the erstwhile Agent Scully transformed into the Iron Lady spar with Olivia Colman’s Queen Elizabeth (we’re also pretty sympatico on our love of Olivia Colman). With each episode, we reliably said (a) Olivia, for the love of Marmite, don’t make us sympathetic with the Queen!; (b) Gillian, please don’t make us feel sympathy for/vague attraction to Margaret Thatcher!; and, (c) Holy crap, Emma Corrin looks so much like Lady Di!
It will be interesting to see The Crown catch up with the present moment. But I also have to wonder if some commentators are right when they say that the Harry and Meghan split from the Firm signals the end of the British monarchy? To my mind, by all rights it should: it’s long past time this vestige of colonial hubris went into that good night. We’ve got enough anti-democratic energy to deal with in the present moment without also concerning ourselves with a desiccated monarchy. When Queen Elizabeth dies, with her dies the WWII generation. The Second World War transformed the world in countless ways, one of them being that it spelled the end of the British Empire and the diminution of Great Britain’s influence in the world. Brexit is, among other things, a reactionary response to this uncomfortable reality, and a vain, desperate attempt to reassert Britannia’s greatness. Across the pond, fellow nativists in the U.S. have latched onto Meghan Markle’s accusations of racism to make common cause with the monarchy. Not, perhaps, because they’ve forgotten the lessons of 1776, but most likely because they never learned them to start with.
1. Perhaps the stupidest defense came from Fox and Friends’ co-host Brian Kilmeade, who opined that the fact that British Commonwealth countries are “75% Black and minority” demonstrated that the Royal Family could not possibly be racist. Leaving aside the pernicious history of colonialism and the kind of white paternalism epitomized by the Rudyard Kipling poem “White Man’s Burden,” can we perhaps agree that Kilmeade’s juxtaposition of “75%” and “minority” sort of gives the game away?
2. I’ve always felt that Piers Morgan was the result of a lab experiment in which a group of scientists got together to create the most perfect distillation of an asshole. Even if we grant his premise that Meghan Markle is, in fact, a status-seeking social climber who has basically Yoko Ono’ed Prince Harry out of the Royal Family, his constant ad hominem attacks on her say more about his own inadequacies than hers. And for the record, I do not grant his premise: to borrow his own turn of phrase, I wouldn’t believe Piers Morgan if he was reading a weather report.
3. We may never know how George R.R. Martin means to end his saga—at the rate he’s going, he’ll be writing into his 90s, and I don’t like his actuarial odds—but we do know how the series ended. The last-minute transformation of Daenerys into a tyrant who needed to be killed could conceivably have been handled better if the showrunners had allowed for two or three more episodes to bring us there; but the aftermath was also comparably rushed, and Sam Tarly’s democratic suggestion for egalitarian Westrosi governance was laughed off without any consideration. I will maintain to my dying day that GRRM effectively transformed fantasy, but also that he was too much in thrall to its core tropes to wander too far from their monarchical logic.
4. I recently bought a Disney+ streaming subscription in order to watch The Mandalorian. While writing this post, I remembered that Hamilton’s sole authorized video recording is Disney property. So of course I immediately clicked over to Disney+ to watch parts of it, and was treated to the irony of a play about the American revolutionary war to overthrow monarchical tyranny prefaced by Disney’s graphic of its castle adorned with celebratory fireworks.
5. When I read this paragraph to Stephanie, she liked all of it but objected to my use of the word “hostile.” “I don’t actually hate the Royal Family,” she said. “I don’t wish them harm. I just find the entire idea pointless and antiquated, and it embodies some of the worst aspects of British history.” So: she’s not hostile to the Royal Family, but I’m at a loss to find a better word, especially considering the invective she hurls at England during the World Cup.
I was reading an article about QAnon in Politico yesterday. This, as anyone who has read this blog over the past few months knows, is hardly unusual for me. What struck me about this article, however—“When QAnon Invades American Homes” by Anastasiia Carrier, which is all about people who have lost family members to the Q-cult—was a more profound sense of how this patently absurd conspiracy theory is genuinely infectious, indoctrinating people to the point where their closest loved ones have to decide whether to abandon them. The article tells the story of Emily and her husband Peter (not their real names); forced by the pandemic to work from home, Peter started going down the rabbit hole of QAnon message boards and YouTube videos. Emily was vaguely aware of QAnon, but it was only this past October that, slowly realizing the hold it had on her husband’s imagination, she sat down and watched a handful of videos he’d been talking about: “That was when she learned that her husband had been consumed by a complex and false conspiracy theory that accuses ‘deep state elites’ of running a secret pedophile ring. By then, it was too late to pull him out.”
Emily’s husband, whom she loved dearly and who she described as having previously been a compassionate and attentive man, had become a stranger to her, treating her with anger and disdain—sometimes in front of their children—when she pushed back on his newfound bigotry and the assertion that such people as Tom Hanks were pedophiles. “I was told that I buried my head in the sand and couldn’t see the ‘real’ problems,” she says.
Eventually, Emily found her way to a Redditt forum called “QAnonCasualties,” in which people like her who have had loved ones become obsessed with the absurd conspiracy theory share their experiences and console one another. Her relief at finding a space to share her grief was mitigated by just how many others like her there were:
Emily is just one of thousands who have found their way to r/QAnonCasualties. Started in 2019 by a Reddit user whose mother was a part of the “Qult,” the subreddit has ballooned in popularity over the past year,growing from less than a thousand followers in February 2020 to more than 133,000 in February 2021. The group’s followers more than doubled in the weeks following the Capitol riot alone. And as QAnon continues to spread—about 30 percent of Republicans have favorable views about the conspiracy theory, according to a January poll by YouGov—so does the forum’s reach.
Such numbers are shocking, not least because the basic elements of the QAnon conspiracy are so objectively absurd. It is, indeed, all too easy to dismiss QAnon: while it has become increasingly baroque in all its moving parts, its most basic premise is that Donald Trump has been working surreptitiously to foil a monstrous cabal that includes the Deep State, prominent Democrats (especially the Obamas and the Clintons) and the Hollywood elite, all of whom are accused of being pedophiles who sex-traffic children and drink their blood for the purpose of prolonging their lives. Some day soon (March is now the new forecast, apparently, after many disappointments) Trump will emerge to declare martial law and bring such malefactors as Hillary Clinton and Tom Hanks to justice. This much-anticipated event is referred to as “The Storm.”
Conspiracy theory and conspiracism is nothing new, especially not in American culture, a point made quite thoroughly in Richard Hofstadter’s landmark 1965 essay “The Paranoid Style in American Politics.” Like so much else in the age of social media, QAnon is not different in kind but in degree—it is a massive amplification of tendencies that have been around for centuries. That amplification is not merely one of size and scope, but also of its adherents’ devotion. As detailed in the Politico article, QAnon is very much a cult, and like most cults it features a leader in whom the cultists invest all of their hopes and adoration—Donald Trump. Indeed, if there is one aspect in which QAnon differs from most conspiracy theories, it is in its figuration of a saviour figure leading the fight against the malevolent conspirators.
What is also remarkable about QAnon is how it functions as an all-encompassing sort of “key to all mythologies” for the conspiracism-inclined, welcoming any and all other extant conspiracies: 9/11 trutherism, anti-vax rhetoric, the old chestnut about lizard people, anti-Semitic and white supremacist fantasies about malevolent globalists, paranoia about world government, “the Great Replacement,” and of course the more recent assertion that Biden’s election was the result of election fraud on a massive scale. The alacrity with which QAnon incorporates such disparate threads keeps me coming back around to Umberto Eco’s 1988 novel Foucault’s Pendulum, which now comes to seem prophetic—not least because, like all good prophecies, it deals entirely with things that have already happened.
The novel is about a trio of young, overeducated and underemployed graduate students, who find themselves working at a scam publishing house. The publisher’s business model is to lavish praise on submitted manuscripts—which find their way there because they’ve been rejected by all respectable publishers for being ludicrous, awful, or clinically insane—and then charge the starry-eyed authors an exorbitant sum to publish their books (with the assurance that their inevitable massive success will soon earn their investment back). They then only print a fraction of the run promised while pocketing the extra cash.
As you might imagine, the manuscript submissions they receive are largely the work of execrable novelists and crackpots—many of whom in the latter category are conspiracy theorists determined to share with the public their earth-shattering exposés of the Templars, the Illuminati, the Freemasons, the Elders of Zion, or a host of other shadowy cabals responsible for anything and everything that happens in the world. Our trio of disaffected intellectuals—Belbo, Casaubon, and Diotallevi—are predictably disdainful of these authors, referring to them as the “Diabolicals.” For their own entertainment, they create a narrative-building computer program into which they input the plots outlined in these manuscripts, building them into a massive, overarching conspiracy theory they simply call The Plan.
TL;DR: the Diabolicals catch wind of The Plan, and conclude that these too-clever-by-half smartarses actually hold the key to the secrets they’ve been seeking all this time. Determined to know the “truth” of The Plan, they pursue our heroes, whose lives are now in danger.
Or to put it another way: our heroes create a conspiracy theory so compelling that all those who “want to believe” essentially give it substance through their belief.
In this respect, Foucault’s Pendulum tells a story in six hundred pages that Jorge Luis Borges told in less than fifteen. In Tlön, Uqbar, Orbis Tertius, the narrator stumbles across a secret project begun by an eccentric American millionaire to exhaustively imagine a planet—“Tlön”—over forty volumes of an encyclopedia, because “he wanted to demonstrate to this nonexistent God that mortal man was capable of conceiving a world.” When the encyclopedia makes it out into the world, people are so captivated by the planet of Tlön that they allow it to infect their minds and displace reality:
Almost immediately, reality yielded on more than one account. The truth is that it longed to yield. Ten years ago any symmetry with a semblance of order—dialectical materialism, anti-Semitism, Nazism—was sufficient to entrance the minds of men. How could one do other than submit to Tlön, to the minute and vast evidence of an orderly planet?
Borges’ allegory is not less troubling for being heavy-handed; neither is Eco’s (whose debt to Borges is writ large in all his fiction). QAnon might be a cult, but it is a cult that needs no suave and persuasive recruiters who target vulnerable new acolytes—that work is done by the algorithms of social media, and the ease with which reality yields in our current cultural and political environment. In Foucault’s Pendulum, the character of Casaubon outlines the basic rules for constructing a conspiracy theory:
Rule One: Concepts are connected by analogy. There is no way to decide at once whether an analogy is good or bad, because to some degree everything is connected to something else. For example, potato crosses with apple, because both are vegetable and round in shape … Rule Two says that if tout se tient in the end, then the connecting works … So it’s right. Rule Three: the connections must not be original. They must have been made before, and the more often the better, by others. Only then do the crossings seem true, because they are Obvious.
Tout se tient—“everything fits.” Or, as Thomas Pynchon phrased it in Gravity’s Rainbow (aka the Ulysses of conspiracy novels), “paranoia … is the leading edge of the awareness that everything is connected.” Paranoia lends itself, ironically, to inclusivity; almost anything can function as evidence for the truth of one’s paranoid projections. One of the most striking examples of this was detailed by Michael Kelly in a New Yorker article from 1995, titled “The Road to Paranoia,” in which he profiled the Militia of Montana (MOM), one of the many anti-government paramilitary groups that proliferated in the 1990s. The militia’s bible was what they called “the Blue Book,” which purported to contain the proof of the U.S. government’s ultimate plot to disenfranchise American citizens, take their guns, and accede to world government under the U.N. As Kelly observed, however, the Blue Book was in fact
an ordinary three-ring binder to which [MOM] is always busily adding what [they] regard as further evidence of conspiracy, so that it bulges like an eccentric lawyer’s briefcase with scraps of this and that, from here and there, which purport to show that the globalists’ scheme to subvert American sovereignty and American citizens to vassalage is in its final hours.
Exhibits in “The Blue Book” ranged from newspaper clippings to UN development reports (in which the conspirators openly discuss world government), photographs of the notorious black helicopters, and an illustrated map of the US taken from the back of a 1993 Kix cereal box. MOM’s leaders declared that the division of the states shown in this last item—eleven regions, such as the mountain region, the coasts, the Heartland, etc.—was “a representation of the New World Order plan for dividing the United States into regional departments after the invaders emerge to take over the country.”
The Militia of Montana’s Blue Book is as apt a metaphor for QAnon’s all-encompassing umbrella of conspiracism as any, though it’s probably safe to say that the sheer volume of connections it makes probably wouldn’t fit in a single binder—as some industrious chart-makers have shown us.
The most troubling aspect of the Borges/Eco allegory is the prospect of how easy it would be for QAnon to become reality. I don’t mean that somehow the power of its adherents’ belief could literally transform the Obamas and Clintons into pedophiles—hopefully that’s obvious—but how it could become the accepted reality under certain circumstances. The ubiquity of QAnon followers taking part in the Capitol assault should give us pause, almost as much as the assault itself should. The numbers cited in the Politico article most likely reflect a spectrum ranging from passionate believers to people who don’t necessarily buy into the Q myth, but who wouldn’t be surprised to find out it is true; one doesn’t have to imagine a violent coup to overthrow the Biden Administration, but a 2024 election in which Trump cruises back to power with a supine Justice Department infested with Q-cultists, who begin legal proceedings against all of Q’s villains. The unrest that would greet such a scenario would be met by armed Trumpists who spent the previous four years nursing their sense of grievance and hatreds, and martial law could be invoked … at which point the show trials of the Deep Staters, the pedophile Democrats, and Hollywood elite could proceed. Reality would yield.
I want to be clear that I don’t think this is a likely scenario. It is, indeed, a highly unlikely scenario. But in a nation where thirty percent of Republicans find amenable the idea that Hillary Clinton drinks the blood of children, it is not unimaginable.
In the virtuosic opening sequence of Francis Ford Coppola’s The Godfather, we are treated to a sumptuous and lavish wedding reception. Present is Michael Corleone (Al Pacino) and his girlfriend Kay (Diane Keaton). Michael wears a Marines dress uniform; the Second World War has recently ended. In his conversation with Kay, Michael notes that his father Vito Corleone (Marlon Brando), a powerful mafia Don, was none too pleased with Michael’s enlistment. At the end of Godfather II, we get a flashback to the moment when Michael reveals to his brothers that he has enlisted. His eldest brother Sonny (James Caan) is incensed—immediately prior to Michael’s revelation, he had been going on about how people volunteering to go fight in the war were “saps,” because they’re risking their lives “for strangers.” Michael counters, “They’re risking their lives for their country.” “Your country ain’t your blood,” Sonny snaps back. “Don’t you forget that!”
Within the limited economy of organized crime, especially crime organized around family connections, Sonny’s perspective makes a certain amount of sense. If there is no higher loyalty than family, and the family’s prosperity, signing up to fight for a country whose laws the family in question flouts and rejects is, at best, an act of stupidity; at worst, of betrayal.
Michael Corleone will of course become the new Don by the end of the first Godfather, and become one of the most compellingly villainous anti-heroes in American film. But at the outset, nothing sets him apart from his family more, and nothing invites audience sympathy more, than his uniform. One of the great tensions at work in the Godfather films is this contrast between the blood ties of family and the abstractions of nation—given that politics is depicted as being as much of a scam as the rackets of organized crime, with law enforcement and elected officials all having their own price, the only real, authentic connections one has are to family. But like much art that challenges prevailing cultural mythologies, The Godfather uses Michael’s enlistment and his family’s anger at it in the service of troubling audience perception—the contrast between the gorgeous wedding reception and the usually-universal approbation attached to a WWII uniform, here the mark of Michael’s shame.
This week in The Atlantic, Jeffrey Goldberg published an article citing a variety of anonymous sources who quote Donald Trump echoing Sonny Corleone on numerous occasions. Referencing the time Trump cancelled a visit to the Aisne-Marne American cemetery in 2018, during the centennial of the end of WWI in Paris, because of “rain,” Goldberg writes:
Trump rejected the idea of the visit because he feared his hair would become disheveled in the rain, and because he did not believe it important to honor American war dead, according to four people with firsthand knowledge of the discussion that day. In a conversation with senior staff members on the morning of the scheduled visit, Trump said, “Why should I go to that cemetery? It’s filled with losers.” In a separate conversation on the same trip, Trump referred to the more than 1,800 marines who lost their lives at Belleau Wood as “suckers” for getting killed.
I’m beginning to think that “Shocked but not Surprised” should be the title of the definitive history of the Trump presidency.
The White House has of course denied these allegations and condemned Goldberg’s article, so I suppose we must in good faith acknowledge the possibility that his sources were, for reasons passing understanding, all lying. And indeed, it would be difficult to credit that any sitting U.S. president would voice such thoughts aloud. To anybody. Ever. Even if they were genuine sentiments.
But of course we’ve heard such things from Trump before, such as when he denied that John McCain was a war hero, because “I like people who weren’t captured.” Or his attacks on the parents of Humayun Khan, an Army officer was killed in Iraq, for having the temerity to speak for the Democrats at the 2016 convention. Or, as was detail in Philip Rucker and Carol Leonig’s book A Very Stable Genius, when he called the assembled Pentagon brass who were trying to give him a crash course in geopolitics, “a bunch of dopes and losers.” Or, when he saw that the White House flags had been lowered to half-staff when John McCain died, he exploded, “What the fuck are we doing that for? Guy was a fucking loser.”
The point here isn’t that we unequivocally owe military personnel our respect and reverence—I would, indeed, argue that the “thank you for your service” default setting, which allows people to be utterly unreflective about the nuances of uniformed life, is about as harmful as the reflexive hostility of anti-Vietnam protesters—but to observe, with an exhausted sigh, that these most impolitic of thoughts on Trump’s part only serve to reinforce what we know about the President’s narcissism and sociopathic self-regard. Perhaps the most appalling distillation of this is summed up by another tidbit from Goldberg’s article:
On Memorial Day 2017, Trump visited Arlington National Cemetery, a short drive from the White House. He was accompanied on this visit by John Kelly, who was then the secretary of homeland security, and who would, a short time later, be named the White House chief of staff. The two men were set to visit Section 60, the 14-acre area of the cemetery that is the burial ground for those killed in America’s most recent wars. Kelly’s son Robert is buried in Section 60. A first lieutenant in the Marine Corps, Robert Kelly was killed in 2010 in Afghanistan. He was 29. Trump was meant, on this visit, to join John Kelly in paying respects at his son’s grave, and to comfort the families of other fallen service members. But according to sources with knowledge of this visit, Trump, while standing by Robert Kelly’s grave, turned directly to his father and said, “I don’t get it. What was in it for them?”
Part of me doesn’t want to believe this. Not just the sentiment, but the fact that anyone would say as much to the father of a dead soldier while standing at the graveside.
Part of my doesn’t want to believe it, but at the same time, I have no difficulty believing it—not because, as some might charge, I’m invested in thinking the worst about Trump, but because he has, during the five years since announcing him campaign, given me no reason whatsoever to disbelieve it. Everything Trump does is transactional—as I said in my last post, he has made it obvious that his worldview is absolutely zero-sum. To paraphrase John Goodman in The Big Lebowski: say what you will about Sonny Corleone, but at least he had an ethos. Trump’s business and Trump’s presidency have both been compared more times than I can count to mafia-style operations; but to my mind, Trump et al are the mob at the end of the movie, when all of the original bonds of family and loyalty have been frayed by corruption and graft and over-reach. There was a time when I thought Goodfellas was a superior film to TheGodfather and its sequels, because it seemed to me that, where Coppola romanticized his mafiosi, Scorsese was far more clear-eyed in depicting their sociopathy and moral bankruptcy. I’ve since come around on that: the decline and fall of Michael Corleone is the subtler of the two tales, not least because it showed how, criminal though the Corleone enterprise might have been, it had its roots not just in family, but a community at once ignored and victimized by an indifferent nation.
If the history of Fred Trump, Sr. and his successor Donald has showed anything, it’s that there was never that originary matrix of family and community. As Mary Trump details, the Trump legacy was never not zero-sum.
What was in it for them? Trump isn’t just asking that of dead soldiers. He’s asking it of anybody who does anything not just for themselves. He’s asking it of anybody who takes on a job whose labour and effort exceeds the fiscal reward—teachers, nurses, EMTs, firefighters, social workers—but which is done in the name of helping others, of serving a community. In Trump’s worldview, they’re all suckers.
And I think what’s most frustrating in the present moment is that his most ardent supporters—those that aren’t rich, that is—don’t grasp that he thinks the same thing about them. He loves their adulation, but was never about to stand for four hours of selfies like Elizabeth Warren did, or hear their stories. His attitude is best summed up by Bono, of all people, who told Jimmy Kimmel, “”He likes to see their faces in the crowd, but I don’t think he wants to know who they are when they go home.”
Of course he doesn’t. They’re not people to be served, they’re means to an end.
I’d been wracking my brain trying to figure out how to theme the fourth-year contemporary American literature seminar I’m teaching in the Fall, when I realized the obvious topic was right in front of me: Pandemic Fiction! Having taught a course a few years ago on post-apocalyptic narratives, I already had a handful of titles under my belt. A quick internet search yielded an embarrassment of riches, and I put in an online order for some that seemed likely candidates.
(Possibilities not pictured: Katherine Ann Porter’s 1938 novella about the Spanish Flu, Pale Horse, Pale Rider; Jack London’s weird post-plague dystopia The Scarlet Plague ; and Philip Roth’s last novel, Nemesis , about a polio outbreak in 1945 New Jersey).
When I mentioned on Facebook that I’d decided on pandemic fiction for my course, the response was pretty uniformly enthusiastic. Some people asked me to post the reading list when I’d finalized it; a few others, some of them former students, wistfully said that would be a course they’d love to take. And more than one person said it would likely be a course that would draw in a lot of students.
I think it will, but I also think there will be a not-insignificant number of students who will, as I commented back, “avoid it like the plague.” (The bad joke was unintentional, but apt). For everyone who might welcome the perspective a course on pandemic fiction might offer on our current moment, I’m sure are those who would much rather not either revisit the coronavirus experience or deal with such fictionalizations during an ongoing crisis (fingers crossed pretty damn hard for the first eventuality).
It’s an odd quirk of human idiosyncrasies that some of us lean into fictional figurations of crisis in response to the experience of a real one, while others most emphatically do not. It makes me think of the way in which, after September 11th, Clear Channel distributed a memo to all its radio stations listing the songs they were to avoid playing because they might evoke thoughts of the attack (including some truly bizarre choices, like “We Gotta Get Out of this Place,” or risible ones, like “Walk Like An Egyptian”), and movie studios froze production or postponed release of films depicting terrorism or large-scale destruction; meanwhile, video stores (remember those?) reported that movies like Armageddon and Independence Day were constantly being rented. In the present moment, one of the highest-trending offerings on Netflix has consistently been Contagion, Stephen Soderbergh’s 2010 film about a pandemic. I have seen a significant number of discussions of this sort of thing on social media, i.e. people soliciting pandemic/apocalypse/dystopian themed isolation viewing, as well as people voicing incredulity that anyone would want to watch or read such stuff in the present moment.
I suppose it doesn’t come as a great galloping shock that I fall into the former category, and not just because I need to read a bunch of titles and make final reading list decisions before the call comes from the English Department to submit our book orders for the Fall (usually, that happens early-mid May). Speaking personally, it’s not the fictional representations of pandemic that bother my soul, but the daily news that makes me afraid for my blood pressure. As I mentioned in my initial “isolated thoughts” post, what narrative tends to offer is catharsis; it is, to paraphrase Fredric Jameson (my copy of The Political Unconscious is currently in my campus office and thus inaccessible), the symbolic resolution of irreconcilable real-world contradictions … even when that resolution entails something putatively negative, like a pair of sclerotic old men in a Beckett play, or the wholesale destruction of society in a zombie apocalypse.
In the latter, at least you might get to use a crossbow.