Why do I write this blog? Well, it certainly isn’t because I have a huge audience—most of my posts top out at 40-60 views, and many garner a lot less than that. Every so often I get signal boosted when one or more people share a post. The most I’ve ever had was when a friend posted a link of one I wrote aboutThe Wire and police militarization to Reddit, and I got somewhere in the neighbourhood of 1500 views. Huge by my standards, minuscule by the internet’s.
Not that I’m complaining. I have no compunction to chase clicks, or to do the kind of networking on Twitter that seems increasingly necessary to building an online audience, which also entails strategically flattering some audiences and pissing off others. The topics I write about are eclectic and occasional, usually the product of a thought that crosses my mind and turns into a conversation with myself. My posts are frequently long and sometimes rambling, which is also not the best way to attract readers.
Blogging for me has always been something akin to thinking out loud—like writing in a journal, except in a slightly more formal manner, with the knowledge that, however scant my audience is, I’m still theoretically writing for other people, and so my thoughts have to be at least somewhat coherent. And every so often I get a hit of dopamine when someone shares a post or makes a complimentary comment.
I started my first blog when I moved to Newfoundland as a means of giving friends and family a window into my new life here, without subjecting them to the annoyance of periodic mass emails. I posted in An Ontarian in Newfoundland for eight years, from 2005 to 2013, during which time it went from being a digest of my experiences in Newfoundland to something more nebulous, in which I basically posted about whatever was on my mind. I transitioned to this blog with the thought that I would focus it more on professional considerations—using it as a test-space for scholarship I was working on, discussions about academic life, and considerations of things I was reading or watching. I did do that … but then also inevitably fell into the habit of posting about whatever was on my mind, often with long stretches of inactivity that sometimes lasted months.
During the pandemic, this blog has become something akin to self-care. I’ve written more consistently in this past year than I have since starting my first blog (though not nearly as prolifically as I posted in that first year), and it has frequently been a help in organizing what have become increasingly inchoate thoughts while enduring the nadir of Trump’s tenure and the quasi-isolation enforced by the pandemic. I won’t lie: it has been a difficult year, and wearing on my mental health. Sometimes putting a series of sentences together in a logical sequence to share with the world brought some order to the welter that has frequently been my mind.
As we approach the sixth month of the Biden presidency and I look forward to my first vaccination in a week, you’d think there would be a calming of the mental waters. And there has been, something helped by the more frequent good weather and more time spent outside. But even as we look to be emerging from the pandemic, there’s a lot still plaguing my peace of mind, from my dread certainty that we’re looking at the end of American democracy, to the fact that we’re facing huge budget cuts in health care and education here in Newfoundland.
The Venn diagram of the thoughts preoccupying my mind has a lot of overlaps, which contributes to the confusion. There are so many points of connection: the culture war, which irks me with all of its unnuanced (mis)understandings of postmodernism, Marxism, and critical race theory; the sustained attack on the humanities, which proceeds to a large degree from the misperception that it’s all about “woke” indoctrination; the ways in which cruelty has become the raison d’être of the new Right; the legacy of the “peace dividend” of the 1990s, the putative “end of history,” and the legacy of post-9/11 governance leading us to the present impasse; and on a more hopeful note, how a new humanism practiced with humility might be a means to redress some of our current problems.
For about three or four weeks I’ve been spending part of my days scribbling endless notes, trying to bring these inchoate preoccupations into some semblance of order. Reading this, you might think that my best route would be to unplug and refocus; except that this has actually been energizing. It helps in a way that there is significant overlap with a handful of articles I’m working on, about (variously) nostalgia and apocalypse, humanism and pragmatism, the transformations of fantasy as a genre, and the figuration of the “end of history” in Philip Roth’s The Plot Against America and contemporary Trumpist figurations of masculinity.
(Yes, that’s a lot. I’m hoping, realistically, to get one completed article out of all that, possibly two).
With all the writing I’ve been doing, it has been unclear—except for the scholarly stuff—how best to present it. I’ve been toying with the idea of writing a short book titled The Idiot’s Guide to Postmodernism, which wouldn’t be an academic text but more of a user manual to the current distortions of the culture wars, with the almost certainly vain idea of reintroducing nuance into the discussion. That would be fun, but in the meantime I think I’ll be breaking it down into a series of blog posts.
Some of the things you can expect to see over the next while:
A three-part set of posts (coming shortly) on history, memory, and forgetting.
A deep dive into postmodernism—what it was, what it is, and why almost everyone bloviating about it and blaming it for all our current ills has no idea what they’re talking about.
A handful of posts about cruelty.
“Jung America”—a series of posts drawing a line from the “crisis of masculinity” of the 1990s to the current state of affairs with Trumpism and the likes of Jordan Peterson and Ben Shapiro.
At least one discussion about the current state of the humanities in the academy, as well as an apologia arguing why the humanities are as important and relevant now as they have ever been.
Phew. Knowing me, I might get halfway through this list, but we’ll see. Meantime, stay tuned.
 David Simon also left a complimentary comment on that one. Without a doubt, the highlight of my blogging career.
 Specifically, Jordan Peterson, but there are others who could use a primer to get their facts straight.
Over the past year or so, is has seemed as though whatever shadowy Deep State agencies responsible for covering up the existence of extraterrestrials have thrown up their hands and said “Yeah. Whatever.”
Perhaps the real-world equivalent of The X-Files Smoking Man finally succumbed to lung cancer, and all his subordinates just couldn’t be bothered to do their jobs any more.
Or perhaps the noise of the Trump presidency created the circumstances in which a tacit acknowledgement of numerous UFO sightings wouldn’t seem to be bizarre or world-changing.
One way or another, the rather remarkable number of declassified videos from fighter pilots’ heads-up-displays of unidentified flying objects of odd shapes and flying capabilities has evoked an equally remarkable blasé response. It’s as if the past four years of Trump, natural disasters, civil tragedies, and a once-in-a-century (touch wood) pandemic has so eroded our capacity for surprise that, collectively, we seem to be saying, “Aliens? Bring it.” Not even the QAnon hordes, for whom no event or detail is too unrelated not to be folded into the grand conspiracy have seen fit to make comment upon something that has so long been a favourite subject of conspiracists (“Aliens? But are they pedophile child sex-trafficking aliens?”).
Perhaps we’re all just a bit embarrassed at the prospect of alien contact, like having a posh and sophisticated acquaintance drop by when your place is an utter pigsty. I have to imagine that, even if the aliens are benevolent and peaceful, humanity would be subjected to a stern and humiliating talking-to about how we let our planet get to the state it’s in.
Not to mention that if they landed pretty much anywhere in the U.S., they’d almost certainly get shot at.
And imagine if they’d landed a year ago.
“Take us to your leader!” “Um … are you sure? Perhaps you should try another country.” “All right, how do I get to Great Britain?” “Ooh … no. You really don’t want that.” “Russia then? China? India? Hungary?” “Uh, no, no, no, and no.” “Brazil?” “A world of nope.” “Wait–what’s the one with the guy with good hair?” “Canada. But, yeah … probably don’t want to go there either. Maybe … try Germany?” “Wasn’t that the Hitler country?” “They got better.”
You’d think there would be more demand for the U.S. government to say more about these UFO sightings. The thing is, I’m sure that in some sections of the internet, there is a full-throated ongoing yawp all the time for that, but it hasn’t punctured the collective consciousness. And frankly, I don’t care enough to go looking for it.
It is weird, however, considering how we’ve always assumed that the existence of extraterrestrial life would fundamentally change humanity, throwing religious belief into crisis and dramatically transforming our existential outlook. The entire premise of Star Trek’s imagined future is that humanity’s first contact with the Vulcans forced a dramatic reset of our sense of self and others—a newly galactic perspective that rendered all our internecine tribal and cultural squabbles irrelevant, essentially at a stroke resolving Earth’s conflicts.
To be certain, there hasn’t been anything approaching definitive proof of alien life, so such epiphany or trauma lies only in a possible future. Those who speak with any authority on the matter are always careful to point out that “UFO” is not synonymous with “alien”—they’re not necessarily otherworldly, just unidentified.
I, for one, am deeply skeptical that these UFOs are of extraterrestrial origin—not because I don’t think it’s infinitesimally possible, just that the chances are in fact infinitesimal. In answer to the question of whether I think there’s life on other planets, my answer is an emphatic yes, which is something I base on the law of large numbers. The Milky Way galaxy, by current estimates, contains somewhere in the neighbourhood of 100 billion planets. Even if one tenth of one percent of those can sustain life, that’s still a million planets, and that in just one of the hundreds of billions of galaxies in the universe.
But then there’s the question of intelligence, and what comprises intelligent life. We have an understandably chauvinistic understanding of intelligence, one largely rooted in the capacity for abstract thought, communication, and inventiveness. We grant that dolphins and whales are intelligent creatures, but have very little means of quantifying that; we learn more and more about the intelligence of cephalopods like octopi, but again: such intelligences are literally alien to our own. The history of imagining alien encounters in SF has framed alien intelligence as akin to our own, just more advanced—developing along the same trajectory until interplanetary travel becomes a possibility. Dolphins might well be, by some metric we haven’t yet envisioned, far more intelligent than us, but they’ll never build a rocket—in part because, well, why would they want to? As Douglas Adams put it in The Hitchhiker’s Guide to the Galaxy, “man had always assumed that he was more intelligent than dolphins because he had achieved so much—the wheel, New York, wars and so on—whilst all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man—for precisely the same reasons.”
To put it another way, to properly imagine space-faring aliens, we have to imagine not so much what circumstances would lead to the development of space travel as how an alien species would arrive at an understanding of the universe that would facilitate the very idea of space travel.
Consider the thought experiment offered by Hans Blumenberg in the introduction of his book The Genesis of the Copernican World. Blumenberg points out that our atmosphere has a perfect density, “just thick enough to enable us to breath and to prevent us from being burned up by cosmic rays, while, on the other hand, it is not so opaque as to absorb entirely the light of the stars and block any view of the universe.” This happy medium, he observes, is “a fragile balance between the indispensable and the sublime.” The ability to see the stars in the night sky, he says, has shaped humanity’s understanding of themselves in relation to the cosmos, from our earliest rudimentary myths and models, to the Ptolemaic system that put us at the center of Creation and gave rise to the medieval music of the spheres, to our present-day forays in astrophysics. We’ve made the stars our oracles, our gods, and our navigational guides, and it was in this last capacity that the failings of the Ptolemaic model inspired a reclusive Polish astronomer named Mikołaj Kopernik, whom we now know as Copernicus.
But what, Blumenberg asks, if our atmosphere was too thick to see the stars? How then would have humanity developed its understanding of its place in the cosmos? And indeed, of our own world—without celestial navigation, how does seafaring evolve? How much longer before we understood that there was a cosmos, or grasped the movement of the earth without the motion of the stars? There would always of course be the sun, but it was always the stars, first and foremost, that inspired the celestial imagination. It is not too difficult to imagine an intelligent alien species inhabiting a world such as ours, with similar capabilities, but without the inspiration of the night sky to propel them from the surface of their planet.
Now think of a planet of intelligent aquatic aliens, or creatures that live on a gas giant that swim deep in its dense atmosphere.
Or consider the possibility that our vaunted intelligence is in fact an evolutionary death sentence, and that that is in fact the case for any species such as ourselves—that our development of technology, our proliferation across the globe, and our environmental depredations inevitably outstrip our primate brains’ capacity to reverse the worst effects of our evolution.
Perhaps what we’ve been seeing is evidence of aliens who have mastered faster than light or transdimensional travel, but they’re biding their time—having learned the dangers of intelligence themselves, they’re waiting to see whether we succeed in not eradicating ourselves with nuclear weapons or environmental catastrophe; perhaps their rule for First Contact is to make certain a species such as homo sapiens can get its shit together and resolve all the disasters we’ve set in motion. Perhaps their Prime Directive is not to help us, because they’ve learned in the past that unless we can figure it out for our own damned selves, we’ll never learn.
In the words of the late great comedian Bill Hicks, “Please don’t shoot at the aliens. They might be here for me.”
EDIT: Stephanie read this post and complained that I hadn’t worked in Monty Python’s Galaxy Song, so here it is:
 And indeed, Douglas Adams did imagine such a species in Life, the Universe, and Everything—an alien race on a planet surrounded by a dust cloud who live in utopian peace and harmony in the thought that they are the sum total of creation, until the day a spaceship crash-lands on their planet and shatters their illusion. At which point, on reverse engineering the spacecraft and flying beyond the dust cloud to behold the splendours of the universe, decide there’s nothing else for it but to destroy it all.
… and I don’t particularly mean that as a compliment.
Literally minutes before he is stabbed to death by a posse of conspiring senators, Shakespeare’s Julius Caesar declares himself to be the lone unshakeable, unmoving, stalwart man among his flip-flopping compatriots. He makes this claim as he arrogantly dismisses the petition of Metellus Cimber, who pleads for the reversal of his brother’s banishment. Cimber’s fellow conspirators echo his plea, prostrating themselves before Caesar, who finally declares in disgust,
I could be well moved if I were as you. If I could pray to move, prayers would move me. But I am constant as the northern star, Of whose true-fixed and resting quality There is no fellow in the firmament. The skies are painted with unnumbered sparks. They are all fire and every one doth shine, But there’s but one in all doth hold his place. So in the world. ‘Tis furnished well with men, And men are flesh and blood, and apprehensive, Yet in the number I do know but one That unassailable holds on his rank, Unshaked of motion. And that I am he.
Caesar mistakes the senators’ begging for weakness, not grasping that they are importuning him as a ploy to get close enough to stab him until it is too late.
Fear not, I’m not comparing Liz Cheney to Julius Caesar. I suppose you could argue that Cheney’s current anti-Trump stance is akin to Caesar’s sanctimonious declaration if you wanted to suggest that it’s more performative than principled. To be clear, I’m not making that argument—not because I don’t see it’s possible merits, but because I really don’t care.
I come not to praise Liz Cheney, whose political beliefs I find vile; nor do I come to bury her. The latter I’ll leave to her erstwhile comrades, and I confess I will watch the proceedings with a big metaphorical bowl of popcorn in my lap, for I will be a gratified observer no matter what the outcome. If the Trumpists succeed in burying her, well, I’m not about to mourn a torture apologist whose politics have always perfectly aligned with those of her father. If she soldiers on and continues to embarrass Trump’s sycophants by telling the truth, that also works for me.
Either way, I’m not about to offer encomiums for Cheney’s courage. I do think it’s admirable that she’s sticking to her guns, but as Adam Serwer recently pointed out in The Atlantic, “the [GOP’s] rejection of the rule of law is also an extension of a political logic that Cheney herself has cultivated for years.” During Obama’s tenure, she frequently went on Fox News to accuse the president of being sympathetic to jihadists, and just as frequently opined that American Muslims were a national security threat. During her run for a Wyoming Senate seat in 2014, she threw her lesbian sister Mary under the bus with her loud opposition to same-sex marriage, a point on which she stands to the right of her father. And, not to repeat myself, but she remains an enthusiastic advocate of torture. To say nothing of the fact that, up until the January 6th assault on the Capitol, was a reliable purveyor of the Trump agenda, celebrated then by such current critics as Steve Scalise and Matt Gaetz.
Serwer notes that the Cheney’s “political logic”—the logic of the War on Terror—is consonant with that of Trumpism not so much in policy as in spirit: the premise that there’s them and us, and that “The Enemy has no rights, and anyone who imagines otherwise, let alone seeks to uphold them, is also The Enemy.” In the Bush years, this meant the Manichaean opposition between America and Terrorism, and that any ameliorating sentiment about, say, the inequities of American foreign policy, meant you were With the Terrorists. In the present moment, the Enemy of the Trumpists is everyone who isn’t wholly on board with Trump. The ongoing promulgation of the Big Lie—that Biden didn’t actually win the election—is a variation of the theme of “the Enemy has no rights,” which is to say, that anyone who does not vote for Trump or his people is an illegitimate voter. Serwer writes:
This is the logic of the War on Terror, and also the logic of the party of Trump. As George W. Bush famously put it, “You are either with us or with the terrorists.” You are Real Americans or The Enemy. And if you are The Enemy, you have no rights. As Spencer Ackerman writes in his forthcoming book, Reign of Terror, the politics of endless war inevitably gives way to this authoritarian logic. Cheney now finds herself on the wrong side of a line she spent much of her political career enforcing.
All of which is by way of saying: Liz Cheney has made her bed. The fact that she’s chosen the hill of democracy to die on is a good thing, but this brings us back to my Julius Caesar allusion. The frustration being expressed by her Republican detractors, especially House Minority Leader Kevin McCarthy, is at least partially rational: she’s supposed to be a party leader, and in so vocally rejecting the party line, she’s not doing her actual job. She is being as constant as the Northern Star here, and those of us addicted to following American politics are being treated to a slow-motion assassination on the Senate (well, actually the House) floor.
But it is that constancy that is most telling in this moment. Cheney is anchored in her father’s neoconservative convictions, and in that respect, she’s something of a relic—an echo of the Bush years. As Serwer notes, however, while common wisdom says Trump effectively swept aside the Bush-Cheney legacy in his rise to be the presidential candidate, his candidacy and then presidency only deepened the bellicosity of Bush’s Us v. Them ethos, in which They are always already illegitimate. It’s just now that the Them is anyone opposed to Trump.
In the present moment, I think it’s useful to think of Liz Cheney as an unmoving point in the Republican firmament: to remember that her politics are as toxic and cruel as her father’s, and that there is little to no daylight between them. The fact that she is almost certainly going to lose both her leadership position and lose a primary in the next election to a Trump loyalist, is not a sign that she has changed. No: she is as constant as the Northern Star, and the Trump-addled GOP has moved around her. She is not become more virtuous; her party has just become so very much more debased.
There is a moment early in the film The Death of Stalin in which, as the titular dictator lays dying, the circle of Soviet officials just beneath Stalin (Khrushchev, Beria, Malenkov) panic at the prospect of finding a reputable doctor to treat him. Why? Because a few years earlier, Stalin, in a fit of characteristic paranoia, had become convinced that doctors were conspiring against him, and he had many of them arrested, tortured, and killed.
I thought of this cinematic moment—the very definition of gallows humour—while reading an article by Peter Wehner in The Atlantic observing that part of the appeal of QAnon (the number of whose adherents have, counter-intuitively perhaps, inflated since Biden’s election) is precisely because of its many disparate components. “I’m not saying I believe everything about Q,” the article quotes one Q follower as saying. “I’m not saying that the JFK-Jr.-is-alive stuff is real, but the deep-state pedophile ring is real.”
As [Sarah Longwell, publisher of The Bulwark] explained it to me, Trump supporters already believed that a “deep state”—an alleged secret network of nonelected government officials, a kind of hidden government within the legitimately elected government—has been working against Trump since before he was elected. “That’s already baked into the narrative,” she said. So it’s relatively easy for them to make the jump from believing that the deep state was behind the “Russia hoax” to thinking that in 2016 Hillary Clinton was involved in a child-sex-trafficking ring operating out of a Washington, D.C., pizza restaurant.
If you’ll recall, the “Deep State” bogeyman was central to Steve Bannon’s rhetoric during his tenure early in the Trump Administration, alongside his antipathy to globalism. The two, indeed, were in his figuration allied to the point of being inextricable, which is also one of the key premises underlying the QAnon conspiracy. And throughout the Trump Administration, especially during his two impeachments and the Mueller investigation, the spectre of the Deep State was constantly blamed as the shadowy, malevolent force behind any and all attempts to bring down Donald Trump (and was, of course, behind the putative fraud that handed Joe Biden the election).
Now, precisely why this article made me think of this moment in The Death of Stalin is a product of my own weird stream of consciousness, so bear with me: while I’ve always found Bannon & co.’s conspiracist depiction of the Deep State more than a little absurd, so too I’ve had to shake my head whenever any of Trump’s detractors and critics declare that there’s no such thing as a Deep State.
Because of course there’s a deep state, just one that doesn’t merit ominous capitalization. It also doesn’t merit the name “deep state,” but let’s just stick with that now for the sake of argument. All we’re really talking about here is the vast and complex bureaucracy that sustains any sizable human endeavour—universities to corporations to government. And when we’re talking about the government of a country as large as the United States, that bureaucracy is massive. The U.S. government employs over two million people, the vast majority of them civil servants working innocuous jobs that make the country run. Without them, nothing would ever get done.
Probably the best piece of advice I ever received as a university student was in my very first year of undergrad; a T.A. told me to never ask a professor about anything like degree requirements or course-drop deadlines, or, really, anything to do with the administrative dimension of being a student. Ask the departmental secretaries, he said. In fact, he added, do your best to cultivate their respect and affection. Never talk down to them or treat them as the help. They may not have a cluster of letters after their name or grade your papers, but they make the university run.
I’d like to think that I’m not the kind of person who would ever be the kind of asshole to berate secretaries or support staff, but I took my T.A.’s advice to heart, and went out of my way to be friendly and express gratitude, to be apologetic when I brought them a problem. It wasn’t long before I was greeted with smiles whenever I had paperwork that needed processing, and I never had any issues getting into courses (by contrast, in my thirty years in academia from undergrad to grad student to professor, I have seen many people—students and faculty—suffer indignities of mysterious provenance because they were condescending or disrespectful to support staff).
The point here is that, for all the negative connotations that attach to bureaucracy, it is an engine necessary for any institution or nation to run. Can it become bloated and sclerotic? Of course, though in my experience that tends to happen when one expands the ranks of upper management. But when Steve Bannon declared, in the early days of the Trump Administration, that his aim was “the deconstruction of the administrative state,” I felt a keen sense of cognitive dissonance in that statement—for the simple reason that there is no such thing as a non-administrative state.
Which brings us back, albeit circuitously, to The Death of Stalin. There is no greater example of a sclerotic and constipated bureaucracy than that of the former Soviet Union, a point not infrequently made in libertarian and anti-statist arguments for small government. But I think the question that rarely gets raised when addressing dysfunctional bureaucracy—at least in the abstract—is why is it dysfunctional? There are probably any number of reasons why that question doesn’t come up, but I have to imagine that a big one is because we’ve been conditioned to think of bureaucracy as inevitably dysfunctional—a sense reinforced by every negative encounter experienced when renewing a driver’s license, waiting on hold with your bank, filing taxes, dealing with governmental red tape, or figuring out what prescriptions are covered by your employee health plan. But a second question we should ask when having such negative experiences is: are they negative because of an excess of bureaucracy, or too little? The inability of Stalin’s minions to find a competent doctor is a profound metaphor for what happens when we strip out the redundancies in a given system—in this case, the state-sponsored murder of thousands of doctors because of a dictator’s paranoia, such that one is left with (at best) mediocre medical professionals too terrified of state retribution to be dispassionately clinical, which is of course what one needs from a doctor.
I’m not a student of the history of the U.S.S.R., so I have no idea if anyone has written about whether the ineptitude of the Soviet bureaucracy was a legacy of Stalinist terror and subsequent Party orthodoxy, in which actually competent people were marginalized, violently or otherwise; I have to assume there’s probably a lot of literature on the topic (certainly, Masha Gessen’s critical review of the HBO series Chernobyl has something to say on the subject). But there’s something of an irony in the fact that Republican administrations since that of Ronald Reagan have created their own versions of The Death of Stalin’s doctor problem through their evisceration of government. Reagan famously said that the nine most frightening words were “I’m from the government, and I’m here to help,” and since then conservative governments—in the U.S., Canada, and elsewhere—have worked hard to make that a self-fulfilling prophecy. Thomas Frank, author of What’s the Matter With Kansas? (2004) has chronicled this tendency, in which Republican distrust of government tends to translate into the rampant gutting of social services, governmental agencies from the Post Office to the various cabinet departments, which then dramatically denudes the government’s ability to do anything. All of the failures that then inevitably occur are held up as proof of the basic premise of government’s inability to get anything right (and that therefore its basic services should be outsourced to the private sector).
In my brief moments of hope I wonder if perhaps the Trump Administration’s explicit practice of putting hacks and incompetent loyalists in key positions (such as Jared Kushner’s bizarrely massive portfolio) made this longstanding Republican exercise too glaring to ignore or excuse. Certainly, the contrast between Trump’s band of lickspittles and Biden’s army of sober professionals is about the most glaring difference we’ve seen between administrations, ever. What I hope we’re seeing, at any rate, is the reconstruction of the administrative state.
And it’s worth noting that Dr. Anthony Fauci has been resurrected from Trump’s symbolic purge of the doctors.
I haven’t blogged in a while, which is partially to do with the fact that I’d more or less forgotten the whole reason I started these “A Few Things” posts to start with—namely, that at any given time I have a handful of unfinished posts on a variety of topics, but which don’t see the light of day because I have difficulty finishing them to my satisfaction. So I’ve had to remind myself that not every one of my blogworthy thoughts needs two or three thousand words; a quick(ish) precis will often serve.
With that being said, here are a few things that were in the hopper …
Vaccine Envy and the Spectacular Vindication of Max Brooks Way back when, shortly after the earth cooled (or so it feels), Canadians could and did feel somewhat smug about our response to the pandemic in comparison to the U.S. … generally speaking, Canadians did not fight the quarantine restrictions, and we watched as the Trump Administration flailed about getting everything exactly wrong, and we took pride in our system of free health care and our much lower infection rates. This was a time when even the much-loathed Doug Ford seemed to step up to the challenge, garnering grudging nods of respect from people like me for his no-nonsense response to COVID (more on this below).
Well, as they say, that was then … it’s not as though the U.S. is any less politically polarized, but the Biden Administration seems determined to remind the world what America can do when competent leadership directing competent experts puts the system into overdrive and expends massive resources to solve a problem. Back in those early days of Canadian smugness, author Max Brooks was much in demand on podcasts; Brooks, who happens to be the son of Mel Brooks, is most famous for his novel World War Z: An Oral History of the Zombie War, which has become one of the classics of the zombie apocalypse genre. Though the novel is global in its scope—chronicling how a zombie plague would play out worldwide—the narrative spine deals with the American response. The TL;DR is that the U.S. is caught on its heels through a combination of cynical politics, a complacent and apathetic populace, and self-interested capitalists, but that once the people come to understand the enormity and gravity of the threat, they come together and rediscover the value of sacrifice, hard work, and community, and ultimately stage the most effective response in the world.
The novel is a quite explicit love letter to the Greatest Generation and the New Deal era. Max Brooks is himself quite clear on this point in interviews, talking about how his parents were both survivors of the Great Depression, and were schooled by the Second World War (Mel Brooks was in the Army Engineer Corps and was responsible for disarming land mines). In those interviews he gave in the early days of the pandemic, he talked about how many of the nations that had responded well, such as South Korea and Taiwan, did so because they lived under fairly constant threat, and so were the most primed to respond quickly. The U.S., he said, takes time to (a) become cognizant of a threat, (b) get its shit together, but (c) always makes up for early stumbles and becomes a world leader. Many of those interviewing him, in those early days of the Trump pandemic fecklessness, voiced skepticism.
But, well, it looks as though Brooks’ perspective has been borne out, especially considering that acute vaccine envy I seem to experience daily when my American friends post their vax selfies.
The Limits of Bullying To return to the topic of Doug Ford …
About a year ago I started writing a blog post about how, though there is little more I loathe in this world than a bully, sometimes in the right context a bully is what you need. I was writing this, as might be obvious, apropos of the grudging respect being given Doug Ford in Ontario and the outsized adulation lavished on Governor Andrew Cuomo of New York. Sometimes in moments of crisis, I mused, having someone you might otherwise dislike for their braggadocio lay down the law offered a strange form of comfort.
It should go without saying that I am now extremely happy I did not write that post.
Though to be fair, it proved to be something of a non-starter. I had written maybe two paragraphs when the obvious counter-argument made me trail off—that is, that though Ford and Cuomo seemed to be doing an effective job in those early days, they were hugely outnumbered by leaders who did not need bellicosity to get the job done (not uncoincidentally, many of these leaders, like Angela Merkel and Jancinda Ardhern, are women).
It has now been over a year since the coronavirus upended the world, and there are few examples of early effective responses that have not met reversals—though few more spectacular than Cuomo and Ford. Cuomo’s example is a good object lesson in the fact that being a bully and an asshole is only effective if you can deliver the goods; as we have learned in the past weeks, he wasn’t delivering the goods so much as obscuring his failures, and once the double-whammy of his COVID missteps and the critical mass of women he has harassed became clear, there weren’t many people left who had his back. Turns out, if you spend your career being an asshole, you accrue a lot of people who are more than willing to stick the knife in once your fortunes change.
Doug Ford, on the other hand, is a very different case. Whatever else you might say about Andrew Cuomo, he’s not an utter moron. Ford’s problem isn’t so much that he’s an asshole and a bully, it’s that he’s a monstrously stupid asshole and bully. He’s so obviously in over his head that I’d feel sorry for him if he weren’t so contemptible. His appeal has always been the same species as Trump’s, which is that a segment of the population who feel victimized by “elites,” by the mandarins of the Liberal Party and the CBC, and by increasingly diverse and vocal Ontarians, elected him specifically to be their bully … which is not a task that requires much in the way of tactical shrewdness or intellectual depth, just the ability to infuriate the Left and deliver arrogant verbal smackdowns in press conferences.
There’s an irony in the fact that Ford’s appeal lies at least in part in the truism that the most satisfying way to deal with a bully is to sic a bigger and badder bully on them—but COVID-19 is also a bully, and doesn’t discriminate.
Biden Departs Afghanistan I may return to this in a longer post at a later date, as it’s something I’ve been thinking a great deal about. After twenty years, the U.S. presence in Afghanistan will be brought home. The announcement evoked the predictable storm in media and social media, with some celebrating Biden’s decision, some expressing ambivalence, and many calling it disastrous.
Biden responded with an eminently sensible question of his own: if not now, when? Unless the U.S. is going to commit to a permanent presence in Afghanistan as the necessary price of stabilizing the country, there’s no other withdrawal timeline that makes sense. What’s somewhat galling about the castigations of Biden’s announced withdrawal is how likely it is that a good number of his critics almost certainly do tacitly endorse a permanent occupation … but of course won’t say as much because such an admission would be politically toxic. The American presence in Afghanistan has always been a little like the weird existential state of being a smoker—with only one or two notable exceptions, every single smoker I have ever known in my life indulged in the habit on the assumption that they were going to quit, of course … someday. The American presence in Afghanistan was always predicated on it eventually ending. There were, of course, end-conditions: destroying Al-Qaeda in the country, building of a self-sufficient, competent Afghan defense force, and solidifying a non-corrupt democratic government, for example. Check that first box, but the other two are as unlikely today as they have been for twenty years.
One thing the announcement of the withdrawal has done is make me mentally revisit those early years of the Bush Administration—the shock and trauma of 9/11, the quasi-hopeful aftermath when the world rallied behind the U.S., the prospect that the targeted, multilateral incursion into Afghanistan would eliminate Al-Qaeda and bin Laden, and that would be that.
It was a nice thought. But no: Bush’s neoconservative brain trust declared the War on Terror, rolled back civil rights with the Patriot Act, and instead of finishing off bin Laden at Tora Bora in December 2001, let him escape as they turned their focus to Saddam Hussein and Iraq.
There are two things that stick in my mind as I read the various excoriations of Biden for leaving Afghanistan. One is that a war that dragged on for an attenuated twenty years originally had an extremely limited scope, and was meant to end upon achieving the specific goal of killing or capturing bin Laden. The other was that the precipitating event that started the war was the result of an avoidable intelligence failure that occurred in part because the Bush team were dismissive of the warnings the Clinton Administration left for them, as well as breakdowns between warring fiefdoms in the C.I.A. and F.B.I. (a breakdown meticulously chronicled in Lawrence Wright’s excellent book The Looming Tower)
Even with these issues within the intelligence community, there were numerous red flags that were raised, to the point where CIA director George Tenet, interviewed by Bob Woodward, recalled musing immediately after the attacks, “I wonder if it has anything to do with this guy taking pilot training.” And let us not forget the notorious memo George W. Bush was given titled “Bin Laden Determined to Strike In U.S.”
The point here is that 9/11—the entire reason for the war in Afghanistan—had been preventable. An emboldened Taliban and reconstituted Al-Qaeda potentially pose the same threat as they did in the late 1990s, which puts the onus on the intelligence community to fix the problems it developed back then.
As I mentioned in my last post, we’re currently watching The Stand, a mini-series adaptation of Stephen King’s 1978 novel. Though both the novel and the series are quite graphic in their depictions of the weaponized superflu that wipes out most of the world—and quite clear on just how the pathogen escaped its experimental military facility—ultimately, the flu itself is ancillary to the substance of the story, which is about a showdown between the forces of good and those of evil.
As I enter the last two weeks of classes for the term and reflect back on the texts we’ve done in my graduate seminar on 21stC post-apocalyptic narratives, some of which have overlapped with my Utopias & Dystopias course, and the course I taught last term on pandemic fiction, I’m struck by how often it has been the case that the catastrophe precipitating these stories has been, more often than not, simply a device to clear the decks for what comes next. I really have no reason to be so struck by this fact, given that I titled my grad course “The Spectre of Catastrophe,” specifically because it focuses on narratives preoccupied less with the catastrophe itself than the aftermath. But it occurs to me that when the catastrophe—be it a viral outbreak, asteroid strike, alien invasion, or whatever—is the focus of the story, it’s usually because it will be resolved by the end. It is, in such instances, the focus of the action, not the setup for the action.
By contrast, many of the post-apocalyptic narratives I’ve been looking at this year often go out of their way to be vague about the nature of the precipitating catastrophe. Your average zombie apocalypse has little to say about what caused the dead to rise—few offer even as much exposition as 28 Days Later prologue, in which eco-terrorists storm a lab and inadvertently free monkeys who have been infected with the Rage virus, or the vague suggestion in Night of the Living Dead that zombies are the result of radiation from a satellite. Shaun of the Dead offers the most perfect parody of this tendency, when at the end, again safely ensconced on his couch, Shaun flips through channels on the TV reporting on the aftermath of the zombie plague and changes the channel before anyone can offer an explanation for how it happened.
Even the arguably bleakest post-apocalyptic narrative of the past twenty years—Cormac McCarthy’s The Road—deliberately frustrates readers keen to know how the world of the novel ended up a blasted wasteland. So too Emily St. John Mandel’s far more hopeful Station Eleven is conspicuously uninterested in the details of the “Georgia Flu” that devastates the world. The comparable pandemic in Kevin Brockmeier’s The Brief History of the Dead is less significant for how it wipes out the world’s population than for the simple fact that it does, which allows for the rather intriguing vision of the afterlife on which the novel is premised. And even all of the gross, granular detail with which Stephen King endows “Captain Trips,” the superflu that wipes out 99.9% of America (and presumably the world, but that’s a speculation for a future post), is ultimately a bit of misdirection as the novel then settles into its aforementioned epic battle between Light and Dark.
In all of these examples, catastrophe plays the narrative role of what film nerds like me call a “MacGuffin,” a concept most associated with Alfred Hitchcock. A MacGuffin, Hitchcock said, is something the characters find important, but the audience doesn’t care about—something that precipitates the action, but is ultimately ancillary to it, like the Maltese falcon of The Maltese Falcon, a putatively priceless statuette that drives the plot, but which is never found; or, more specific to Hitchcock, the money Marion Crane (Janet Leigh) steals from her employer in Psycho, but which is less important than the fact that in fleeing her crime she ends up at the Bates Motel.
Catastrophe has become one of our more prevalent MacGuffins: how the world ends is more or less incidental to what comes afterward. There are of course exceptions to this rule; this post comes about in part from my notes for my grad class on Monday, in which we’re finishing Ling Ma’s novel Severance and starting Mandel’s Station Eleven. The latter, as I mention above, is exemplary of this tendency to use catastrophe as a plot device; Severance, by contrast, is far more interested in the particulars of the pandemic that collapses civilization, mainly because the particulars of the “Shen Fever” are tied closely to the novel’s themes of nostalgia and home. The infectiousness of the disease, in which people are reduced to mindless automatons repeating rituals from their former lives, is most prevalent among those given to nostalgia. Candace Chen, the novel’s protagonist, came to the U.S. at a young age when her parents immigrated from China; in the novel’s present, both of her parents have died, and she feels equally not at home in New York City and her home region of China, which she visits on business trips. Feeling generally rootless and untethered makes Candace ironically immune to the disease.
This thematic connection between the catastrophic pandemic and Candace’s situation—which in the novel is also more broadly representative of the millennial experience of late capitalism—makes the cause and particulars of the catastrophe central to the novel, in that both the characters and the audience care about it; this also makes Severance an instructive outlier in a burgeoning sub-genre full of catastrophic MacGuffins.
A large part of the reason for this relates to one of my overarching arguments about contemporary post-apocalyptic narratives, which is also one of the premises of my course: namely, that the preoccupation with the aftermath of catastrophe is indicative of a breakdown of faith and trust in government—not a new phenomenon by any means, but something that became supercharged by the Bush Administration’s post-9/11 failures in Iraq, the debacle of Hurricane Katrina, the increasing polarization of politics and culture, all culminating in the election of Donald Trump. Trump was an is a catastrophic figure, and while I mean that quite literally, it’s important to keep in mind that being symbolically catastrophic—i.e. being seen by his followers as a bomb that would demolish “the Establishment”—was a huge part of his appeal. To many people, Trump is an expression of what I’ve been terming “hopeful nihilism,” which is also an animating factor in many post-apocalyptic narratives. Hopeful nihilism is the flip side of what Lauren Berlant terms “cruel optimism,” a condition in which the object of your desire is actually an obstacle to your flourishing; hopeful nihilism is the belief that burning down and destroying the present system clears the decks for a freer and more authentic existence in the wreckage.
Hence, the general lack of interest in the nature of the catastrophe in these stories: the important thing isn’t the why and how, but the simple fact of civilization’s end. The catastrophe is a MacGuffin; the important thing is what happens next, and how the characters negotiate the circumstances. It may be that in time that we see Trump in similar terms, which would actually go a long way towards explaining how a fatuous, preening New Yorker billionaire became the symbol of defiance for a rump of resolutely “anti-elitist” people; perhaps the particulars of the Trumpian catastrophe were less important than the fact of it. It is, I admit, a comforting thought, as it suggests a huge difficulty for those who want to step into Trump’s role going forward—especially if (and it’s a big if) the Biden Administration can take steps to re-establish people’s faith in government to help its people.
What I’m Reading I first heard of Heather McGhee two or three years ago when she was interviewed on one of the political podcasts I listen to. She was then the president of Demos, a progressive think-tank focused on race and economics and strategies to strengthen American democracy; I was immediately impressed by how clearly and articulately she broke down the inextricability of race and economic policy, and the ways in which Republicans have successfully sold the idea to white voters of government spending as a zero-sum game in which every dollar that goes to help Black people and minorities is a dollar taken from them—and that government programs that help non-wealthy whites are somehow stealing from them to benefit inner-city Blacks. And hence, non-wealthy whites have become reliable Republican voters who vote against the own interests in election after election.
To be clear, this is not a new insight. President Lyndon B. Johnson himself, who signed the Civil Rights Act into being in 1964 and the Voting Rights Act the following year, knew that he was alienating a significant portion of his own party. “Well, we’ve lost the South,” he is reported to have said on signing the civil rights legislation; he also famously acknowledged the principle on which Richard Nixon would successfully court southern white Democrats: “If you can convince the lowest white man he’s better than the best colored man, he won’t notice you’re picking his pocket. Hell, give him somebody to look down on, and he’ll empty his pockets for you.”
What impressed me about McGhee was how clearly she laid out the historical narrative, as well as how convincingly she argued her central premise: that systemic racism hurts everyone, white people included. I don’t remember which podcast it was on which I originally heard her, but that’s become something of a moot point as since then she’s been on all the podcasts—especially lately since her book The Sum of Us: What Racism Costs Everyone and How We Can Prosper Together came out. Since that first podcast I heard, she resigned as Demos’ president and traveled the U.S., speaking to hundreds of experts, activists, historians, and ordinary people. The Sum of Us is the result, and it makes her original argument in an exhaustively detailed and forceful manner. It is an eminently readable book: personal without being subjective, wonky without losing itself in the weeds, and both rigorously historical while still relating straightforward stories that persuasively bring home the societal costs of systemic racism. The one example she shares in her interviews functions as the book’s central metaphor: starting in the 1920s, the U.S. invested heavily in public projects and infrastructure, one thing being the construction of public pools. During the Depression, Roosevelt’s Works Progress Administration (WPA) continued this trend, using such community investment to generate jobs. By the 1950s, towns and cities across the country boasted ever-larger and more lavish public pools, which became a point of pride for communities—pools large enough, in some cases, to admit thousands of swimmers.
But such pools were, of course, whites-only. With the advent of desegregation in the mid-late 50s, courts decreed that these public pools were legally obliged to admit Blacks. Town and city councils responded swiftly, voting to drain the pools and fill them in with dirt and seed them over with grass (in Georgia, the Parks and Recreation Department was simply eliminated, and was not resurrected for ten years). Affluent whites did not suffer: there was a concomitant boom in the construction of backyard pools and the establishment of private swimming clubs, which could effect de facto segregation by leaving membership decisions to the discretion of a governing board. But non-wealthy whites were suddenly left without a key option for summer recreation, all because their communities could not countenance sharing a publicly-funded pool with their Black citizens. In what is one of the pernicious elements of systemic racism, McGhee observes, many of the non-wealthy whites who could no longer bring their children to swim in one of these magnificent pools for free probably thought that this was a fair deal—better to go without than be obliged to share with people you’d been brought up to consider beneath you.
I am at present about halfway through The Sum of Us; look for a longer blog post when I’ve finished it. Meanwhile, I would suggest that this book should be required reading for our present moment.
What I’m Watching I wrote in my last post about how much I’m enjoying rewatching Battlestar Galactica, but as Stephanie and I took a hiatus from that show to binge The Mandalorian, so again we’re taking a hiatus to watch The Stand—the recent mini-series adaptation of Stephen King’s mammoth 1978 novel in which 99.9% of the world is wiped out by a weaponized superflu nicknamed “Captain Trips,” and the remaining people of the U.S. gather in two opposing communities. On one side are the forces of good, who have been drawn to Boulder, Colorado by dreams of a 108 year old Black woman named Mother Abigail. On the other are those drawn by promises of power, licentiousness, and revenge by the evil Randall Flagg, a denim-clad and cowboy-boot shod demon in human form, who establishes his new society in (of course) Las Vegas.
As I’ve discussed a few times on this blog, last term I taught a fourth year class on pandemic fiction; I did not include The Stand, in spite of the fact that it’s one of the few actual pandemic novels written prior to the 21st century, mainly because it is way too long (almost 1500 pages) to shoehorn into a semester-long course. Given its significance to the topic, however, I did record a short lecture in which I ran down the key themes and plot points (which you can watch here if you’re so inclined). But one of the things I found interesting in retrospect—I first read The Stand in high school, and then read it again when King published the unexpurgated version in 1990—after doing all the preparatory reading for my course, was how King transformed a story about a biological catastrophe into a Manichaean light v. dark, G v. E, cosmic battle royale with Mother Abigail as God’s surrogate and Randall Flagg as Satan’s proxy. While the novel meditated at length on the nature of civilization and how one pragmatically goes about rebuilding after the apocalypse—with 1500 pages, how could it not?—it is obvious that it’s the metaphysical war that most interested King.
We’re slightly more than halfway through the new adaptation, and quite enjoying it. It was quite badly reviewed; and while I can agree with some of the complaints, it has been on the whole well-adapted to the screen, and (mostly) well-cast. Alexander Skarsgard is at his menacing best as Randall Flagg, James Marsden is all wry southern charm as Texan Stu Redman, Greg Kinnear plays the professorial Glen Bateman with the right balance of pomposity and insight; Whoopi Goldberg basically plays Mother Abigail as a devoutly Christian Guinan with a head of white dreadlocks; my favourite however is Brad William Henke, who plays the mentally disabled Tom Cullen with a guileless, earnest simplicity that avoids stereotypes (those who watch Justified will recognize Henke from season two as Coover Bennett, a similarly mentally delayed character whose disability manifests instead in sociopathic violence).
There is much that is left out, and much that could have been done better, but on the whole it is a pretty satisfying adaptation of an intriguing but flawed novel (“intriguing but flawed” is how I’d characterize most of King’s oeuvre, but I suppose that is to be expected when you churn out an average of two brick-sized novels a year). If you like The Stand, or are just amenable to Stephen King more generally, I’d recommend this series.
What I’m Writing I recently dusted off an article-in-progress that had been mouldering for a year or two, on zombie apocalypse and celebrity; in a fit of energy I finished it and submitted it to a journal. I now have another that I’m looking to finish, on Emily St. John Mandel’s Station Eleven and nostalgia. Given that I’ll be doing that novel in both of my classes over the next two weeks, it seems an ideal time to return to it. Given that it is also about apocalypse, though of the non-zombie variety—and indeed about a civilization-ending pandemic—I’ve been trying to rewrite my introduction to put it in the context of the past year. It’s been slow going, not least because finding the right balance between the personal and the objective can be tricky when your aim is to submit it to a scholarly journal. But the overarching argument—that Mandel’s post-apocalyptic world in which the main characters are actors and musicians travelling between settlements to perform Shakespeare and classical music comprises a nostalgic desire to return to a pre-postmodern humanism—is, I think, a strong one. I just have to fill out a core section in which I discuss humanism in a more granular way.
(This process will also be useful, as it will give me a lot of lecture material).
On a related subject, I’ve also been working on a new essay on Terry Pratchett and Discworld. I have an article on Pratchett and his campaign for assisted dying coming out soon in a new collection; I’m trying to carry that momentum forward on a handful of Sir Terry essays, but the one I’ve been focusing on is a reading the Discworld novels in the context of the philosophy of American Pragmatism and what I’ve been calling the “magical humanism” exhibited in a lot of contemporary fantasy.
As much as I love Sir Terry’s writing, I find it difficult to write about it in a scholarly manner, for the basic reason that I find it difficult to find a focus and not end up running off madly in all directions. The essay I wrote for the collection, which came in way past deadline, needed to be cut from nine thousand words to six thousand (one of the essay’s blind reviewers said something to the effect of “this is obviously a piece of work gesturing to a much larger theory of Pratchett’s fiction,” which was at once both gratifying and true). Part of the problem is the iterative nature of the Discworld’s world-building: each of the forty-one novels is a standalone narrative, and with each new installment, Sir Terry modified and refined aspects of that world, but also returned to the same themes and preoccupations in such a way that it is close to impossible to discuss the political and philosophical preoccupations of a given novel without being obliged to reference a dozen others.
This isn’t the most conducive thing for my intellectual temperament, which at the best of times is digressive and inclined to run down whatever rabbit holes I find, until I realize that, several paragraphs of writing on, I’ve found myself discussing an entirely different topic. (Presumably, devoted readers of this blog will have noticed this). That being said, however, it is a pleasure to lose myself in this topic … not least because I increasingly see Sir Terry’s humanism as a necessary antidote to our present toxic political moment.
Revisiting Battlestar Galactica When I noticed that BSG was available to stream on Amazon, I mentioned the fact in passing to Stephanie, who said she’d never watched it. I was surprised, but also delighted, as it gave me an excuse to rewatch the series and introduce it to her.
It is such a good series, and it has far more in common with other such contemporary SF on television as Firefly or The Expanse than it does with its original, hokey 70s series that was derivative enough of Star Wars (its original title was to be Saga of a Star World) that George Lucas attempted to sue. The 2004 reboot maintained the original’s premise of a weathered battleship leading a ragtag fleet of humans who had survived a genocidal attack by the robotic race of Cylons in a search for the mythic planet Earth. It also kept the aesthetics of the battlestar and the Viper fighters, and the names of the main characters—Commander William Adama, his son and chief Viper pilot, Lee “Apollo” Adama, his second-in-command Colonel Tigh, hotshot pilot Starbuck, the treacherous Gaius Baltar, and so on.
But aside from maintaining such continuities, the new version is darker, grittier, and abjures the campy quality of the original (something that, to be fair, tended to mark a lot of 70s-era SF, Star Wars included). The new version is also more diverse with respect to race and gender, with the always-brilliant Edward James Olmos in the role of Commander Adama, a crew that seems more or less to have gender parity, and the crucial role of firebrand Starbuck played by Katee Sackhoff.
This last change did not sit well with the original Starbuck, played by Dirk Benedict, whom you may also remember as “Faceman” Peck from The A-Team (though you could be forgiven if you don’t remember him from anything else). Benedict seems to have gone the route of other 80s actors of limited fame who re-emerge as conservative culture warriors. Benedict penned a blog post titled “Lt. Starbuck … Lost in Castration” some time around 2008, in which he excoriates the new version for feminizing the cigar-smoking, roguish lothario he played, and for otherwise being the embodiment of a world in which “40 years of feminism have taken their toll,” and the “war against masculinity has been won.”
If you think you can stomach it, you should really read the post in its entirety, as it reads like a parody of butthurt masculinity; I remember reading it about twelve years ago and wondering at its ludicrousness, but in re-reading it today, it appears as prescient anticipation of the squalid online worlds of “men’s rights advocates,” incel culture, and Jordan Peterson acolytes. To give just one of the more egregious examples from the piece:
Women are from Venus. Men are from Mars. Hamlet does not scan as Hamletta. Nor does Hans Solo as Hans Sally. Faceman is not the same as Facewoman. Nor does a Stardoe a Starbuck make. Men hand out cigars. Women hand out babies. And thus the world for thousands of years has gone round.
(For the record, I cut and pasted this from his post, and only just now realized that Harrison Ford’s iconic character was German).
Benedict’s little temper tantrum is exemplary of both the kind of white male fan-rage that enveloped The Last Jedi and inspired the “Sad Puppies’” campaign against the Hugo Awards, but also the pathetic whine of a mediocre actor seeing one of the television properties that gave him his brief bout of fame being done better—and seeing a version of Lt. Starbuck played by an actor with greater depth and talent than him, though it’s fairly obvious that the fact that she’s a woman is what prompted his rage.
Watership DownYesterday in my Utopias & Dystopias class we started Richard Adams’ Watership Down, which I’ve been looking forward to all semester. I first read the novel when I was in high school, and I loved it enough that it almost erased the traumatic memory of seeing the animated 1978 film adaptation in the theatre. For those who haven’t read Watership Down, it’s a story about a bunch of wild rabbits trying to find a new home. Lest that make you think it is thus a cutesy story about bunnies, remember that rabbits are basically prey animals for just about every conceivable predator, and so the odyssey to find a new and safe home is beset with terror at every turn. The 1978 film cranked this up to eleven, positively glorying in the blood and violence and death, and doing so in that creepy 70s-style animation that always leaves me feeling weirded out. A friend’s father took us to see it, probably on the same misapprehension that many adults had, that this would be a cute story about bunnies.
Not so much.
The novel doesn’t lack for the terror and fear, but at least it doesn’t have the graphic dimension of the film … and it is also quite impressive in its world-building, giving the rabbits their own mythology and folklore. In fact, their origin story is precisely about how their creator made them prey animals in punishment for out-breeding all the other animals, but also gave them cunning and powerful hind legs that let them outrun their predators. Adams walks an interesting line between straightforward anthropomorphising á là Disney animals, and emphasizing the limitations rabbits would have (even these versions of rabbits with language and lore) in making mental connections or simple counting; there is, however, a necessary amount of anthropomorphizing, and the rabbits all have subtle and nuanced characters.
When I asked my students what they thought of the novel, the consensus was that they did not expect the story they encountered, but that they liked it and found it compelling. (Which was a relief—I’m always leery of teaching a text I love for the first time, as it is often very disheartening when a balance of students express dislike or, worse, indifference).
But why is a novel about rabbits on a Utopias & Dystopias course? Well, because it embodies both kinds of story—it is about the rabbits leaving a home that the oracular character Fiver says is in danger (it is ultimately, we learn later, destroyed in the process of humans building new houses), and seeking out a place where they can live in peace and utopian safety. But the journey is markedly dystopian, as they must venture out into a hostile world populated by the thousand animals that want to eat them, but also by antagonistic rabbits who end up being the bigger threat (in this way, as I have blogged before, and as I suggested to my class, Watership Down is sort of like a zombie apocalypse narrative).
Also, I just love the novel, and sometimes that’s sufficient excuse to put it on a course.
I did not watch Oprah Winfrey’s much-hyped interview of Prince Harry and Meghan Markle for much the same reason I did not watch the two most recent royal weddings: I didn’t care. Especially at this point in time, between marking a year of pandemic and the ongoing reverberations of the Trump presidency, the travails of Harry and Meghan—even inasmuch as I sympathize with them against the Royal Family—don’t really do much to excite my imagination or interest.
On the other hand, the fallout from the interview, coupled with related issues and events, has piqued my interest indeed. That people will be instinctively partisan for one party or the other is about as unsurprising as learning that some people in “the Firm” fretted about whether or not Meghan’s first child would be dark-complexioned. Racism in the Royal Family? Get away! But of course, this particular charge was picked up by right-wing pundits as further evidence of “cancel culture” at work, and we’ve been treated to the bizarre spectacle of self-described red-blooded American patriots rushing to the defense of HRM Queen Elizabeth II.1
Someone might want to remind them just what those boys at Lexington and Concord died for. Or perhaps tell them to watch Hamilton.
Notably, the one person emerging not just unscathed but burnished from the interview was the Queen herself—both Harry and Meghan were careful to say that none of the difficulties they’ve experienced emanated from her, and that she has indeed been the one person who is blameless (some reports have read between the lines and extrapolated that the Queen was prescient enough to have given Harry funds to see him through being cut off financially).
Leaving aside for the moment the possibility, or possibly even the likelihood, that this is entirely true, this sympathy is reflective of a broader reluctance to be critical of Elizabeth II. Even the 2006 film The Queen, starring Helen Mirren in the title role, which was all about the Palace’s cold and inept response to the shocking death of Diana, ended up painting a vaguely sympathetic portrait (though to be fair, that has a lot to do with the virtuosity of Helen Mirren). And The Crown (created and written by Peter Morgan, who wrote The Queen), which is largely unsparing of all the other royals and their courtiers, generally depicts Elizabeth as a victim of circumstance who spends her life doing her level best to do her royal duty and constrained by this very sense of duty from being a more compassionate and loving human.
The Queen is a person whom, I would argue, people tend to see through a nostalgic lens: nostalgia, in this case, for a form of stiff-upper-lip, keep-calm-and-carry-on Britishness memorialized in every WWII film ever—something seen as lost in the present day, along with Britannia’s status in the world. As we have seen in much of the pro-Brexit rhetoric, these two losses are not perceived as unrelated; and seeing Queen Elizabeth as the cornerstone of an ever-more-fractured Royal Family is a comforting anchor, but one that grows more tenuous as she ages.
There’s an episode in season four of The Crown that articulates this sensibility. In it, Elizabeth, having grown concerned that her children might not appreciate the scale and scope of the duties they’ve inherited, meets with each of them in turn and is perturbed by their feckless selfishness. Charles is in the process of destroying his marriage to Diana; Andrew is reckless in his passions; Anne is consumed by resentment and anger; and Edward is at once isolated by his royal status at school and indulgent in his royal privilege. Though her disappointment in her spawn is never put into words, it is obvious (Olivia Coleman can convey more with her facial expressions than I can in ten thousand words), and The Crown effectively indicts the younger generation of royals as unworthy of their status, and definitely unworthy of the throne.
This, I think, is where we’re at right now with Harry and Meghan’s interview. I’ve joked on occasion that “shocked but not surprised” should be the title of the definitive history of the Trump presidency, but it might also function as a general sentiment for this particular epoch. It is difficult, precisely, to put one’s finger on the substance of the outrage over Meghan’s revelations, aside from an instinctive royalist animus directed at anyone with the temerity to criticize the monarchy. This is why, perhaps, some (<cough> <cough> PIERS MORGAN <cough>) have simply chosen to call bullshit on Meghan Markle’s story of mental health issues and suicidal ideation;2 but it was the charge of racism that seems to have becomes the most ubiquitous bee in a whole lot of bonnets. Shocking, yes; surprising, no. The entire British colonial enterprise was predicated on the premise of white English supremacy, and royalty of all different nationalities has always been assiduous in policing their bloodlines. Prior to the divorce of Charles and Diana amid revelations of his relationship with Camilla Parker-Bowles, the greatest scandal the British monarchy had weathered was the abdication of Edward VIII so he could marry his American divorcée paramour, Wallis Simpson. Meghan Markle, it has been noted by many, ticks two of those scandalous boxes insofar as she is American and a divorcée.
She is also, to use postcolonial theorist Homi Bhabha’s phrasing, “not white/not quite.” Which is to say, she is biracial, and as such will never thus be qualified to be a royal in a stubborn subsection of the British cultural imagination.
The fascination many people have with the British Royal Family—especially among those who aren’t British—has always baffled me more than a little. But on honest reflection, I suppose I shouldn’t be baffled. In spite of the fact that hereditary monarchy is an objectively terrible form of governance, it is also one of the most durable throughout history. Human beings, it seems, are suckers for dynastic power, in spite of the illogic of its premise; as the late great Christopher Hitchens wryly observed, being the eldest son of a dentist does not somehow confer upon you the capacity to be a dentist. And yet down through the centuries, people have accepted that the eldest son (and occasionally daughter) of the current monarch had the right to assume the most important job in the nation on that monarch’s passing.
Of course, “right” and “ability” don’t always intersect, and there have been good, bad, and indifferent kings and queens down through history (of course, being democratically elected is no guarantee of governing ability, but at least the people have the option of cancelling your contract every few years). For every Henry V there’s a Richard III, and we’re equally fascinated by both, while mediocre kings and queens who preside over periods of relative peace don’t tend to get the dramatic treatment.
Indeed, on even just a brief reflection, it’s kind of amazing at just how pervasive the trope of monarchy is in literature and popular culture more broadly. It is unsurprising that Shakespeare, for example, would have made kings and queens the subject of many of his plays—that was, after all, the world in which he lived—but the persistence of hereditary monarchy in the 20th century cultural imagination is quite remarkable. It’s pretty much a staple of fantasy, as the very title of Game of Thrones attests; but where George R.R. Martin’s saga and its televisual adaptation are largely (but sadly not ultimately)3 rooted in a critique of the divine right of kings and the concept of the “chosen one,” the lion’s share of the genre rests in precisely the comfort bestowed by the idea that there is a true king out there perfectly suited to rule justly and peaceably.
More pervasive and pernicious than Shakespearean or Tolkienesque kings and queens, however, is the Disney princess-industrial complex. Granted, the fairy-tale story of the lowly and put-upon girl finding her liberatory prince pre-dates Walt Disney’s animated empire by centuries, but I think we can all agree that Disney has at once expanded, amplified, and sanded down the sharp edges of the princesses’ folkloric origins—all while inculcating in millions of children the twinned conceptions of royalistic destiny and the heteronormative gender roles associated with hereditary nobility (to be fair to Disney, it has done better with such recent excursions as Brave and Frozen—possibly the best endorsement of the latter’s progressiveness is the fact that Jordan Peterson loathes it). It’s telling that Disney’s most prominent branding image isn’t Mickey Mouse, but the Disney castle,4 a confection of airy spires and towers that any medievalist would tell you defeats the purpose of having a castle to start with. Even your more inept horde of barbarians would have little difficulty storming those paper-thin defenses, but then it’s not the bulwarks and baileys that are important, but the towers … the towers, built to entrap fair maidens until their rescuing princes can slip the lock or scale the wall.
I have to imagine that a large part of the obsession over royal weddings proceeds from precisely this happy-ending narrative on which the Mouse has built its house: the sumptuous spectacle of excess and adulation that evokes, time and again, Cinderella’s arrival at the ball. The disruption of this mythos is at once discomforting and titillating: Diana’s 1995 interview presaged Harry and Meghan’s with its revelations of constraint and isolation, and the active antagonism of both the Royal Family and its functionaries toward any sort of behaviour that might reflect badly upon it—even if that behaviour simply entailed seeking help for mental health issues. There have been many think-pieces breaking down which elements of The Crown are fact and which are fiction, but it is at this point fairly well established wisdom that being born a Windsor—or marrying into the family—is no picnic. And while Meghan’s claim that she never Googled Harry or his family strains credulity, I think it’s probably safe to say that no matter how much research one does, the realities of royal life almost certainly beggar the imagination.
Also, The Crown was only in its second season when Meghan married Harry.
I confess that, aside from the very first episode, I did not watch the first three seasons of The Crown, the principal reason being that I couldn’t get my girlfriend Stephanie into the show. While I may be more or less indifferent to the British monarchy, Stephanie is actively hostile5 to it. Born in South Africa, she and her family came to Canada when she was fourteen; having imbibed an antipathy to her birth nation’s colonizer that is far more diffuse in Canada, she gritted her teeth through the part of her citizenship oath in which she had to declare loyalty to the Queen. Her love of Gillian Anderson (Stephanie is, among her other endearing qualities, the biggest X-Files fan I’ve ever met) overcame her antipathy, however, for season four, and so we gleefully watched the erstwhile Agent Scully transformed into the Iron Lady spar with Olivia Colman’s Queen Elizabeth (we’re also pretty sympatico on our love of Olivia Colman). With each episode, we reliably said (a) Olivia, for the love of Marmite, don’t make us sympathetic with the Queen!; (b) Gillian, please don’t make us feel sympathy for/vague attraction to Margaret Thatcher!; and, (c) Holy crap, Emma Corrin looks so much like Lady Di!
It will be interesting to see The Crown catch up with the present moment. But I also have to wonder if some commentators are right when they say that the Harry and Meghan split from the Firm signals the end of the British monarchy? To my mind, by all rights it should: it’s long past time this vestige of colonial hubris went into that good night. We’ve got enough anti-democratic energy to deal with in the present moment without also concerning ourselves with a desiccated monarchy. When Queen Elizabeth dies, with her dies the WWII generation. The Second World War transformed the world in countless ways, one of them being that it spelled the end of the British Empire and the diminution of Great Britain’s influence in the world. Brexit is, among other things, a reactionary response to this uncomfortable reality, and a vain, desperate attempt to reassert Britannia’s greatness. Across the pond, fellow nativists in the U.S. have latched onto Meghan Markle’s accusations of racism to make common cause with the monarchy. Not, perhaps, because they’ve forgotten the lessons of 1776, but most likely because they never learned them to start with.
1. Perhaps the stupidest defense came from Fox and Friends’ co-host Brian Kilmeade, who opined that the fact that British Commonwealth countries are “75% Black and minority” demonstrated that the Royal Family could not possibly be racist. Leaving aside the pernicious history of colonialism and the kind of white paternalism epitomized by the Rudyard Kipling poem “White Man’s Burden,” can we perhaps agree that Kilmeade’s juxtaposition of “75%” and “minority” sort of gives the game away?
2. I’ve always felt that Piers Morgan was the result of a lab experiment in which a group of scientists got together to create the most perfect distillation of an asshole. Even if we grant his premise that Meghan Markle is, in fact, a status-seeking social climber who has basically Yoko Ono’ed Prince Harry out of the Royal Family, his constant ad hominem attacks on her say more about his own inadequacies than hers. And for the record, I do not grant his premise: to borrow his own turn of phrase, I wouldn’t believe Piers Morgan if he was reading a weather report.
3. We may never know how George R.R. Martin means to end his saga—at the rate he’s going, he’ll be writing into his 90s, and I don’t like his actuarial odds—but we do know how the series ended. The last-minute transformation of Daenerys into a tyrant who needed to be killed could conceivably have been handled better if the showrunners had allowed for two or three more episodes to bring us there; but the aftermath was also comparably rushed, and Sam Tarly’s democratic suggestion for egalitarian Westrosi governance was laughed off without any consideration. I will maintain to my dying day that GRRM effectively transformed fantasy, but also that he was too much in thrall to its core tropes to wander too far from their monarchical logic.
4. I recently bought a Disney+ streaming subscription in order to watch The Mandalorian. While writing this post, I remembered that Hamilton’s sole authorized video recording is Disney property. So of course I immediately clicked over to Disney+ to watch parts of it, and was treated to the irony of a play about the American revolutionary war to overthrow monarchical tyranny prefaced by Disney’s graphic of its castle adorned with celebratory fireworks.
5. When I read this paragraph to Stephanie, she liked all of it but objected to my use of the word “hostile.” “I don’t actually hate the Royal Family,” she said. “I don’t wish them harm. I just find the entire idea pointless and antiquated, and it embodies some of the worst aspects of British history.” So: she’s not hostile to the Royal Family, but I’m at a loss to find a better word, especially considering the invective she hurls at England during the World Cup.
Scrolling back through my posts since 2021 began, I’m struck by the general bleakness of a lot of what I’ve been writing about … which not unsurprising, given that much of it has had to do with American politics, and I’ve been writing as we approach the one-year anniversary of the COVID pandemic, with the prospect of it persisting as late as the autumn.
I’m not one for such practices as daily affirmations, but sometimes it’s helpful to remind oneself about what is making you happy. One of my favourite podcasts is NPR’s Pop Culture Happy Hour; once a week they have a segment titled “what’s making us happy this week,” in which the hosts share, well, what’s making them happy that week—what books or film or TV or other things that give them delight.
Well, I’m stealing the idea. I’ve already done so on occasion when I teach—I will ask my students every so often what’s making them happy—though I’m usually treated to silence punctuated now and then with an enthusiastic endorsement of something. But I’m now bringing it to my blog. Don’t expect it to be weekly, though.
So what’s making me happy right now?
Stephanie on the Guitar and Sir Terry Pratchett Since posting my QAnon piece this morning, I’ve been working away on an article that I’ve had stewing on my brain’s back burner for a long time: I’m titling it “The Pragmatic Pratchett,” and it argues that the political and moral philosophy that Sir Terry developed over his forty-one Discworld novels (and his other fiction and non-fiction) is a form of “magical humanism” that squares up quite nicely with the American school of Pragmatism á là John Dewey, Richard Rorty, and Judith Shklar. We’re currently on midterm break here, so I’m trying to pound out one thousand words and day and have something approaching a rough draft by the time classes resume in a week. I hit 1400 words today, and will continue work on it after dinner.
Meanwhile, as I write in my office, Stephanie is in the next room doing something musical. One of her hobbies is to record songs and post videos of herself playing to YouTube. Lately, she’s been getting into the project of making backing tracks for people to play guitar over. But today she’s broken out the guitar again, and so as I write about Sir Terry, I can hear her playing. It’s quite lovely—it’s almost like being serenaded.
I love watching her get absorbed in whatever project she’s working on. It is a form of self-care, a mental respite from her job as a full-time nurse. She is a perfectionist when it comes to her musical projects, and will spend hours plugged into her laptop and various musical doodads (I am, it should go without saying, musically illiterate). I like to joke that she’s obsessive and I’m compulsive, and that together we make up a complete neurosis.
The Mandalorian I finally caved and subscribed to Disney Plus. Stephanie and I binged the two extant seasons of The Mandalorian over the course of a week, watching the season two finale last Wednesday. Since then I have been rewatching some of my favourite scenes on my laptop, as well as watching the various fan reaction videos.
Jon Favreau just gets it—he gets the texture of the Star Wars universe, he gets the aspects of it that make for good stories (and eschews those that don’t), and somehow he made a legion of viewers fall in love with a goddamn muppet.
I loved the Ewoks when I first watched Return of the Jedi—but then again, when I first watched Jedi, I was eleven years old. The cloying cute little teddy bears of Endor have not aged well, so when I first heard the name “Baby Yoda” and saw the images, I was skeptical—another merchandising opportunity, I thought, at the expense of good storytelling. Well, I don’t mind admitting how wrong I was—Baby Yoda, aka Grogu, was impossibly cute, but somehow not cloyingly so. And the relationship between “the kid” and Din Djarin was quite beautifully done—a testament to Pedro Pascal’s acting chops, considering that we see his face all of three times over sixteen episodes.
(Fun fact: if you binge The Mandalorian not long after binging Schitt’s Creek, as Stephanie and I did, it is nearly impossible not to shout—in one’s best Moira Rose voice—“the Bébé!” every time Baby Yoda shows his face).
And the casting. Jeebus, the casting. I got the sense as I watched that Jon Favreau would just call up friends and say “Hey, I’m doing this Star Wars thing, you want in?” Such a great ensemble of actors. There is something exquisite about hearing Werner Herzog say, “I hear you are ze best in ze parzec.” There is something equally exquisite in seeing Giancarlo Esposito bring all of his equable Gus Fring menace to the role of Moff Gideon. Ming-Na Wen as a deadly assassin? Yes please. Timothy Olyphant playing a variation on Seth Bullock and Raylen Givens as the marshal of a mining town on Tatooine? Gods, yes. Bill Burr as an irascible mercenary thief? Natalia Tena, aka Nymphadora Tonks, as a hissing, blade throwing alien? Richard Ayoade as the voice of a priggish but deadly droid? Taika Waititi, who also directed a few episodes, as a droid programmed to care for Baby Yoda? Jason Sudeikas and Adam Palley bantering as a pair of inept stormtroopers? Also: considering that we interrupted our viewing of Battlestar Galactica (Stephanie had never seen it, so I felt it my moral obligation to introduce her to it) to watch The Mandalorian, Katee Sackhoff’s appearance as fellow Mandalorian Bo Katan felt particularly apropos.
But one of my favourite cameos also relates to the way Jon Favreau is building out the post-Return of the Jedi universe, in which the New Republic must now actually govern. It’s a new normal in which the X-wing pilots are no longer the heroic flyboys and -girls of the movies, but are essentially cops on a beat, who give Mando grief for his ship’s broken transponder in the same way an exasperated traffic cop might give you a pass on a broken taillight. In a later episode of season two, a dumpy, balding X-wing pilot suggests to Cara Dune that she should take on the role of Marshal in her town, now that the former Empire was more or less expunged. “Wait,” I said as we watched, “isn’t that the father from Kim’s Convenience?” And indeed it was—Korean-Canadian actor Paul Sun-Hyung Lee (who according to his iMDB page, “Is a member of the Star Wars costuming group The 501st Legion”).
It was the clipboard in the pilot’s hand that made it art.
Pie-Making … or, as I like to pronounce it, “PAH!” Given that I’m not a desserts person, the vast majority of my pies are savoury. I’ve made it something of a custom, when I make a roast chicken, to make stock from the carcass the next day and use the meat coming off to the bone to make stew, which then goes to making chicken pot pie. I have also done the same with leftover roast beef.
I recently decided I wanted to learn to make traditional British pork pies—specifically, the classic Melton Mowbray pie, which is basically just a whole lot of finely (or coarsely) chopped cuts of pork, cooked in a narrow but tall pie crust. I ordered a dedicated pie tin online, along with a book of savoury pie recipes. My first attempt was tasty, but I did not make it with the traditional aspic that is part of the recipe—a glaring omission, as a handful of people on Facebook observed. To be fair, I’d looked at a bunch of recipes, some of which called for the use of powdered gelatin, while the more involved recipes would have had me boiling pork bones. Given that my access to pork bones in St. John’s is limited, I’d bought the gelatin … and then decided to do a trial run without, just to see about getting the taste right.
I want to try again and do it properly—there’s a small butcher shop just around the corner, and I was planning on popping in there to ask about the whole pork bones thing. But then we had a outbreak of new COVID cases in Newfoundland that put us back into level five lockdown … so there’s no popping into the local butcher’s for a while, anyway.
Meanwhile, I ordered more pie tins—4” across like the original one, but half the height. Which makes for an ideal single-serving pie. Last night I made prime rib for dinner. As I write, four beef and mushroom pies are in the oven. When the lockdown lifts, I’ll return to my project of perfecting the Melton Mowbray pie; until then, I’ll work with the classics.