Category Archives: maunderings

History, Memory, and Forgetting Part 3: The Backlash Against Remembering

“The struggle of man against power is the struggle of memory against forgetting.”

—Milan Kundera The Book of Laughter and Forgetting

I had drafted the first two installments of this three-part series of posts and was working on this one when the news of the discovery of the remains of 215 indigenous children on the grounds of a former residential school in BC broke. I paused for a day or two, uncertain of whether I wanted to talk about it here, in the context of my broader theme of history, memory, and forgetting. Part of my hesitation is I honestly lack the words for what is a shocking but utterly unsurprising revelation.

I must also confess that, to my shame, one of my first thoughts was the dread certainty that we’d soon be treated to some tone-deaf and historically ignorant column from the likes of Rex Murphy or Conrad Black or any one of the coterie of residential school apologists. So far however the usual suspects seem to be steering clear of this particular story; possibly the concrete evidence of so much death and suffering perpetrated by white turnkeys in the name of “civilizing” Native children is a bridge too far even for Murphy and Black et al’s paeans to Western civilization and white Canada’s munificence.

What I’m talking about in this third post of three is remembering as a political act: more specifically, of making others remember aspects of our history that they may not want to accept or believe. Scouting as I did for anything trying to ameliorate or excuse or explain away this evidence of the residential schools’ inhumanity,[1] I found my way to a 2013 Rex Murphy column that might just be the epitome of the genre, as one gleans from its title: “A Rude Dismissal of Canada’s Generosity.” In Murphy’s telling, in this column as in other of his rants and writings, conditions for Native Canadians in the present day are a vast improvement over historical circumstances, but the largesse of white Canadians and the government is something our indigenous populations perversely deny at every turn. He writes: “At what can be called the harder edges of native activism, there is a disturbing turn toward ugly language, a kind of razor rhetoric that seeks to cut a straight line between the attitudes of a century or a century and a half ago and the extraordinarily different attitudes that prevail today.”

“Attitudes” is the slippery word there: outside of unapologetically anti-indigenous and racist enclaves, I doubt you’d have difficulty finding white Canadians who did not piously agree that the exploitation and abuse of our indigenous peoples was a terrible thing. You’d have a much harder time finding anyone willing to do anything concrete about it, such as restoring the land we cite in land acknowledgments to its ancestral people. Attitudes, on the balance, have indeed improved, but that has had little effect on Native peoples’ material circumstances. And in his next paragraph, Murphy seems intent on demonstrating that not all attitudes have, in fact, improved:

From native protestors and spokespeople there is a vigorous resort to current radical jargon—referring to Canadians as colonialist, as settlers, as having a settler’s mentality. Though it is awkward to note, there is a play to race in this, a conscious effort to ground all issues in the allegedly unrepentant racism of the “settler community.” This is an effort to force-frame every dispute in the tendentious framework of the dubious “oppression studies” and “colonial theory” of latter-day universities.

And there it is—the “radical jargon” that seeks to remember. Referring to Canadians as colonialist settlers isn’t radical, nor is it jargon, but is a simple point of fact—and indeed, for decades history textbooks referred to settlers as brave individuals and the colonizing of Canada as a proud endeavour, necessarily eliding the genocidal impact on the peoples already inhabiting the “new world.” Murphy’s vitriol is, essentially, denialism: denying that our racist and oppressive history doesn’t linger on in a racist present. He speaks for an unfortunately large demographic of white Canada that is deeply invested in a whitewashed history, and reacts belligerently when asked to remember things otherwise.

This is a phenomenon we see playing out on a larger scale to the south, most recently with a substantial number of Republican-controlled state legislatures introducing bills that would forbid schools from teaching any curricula suggesting that the U.S. is a racist country, that it has had a racist history, or really anything that suggests racism is systemic and institutional rather than an individual failing. The bogeyman in much of the proposed legislation is “critical race theory.” Like Rex Murphy’s sneering characterization of “latter-day universities” offering degrees in “oppression studies” (not actually a thing), critical race theory is stigmatized as emerging from the university faculty lounge as part and parcel of “cultural Marxism’s” sustained assault on the edifices of Western civilization.[2] While critical race theory did indeed emerge from the academy, it was (and is) a legal concept developed by legal scholars like Derrick Bell and Kimberlé Crenshaw[3] in the 1970s and 80s. As Christine Emba notes, “It suggests that our nation’s history of race and racism is embedded in law and public policy, still plays a role in shaping outcomes for Black Americans and other people of color, and should be taken into account when these issues are discussed.” As she further observes, it has a clear and quite simple definition, “one its critics have chosen not to rationally engage with.”

Instead, critical race theory is deployed by its critics to connote militant, illiberal wokeism in a manner, to again quote Emba, that is “a psychological defense, not a rational one”—which is to say, something meant to evoke suspicion and fear rather than thought. The first and third words of the phrase, after all, associate it with elite liberal professors maundering in obscurantist jargon, with which they indoctrinate their students into shrill social justice warriors. (The slightly more sophisticated attacks will invoke such bêtes noir of critical theory as Michel Foucault or Jacques Derrida[4]).

But again, the actual concept is quite simple and straightforward: racism is systemic, which should not be such a difficult concept to grasp when you consider, for example, how much of the wealth produced in the antebellum U.S. was predicated on slave labour, especially in the production of cotton—something that also hugely benefited the northern free states, whose textile mills profitably transformed the raw cotton into cloth. Such historical realities, indeed, were the basis for the 1619 Project, the New York Times’ ambitious attempt to reframe American history through the lens of race—arguing that the true starting-point of America was not with the Declaration of Independence in 1776, but when the first African slaves set foot on American soil in 1619.

The premise is polemical by design, and while some historians took issue with some of the claims made, the point of the project was an exercise in remembering aspects of a national history that have too frequently been elided, glossed over, or euphemized. In my previous post, I suggested that the forgetting of the history of Nazism and the Holocaust—and its neutering through the overdeterminations of popular culture—has facilitated the return of authoritarian and fascistic attitudes. Simultaneously, however, it’s just as clear that this revanchist backsliding in the United States has as much to do with remembering. The reactionary crouch inspired by the tearing down of Confederate monuments isn’t about “erasing history,” but remembering it properly: remembering that Robert E. Lee et al were traitors to the United States and were fighting first and foremost to maintain the institution of slavery. Apologists for the Confederacy aren’t wrong when they say that the Civil War was fought over “states’ rights,” they’re just eliding the fact that the principal “right” being fought for above all others was the right to enslave Black people. All one needs to do to clarify this particular point is to read the charters and constitutions of the secessionist states, all of which make the supremacy of the white race and the inferiority of Africans their central tenet. The Confederate Vice President Alexander Stephens declared that the “cornerstone” of the Confederacy was that “the negro is not equal to the white man; that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”

Statue of Nathan Bedford Forrest in Memphis, Tennessee

Bringing down Confederate monuments isn’t erasure—it’s not as if Robert E. Lee and Nathan Bedford Forrest disappear from the history books because people no longer see their bronze effigies in parks and town squares—but the active engagement with history. That was also the case with the erection of such monuments, albeit in a more pernicious manner: the vast majority were put up in the 1920s and 1950s, both of which were periods when white Americans felt compelled to remind Black Americans of their subordinate place by memorializing those who had fought so bloodily to retain chattel slavery. Like the Confederate battle flag, these monuments were always already signifiers of white supremacy, though that fact has been systematically euphemized with references to “southern tradition,” and of course the mantra of states’ rights.

Martin Sheen as Robert E. Lee in Gettysburg (1993)

Once when I was still in grad school and had gone home to visit my parents for a weekend, I was up late watching TV, thinking about going to bed, and while flipping channels happened across a film set during the Civil War. It was about halfway through, but I stayed up watching until it was done. The story was compelling, the acting was really good, and the battle scenes were extremely well executed. The film was Gettysburg, which had been released in 1993. It had a huge cast, including Jeff Daniels and Sam Elliott, and Martin Sheen as General Robert E. Lee. Because I’d only seen the second half, I made a point of renting it so I could watch the entire thing.

Gettysburg is a well-made film with some great performances, and is very good at pushing emotional buttons. Colonel Joshua Chamberlain’s (Jeff Daniels) bayonet charge down Little Round Top is a case in point:

I’m citing Gettysburg here because it is one of the most perfect examples of how deeply the narrative of the Lost Cause became rooted in the American imagination. The Lost Cause, for the uninitiated, is the ongoing project, begun just after the end of the Civil War, to recuperate and whitewash (pun intended) the Confederacy and the antebellum South. Its keynotes include the aforementioned insistence that the Civil War wasn’t fought over slavery but states’ rights; its foregrounding of cultured and genteel Southern gentlemen and women; depictions of slaves (when depicted at all) as happy spiritual-singing fieldhands under the benevolent supervision of the mastah; Northerners as rapacious carpetbaggers who proceeded to despoil everything good and noble about the South in the years following the war; and the war itself as a tragic but honourable dispute between sad but dutiful officer classes prosecuting their respective sides’ goals not in anger but sorrow.

This last element is Gettysburg’s connective tissue. Let’s be clear: the film isn’t Southern propaganda like D.W. Griffith’s Birth of a Nation. It is, indeed, high-minded and even-handed. Martin Sheen—Martin fuckin’ Jed Bartlett Sheen!—plays Robert E. Lee, one of the prime targets of statue removers. But the film is propagandistic: there is, to the best of my recollection, not a single Black character in the film, and slavery as the cause of the conflict is alluded to (again, to the best of my recollection) only once—and then only obliquely.

My point in using this example—I could just as easily have cited Gone With the Wind or Ken Burns’ docuseries The Civil War—is how invidiously the Lost Cause narrative has wormed its way into the American imaginary. It is of a piece with everything else I’ve been talking about in this thee-part series of posts. What novels like The Underground Railroad and historical reckonings like the 1619 Project—as well as the campaign to tear down Confederate monuments—attempt is a kind of radical remembering. And as we see from the ongoing backlash, to remember things differently can be threatening to one’s sense of self and nation.

NOTES


[1] As I said, none of the usual suspects seems to have advanced an opinion, but there was—not unpredictably—an awful lot of such attempts in the comments sections on articles about the grisly discovery. They ranged from the usual vile racist sentiments one always finds in these digital cesspools, to one person who argued at length that the child mortality rates in residential schools were consonant with child mortality rates in the population at large, nineteenth century hygiene and medicine being what it was. This individual was not undeterred from their thesis in spite of a long-suffering interlocutor who provided stats and links showing that (a) what the person was referencing was infant mortality rates, which is not the same thing; (b) that the death rates in residential schools were actually more egregious in the 1930s and 40s, and (c) that mass burials in unmarked graves without proper records and death certificates spoke to the dehumanization of the Native children on one hand, and the likelihood on the other hand that this indicated that the “teachers” at these schools were reluctant to leave evidence of their abusive treatment.

[2] I will have a lot more to say on this particular misapprehension of “the university faculty lounge” in a future post on the more general misapprehensions of what comprises a humanities degree.

[3] Crenshaw also developed that other concept that triggers conservatives, “intersectionality.”

[4] Stay tuned for my forthcoming post, “The Conspiracy Theory of Postmodernism.”

Leave a comment

Filed under maunderings

History, Memory, and Forgetting Part 2: Forgetting and the Limits of Defamiliarization

“We cross our bridges when we come to them, and burn them behind us, with nothing to show for our progress except a memory of the smell of smoke, and a presumption that once our eyes watered.”

—Tom Stoppard, Rosencrantz and Guildenstern are Dead

In my first year at Memorial, I taught one of our first-year fiction courses. I ended the term with Martin Amis’ novel Time’s Arrow—a narrative told from the perspective of a parasitic consciousness that experiences time backwards. The person to whom the consciousness is attached turns out to be a Nazi war criminal hiding in America, who was a physician working with Dr. Mengele at Auschwitz. Just as we get back there, after seeing this man’s life played in reverse to the bafflement of our narrator, the novel promises that things will start to make sense … now. And indeed, the conceit of Amis’ novel is that the Holocaust can only make sense if played in reverse. Then, it is not the extermination of a people, but an act of benevolent creation—in which ashes and smoke are called down out of the sky into the chimneys of Auschwitz’s ovens and formed into naked, inert bodies. These bodies then have life breathed into them, are clothed, and sent out into the world. “We were creating a race of people,” the narrative consciousness says in wonder.

Time’s Arrow, I told my class, is an exercise of defamiliarization: it wants to resist us becoming inured to the oft-repeated story of the Holocaust, and so requires us to view it from a different perspective. Roberto Benigni’s film Life is Beautiful, I added, worked to much the same end, by (mostly) leaving the explicit brutalities of the Holocaust offstage (as it were), as a father clowns his way through the horror in order to spare his son the reality of their circumstances. As I spoke, however, and looked around the classroom at my students’ uncomprehending expressions, I felt a dread settle in my stomach. Breaking off from my prepared lecture notes, I asked the class: OK, be honest here—what can you tell me about the Holocaust?

As it turned out, not much. They knew it happened in World War II? And the Germans were the bad guys? And the Jews didn’t come out of it well …? (I’m honestly not exaggerating here). I glanced at my notes, and put them aside—my lecture had been predicated on the assumption that my students would have a substantive understanding of the Holocaust. This was not, to my mind, an unreasonable assumption—I had grown up learning about it in school by way of books like Elie Wiesel’s Night, but also seeing movies depicting its horrors. But perhaps I misremembered: I was from a young age an avid reader of WWII history (and remain so to this day), so I might have assumed your average high school education would have covered these bases in a more thorough manner.[1]

The upshot was that I abandoned my lecture notes and spent the remaining forty minutes of class delivering an off-the-cuff brief history of the Holocaust that left my students looking as if I’d strangled puppies in front of them, and me deeply desiring a hot shower and a stiff drink.

In pretty much every single class I’ve ever taught since, I will reliably harangue my students that they need to read more history. To be fair, I’d probably do that even without having had this particular experience; but I remember thinking of Amis’ brilliant narrative conceit that defamiliarization only works if there has first been familiarization, and it depressed me to think that the passage of time brings with it unfamiliarization—i.e. the memory-holing of crucial history that, previously, was more or less fresh in the collective consciousness. The newfound tolerance for alt-right perspectives and the resurgence of authoritarian and fascist-curious perspectives (to which the Republican Party is currently in thrall) proceeds from a number of causes, but one of them is the erosion of memory that comes with time’s passage. The injunctions against fascism that were so powerful in the decades following WWII, when the memory of the Holocaust was still fresh and both the survivors and the liberators were still ubiquitous, have eroded—those whose first-hand testimonials gave substance to that history have largely passed away. Soon none will remain.

What happens with a novel like Time’s Arrow or a film like Life is Beautiful when you have audiences who are effectively ignorant of the history informing their narrative gambits? Life is Beautiful, not unpredictably, evoked controversy because it was a funny movie about the Holocaust. While it was largely acclaimed by critics, there were a significant number who thought comedy was egregiously inappropriate in a depiction of the Holocaust,[2] as was using the Holocaust as a backdrop for a story focused on a father and his young son. As Italian film critic Paolo Mereghetti observes, “In keeping with Theodor Adorno’s idea that there can be no poetry after Auschwitz, critics argued that telling a story of love and hope against the backdrop of the biggest tragedy in modern history trivialized and ultimately denied the essence of the Holocaust.” I understand the spirit of such critiques, given that humour—especially Roberto Benigni’s particular brand of manic clowning—is jarring and dissonant in such a context, but then again, that’s the entire point. The film wants us to feel that dissonance, and to interrogate it. And not for nothing, but for all of the hilarity Benigni generates, the film is among one of the most heartbreaking I’ve seen, as it is about a father’s love and his desperate need to spare his son from the cruel reality of their circumstances. Because we’re so focused on the father’s clownish distractions, we do not witness—except for one haunting and devastating scene—the horrors that surround them.

In this respect, Life is Beautiful is predicated on its audience being aware of those unseen horrors, just as Time’s Arrow is predicated on its readers knowing the fateful trajectory of Jews rounded up and transported in boxcars to their torture and death in the camps, to say nothing of the grotesque medical “experiments” conducted by Mengele. The underlying assumption of such defamiliarization is that an oft-told history such as the Holocaust’s runs the risk of inuring people to its genuinely unthinkable proportions.[3] It is that very unthinkability, fresher in the collective memory several decades ago, that drove Holocaust denial among neo-Nazi and white supremacist groups—because even such blinkered, hateful, ignorant bigots understood that the genocide of six million people was a morally problematic onion in their racial purity ointment.[4]

“I know nothing, I see nothing …”

They say that tragedy plus time begets comedy. It did not take long for Nazis to become clownish figures on one hand—Hogan’s Heroes first aired in 1965, and The Producers was released in 1967—and one-dimensional distillations of evil on the other. It has become something of a self-perpetuating process: Nazis make the best villains because (like racist Southerners, viz. my last post) you don’t need to spend any time explaining why they’re villainous. How many Stephen Spielberg films embody this principle? Think of Indiana Jones in The Last Crusade, looking through a window into a room swarming with people in a certain recognizable uniform: “Nazis. I hate these guys.” It’s an inadvertently meta- moment, as well as a throwback to Indy’s other phobia in Raiders of the Lost Ark: “Snakes. Why’d it have to be snakes?” Snakes, Nazis, tomato, tomahto. Though I personally consider that a slander against snakes, the parallel is really about an overdetermined signifier of evil and revulsion, one that functions to erase nuance.

Unfortunately, if tragedy plus time begets comedy, it also begets a certain cultural amnesia when historically-based signifiers become divorced from a substantive understanding of the history they’re referencing. Which is really just a professorial way of saying that the use of such terms as “Nazi” or “fascist,” or comparing people to Hitler has become ubiquitous in a problematic way, especially in the age of social media. Case in point, Godwin’s Law, which was formulated by Michael Godwin in the infancy of the internet (1990). Godwin’s Law declares that “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.” This tendency has been added to the catalogue of logical fallacies as the “Reduction ad Hitlerum,” which entails “an attempt to invalidate someone else’s position on the basis that the same view was held by Adolf Hitler or the Nazi Party.”

Perhaps the most egregious recent example of this historical signification was Congresswoman and QAnon enthusiast Marjorie Taylor Greene’s comparison of Nancy Pelosi’s decision to continue requiring masks to be worn in the House of Representatives (because so many Republicans have declared their intention to not get vaccinated) to the Nazi law requiring Jews to wear a gold Star of David on their chests. She said, “You know, we can look back at a time in history where people were told to wear a gold star, and they were definitely treated like second class citizens, so much so that they were put in trains and taken to gas chambers in Nazi Germany. And this is exactly the type of abuse that Nancy Pelosi is talking about.”

To be certain, Greene was roundly condemned by almost everybody, including by many in her own party—even the craven and spineless Minority Leader Kevin McCarthy had some stern words for her—but what she said was different not in kind but in degree from the broader practice of alluding to Nazism and the Holocaust in glib and unreflective ways.[5]

Though this tendency is hardly new—Godwin’s Law is thirty-one years old—it has been amplified and exacerbated by social media, to the point where it made it difficult to find terms to usefully describe and define Donald Trump’s authoritarian tendencies. The ubiquity of Nazi allusions has made them necessarily diffuse, and so any attempt to characterize Trumpism as fascist in character could be easily ridiculed as alarmist and hysterical; and to be fair, there were voices crying “fascist!” from the moment he made that initial descent on his golden escalator to announce his candidacy. That those voices proved prescient rather than alarmist doesn’t obviate the fact that they muddied the rhetorical waters.[6] As the contours of Trump’s authoritarian tendencies came into focus, the fascistic qualities of the Trumpian Right became harder and harder to ignore; bizarrely, they’ve become even more clearly delineated since Trump left office, as the republicans still kowtow to the Mar-A-Lago strongman and move to consolidate minoritarian power.

Historians and political philosophers and a host of other thinkers of all stripes will be years in unravelling the historical and cultural strands of Trump’s rise and the previously unthinkable hold that Trumpism has over a stubborn rump of the electorate; but I do think that one of the most basic elements is our distressing tendency toward cultural amnesia. It makes me think we’re less in need of defamiliarizing history than defamiliarizing all the clichés of history that have become this inchoate jumble of floating signifiers, which allow neo-Nazis and white supremacists to refashion themselves as clean cut khaki-clad young men bearing tiki-torches, or to disingenuously euphemize their racism as “Western chauvinism” and meme their way out of accusations of ideological hatefulness—“It’s just about the lulz, dude.”

There is also, as I will discuss in the third of these three posts, the fact that remembering is itself a politically provocative act. On one hand, the diminution in the collective memory of Nazism and the Holocaust has facilitated the re-embracing of its key tropes; on the other, the active process of remembering the depredations of Western imperialism and the myriad ways in which slavery in the U.S. wasn’t incidental to the American experiment but integral gives rise to this backlash that takes refuge in such delusions as the pervasiveness of anti-white racism.

NOTES


[1] To be clear, this is not to castigate my students; as I’ll be expanding on as I go, historical amnesia is hardly limited to a handful of first-year university students in a class I taught fifteen years ago.

[2] One can only speculate on what such critics make of Mel Brooks’ career.

[3] Martin Amis attempts something similar in his 1985 short story collection Einstein’s Monsters, which is about nuclear war. His lengthy introductory essay “Thinkability” (to my mind, the best part of the book) addresses precisely the way in which military jargon euphemizes the scope and scale of a nuclear exchange, precisely to render the unthinkable thinkable. 

And speaking of humour used to defamiliarize horror: Dr. Strangelove, or How I Finally Stopped Worrying and Learned to Love the Bomb, and General Buck Turgidson (George C. Scott)’s own “thinkability” regarding the deaths in a nuclear war: “Mr. President, we are rapidly approaching a moment of truth, both for ourselves as human beings and for the life of our nation. Now, truth is not always a pleasant thing. But it is necessary now to make a choice, to choose between two admittedly regrettable, but nevertheless distinguishable, post-war environments: one where you got 20 million people killed, and the other where you got 150 million people killed! … Mr. President, I’m not saying we wouldn’t get our hair mussed, but I do say no more than 10 to 20 million killed, tops! Uh, depending on the breaks.”

[4] Which always rested on a tacit contradiction in their logic: it didn’t happen, but it should have.

[5] We see the same tendency, specifically among conservatives, to depict any attempt at raising taxes or expanding social programs as “socialism,” often raising the spectre of “Soviet Russia”—which is about as coherent as one of my favourite lines from Community, when Britta cries, “It’s just like Stalin back in Russia times!”

[6] I don’t say so to castigate any such voices, nor to suggest that they were premature—the contours of Trump’s authoritarian, nativist style were apparent from before he announced his candidacy to anyone who looked closely enough.

Leave a comment

Filed under maunderings

My Mostly Unscientific Take on UFOs

Over the past year or so, is has seemed as though whatever shadowy Deep State agencies responsible for covering up the existence of extraterrestrials have thrown up their hands and said “Yeah. Whatever.”

Perhaps the real-world equivalent of The X-Files Smoking Man finally succumbed to lung cancer, and all his subordinates just couldn’t be bothered to do their jobs any more.

Or perhaps the noise of the Trump presidency created the circumstances in which a tacit acknowledgement of numerous UFO sightings wouldn’t seem to be bizarre or world-changing.

One way or another, the rather remarkable number of declassified videos from fighter pilots’ heads-up-displays of unidentified flying objects of odd shapes and flying capabilities has evoked an equally remarkable blasé response. It’s as if the past four years of Trump, natural disasters, civil tragedies, and a once-in-a-century (touch wood) pandemic has so eroded our capacity for surprise that, collectively, we seem to be saying, “Aliens? Bring it.” Not even the QAnon hordes, for whom no event or detail is too unrelated not to be folded into the grand conspiracy have seen fit to make comment upon something that has so long been a favourite subject of conspiracists (“Aliens? But are they pedophile child sex-trafficking aliens?”).

Perhaps we’re all just a bit embarrassed at the prospect of alien contact, like having a posh and sophisticated acquaintance drop by when your place is an utter pigsty. I have to imagine that, even if the aliens are benevolent and peaceful, humanity would be subjected to a stern and humiliating talking-to about how we let our planet get to the state it’s in.

“I’m sorry to have to tell you, sir, that your polar icecaps are below regulation size for a planet of this category, sir.” (Good Omens)

Not to mention that if they landed pretty much anywhere in the U.S., they’d almost certainly get shot at.

And imagine if they’d landed a year ago.

“Take us to your leader!”
“Um … are you sure? Perhaps you should try another country.”
“All right, how do I get to Great Britain?”
“Ooh … no. You really don’t want that.”
“Russia then? China? India? Hungary?”
“Uh, no, no, no, and no.”
“Brazil?”
“A world of nope.”
“Wait–what’s the one with the guy with good hair?”
“Canada. But, yeah … probably don’t want to go there either. Maybe … try Germany?”
“Wasn’t that the Hitler country?”
“They got better.”

You’d think there would be more demand for the U.S. government to say more about these UFO sightings. The thing is, I’m sure that in some sections of the internet, there is a full-throated ongoing yawp all the time for that, but it hasn’t punctured the collective consciousness. And frankly, I don’t care enough to go looking for it.

It is weird, however, considering how we’ve always assumed that the existence of extraterrestrial life would fundamentally change humanity, throwing religious belief into crisis and dramatically transforming our existential outlook. The entire premise of Star Trek’s imagined future is that humanity’s first contact with the Vulcans forced a dramatic reset of our sense of self and others—a newly galactic perspective that rendered all our internecine tribal and cultural squabbles irrelevant, essentially at a stroke resolving Earth’s conflicts.

To be certain, there hasn’t been anything approaching definitive proof of alien life, so such epiphany or trauma lies only in a possible future. Those who speak with any authority on the matter are always careful to point out that “UFO” is not synonymous with “alien”—they’re not necessarily otherworldly, just unidentified.

I, for one, am deeply skeptical that these UFOs are of extraterrestrial origin—not because I don’t think it’s infinitesimally possible, just that the chances are in fact infinitesimal. In answer to the question of whether I think there’s life on other planets, my answer is an emphatic yes, which is something I base on the law of large numbers. The Milky Way galaxy, by current estimates, contains somewhere in the neighbourhood of 100 billion planets. Even if one tenth of one percent of those can sustain life, that’s still a million planets, and that in just one of the hundreds of billions of galaxies in the universe.

But then there’s the question of intelligence, and what comprises intelligent life. We have an understandably chauvinistic understanding of intelligence, one largely rooted in the capacity for abstract thought, communication, and inventiveness. We grant that dolphins and whales are intelligent creatures, but have very little means of quantifying that; we learn more and more about the intelligence of cephalopods like octopi, but again: such intelligences are literally alien to our own. The history of imagining alien encounters in SF has framed alien intelligence as akin to our own, just more advanced—developing along the same trajectory until interplanetary travel becomes a possibility. Dolphins might well be, by some metric we haven’t yet envisioned, far more intelligent than us, but they’ll never build a rocket—in part because, well, why would they want to? As Douglas Adams put it in The Hitchhiker’s Guide to the Galaxy, “man had always assumed that he was more intelligent than dolphins because he had achieved so much—the wheel, New York, wars and so on—whilst all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man—for precisely the same reasons.”

To put it another way, to properly imagine space-faring aliens, we have to imagine not so much what circumstances would lead to the development of space travel as how an alien species would arrive at an understanding of the universe that would facilitate the very idea of space travel.

Consider the thought experiment offered by Hans Blumenberg in the introduction of his book The Genesis of the Copernican World. Blumenberg points out that our atmosphere has a perfect density, “just thick enough to enable us to breath and to prevent us from being burned up by cosmic rays, while, on the other hand, it is not so opaque as to absorb entirely the light of the stars and block any view of the universe.” This happy medium, he observes, is “a fragile balance between the indispensable and the sublime.” The ability to see the stars in the night sky, he says, has shaped humanity’s understanding of themselves in relation to the cosmos, from our earliest rudimentary myths and models, to the Ptolemaic system that put us at the center of Creation and gave rise to the medieval music of the spheres, to our present-day forays in astrophysics. We’ve made the stars our oracles, our gods, and our navigational guides, and it was in this last capacity that the failings of the Ptolemaic model inspired a reclusive Polish astronomer named Mikołaj Kopernik, whom we now know as Copernicus.

But what, Blumenberg asks, if our atmosphere was too thick to see the stars? How then would have humanity developed its understanding of its place in the cosmos? And indeed, of our own world—without celestial navigation, how does seafaring evolve? How much longer before we understood that there was a cosmos, or grasped the movement of the earth without the motion of the stars? There would always of course be the sun, but it was always the stars, first and foremost, that inspired the celestial imagination. It is not too difficult to imagine an intelligent alien species inhabiting a world such as ours, with similar capabilities, but without the inspiration of the night sky to propel them from the surface of their planet.[1]

Now think of a planet of intelligent aquatic aliens, or creatures that live on a gas giant that swim deep in its dense atmosphere.

Or consider the possibility that our vaunted intelligence is in fact an evolutionary death sentence, and that that is in fact the case for any species such as ourselves—that our development of technology, our proliferation across the globe, and our environmental depredations inevitably outstrip our primate brains’ capacity to reverse the worst effects of our evolution.

Perhaps what we’ve been seeing is evidence of aliens who have mastered faster than light or transdimensional travel, but they’re biding their time—having learned the dangers of intelligence themselves, they’re waiting to see whether we succeed in not eradicating ourselves with nuclear weapons or environmental catastrophe; perhaps their rule for First Contact is to make certain a species such as homo sapiens can get its shit together and resolve all the disasters we’ve set in motion. Perhaps their Prime Directive is not to help us, because they’ve learned in the past that unless we can figure it out for our own damned selves, we’ll never learn.

In the words of the late great comedian Bill Hicks, “Please don’t shoot at the aliens. They might be here for me.”

EDIT: Stephanie read this post and complained that I hadn’t worked in Monty Python’s Galaxy Song, so here it is:

NOTES


[1] And indeed, Douglas Adams did imagine such a species in Life, the Universe, and Everything—an alien race on a planet surrounded by a dust cloud who live in utopian peace and harmony in the thought that they are the sum total of creation, until the day a spaceship crash-lands on their planet and shatters their illusion. At which point, on reverse engineering the spacecraft and flying beyond the dust cloud to behold the splendours of the universe, decide there’s nothing else for it but to destroy it all.

Leave a comment

Filed under maunderings, Uncategorized

Liz Cheney is as Constant as the Northern Star …

… and I don’t particularly mean that as a compliment.

Literally minutes before he is stabbed to death by a posse of conspiring senators, Shakespeare’s Julius Caesar declares himself to be the lone unshakeable, unmoving, stalwart man among his flip-flopping compatriots. He makes this claim as he arrogantly dismisses the petition of Metellus Cimber, who pleads for the reversal of his brother’s banishment. Cimber’s fellow conspirators echo his plea, prostrating themselves before Caesar, who finally declares in disgust,

I could be well moved if I were as you.
If I could pray to move, prayers would move me.
But I am constant as the northern star,
Of whose true-fixed and resting quality
There is no fellow in the firmament.
The skies are painted with unnumbered sparks.
They are all fire and every one doth shine,
But there’s but one in all doth hold his place.
So in the world. ‘Tis furnished well with men,
And men are flesh and blood, and apprehensive,
Yet in the number I do know but one
That unassailable holds on his rank,
Unshaked of motion. And that I am he.

Caesar mistakes the senators’ begging for weakness, not grasping that they are importuning him as a ploy to get close enough to stab him until it is too late.

Fear not, I’m not comparing Liz Cheney to Julius Caesar. I suppose you could argue that Cheney’s current anti-Trump stance is akin to Caesar’s sanctimonious declaration if you wanted to suggest that it’s more performative than principled. To be clear, I’m not making that argument—not because I don’t see it’s possible merits, but because I really don’t care.

I come not to praise Liz Cheney, whose political beliefs I find vile; nor do I come to bury her. The latter I’ll leave to her erstwhile comrades, and I confess I will watch the proceedings with a big metaphorical bowl of popcorn in my lap, for I will be a gratified observer no matter what the outcome. If the Trumpists succeed in burying her, well, I’m not about to mourn a torture apologist whose politics have always perfectly aligned with those of her father. If she soldiers on and continues to embarrass Trump’s sycophants by telling the truth, that also works for me.

Either way, I’m not about to offer encomiums for Cheney’s courage. I do think it’s admirable that she’s sticking to her guns, but as Adam Serwer recently pointed out in The Atlantic, “the [GOP’s] rejection of the rule of law is also an extension of a political logic that Cheney herself has cultivated for years.” During Obama’s tenure, she frequently went on Fox News to accuse the president of being sympathetic to jihadists, and just as frequently opined that American Muslims were a national security threat. During her run for a Wyoming Senate seat in 2014, she threw her lesbian sister Mary under the bus with her loud opposition to same-sex marriage, a point on which she stands to the right of her father. And, not to repeat myself, but she remains an enthusiastic advocate of torture. To say nothing of the fact that, up until the January 6th assault on the Capitol, was a reliable purveyor of the Trump agenda, celebrated then by such current critics as Steve Scalise and Matt Gaetz.

Serwer notes that the Cheney’s “political logic”—the logic of the War on Terror—is consonant with that of Trumpism not so much in policy as in spirit: the premise that there’s them and us, and that “The Enemy has no rights, and anyone who imagines otherwise, let alone seeks to uphold them, is also The Enemy.” In the Bush years, this meant the Manichaean opposition between America and Terrorism, and that any ameliorating sentiment about, say, the inequities of American foreign policy, meant you were With the Terrorists. In the present moment, the Enemy of the Trumpists is everyone who isn’t wholly on board with Trump. The ongoing promulgation of the Big Lie—that Biden didn’t actually win the election—is a variation of the theme of “the Enemy has no rights,” which is to say, that anyone who does not vote for Trump or his people is an illegitimate voter. Serwer writes:

This is the logic of the War on Terror, and also the logic of the party of Trump. As George W. Bush famously put it, “You are either with us or with the terrorists.” You are Real Americans or The Enemy. And if you are The Enemy, you have no rights. As Spencer Ackerman writes in his forthcoming book, Reign of Terror, the politics of endless war inevitably gives way to this authoritarian logic. Cheney now finds herself on the wrong side of a line she spent much of her political career enforcing.

All of which is by way of saying: Liz Cheney has made her bed. The fact that she’s chosen the hill of democracy to die on is a good thing, but this brings us back to my Julius Caesar allusion. The frustration being expressed by her Republican detractors, especially House Minority Leader Kevin McCarthy, is at least partially rational: she’s supposed to be a party leader, and in so vocally rejecting the party line, she’s not doing her actual job. She is being as constant as the Northern Star here, and those of us addicted to following American politics are being treated to a slow-motion assassination on the Senate (well, actually the House) floor.

But it is that constancy that is most telling in this moment. Cheney is anchored in her father’s neoconservative convictions, and in that respect, she’s something of a relic—an echo of the Bush years. As Serwer notes, however, while common wisdom says Trump effectively swept aside the Bush-Cheney legacy in his rise to be the presidential candidate, his candidacy and then presidency only deepened the bellicosity of Bush’s Us v. Them ethos, in which They are always already illegitimate. It’s just now that the Them is anyone opposed to Trump.

In the present moment, I think it’s useful to think of Liz Cheney as an unmoving point in the Republican firmament: to remember that her politics are as toxic and cruel as her father’s, and that there is little to no daylight between them. The fact that she is almost certainly going to lose both her leadership position and lose a primary in the next election to a Trump loyalist, is not a sign that she has changed. No: she is as constant as the Northern Star, and the Trump-addled GOP has moved around her. She is not become more virtuous; her party has just become so very much more debased.

2 Comments

Filed under maunderings, The Trump Era, Uncategorized, wingnuttery

Of Course There’s a Deep State. It’s Just Not What the Wingnuts Think it is.

There is a moment early in the film The Death of Stalin in which, as the titular dictator lays dying, the circle of Soviet officials just beneath Stalin (Khrushchev, Beria, Malenkov) panic at the prospect of finding a reputable doctor to treat him. Why? Because a few years earlier, Stalin, in a fit of characteristic paranoia, had become convinced that doctors were conspiring against him, and he had many of them arrested, tortured, and killed.

I thought of this cinematic moment—the very definition of gallows humour—while reading an article by Peter Wehner in The Atlantic observing that part of the appeal of QAnon (the number of whose adherents have, counter-intuitively perhaps, inflated since Biden’s election) is precisely because of its many disparate components. “I’m not saying I believe everything about Q,” the article quotes one Q follower as saying. “I’m not saying that the JFK-Jr.-is-alive stuff is real, but the deep-state pedophile ring is real.”

As [Sarah Longwell, publisher of The Bulwark] explained it to me, Trump supporters already believed that a “deep state”—an alleged secret network of nonelected government officials, a kind of hidden government within the legitimately elected government—has been working against Trump since before he was elected. “That’s already baked into the narrative,” she said. So it’s relatively easy for them to make the jump from believing that the deep state was behind the “Russia hoax” to thinking that in 2016 Hillary Clinton was involved in a child-sex-trafficking ring operating out of a Washington, D.C., pizza restaurant.

If you’ll recall, the “Deep State” bogeyman was central to Steve Bannon’s rhetoric during his tenure early in the Trump Administration, alongside his antipathy to globalism. The two, indeed, were in his figuration allied to the point of being inextricable, which is also one of the key premises underlying the QAnon conspiracy. And throughout the Trump Administration, especially during his two impeachments and the Mueller investigation, the spectre of the Deep State was constantly blamed as the shadowy, malevolent force behind any and all attempts to bring down Donald Trump (and was, of course, behind the putative fraud that handed Joe Biden the election).

Now, precisely why this article made me think of this moment in The Death of Stalin is a product of my own weird stream of consciousness, so bear with me: while I’ve always found Bannon & co.’s conspiracist depiction of the Deep State more than a little absurd, so too I’ve had to shake my head whenever any of Trump’s detractors and critics declare that there’s no such thing as a Deep State.

Because of course there’s a deep state, just one that doesn’t merit ominous capitalization. It also doesn’t merit the name “deep state,” but let’s just stick with that now for the sake of argument. All we’re really talking about here is the vast and complex bureaucracy that sustains any sizable human endeavour—universities to corporations to government. And when we’re talking about the government of a country as large as the United States, that bureaucracy is massive. The U.S. government employs over two million people, the vast majority of them civil servants working innocuous jobs that make the country run. Without them, nothing would ever get done.

Probably the best piece of advice I ever received as a university student was in my very first year of undergrad; a T.A. told me to never ask a professor about anything like degree requirements or course-drop deadlines, or, really, anything to do with the administrative dimension of being a student. Ask the departmental secretaries, he said. In fact, he added, do your best to cultivate their respect and affection. Never talk down to them or treat them as the help. They may not have a cluster of letters after their name or grade your papers, but they make the university run.

I’d like to think that I’m not the kind of person who would ever be the kind of asshole to berate secretaries or support staff, but I took my T.A.’s advice to heart, and went out of my way to be friendly and express gratitude, to be apologetic when I brought them a problem. It wasn’t long before I was greeted with smiles whenever I had paperwork that needed processing, and I never had any issues getting into courses (by contrast, in my thirty years in academia from undergrad to grad student to professor, I have seen many people—students and faculty—suffer indignities of mysterious provenance because they were condescending or disrespectful to support staff).

The point here is that, for all the negative connotations that attach to bureaucracy, it is an engine necessary for any institution or nation to run. Can it become bloated and sclerotic? Of course, though in my experience that tends to happen when one expands the ranks of upper management. But when Steve Bannon declared, in the early days of the Trump Administration, that his aim was “the deconstruction of the administrative state,” I felt a keen sense of cognitive dissonance in that statement—for the simple reason that there is no such thing as a non­-administrative state.

Which brings us back, albeit circuitously, to The Death of Stalin. There is no greater example of a sclerotic and constipated bureaucracy than that of the former Soviet Union, a point not infrequently made in libertarian and anti-statist arguments for small government. But I think the question that rarely gets raised when addressing dysfunctional bureaucracy—at least in the abstract—is why is it dysfunctional? There are probably any number of reasons why that question doesn’t come up, but I have to imagine that a big one is because we’ve been conditioned to think of bureaucracy as inevitably dysfunctional—a sense reinforced by every negative encounter experienced when renewing a driver’s license, waiting on hold with your bank, filing taxes, dealing with governmental red tape, or figuring out what prescriptions are covered by your employee health plan. But a second question we should ask when having such negative experiences is: are they negative because of an excess of bureaucracy, or too little? The inability of Stalin’s minions to find a competent doctor is a profound metaphor for what happens when we strip out the redundancies in a given system—in this case, the state-sponsored murder of thousands of doctors because of a dictator’s paranoia, such that one is left with (at best) mediocre medical professionals too terrified of state retribution to be dispassionately clinical, which is of course what one needs from a doctor.

I’m not a student of the history of the U.S.S.R., so I have no idea if anyone has written about whether the ineptitude of the Soviet bureaucracy was a legacy of Stalinist terror and subsequent Party orthodoxy, in which actually competent people were marginalized, violently or otherwise; I have to assume there’s probably a lot of literature on the topic (certainly, Masha Gessen’s critical review of the HBO series Chernobyl has something to say on the subject). But there’s something of an irony in the fact that Republican administrations since that of Ronald Reagan have created their own versions of The Death of Stalin’s doctor problem through their evisceration of government. Reagan famously said that the nine most frightening words were “I’m from the government, and I’m here to help,” and since then conservative governments—in the U.S., Canada, and elsewhere—have worked hard to make that a self-fulfilling prophecy. Thomas Frank, author of What’s the Matter With Kansas? (2004) has chronicled this tendency, in which Republican distrust of government tends to translate into the rampant gutting of social services, governmental agencies from the Post Office to the various cabinet departments, which then dramatically denudes the government’s ability to do anything. All of the failures that then inevitably occur are held up as proof of the basic premise of government’s inability to get anything right (and that therefore its basic services should be outsourced to the private sector).

In my brief moments of hope I wonder if perhaps the Trump Administration’s explicit practice of putting hacks and incompetent loyalists in key positions (such as Jared Kushner’s bizarrely massive portfolio) made this longstanding Republican exercise too glaring to ignore or excuse. Certainly, the contrast between Trump’s band of lickspittles and Biden’s army of sober professionals is about the most glaring difference we’ve seen between administrations, ever. What I hope we’re seeing, at any rate, is the reconstruction of the administrative state.

And it’s worth noting that Dr. Anthony Fauci has been resurrected from Trump’s symbolic purge of the doctors.

1 Comment

Filed under maunderings, The Trump Era, wingnuttery

In Which I Mark the One-Year Anniversary of the Pandemic With Some Thoughts on Monarchy

I did not watch Oprah Winfrey’s much-hyped interview of Prince Harry and Meghan Markle for much the same reason I did not watch the two most recent royal weddings: I didn’t care. Especially at this point in time, between marking a year of pandemic and the ongoing reverberations of the Trump presidency, the travails of Harry and Meghan—even inasmuch as I sympathize with them against the Royal Family—don’t really do much to excite my imagination or interest.

On the other hand, the fallout from the interview, coupled with related issues and events, has piqued my interest indeed. That people will be instinctively partisan for one party or the other is about as unsurprising as learning that some people in “the Firm” fretted about whether or not Meghan’s first child would be dark-complexioned. Racism in the Royal Family? Get away! But of course, this particular charge was picked up by right-wing pundits as further evidence of “cancel culture” at work, and we’ve been treated to the bizarre spectacle of self-described red-blooded American patriots rushing to the defense of HRM Queen Elizabeth II.1

Someone might want to remind them just what those boys at Lexington and Concord died for. Or perhaps tell them to watch Hamilton.

Notably, the one person emerging not just unscathed but burnished from the interview was the Queen herself—both Harry and Meghan were careful to say that none of the difficulties they’ve experienced emanated from her, and that she has indeed been the one person who is blameless (some reports have read between the lines and extrapolated that the Queen was prescient enough to have given Harry funds to see him through being cut off financially).

Leaving aside for the moment the possibility, or possibly even the likelihood, that this is entirely true, this sympathy is reflective of a broader reluctance to be critical of Elizabeth II. Even the 2006 film The Queen, starring Helen Mirren in the title role, which was all about the Palace’s cold and inept response to the shocking death of Diana, ended up painting a vaguely sympathetic portrait (though to be fair, that has a lot to do with the virtuosity of Helen Mirren). And The Crown (created and written by Peter Morgan, who wrote The Queen), which is largely unsparing of all the other royals and their courtiers, generally depicts Elizabeth as a victim of circumstance who spends her life doing her level best to do her royal duty and constrained by this very sense of duty from being a more compassionate and loving human.

The Queen is a person whom, I would argue, people tend to see through a nostalgic lens: nostalgia, in this case, for a form of stiff-upper-lip, keep-calm-and-carry-on Britishness memorialized in every WWII film ever—something seen as lost in the present day, along with Britannia’s status in the world. As we have seen in much of the pro-Brexit rhetoric, these two losses are not perceived as unrelated; and seeing Queen Elizabeth as the cornerstone of an ever-more-fractured Royal Family is a comforting anchor, but one that grows more tenuous as she ages.

There’s an episode in season four of The Crown that articulates this sensibility. In it, Elizabeth, having grown concerned that her children might not appreciate the scale and scope of the duties they’ve inherited, meets with each of them in turn and is perturbed by their feckless selfishness. Charles is in the process of destroying his marriage to Diana; Andrew is reckless in his passions; Anne is consumed by resentment and anger; and Edward is at once isolated by his royal status at school and indulgent in his royal privilege. Though her disappointment in her spawn is never put into words, it is obvious (Olivia Coleman can convey more with her facial expressions than I can in ten thousand words), and The Crown effectively indicts the younger generation of royals as unworthy of their status, and definitely unworthy of the throne.

This, I think, is where we’re at right now with Harry and Meghan’s interview. I’ve joked on occasion that “shocked but not surprised” should be the title of the definitive history of the Trump presidency, but it might also function as a general sentiment for this particular epoch. It is difficult, precisely, to put one’s finger on the substance of the outrage over Meghan’s revelations, aside from an instinctive royalist animus directed at anyone with the temerity to criticize the monarchy. This is why, perhaps, some (<cough> <cough> PIERS MORGAN <cough>) have simply chosen to call bullshit on Meghan Markle’s story of mental health issues and suicidal ideation;2 but it was the charge of racism that seems to have becomes the most ubiquitous bee in a whole lot of bonnets. Shocking, yes; surprising, no. The entire British colonial enterprise was predicated on the premise of white English supremacy, and royalty of all different nationalities has always been assiduous in policing their bloodlines. Prior to the divorce of Charles and Diana amid revelations of his relationship with Camilla Parker-Bowles, the greatest scandal the British monarchy had weathered was the abdication of Edward VIII so he could marry his American divorcée paramour, Wallis Simpson. Meghan Markle, it has been noted by many, ticks two of those scandalous boxes insofar as she is American and a divorcée.

She is also, to use postcolonial theorist Homi Bhabha’s phrasing, “not white/not quite.” Which is to say, she is biracial, and as such will never thus be qualified to be a royal in a stubborn subsection of the British cultural imagination.

Wallis Simpson and the man who might have been king.

The fascination many people have with the British Royal Family—especially among those who aren’t British—has always baffled me more than a little. But on honest reflection, I suppose I shouldn’t be baffled. In spite of the fact that hereditary monarchy is an objectively terrible form of governance, it is also one of the most durable throughout history. Human beings, it seems, are suckers for dynastic power, in spite of the illogic of its premise; as the late great Christopher Hitchens wryly observed, being the eldest son of a dentist does not somehow confer upon you the capacity to be a dentist. And yet down through the centuries, people have accepted that the eldest son (and occasionally daughter) of the current monarch had the right to assume the most important job in the nation on that monarch’s passing.

Of course, “right” and “ability” don’t always intersect, and there have been good, bad, and indifferent kings and queens down through history (of course, being democratically elected is no guarantee of governing ability, but at least the people have the option of cancelling your contract every few years). For every Henry V there’s a Richard III, and we’re equally fascinated by both, while mediocre kings and queens who preside over periods of relative peace don’t tend to get the dramatic treatment.

Indeed, on even just a brief reflection, it’s kind of amazing at just how pervasive the trope of monarchy is in literature and popular culture more broadly. It is unsurprising that Shakespeare, for example, would have made kings and queens the subject of many of his plays—that was, after all, the world in which he lived—but the persistence of hereditary monarchy in the 20th century cultural imagination is quite remarkable. It’s pretty much a staple of fantasy, as the very title of Game of Thrones attests; but where George R.R. Martin’s saga and its televisual adaptation are largely (but sadly not ultimately)3 rooted in a critique of the divine right of kings and the concept of the “chosen one,” the lion’s share of the genre rests in precisely the comfort bestowed by the idea that there is a true king out there perfectly suited to rule justly and peaceably.

More pervasive and pernicious than Shakespearean or Tolkienesque kings and queens, however, is the Disney princess-industrial complex. Granted, the fairy-tale story of the lowly and put-upon girl finding her liberatory prince pre-dates Walt Disney’s animated empire by centuries, but I think we can all agree that Disney has at once expanded, amplified, and sanded down the sharp edges of the princesses’ folkloric origins—all while inculcating in millions of children the twinned conceptions of royalistic destiny and the heteronormative gender roles associated with hereditary nobility (to be fair to Disney, it has done better with such recent excursions as Brave and Frozen—possibly the best endorsement of the latter’s progressiveness is the fact that Jordan Peterson loathes it). It’s telling that Disney’s most prominent branding image isn’t Mickey Mouse, but the Disney castle,4 a confection of airy spires and towers that any medievalist would tell you defeats the purpose of having a castle to start with. Even your more inept horde of barbarians would have little difficulty storming those paper-thin defenses, but then it’s not the bulwarks and baileys that are important, but the towers … the towers, built to entrap fair maidens until their rescuing princes can slip the lock or scale the wall.

I have to imagine that a large part of the obsession over royal weddings proceeds from precisely this happy-ending narrative on which the Mouse has built its house: the sumptuous spectacle of excess and adulation that evokes, time and again, Cinderella’s arrival at the ball. The disruption of this mythos is at once discomforting and titillating: Diana’s 1995 interview presaged Harry and Meghan’s with its revelations of constraint and isolation, and the active antagonism of both the Royal Family and its functionaries toward any sort of behaviour that might reflect badly upon it—even if that behaviour simply entailed seeking help for mental health issues. There have been many think-pieces breaking down which elements of The Crown are fact and which are fiction, but it is at this point fairly well established wisdom that being born a Windsor—or marrying into the family—is no picnic. And while Meghan’s claim that she never Googled Harry or his family strains credulity, I think it’s probably safe to say that no matter how much research one does, the realities of royal life almost certainly beggar the imagination.

Also, The Crown was only in its second season when Meghan married Harry.

I confess that, aside from the very first episode, I did not watch the first three seasons of The Crown, the principal reason being that I couldn’t get my girlfriend Stephanie into the show. While I may be more or less indifferent to the British monarchy, Stephanie is actively hostile5 to it. Born in South Africa, she and her family came to Canada when she was fourteen; having imbibed an antipathy to her birth nation’s colonizer that is far more diffuse in Canada, she gritted her teeth through the part of her citizenship oath in which she had to declare loyalty to the Queen. Her love of Gillian Anderson (Stephanie is, among her other endearing qualities, the biggest X-Files fan I’ve ever met) overcame her antipathy, however, for season four, and so we gleefully watched the erstwhile Agent Scully transformed into the Iron Lady spar with Olivia Colman’s Queen Elizabeth (we’re also pretty sympatico on our love of Olivia Colman). With each episode, we reliably said (a) Olivia, for the love of Marmite, don’t make us sympathetic with the Queen!; (b) Gillian, please don’t make us feel sympathy for/vague attraction to Margaret Thatcher!; and, (c) Holy crap, Emma Corrin looks so much like Lady Di!

It will be interesting to see The Crown catch up with the present moment. But I also have to wonder if some commentators are right when they say that the Harry and Meghan split from the Firm signals the end of the British monarchy? To my mind, by all rights it should: it’s long past time this vestige of colonial hubris went into that good night. We’ve got enough anti-democratic energy to deal with in the present moment without also concerning ourselves with a desiccated monarchy. When Queen Elizabeth dies, with her dies the WWII generation. The Second World War transformed the world in countless ways, one of them being that it spelled the end of the British Empire and the diminution of Great Britain’s influence in the world. Brexit is, among other things, a reactionary response to this uncomfortable reality, and a vain, desperate attempt to reassert Britannia’s greatness. Across the pond, fellow nativists in the U.S.  have latched onto Meghan Markle’s accusations of racism to make common cause with the monarchy. Not, perhaps, because they’ve forgotten the lessons of 1776, but most likely because they never learned them to start with.

NOTES

1. Perhaps the stupidest defense came from Fox and Friends’ co-host Brian Kilmeade, who opined that the fact that British Commonwealth countries are “75% Black and minority” demonstrated that the Royal Family could not possibly be racist. Leaving aside the pernicious history of colonialism and the kind of white paternalism epitomized by the Rudyard Kipling poem “White Man’s Burden,” can we perhaps agree that Kilmeade’s juxtaposition of “75%” and “minority” sort of gives the game away?

2. I’ve always felt that Piers Morgan was the result of a lab experiment in which a group of scientists got together to create the most perfect distillation of an asshole. Even if we grant his premise that Meghan Markle is, in fact, a status-seeking social climber who has basically Yoko Ono’ed Prince Harry out of the Royal Family, his constant ad hominem attacks on her say more about his own inadequacies than hers. And for the record, I do not grant his premise: to borrow his own turn of phrase, I wouldn’t believe Piers Morgan if he was reading a weather report.

3. We may never know how George R.R. Martin means to end his saga—at the rate he’s going, he’ll be writing into his 90s, and I don’t like his actuarial odds—but we do know how the series ended. The last-minute transformation of Daenerys into a tyrant who needed to be killed could conceivably have been handled better if the showrunners had allowed for two or three more episodes to bring us there; but the aftermath was also comparably rushed, and Sam Tarly’s democratic suggestion for egalitarian Westrosi governance was laughed off without any consideration. I will maintain to my dying day that GRRM effectively transformed fantasy, but also that he was too much in thrall to its core tropes to wander too far from their monarchical logic.

4. I recently bought a Disney+ streaming subscription in order to watch The Mandalorian. While writing this post, I remembered that Hamilton’s sole authorized video recording is Disney property. So of course I immediately clicked over to Disney+ to watch parts of it, and was treated to the irony of a play about the American revolutionary war to overthrow monarchical tyranny prefaced by Disney’s graphic of its castle adorned with celebratory fireworks.

5. When I read this paragraph to Stephanie, she liked all of it but objected to my use of the word “hostile.” “I don’t actually hate the Royal Family,” she said. “I don’t wish them harm. I just find the entire idea pointless and antiquated, and it embodies some of the worst aspects of British history.” So: she’s not hostile to the Royal Family, but I’m at a loss to find a better word, especially considering the invective she hurls at England during the World Cup.

2 Comments

Filed under history, maunderings, Uncategorized

The Sound of Mitch’s Hypocrisy

It has now long seemed that the idea of hypocrisy as something for which politicians should feel shame is a quaint and charming a relic of an imagined past. Certainly, the crass and vulgar mendacity of Trump and Trumpism has been a wall of overwhelming sound, drowning out hypocrisy’s reedy voice. It has been an environment in which Mitch McConnell has thrived—having perfected the art of po-faced hypocrisy in the Obama years, he has matched the amplifications of the Trump presidency with ever-more overt displays with seeming impunity.

And yet, he might have finally crossed a bridge too far with his handling of the Senate impeachment trial. It’s been as interesting as it has been infuriating to watch McConnell try to navigate the post-election waters, especially after January 6th. As has frequently been said of him, Mitch McConnell’s only ideological allegiance is to himself, his own power, and maintaining Republican control of the Senate. With this last element gone with the election of Raphael Warnock and David Ossoff in Georgia—largely because of Trump’s compulsive self-dealing—and with donors fleeing the G.O.P. after the Capitol assault, Mitch’s political calculus became more delicate. How to woo back the big money without infuriating Trump’s voters? How to placate the MAGA base without seeming to endorse the insurrection? He’s done so by being as cagey as possible—letting his aides leak to the press that he was open to the idea of impeaching Trump; harrumphing very occasionally about the unseemliness of the Capitol violence; then, after the House’s impeachment vote, refusing to start the Senate’s trial until after Biden’s inauguration; letting it be known he was encouraging his caucus to vote their conscience; then voting against the constitutionality of the trial (twice); and finally, voting to exonerate Trump on the tenuous excuse that you can’t impeach an ex-president, even though it was specifically his actions that did not allow for the trial while Trump was still in office.

But what might make things more difficult for Mitch going forward is that, after voting not guilty, he then denounced Trump in no uncertain terms, calling the former president’s actions a “disgraceful, disgraceful dereliction of duty” and further that Trump was “practically and morally responsible for provoking the events of the day.” The attack on the Capitol “was a foreseeable consequence of the growing crescendo of false statements, conspiracy theories, and reckless hyperbole which the defeated president kept shouting into the largest megaphone on planet Earth.” There’s no equivocation here: Mitch denounced the President’s actions, and then his inaction on the day, as criminal and criminally negligent … after voting against conviction, because ¯\_(ツ)_/¯ constitution, whaddaya gonna do?

Perhaps Mitch has just gotten too inured to his own habitual hypocrisy that he did not account for the relative silence of Donald Trump since his Twitter ban. We’ve spent five years being deafened by Trump’s bellows; Mitch’s latest, monumental hypocrisy was like someone carrying on speaking at the top of their voice when the room suddenly falls silent. Perhaps he’s counting on Americans’ short memories, but if the Democrats don’t hang this around his neck and the necks of the Republican Party from now until November 2022, they’re feckless idiots (sadly, never discount the Democrats’ capacity for fecklessness). Midterm elections traditionally go badly for the party in power, but the 2022 Senate map isn’t a good one for the G.O.P. If a handful of senators lose primaries to MAGA extremists, and if Joe Biden is successful in containing the virus and jump-starting the economy, the usual electoral math might not matter so much.

I have to imagine that Mitch has seen himself between a rock and a hard place these past few weeks: acquit Trump on a party-line vote and suffer at the polls in 2022; let more senators vote to convict, and suffer primary challenges. But those were not his only options. What if he had actively lobbied behind the scenes to convict Trump? What if he had brought in enough of his people to make the conviction not just a 2/3 vote, but overwhelming? Yes, that would have incited Trump’s ire and led to a lot of primary challenges, but at the same time there’s safety in numbers. A large-scale rebuke to Trump would have sucked up a lot of his oxygen, and it would have had the effect of isolating the Trumpiest of the Senate: Ted Cruz, Josh Hawley, Rand Paul, Ron Johnson, Lindsey Graham, all of whose political capital becomes tenuous in the absence of a Republican Party that continues to be supine to Trump. And the threat of primary losses diminishes along with Trump’s own status.

What’s more, such a bold shift would almost certainly have brought the Senate back to the Republicans in 2022. While I don’t be any means discount a resurgence in Trumpism in the near and medium-term future, we are at present seeing a slow but steady erosion of his support … a general disenchantment as Americans re-acclimate to boring but competent governance, while the impeachment managers laid out in damning and irrefutable terms Trump’s incitement to violence and subsequent dereliction of duty. Trump’s aides have suggested that he has been lying low during the impeachment trial and will start barnstorming the country any day now, seeking revenge on republican disloyalty. But so too have all the ongoing and potential lawsuits and indictments  been in a holding pattern, in some cases waiting to gauge political fallout. Not only do Mitch McConnell’s own damning words give the green light for many such cases, but he has also probably encouraged those who suffered injury or lost loved ones on January 6th to launch their own lawsuits against the ex-president.

One can only imagine what that possible feeding frenzy would look like if he had been convicted.

1 Comment

Filed under maunderings, The Trump Era

History’s Discordant Rhymes

Trump’s ongoing crusade to overturn the election has had a weird split-screen quality that would be hilarious if it weren’t so dystopian. On one hand, you have all the overblown rhetoric and accusations of fraud and election-rigging, elaborate conspiracy theories about voting machines being manipulated by China and Venezuela, dead people voting by the hundreds of thousands, and the active suppression of Republican poll-watchers. On the other hand, you have the fact that Trump et al have had, at this time of this writing, thirty-two of their legal challenges often literally laughed out of court, while they’ve only succeeded twice, on minor procedural questions. Notably, once in the courtroom, the allegations of fraud, never mind fraud on a massive conspiratorial scale, evaporate—because unlike one of Rudy Giuliani’s hysterically inchoate press conferences, the courts demand that evidence be presented.

You might think that this contradiction between what the Trump people allege and their inability to produce evidence in court, coupled with the glaring fact of their 2-32 win/loss record so far, would start to sink in and make Trump’s followers start to understand that there was no fraud and that Biden won what Trump’s own Department of Homeland Security called “the most secure election in American history.” But then, in order to think that, you’d probably have had to be in a coma these past four years. About a week ago I broke a personal rule and got into an argument with someone on social media who was convinced that election fraud had been perpetrated. When I pointed to the fact that the Trump people had not been able to produce any evidence of systematic wrongdoing, he repeatedly and sarcastically demanded, “Oh, are you a lawyer? Are you there in the courtroom? You don’t know what evidence they have!” I have since seen this line of argument repeated, most prominently by Trump lawyer Jenna Ellis, as if these court cases are black boxes and not publicly available … or that if Trump and Giuliani actually had any actual evidence, that they wouldn’t be putting it on public display 24/7. (My argument with the fraud-advocate ended when he told me he was “terrified” for my students, as it was “obvious” that I couldn’t be trusted to let them offer opposing perspectives in class).

Meanwhile, as his legal teams racks up losses like the New York Jets on Dramamine, Trump keeps tweeting his confidence that his re-election is all but a done deal, and his supporters continue to close ranks. Even Trump’s most voluble advocates aren’t safe from their wrath should they voice even the slightest doubt, as Tucker Carlson found when he made the rather glaringly obvious observation that such subtly orchestrated fraud on a vast scale—which leaves no trace—strains credulity: “What [Trump lawyer Sydney] Powell was describing would amount to the single greatest crime in American history,” Carlson said on his show this past Thursday. “Millions of votes stolen in a day. Democracy destroyed. The end of our centuries-old system of government.” The backlash from Trump supporters and other Trump-friendly media figures was immediate, with Rush Limbaugh’s producer asking (and betraying an ignorance of how evidence and the law works), “Where is the ‘evidence’ the election was fair?” With trenchant understatement, the NY Times Jeremy W. Peters observes that “The backlash against Mr. Carlson and Fox for daring to exert even a moment of independence underscores how little willingness exists among Republicans to challenge the president and his false narrative about the election he insists was stolen.”

It goes without saying that this state of affairs is deeply dangerous, and serves to obviate any kind of amusement or schadenfreude at the spectacle of Trump’s presidency figuratively—and Giuliani literally—melting down.

I think the adhesive for Rudy’s human mask is dissolving.

As I wrote in a recent post, the incoherence of the aggregate accusations being thrown around is a feature, not a bug, of conspiracism. All it all needs to do is cement in the minds of Trump voters—not all Trump voters, but a critical mass of them, to be certain—the illegitimacy of the Democrats and the impossibility that Biden could have won without cheating. It was always a given that Trump would not concede, but there was always the milder possibility that he’d resign with high dudgeon and Nixonian resentment (“You won’t have Donald Trump to kick around any more!”), claiming that he’d been cheated, accept a federal pardon from President Mike Pence, and retire to Mar-a-Lago to sulk and tweet and plan his comeback.

But no. It seems he’s determined to go all-in. Whether he actually believes he has a chance to steal the election with his scheme to have state electors overturn the results is something we’ll likely never know; but what seems more likely is that he wants to be forced from office. He wants to be seen going down fighting, a victim of Democratic malfeasance, the Deep State, interference from China, Venezuela, and Cuba, and whatever other fecal matter they want to fling at the wall. And while the prospect of seeing Trump literally frog-marched out of the White House by the Secret Service one minute after noon on January 20th is too delicious to contemplate, that is probably one of the worst scenarios. Why? Because all of those people who have gone all-in on Trump and the narrative of the election being stolen will look up from their phones on January 20th to see Biden taking the oath of office, and see the culmination of their present fears and convictions. And what happens then is anyone’s guess, though the one absolute certainty is that a not-insignificant proportion of the U.S. populace will believe it has been stabbed in the back by the rest of the country.

More than a few times I’ve seen the fantasy being built by Trump et al called the “stab in the back” narrative, and it never fails to chill. When Germany surrendered at the end of the First World War, it came as an utter shock to the soldiers and much of the civilian population. They had thought they were winning, due to a series of gains they had made in the spring of 1918, but in truth, there was nothing left with which to continue the war. The gains they had made were the result of the Kaiserschlacht, or “Kaiser’s Battle”—more commonly known as the “Spring Offensive,” that began in March 1918 and carried on for several months. The offensive was a gamble, and a risky one: the German High Command knew their resources were running low. The recent entry of the United States on the side of the Allies made the situation even more dire. So they went all-in on a massive series of attacks in the hopes of breaking the enemy lines and forcing them into a peace negotiation that would be favourable to Germany.

They failed, but they failed while looking as if they were winning. But the tank was empty. They could keep fighting, of course, with vastly denuded stocks of weapons and ammunition, with an ever-more demoralized army, and with starvation at home. They chose instead to surrender rather than put the military through the inevitable meat grinder.

But as is the nature or quasi-dictatorial monarchies, the German government wasn’t adept at messaging … the end came as a shock to the army and the civilian population because they had no idea how bad the situation actually was. And after the humiliation of the Treaty of Versailles, unsurprisingly, people looked for whom to blame. One of the most persistent theories was the “stab in the back” narrative, which held that powerful business interests with an internationalist character and therefore disloyal to Germany—i.e., the Jews—were responsible for bringing about Germany’s cowardly capitulation. For stabbing Germany in the back.

Yes, yes, insert Godwin’s Law disclaimer here. But it is more than a little uncanny to consider that these events occurred almost precisely one century ago—and doubly uncanny to further consider that 1918-1919 was the last occurrence of a truly global pandemic.

As the saying goes, history doesn’t repeat itself, but it does rhyme. Unlike most rhymes, however, these one can be discordant and jarring to the soul.

Leave a comment

Filed under maunderings, The Trump Era, Trump, wingnuttery

The Mad King in his Labyrinth

For reasons I can’t quite put my finger on, I’ve been thinking these past several days about mad kings, both fictional and historical.

It started with a Facebook post, alluding to George R.R. Martin’s series A Song of Ice and Fire—which some will know better by the HBO adaptation Game of Thrones—in which I said “We could really use a young Jaime Lannister in the White House right about now.” The allusion, which anyone who has read the novels and/or watched the series, will know, is to a key backstory plot point in which the Mad King, Aerys II, was murdered by Jaime Lannister, a member of his sworn Kingsguard—clearing the way for the usurpation of the Iron Throne by Robert Baratheon.

Martin, a keen student of history, loosely based the conflict animating the first few novels on the Wars of the Roses, the English civil wars that convulsed the nation for the better part of the fifteenth century; indeed, the two principal warring families of his series, the Starks and the Lannisters, bear more than a passing resemblance (phonetically, at any rate) with the Yorks and the Lancasters. But the Mad King himself—glimpsed only secondhand in various characters’ accounts of Robert Baratheon’s rebellion—bears a closer resemblance to the handful of lunatic Roman emperors who populated the empire’s declining years: Caligula with his murderous licentiousness, Nero’s narcissistic self-regard, and so forth. Nero was declared a public enemy by the Senate and killed himself in exile; Caligula was murdered by the Praetorian Guard. Martin borrows from a raft of such histories, which also include the killing of England’s Edward II and Richard II.

The other figure Martin’s Mad King resembles is the more contemporary dictator, reduced to paranoid, delusional ranting, surrounded by toadies and sycophants because he has banished or killed everybody who dares voice the slightest dissent. It was only a matter of time (probably minutes) before somebody did a Trump version of the much-memed bunker scene from Downfall.

The mad king—or tyrant, or dictator—is a compelling character for much the same reason that car crashes are fascinating: whether it’s Hitler in his bunker or Lear on the heath, we’re witness to the unspooling of a formerly powerful, formerly charismatic person’s mind. What has been remarkable about the Trump presidency these past few weeks is how public the unspooling has been. Historically, infirmity in the highest of offices has been hidden, as much as possible, from the public view (the examples of certain Roman emperors notwithstanding). Only a handful of royal handlers were witness to the madness of George III. When Woodrow Wilson suffered a stroke late into his second term, his wife and aides kept it quiet; ditto for Ronald Reagan’s latter-day dementia. We only found out about Richard Nixon’s drunken conversations with the portraits of former presidents in the final days before his resignation years after the fact.

But then again, Trump has arguably always been unhinged—that quality of mercurial unpredictability and volcanic temper is central to reality television, after all, and it was through The Apprentice that Trump was able to reforge his public persona in such a way as to delude a critical mass of Americans into believing that he was a brilliant and canny businessman and dealmaker. I’ve lost count of how many op-eds and think pieces have made the observation that his presidency has essentially unfolded like an exhausting four years (five, counting the campaign) of reality television conventions and tropes. He is himself not unaware of this fact; it is an open question of whether his tendency to do or say something outrageous when news unflattering to him breaks is a deliberate distraction strategy, or simply Trump being jealous of the spotlight.

But now we’re in the endgame. True to form, he’s playing a character, however inadvertently: sequestered in the White House, his general avoidance of the public eye speaks about as loudly as his all-caps tweets. Structurally, it is a bizarre situation, by which I mean the mad king in his labyrinth would normally be invisible to all but his closest advisors, some of whom would trot out to podiums every so often to offer anodyne updates. But of course this White House, as my mother would say, leaks like a chimney (as opposed to smoking like a sieve), and so we have frequent reports of Trump brooding, and details of the argument within his inner circle about whether to convince him to concede or keep fighting. But even without such leaks, we still have the logorrhea of Trump’s Twitter feed to keep us abreast of his downward spiral into increasingly deranged conspiracy theories about George Soros and Dominion Software voting machines. And of course we also have his devoted sycophants, like Rudy Giuliani and Lindsey Graham, taking every possible opportunity to go on television and propagate his paranoid maunderings.

The one bright spot in all of this is that at least there’s an expiration date: January 20, 2021, obviates the need for a Jaime Lannister or a Praetorian Guard. Which is fortunate for Trump.

Though that would make for good TV.

Leave a comment

Filed under maunderings, politics, The Trump Era, Trump, wingnuttery

Four Years Later

Slightly less than four years ago, a countdown started. Perhaps it was obscured at points by speculations about the twenty-fifth amendment, or the spectre of impeachment and removal—or, well, really, any number of possible eventualities—but ultimately for those of us horrified by the election of Donald J. Trump, Election Day 2020 was a cognitive terminus, that point at which America’s national nightmare would either end or be validated anew. And given that the latter was, and is, more or less unthinkable—an existential crisis of both political and spiritual dimensions—this coming Tuesday is a day of reckoning. To repurpose a line from Good Omens, November 3, 2020 has been throbbing in the collective brain like a migraine.

With less than a week to go, I’ve been oscillating between zen-like calm and apocalyptic agitation. On one hand, I’ve been watching Trump unspooling in real time with all of the schadenfreude you would expect; every time he pleads, whines, or drops yet another increasingly absurd lie (did you know that California is forcing its citizens to wear a “special” mask that you cannot take off, and have to eat through? Or that Trump recently ended a 400-year war between Serbia and Kosovo?), I get calmer, seeing in his behaviour his realization that he’s going to lose. But then I read news pieces about the gun-wielding militias declaring their intentions to take to the streets if Trump loses; about the ongoing efforts by Republicans to suppress the vote; about the near-certainty that, if Republican legal challenges to a Biden victory make it to the Supreme Court, the facts of the case won’t matter all that much to Brett Kavanaugh and Amy Coney Barrett.

This all makes me feel, as Bilbo Baggins would say, somewhat thin and stretched, like too little butter spread over too much toast.

And I don’t even live in the U.S.

In the autumn of 2016, I was teaching a second-year course titled “Critical Approaches to Popular Culture.” The Department of English had recently absorbed the Communications Studies program; pop culture was (is) one of the required courses for the major. I’d taught pop culture years before, twice, when I was in the latter stages of my PhD at Western, so it was lovely to return to it. For one of the course’s units, I focused on recent sitcoms that articulated diverse, feminist sensibilities: we looked at episodes of Archer, Master of None, Brooklyn Nine-Nine, and Parks and Recreation. A former student of mine from about six or seven years earlier, who had gone on to do a masters in English and another in gender studies, and was at the time a journalist and feminist activist, was a massive fan of Parks and Recreation, and identified strongly with the series’ main character Leslie Knope (played by Amy Poehler). The connection my former student has with Leslie was and is an obvious one: she is blonde, passionate, unremittingly (sometimes exhaustingly) enthusiastic, and devoted to feminism and the possibilities of local government to do good. Early in the semester, I emailed her and asked if she’d like to do a guest lecture on feminism and Parks and Recreation. She emailed me back almost immediately, asking (possibly jokingly) if she should do it in character as Leslie Knope.

The way things had fallen out in my scheduling, her guest lecture would take place the Thursday after the 2016 election. This was not by design, but I was delighted by the serendipity of it: the idea that I would have this extraordinary former student coming in to deliver a lecture on the character of Leslie Knope—who, in the show, idolized Hillary Clinton—two days after what most of us assumed would have been the election of the first woman president of the United States.

Well. We know how that worked out.

It was a huge boon to me that I did not teach on Wednesdays that term. Normally I would have gone up to the office anyway, but that day—which was, as I recall, appropriately grey and rainy—I instead stayed home and sat at my desk in my pyjamas, trying to work through my thoughts. I read dozens of news articles online. I wrote a blog post. And I tried to come to terms with the fact that the United States had actually elected Donald Trump.

The next day, I introduced my former student in my pop culture class, and, not unpredictably, she knocked her guest lecture out of the park. She was amazing, her lecture was amazing, and I can still congratulate myself on my decision to invite her. But there was also the uneasy sense of whistling past the graveyard, as it were: Parks and Recreation, which had by that point ended its run, was wreathed in the spirit of the Obama years. Real political figures made not-infrequent cameos as the series went on, both Democrat and Republican, conveying a sense of comity consonant with Obama’s (frequently frustrated) inclination to want to reach across the aisle (such as in the episode where Cory Booker and Orrin Hatch tell Leslie that they share a passion for Polynesian folk music, and perform together in a band named “Across the Isle”). And of course there was the running joke of Leslie’s conviction that Joe Biden is the sexiest man alive:

(Somehow, Leslie’s admonition to the Secret Service that Biden is “precious cargo” is a little more poignant in the present moment).

In the days leading up to the 2020 election, I’ve been rewatching Parks and Recreation. On one hand, the show is not unlike The West Wing as an imaginative salve for the present moment; but where Aaron Sorkin’s drama offers a liberal fantasia in which people work and argue in good faith (mostly), Parks is somewhat more on point insofar as it hews to a more realistic premise: people are terrible. All Leslie Knope wants to do is improve people’s lives, and not infrequently suffers unforeseen effects of liberal statist interventionism.

One such case occurs when the town’s sole video rental store, which hosts weekly screenings of film classics, is about to go out of business, as every other video rental store in the world has. The store’s proprietor, a pretentious film snob played by Jason Schwartzman who refuses to stock popular movies, does not help himself by being, well, a pretentious film snob who refuses to stock popular movies. Leslie secures him a grant from the city council, making him promise to overhaul his stock with more popular selections—which he does in fact do, but goes to the other extreme and turns his store into a porn emporium. Business then booms, but not even remotely in the way Leslie Knope had intended.

Such mishaps aside, the series chronicles the ways in which Leslie’s earnest and idealistic faith in government batters against the apathy, indifference, and hostility of the citizens she so wants to help. But for all of their awfulness, the people of Leslie Knope’s Pawnee aren’t hateful; they do not actively wish harm on others. Rewatching the series right now, I can’t ignore the simple fact that a stubborn forty percent of Americans support a defiantly hateful man, whom they love not for his principles but for his enemies—for how much pain he can inflict and how much cruelty he can practice.

That same semester in 2016, I taught a fourth-year seminar I titled “Revenge of the Genres.” The premise of the course was to look at texts and authors that transcended popular genres, or else used them in metaphorical or critical ways: we did Neil Gaiman’s novel American Gods, Colson Whitehead’s zombie apocalypse novel Zone One, Alison Bechdel’s graphic novel Fun Home, among others … and we ended the class with Hamilton. Even more than Parks and Recreation, Hamilton feels like a relic of the Obama years—not least because its conception, staging, and extraordinary success unfolded over Obama’s presidency, and indeed received its first audience in 2009 for the title song at the White House.

One of our points of discussion in class was how to read Hamilton not just post-Obama, but in the new age of Trump. By the time we started on it in class, we were a few weeks past the election; there was a sort of cosmic irony in examining a musical about a man whose co-creation of the Electoral College was undertaken to prevent the rise of a populist demagogue to the presidency.

There was also the leaden feeling that the play’s optimism and faith in the American experiment had been definitively belied by Trump’s election. As much as I love Hamilton, my one misgiving about it has always been the conviction that part of its popularity—aside from simply being an astonishingly good play—proceeded from the fact that it gave white liberals permission to celebrate the origin story of the United States without caveats. I mean, let’s be honest—the story of the United States’ founding is a pretty compelling one to start with. As a Canadian kid who grew up watching Schoolhouse Rock, I was frequently envious, because my own country lacked a revolutionary beginning and such colourful characters as Hamilton, Jefferson, and Franklin. But of course, as one ages and learns more, the broader contours of that history become tainted by the ugly facts of slavery and the genocide of indigenous peoples. The story remains compelling, but the more honest we are about the actual history, the more those caveats are going to inflect it.

Hence, a hip-hop musical written by a man of Puerto Rican heritage—whose first musical, In The Heights, was about life in an Hispanic neighbourhood in New York—which not only unapologetically celebrates “the ten dollar founding father” and the “American experiment,” but also casts predominantly Black and brown performers to play the roles of the white Founders, gives permission to white milquetoast liberals like myself to set aside the caveats for two and a half hours and enjoy America’s origin story set to virtuosic music.

And lest that sound cynical, I should hasten to add that it’s not merely about the permission structure—it’s also the hope it inspires.

Hope also feels a little like a relic of the Obama years, but I can’t let myself think that. I’m wound up pretty tightly at the moment, but I do think that there are reasons to hope. I always tell my studies in my American Lit classes that America is an idea. As one obscure Irish poet put it, it’s possibly the best idea the world ever had, but it has never been properly realized. The power of that idea, however, is what fuels Hamilton, what made The West Wing a hit TV show, and why Leslie Knope is an endearing character. The problem at the heart of Trumpism, as with all nativist populism, is that it has divorced the idea of America from its mythos. The America of MAGA is inert: an unchanging bundle of resentment stuck somewhere in an imaginary past. The idea of America, by contrast, is dynamic and hopeful; it is generous and open. It is also what Joe Biden has been articulating throughout his campaign. Precious cargo, indeed.

See everybody on the other side.

Leave a comment

Filed under maunderings, The Trump Era, what I'm watching