A Few Things

Summer Blogging Update     Several posts ago, I announced my summer blogging plans; I then dutifully followed up with a three-part series on memory, history, and forgetting in fairly quick succession. And then … well, nothing for two weeks. That’s largely due to the fact that my next series of posts, which will be a deep dive into postmodernism—what it is, what it was, what it isn’t, and why most of the people using the word in the media these days have no idea what they’re talking about—has been taking somewhat longer to write than I’d planned. Actually, that isn’t entirely true: the second and third installments are all but completed, and I’ve gotten a healthy start on the third and fourth, but the introductory post is taking longer (and is getting longer). I am optimistic that I will have it done by the end of the weekend, however, so hopefully Phase Two of My Ambitious Posts No One Will Read will soon commence.

What I’ve Watched     A few weeks ago I binged the series Rutherford Falls on Amazon Prime. I’d read some positive reviews of it and heard good things via NPR’s Pop Culture Happy Hour podcast (always a reliable guide); the show is also co-created by Michael Schur, a showrunner who, in my opinion, has been batting 1.000 for years: The Office, Parks and Recreation, Brooklyn Nine-Nine, The Good Place. But what’s particularly notable about Rutherford Falls is that the premise of the show is rooted in the relationship in the titular town between its indigenous and non-indigenous townsfolk. The series was co-created by Sierra Teller Ornelas, a Navajo screenwriter and filmmaker. The writers’ room is also well represented with Native American writers, and the cast is remarkable.

Ed Helms and Jana Schmieding.

Ed Helms plays Nathan Rutherford, a scion of the family that gives the town its name, and who is about the most Ed Helms character I’ve yet seen—a painfully earnest and well-meaning but frequently clueless fellow whose entire sense of self is bound up in his family’s history and that of the town. He runs a museum-ish space in his ancestral home. His best friend is Reagan (Jana Schmieding), a member of the (fictional) Minishonka Nation who is more or less persona non grata in her Native community because she left her fiancée at the altar a number of years before and instead went and did her masters degree at Northwestern. She runs—or attempts to run—a “cultural center” inside the local casino, which is owned by Terry Thomas (Michael Greyeyes), the de facto leader of the Minishonka community.

The show is hilarious, but is also quite comfortable in its own skin (so to speak)—it engages with difficult issues of Native history, white entitlement and appropriation of Native culture, the memorialization of settler culture (the precipitating crisis is when the town’s African-American mayor seeks to remove a statue of the town’s founding Rutherford in the town square, not because it’s a symbol of colonialism, but because cars keep crashing into it), and the fraught navigation by Terry Thomas of American capitalism as a means of accruing power and influence for his people—all without ever losing its humour or coming across as pedantic. A key moment comes in the fourth episode, when an NPR podcaster, having sniffed out that there might be a story in this sleepy town, interviews Terry. He asks him how he reconciles the contradiction of running a casino—the epitome of capitalistic graft—with his own Native identity. Terry, who had been answering the podcaster’s questions with a politician’s practiced smoothness, reaches out and pauses the recorder—and then proceeds, in a stern but not-angry voice, to school the well-meaning NPR hipster on precisely how the casino is consonant with Native values because it is not about the accrual of wealth for wealth’s sake, but for the benefit of a community that has long been marginalized. He then hits play again, and resumes his breezy tone for the rest of the interview.

It is a bravura performance; indeed, Michael Greyeyes is the series’ great revelation. A Canadian-born actor of the Cree Nation, he has long been a staple in Canadian and American indigenous film, or film and television featuring indigenous characters, and has (at least in most of the stuff I’ve seen him in) tended to play stern, brooding, or dangerous characters. So it is a joy to see him show off his comedic talents. His best line? When arguing with Nathan Rutherford over the historical accuracy of a costume he wants him to wear in his planned historical theme park, he says in exasperation, “Our market research shows that the average American’s understanding of history can be boiled down to seven concepts: George Washington, the flag, Independence Day, Independence Day the movie, MLK, Forrest Gump, and butter churns.”

What I’m Currently Watching     Well, we just watched the season four finale of The Handmaid’s Tale last night, and I’m going to need a while to process that. And we watched the series premier of Loki, which looks promising—largely on the basis of the fact that a Tom Hiddleston / Owen Wilson buddy comedy is unlikely to disappoint no matter what is done to it.

F. Murray Abraham, Danny Pudi, David Hornsby, Rob McElhenney, and Charlotte Nicdao

But the show we’re loving right now beyond what is strictly rational is the second season of Mythic Quest on AppleTV. For the unfamiliar: it’s the creation of Rob McElhenney, Charlie Day, and Megan Ganz, the folks who brought you It’s Always Sunny In Philadelphia; the series is a workplace comedy set in the offices of a video game called Mythic Quest: Raven’s Banquet, an MMORPG along the lines of World of Warcraft or Elder Scrolls. McElhenney plays the game’s creator and mastermind Ian (pronounced “Ion”) Grimm, a self-styled visionary who is sort of a cross between Steve Jobs and a lifestyle guru. Charlotte Nicdao plays Poppy, the perennially anxious and high-strung lead engineer who is responsible for turning Ian’s (again, pronounced “Ion”) ideas into workable code. Danny Pudi, who most famously played Abed in Community, is Brad, the “head of monetization,” and is delightfully cast here as a kind of anti-Abed who knows little and cares less about pop culture and video gaming and is only concerned about how he can wring every last cent out of the game’s devoted players. And F. Murray Abraham—yes, Salieri himself—plays the game’s head writer, washed-up SF author C.W. Longbottom, a lascivious alcoholic whose sole claim to fame (to which he clings like a barnacle) is having won a Nebula award in the early 1970s.

That combination of characters alone is a selling point—but the show is also incredibly smart, and also—just when you least expect it—deeply emotional.

What I’m Reading     At this point in the early summer as I scramble to use my time to complete at least one article before September, as well as move my summer blogging project forward (to say nothing of starting on course prep for the Fall), the question feels a little more like what am I not reading.

But that’s all business. My pleasure reading of the moment is a trilogy by M.R. Carey. Carey wrote the brilliant zombie apocalypse novel The Girl With all the Gifts—which was made into a quite good film starring Glenn Close and Gemma Arterton—and its companion novel set in the same world The Boy on the Bridge. One of my grad students this past semester alerted me to the fact that Carey had recently published a series set in another post-apocalyptic scenario. The “Ramparts Trilogy”—comprised of the novels The Book of Koli, The Trials of Koli, and The Fall of Koli—isn’t really a trilogy per se. The novels were all released within several months of each other, suggesting that “Ramparts” was really more of 1000+ page novel that the publisher chose to release in serial form rather than all at once.

Like the other Carey novels I’ve read, the Koli books are post-apocalyptic, set in an England some three to four centuries after humanity more or less obliterated itself in “the Unfinished War.” It is a word in which humanity has been reduced to a mere handful of its former numbers, and everything in nature now seems to be intent on killing the remaining survivors. Human manipulation of trees and vegetation to make them grow faster as part of an effort of reverse climate change has resulted in hostile forests: only on cold or overcast days can anyone walk among the trees without them snaring them with their tendrils. Any wood cut has to be “cured” in saline vats to kill it before it can be used as lumber. And the animal life that has evolved to live in such an environment is equally dangerous.

Koli is, at the start of the novels, an earnest if slightly dim fifteen-year old living in the village of Mythen Rood in the Calder Valley in the northwest of what was once England. His family are woodsmiths; Mythen Rood is protects by the Ramparts, villagers who can use the tech of the old world—a flamethrower, a guided bolt-gun, a “cutter,” which emits an invisible cutting beam, and a database that offers up helpful but cryptic knowledge and information from the old world.

There is much tech left over from the old world, but most of it is useless. Koli’s story essentially begins when he becomes obsessed with “waking” a piece of tech and joining the ranks of the Ramparts.

And that’s all I’ll tell you of the story: suffice to say, with one-third of the last novel to go, I am captivated by both the story Carey tells and the potential future he envisions.

Leave a comment

Filed under A Few Things, what I'm reading, what I'm watching

History, Memory, and Forgetting Part 3: The Backlash Against Remembering

“The struggle of man against power is the struggle of memory against forgetting.”

—Milan Kundera The Book of Laughter and Forgetting

I had drafted the first two installments of this three-part series of posts and was working on this one when the news of the discovery of the remains of 215 indigenous children on the grounds of a former residential school in BC broke. I paused for a day or two, uncertain of whether I wanted to talk about it here, in the context of my broader theme of history, memory, and forgetting. Part of my hesitation is I honestly lack the words for what is a shocking but utterly unsurprising revelation.

I must also confess that, to my shame, one of my first thoughts was the dread certainty that we’d soon be treated to some tone-deaf and historically ignorant column from the likes of Rex Murphy or Conrad Black or any one of the coterie of residential school apologists. So far however the usual suspects seem to be steering clear of this particular story; possibly the concrete evidence of so much death and suffering perpetrated by white turnkeys in the name of “civilizing” Native children is a bridge too far even for Murphy and Black et al’s paeans to Western civilization and white Canada’s munificence.

What I’m talking about in this third post of three is remembering as a political act: more specifically, of making others remember aspects of our history that they may not want to accept or believe. Scouting as I did for anything trying to ameliorate or excuse or explain away this evidence of the residential schools’ inhumanity,[1] I found my way to a 2013 Rex Murphy column that might just be the epitome of the genre, as one gleans from its title: “A Rude Dismissal of Canada’s Generosity.” In Murphy’s telling, in this column as in other of his rants and writings, conditions for Native Canadians in the present day are a vast improvement over historical circumstances, but the largesse of white Canadians and the government is something our indigenous populations perversely deny at every turn. He writes: “At what can be called the harder edges of native activism, there is a disturbing turn toward ugly language, a kind of razor rhetoric that seeks to cut a straight line between the attitudes of a century or a century and a half ago and the extraordinarily different attitudes that prevail today.”

“Attitudes” is the slippery word there: outside of unapologetically anti-indigenous and racist enclaves, I doubt you’d have difficulty finding white Canadians who did not piously agree that the exploitation and abuse of our indigenous peoples was a terrible thing. You’d have a much harder time finding anyone willing to do anything concrete about it, such as restoring the land we cite in land acknowledgments to its ancestral people. Attitudes, on the balance, have indeed improved, but that has had little effect on Native peoples’ material circumstances. And in his next paragraph, Murphy seems intent on demonstrating that not all attitudes have, in fact, improved:

From native protestors and spokespeople there is a vigorous resort to current radical jargon—referring to Canadians as colonialist, as settlers, as having a settler’s mentality. Though it is awkward to note, there is a play to race in this, a conscious effort to ground all issues in the allegedly unrepentant racism of the “settler community.” This is an effort to force-frame every dispute in the tendentious framework of the dubious “oppression studies” and “colonial theory” of latter-day universities.

And there it is—the “radical jargon” that seeks to remember. Referring to Canadians as colonialist settlers isn’t radical, nor is it jargon, but is a simple point of fact—and indeed, for decades history textbooks referred to settlers as brave individuals and the colonizing of Canada as a proud endeavour, necessarily eliding the genocidal impact on the peoples already inhabiting the “new world.” Murphy’s vitriol is, essentially, denialism: denying that our racist and oppressive history doesn’t linger on in a racist present. He speaks for an unfortunately large demographic of white Canada that is deeply invested in a whitewashed history, and reacts belligerently when asked to remember things otherwise.

This is a phenomenon we see playing out on a larger scale to the south, most recently with a substantial number of Republican-controlled state legislatures introducing bills that would forbid schools from teaching any curricula suggesting that the U.S. is a racist country, that it has had a racist history, or really anything that suggests racism is systemic and institutional rather than an individual failing. The bogeyman in much of the proposed legislation is “critical race theory.” Like Rex Murphy’s sneering characterization of “latter-day universities” offering degrees in “oppression studies” (not actually a thing), critical race theory is stigmatized as emerging from the university faculty lounge as part and parcel of “cultural Marxism’s” sustained assault on the edifices of Western civilization.[2] While critical race theory did indeed emerge from the academy, it was (and is) a legal concept developed by legal scholars like Derrick Bell and Kimberlé Crenshaw[3] in the 1970s and 80s. As Christine Emba notes, “It suggests that our nation’s history of race and racism is embedded in law and public policy, still plays a role in shaping outcomes for Black Americans and other people of color, and should be taken into account when these issues are discussed.” As she further observes, it has a clear and quite simple definition, “one its critics have chosen not to rationally engage with.”

Instead, critical race theory is deployed by its critics to connote militant, illiberal wokeism in a manner, to again quote Emba, that is “a psychological defense, not a rational one”—which is to say, something meant to evoke suspicion and fear rather than thought. The first and third words of the phrase, after all, associate it with elite liberal professors maundering in obscurantist jargon, with which they indoctrinate their students into shrill social justice warriors. (The slightly more sophisticated attacks will invoke such bêtes noir of critical theory as Michel Foucault or Jacques Derrida[4]).

But again, the actual concept is quite simple and straightforward: racism is systemic, which should not be such a difficult concept to grasp when you consider, for example, how much of the wealth produced in the antebellum U.S. was predicated on slave labour, especially in the production of cotton—something that also hugely benefited the northern free states, whose textile mills profitably transformed the raw cotton into cloth. Such historical realities, indeed, were the basis for the 1619 Project, the New York Times’ ambitious attempt to reframe American history through the lens of race—arguing that the true starting-point of America was not with the Declaration of Independence in 1776, but when the first African slaves set foot on American soil in 1619.

The premise is polemical by design, and while some historians took issue with some of the claims made, the point of the project was an exercise in remembering aspects of a national history that have too frequently been elided, glossed over, or euphemized. In my previous post, I suggested that the forgetting of the history of Nazism and the Holocaust—and its neutering through the overdeterminations of popular culture—has facilitated the return of authoritarian and fascistic attitudes. Simultaneously, however, it’s just as clear that this revanchist backsliding in the United States has as much to do with remembering. The reactionary crouch inspired by the tearing down of Confederate monuments isn’t about “erasing history,” but remembering it properly: remembering that Robert E. Lee et al were traitors to the United States and were fighting first and foremost to maintain the institution of slavery. Apologists for the Confederacy aren’t wrong when they say that the Civil War was fought over “states’ rights,” they’re just eliding the fact that the principal “right” being fought for above all others was the right to enslave Black people. All one needs to do to clarify this particular point is to read the charters and constitutions of the secessionist states, all of which make the supremacy of the white race and the inferiority of Africans their central tenet. The Confederate Vice President Alexander Stephens declared that the “cornerstone” of the Confederacy was that “the negro is not equal to the white man; that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”

Statue of Nathan Bedford Forrest in Memphis, Tennessee

Bringing down Confederate monuments isn’t erasure—it’s not as if Robert E. Lee and Nathan Bedford Forrest disappear from the history books because people no longer see their bronze effigies in parks and town squares—but the active engagement with history. That was also the case with the erection of such monuments, albeit in a more pernicious manner: the vast majority were put up in the 1920s and 1950s, both of which were periods when white Americans felt compelled to remind Black Americans of their subordinate place by memorializing those who had fought so bloodily to retain chattel slavery. Like the Confederate battle flag, these monuments were always already signifiers of white supremacy, though that fact has been systematically euphemized with references to “southern tradition,” and of course the mantra of states’ rights.

Martin Sheen as Robert E. Lee in Gettysburg (1993)

Once when I was still in grad school and had gone home to visit my parents for a weekend, I was up late watching TV, thinking about going to bed, and while flipping channels happened across a film set during the Civil War. It was about halfway through, but I stayed up watching until it was done. The story was compelling, the acting was really good, and the battle scenes were extremely well executed. The film was Gettysburg, which had been released in 1993. It had a huge cast, including Jeff Daniels and Sam Elliott, and Martin Sheen as General Robert E. Lee. Because I’d only seen the second half, I made a point of renting it so I could watch the entire thing.

Gettysburg is a well-made film with some great performances, and is very good at pushing emotional buttons. Colonel Joshua Chamberlain’s (Jeff Daniels) bayonet charge down Little Round Top is a case in point:

I’m citing Gettysburg here because it is one of the most perfect examples of how deeply the narrative of the Lost Cause became rooted in the American imagination. The Lost Cause, for the uninitiated, is the ongoing project, begun just after the end of the Civil War, to recuperate and whitewash (pun intended) the Confederacy and the antebellum South. Its keynotes include the aforementioned insistence that the Civil War wasn’t fought over slavery but states’ rights; its foregrounding of cultured and genteel Southern gentlemen and women; depictions of slaves (when depicted at all) as happy spiritual-singing fieldhands under the benevolent supervision of the mastah; Northerners as rapacious carpetbaggers who proceeded to despoil everything good and noble about the South in the years following the war; and the war itself as a tragic but honourable dispute between sad but dutiful officer classes prosecuting their respective sides’ goals not in anger but sorrow.

This last element is Gettysburg’s connective tissue. Let’s be clear: the film isn’t Southern propaganda like D.W. Griffith’s Birth of a Nation. It is, indeed, high-minded and even-handed. Martin Sheen—Martin fuckin’ Jed Bartlett Sheen!—plays Robert E. Lee, one of the prime targets of statue removers. But the film is propagandistic: there is, to the best of my recollection, not a single Black character in the film, and slavery as the cause of the conflict is alluded to (again, to the best of my recollection) only once—and then only obliquely.

My point in using this example—I could just as easily have cited Gone With the Wind or Ken Burns’ docuseries The Civil War—is how invidiously the Lost Cause narrative has wormed its way into the American imaginary. It is of a piece with everything else I’ve been talking about in this thee-part series of posts. What novels like The Underground Railroad and historical reckonings like the 1619 Project—as well as the campaign to tear down Confederate monuments—attempt is a kind of radical remembering. And as we see from the ongoing backlash, to remember things differently can be threatening to one’s sense of self and nation.

NOTES


[1] As I said, none of the usual suspects seems to have advanced an opinion, but there was—not unpredictably—an awful lot of such attempts in the comments sections on articles about the grisly discovery. They ranged from the usual vile racist sentiments one always finds in these digital cesspools, to one person who argued at length that the child mortality rates in residential schools were consonant with child mortality rates in the population at large, nineteenth century hygiene and medicine being what it was. This individual was not undeterred from their thesis in spite of a long-suffering interlocutor who provided stats and links showing that (a) what the person was referencing was infant mortality rates, which is not the same thing; (b) that the death rates in residential schools were actually more egregious in the 1930s and 40s, and (c) that mass burials in unmarked graves without proper records and death certificates spoke to the dehumanization of the Native children on one hand, and the likelihood on the other hand that this indicated that the “teachers” at these schools were reluctant to leave evidence of their abusive treatment.

[2] I will have a lot more to say on this particular misapprehension of “the university faculty lounge” in a future post on the more general misapprehensions of what comprises a humanities degree.

[3] Crenshaw also developed that other concept that triggers conservatives, “intersectionality.”

[4] Stay tuned for my forthcoming post, “The Conspiracy Theory of Postmodernism.”

Leave a comment

Filed under maunderings

History, Memory, and Forgetting Part 2: Forgetting and the Limits of Defamiliarization

“We cross our bridges when we come to them, and burn them behind us, with nothing to show for our progress except a memory of the smell of smoke, and a presumption that once our eyes watered.”

—Tom Stoppard, Rosencrantz and Guildenstern are Dead

In my first year at Memorial, I taught one of our first-year fiction courses. I ended the term with Martin Amis’ novel Time’s Arrow—a narrative told from the perspective of a parasitic consciousness that experiences time backwards. The person to whom the consciousness is attached turns out to be a Nazi war criminal hiding in America, who was a physician working with Dr. Mengele at Auschwitz. Just as we get back there, after seeing this man’s life played in reverse to the bafflement of our narrator, the novel promises that things will start to make sense … now. And indeed, the conceit of Amis’ novel is that the Holocaust can only make sense if played in reverse. Then, it is not the extermination of a people, but an act of benevolent creation—in which ashes and smoke are called down out of the sky into the chimneys of Auschwitz’s ovens and formed into naked, inert bodies. These bodies then have life breathed into them, are clothed, and sent out into the world. “We were creating a race of people,” the narrative consciousness says in wonder.

Time’s Arrow, I told my class, is an exercise of defamiliarization: it wants to resist us becoming inured to the oft-repeated story of the Holocaust, and so requires us to view it from a different perspective. Roberto Benigni’s film Life is Beautiful, I added, worked to much the same end, by (mostly) leaving the explicit brutalities of the Holocaust offstage (as it were), as a father clowns his way through the horror in order to spare his son the reality of their circumstances. As I spoke, however, and looked around the classroom at my students’ uncomprehending expressions, I felt a dread settle in my stomach. Breaking off from my prepared lecture notes, I asked the class: OK, be honest here—what can you tell me about the Holocaust?

As it turned out, not much. They knew it happened in World War II? And the Germans were the bad guys? And the Jews didn’t come out of it well …? (I’m honestly not exaggerating here). I glanced at my notes, and put them aside—my lecture had been predicated on the assumption that my students would have a substantive understanding of the Holocaust. This was not, to my mind, an unreasonable assumption—I had grown up learning about it in school by way of books like Elie Wiesel’s Night, but also seeing movies depicting its horrors. But perhaps I misremembered: I was from a young age an avid reader of WWII history (and remain so to this day), so I might have assumed your average high school education would have covered these bases in a more thorough manner.[1]

The upshot was that I abandoned my lecture notes and spent the remaining forty minutes of class delivering an off-the-cuff brief history of the Holocaust that left my students looking as if I’d strangled puppies in front of them, and me deeply desiring a hot shower and a stiff drink.

In pretty much every single class I’ve ever taught since, I will reliably harangue my students that they need to read more history. To be fair, I’d probably do that even without having had this particular experience; but I remember thinking of Amis’ brilliant narrative conceit that defamiliarization only works if there has first been familiarization, and it depressed me to think that the passage of time brings with it unfamiliarization—i.e. the memory-holing of crucial history that, previously, was more or less fresh in the collective consciousness. The newfound tolerance for alt-right perspectives and the resurgence of authoritarian and fascist-curious perspectives (to which the Republican Party is currently in thrall) proceeds from a number of causes, but one of them is the erosion of memory that comes with time’s passage. The injunctions against fascism that were so powerful in the decades following WWII, when the memory of the Holocaust was still fresh and both the survivors and the liberators were still ubiquitous, have eroded—those whose first-hand testimonials gave substance to that history have largely passed away. Soon none will remain.

What happens with a novel like Time’s Arrow or a film like Life is Beautiful when you have audiences who are effectively ignorant of the history informing their narrative gambits? Life is Beautiful, not unpredictably, evoked controversy because it was a funny movie about the Holocaust. While it was largely acclaimed by critics, there were a significant number who thought comedy was egregiously inappropriate in a depiction of the Holocaust,[2] as was using the Holocaust as a backdrop for a story focused on a father and his young son. As Italian film critic Paolo Mereghetti observes, “In keeping with Theodor Adorno’s idea that there can be no poetry after Auschwitz, critics argued that telling a story of love and hope against the backdrop of the biggest tragedy in modern history trivialized and ultimately denied the essence of the Holocaust.” I understand the spirit of such critiques, given that humour—especially Roberto Benigni’s particular brand of manic clowning—is jarring and dissonant in such a context, but then again, that’s the entire point. The film wants us to feel that dissonance, and to interrogate it. And not for nothing, but for all of the hilarity Benigni generates, the film is among one of the most heartbreaking I’ve seen, as it is about a father’s love and his desperate need to spare his son from the cruel reality of their circumstances. Because we’re so focused on the father’s clownish distractions, we do not witness—except for one haunting and devastating scene—the horrors that surround them.

In this respect, Life is Beautiful is predicated on its audience being aware of those unseen horrors, just as Time’s Arrow is predicated on its readers knowing the fateful trajectory of Jews rounded up and transported in boxcars to their torture and death in the camps, to say nothing of the grotesque medical “experiments” conducted by Mengele. The underlying assumption of such defamiliarization is that an oft-told history such as the Holocaust’s runs the risk of inuring people to its genuinely unthinkable proportions.[3] It is that very unthinkability, fresher in the collective memory several decades ago, that drove Holocaust denial among neo-Nazi and white supremacist groups—because even such blinkered, hateful, ignorant bigots understood that the genocide of six million people was a morally problematic onion in their racial purity ointment.[4]

“I know nothing, I see nothing …”

They say that tragedy plus time begets comedy. It did not take long for Nazis to become clownish figures on one hand—Hogan’s Heroes first aired in 1965, and The Producers was released in 1967—and one-dimensional distillations of evil on the other. It has become something of a self-perpetuating process: Nazis make the best villains because (like racist Southerners, viz. my last post) you don’t need to spend any time explaining why they’re villainous. How many Stephen Spielberg films embody this principle? Think of Indiana Jones in The Last Crusade, looking through a window into a room swarming with people in a certain recognizable uniform: “Nazis. I hate these guys.” It’s an inadvertently meta- moment, as well as a throwback to Indy’s other phobia in Raiders of the Lost Ark: “Snakes. Why’d it have to be snakes?” Snakes, Nazis, tomato, tomahto. Though I personally consider that a slander against snakes, the parallel is really about an overdetermined signifier of evil and revulsion, one that functions to erase nuance.

Unfortunately, if tragedy plus time begets comedy, it also begets a certain cultural amnesia when historically-based signifiers become divorced from a substantive understanding of the history they’re referencing. Which is really just a professorial way of saying that the use of such terms as “Nazi” or “fascist,” or comparing people to Hitler has become ubiquitous in a problematic way, especially in the age of social media. Case in point, Godwin’s Law, which was formulated by Michael Godwin in the infancy of the internet (1990). Godwin’s Law declares that “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.” This tendency has been added to the catalogue of logical fallacies as the “Reduction ad Hitlerum,” which entails “an attempt to invalidate someone else’s position on the basis that the same view was held by Adolf Hitler or the Nazi Party.”

Perhaps the most egregious recent example of this historical signification was Congresswoman and QAnon enthusiast Marjorie Taylor Greene’s comparison of Nancy Pelosi’s decision to continue requiring masks to be worn in the House of Representatives (because so many Republicans have declared their intention to not get vaccinated) to the Nazi law requiring Jews to wear a gold Star of David on their chests. She said, “You know, we can look back at a time in history where people were told to wear a gold star, and they were definitely treated like second class citizens, so much so that they were put in trains and taken to gas chambers in Nazi Germany. And this is exactly the type of abuse that Nancy Pelosi is talking about.”

To be certain, Greene was roundly condemned by almost everybody, including by many in her own party—even the craven and spineless Minority Leader Kevin McCarthy had some stern words for her—but what she said was different not in kind but in degree from the broader practice of alluding to Nazism and the Holocaust in glib and unreflective ways.[5]

Though this tendency is hardly new—Godwin’s Law is thirty-one years old—it has been amplified and exacerbated by social media, to the point where it made it difficult to find terms to usefully describe and define Donald Trump’s authoritarian tendencies. The ubiquity of Nazi allusions has made them necessarily diffuse, and so any attempt to characterize Trumpism as fascist in character could be easily ridiculed as alarmist and hysterical; and to be fair, there were voices crying “fascist!” from the moment he made that initial descent on his golden escalator to announce his candidacy. That those voices proved prescient rather than alarmist doesn’t obviate the fact that they muddied the rhetorical waters.[6] As the contours of Trump’s authoritarian tendencies came into focus, the fascistic qualities of the Trumpian Right became harder and harder to ignore; bizarrely, they’ve become even more clearly delineated since Trump left office, as the republicans still kowtow to the Mar-A-Lago strongman and move to consolidate minoritarian power.

Historians and political philosophers and a host of other thinkers of all stripes will be years in unravelling the historical and cultural strands of Trump’s rise and the previously unthinkable hold that Trumpism has over a stubborn rump of the electorate; but I do think that one of the most basic elements is our distressing tendency toward cultural amnesia. It makes me think we’re less in need of defamiliarizing history than defamiliarizing all the clichés of history that have become this inchoate jumble of floating signifiers, which allow neo-Nazis and white supremacists to refashion themselves as clean cut khaki-clad young men bearing tiki-torches, or to disingenuously euphemize their racism as “Western chauvinism” and meme their way out of accusations of ideological hatefulness—“It’s just about the lulz, dude.”

There is also, as I will discuss in the third of these three posts, the fact that remembering is itself a politically provocative act. On one hand, the diminution in the collective memory of Nazism and the Holocaust has facilitated the re-embracing of its key tropes; on the other, the active process of remembering the depredations of Western imperialism and the myriad ways in which slavery in the U.S. wasn’t incidental to the American experiment but integral gives rise to this backlash that takes refuge in such delusions as the pervasiveness of anti-white racism.

NOTES


[1] To be clear, this is not to castigate my students; as I’ll be expanding on as I go, historical amnesia is hardly limited to a handful of first-year university students in a class I taught fifteen years ago.

[2] One can only speculate on what such critics make of Mel Brooks’ career.

[3] Martin Amis attempts something similar in his 1985 short story collection Einstein’s Monsters, which is about nuclear war. His lengthy introductory essay “Thinkability” (to my mind, the best part of the book) addresses precisely the way in which military jargon euphemizes the scope and scale of a nuclear exchange, precisely to render the unthinkable thinkable. 

And speaking of humour used to defamiliarize horror: Dr. Strangelove, or How I Finally Stopped Worrying and Learned to Love the Bomb, and General Buck Turgidson (George C. Scott)’s own “thinkability” regarding the deaths in a nuclear war: “Mr. President, we are rapidly approaching a moment of truth, both for ourselves as human beings and for the life of our nation. Now, truth is not always a pleasant thing. But it is necessary now to make a choice, to choose between two admittedly regrettable, but nevertheless distinguishable, post-war environments: one where you got 20 million people killed, and the other where you got 150 million people killed! … Mr. President, I’m not saying we wouldn’t get our hair mussed, but I do say no more than 10 to 20 million killed, tops! Uh, depending on the breaks.”

[4] Which always rested on a tacit contradiction in their logic: it didn’t happen, but it should have.

[5] We see the same tendency, specifically among conservatives, to depict any attempt at raising taxes or expanding social programs as “socialism,” often raising the spectre of “Soviet Russia”—which is about as coherent as one of my favourite lines from Community, when Britta cries, “It’s just like Stalin back in Russia times!”

[6] I don’t say so to castigate any such voices, nor to suggest that they were premature—the contours of Trump’s authoritarian, nativist style were apparent from before he announced his candidacy to anyone who looked closely enough.

1 Comment

Filed under maunderings

History, Memory, and Forgetting Part 1: Deconstructing History in The Underground Railroad

“Yesterday, when asked about reparations, Senate Majority Leader Mitch McConnell offered a familiar reply: America should not be held liable for something that happened 150 years ago, since none of us currently alive are responsible … This rebuttal proffers a strange theory of governance, that American accounts are somehow bound by the lifetime of its generations … But we are American citizens, and thus bound to a collective enterprise that extends beyond our individual and personal reach.”

—Ta-Nehisi Coates, Congressional Hearing on Reparations, 20 June 2019
Thuso Mbedu as Cora in The Underground Railroad

I’ve been slowly working my way through Barry Jenkins’ ten-episode adaptation of Colson Whitehead’s novel The Underground Railroad. I’ve been a huge fan of Whitehead’s fiction ever since I got pointed towards his debut novel The Intuitionist; I read The Underground Railroad when it was still in hardcover, and I’ve included it twice in classes I’ve taught. When I saw it was being adapted to television by the virtuoso director of Moonlight and If Beale Street Could Talk, I knew this was a series I wanted to watch.

I was also wary—not because I was concerned about the series keeping faith with the novel, but because I knew it would make for difficult viewing. Whatever liberties Whitehead takes with history (as I discuss below), he is unsparing with the brutal historical realities of slavery and the casual cruelty and violence visited on slaves. It is often difficult enough to read such scenes, but having them depicted on screen—seeing cruelty and torture made explicit in audio and visual—can often be more difficult to watch.

For this reason, for myself at least, the series is the opposite of bingeable. After each episode, I need to digest the story and the visuals and think.

The Underground Railroad  focuses on the story of Cora (Thuso Mbedu), a teenage girl enslaved on a Georgia plantation whose mother had escaped when she was young and was never caught, leaving Cora behind. Another slave named Caesar (Aaron Pierre) convinces her to flee with him. Though reluctant at first, circumstances—the kind of circumstances which make the show difficult viewing—convince her to go. She and Caesar and another slave named Lovey (who had seen them go and, to their dismay, tagged along) are waylaid by slave-catchers. Though Lovey is recaptured, Cora and Caesar escape, but in the process one of the slave-catchers is killed. They make it to a node of the Underground Railroad and are sheltered by a white abolitionist with whom Caesar had been in contact. He then takes them underground to the “station,” where they wait for the subterranean train that will take them out of Georgia.

Because that is the principal conceit of The Underground Railroad: that the rail system spiriting slaves away is not metaphorical, but literal, running through underground tunnels linking the states.

More than a few reviews of the series, and also of the novel on which it is based, have referred to the story as “magical realism.” This is an inaccurate characterization. Magical realism is a mode of narrative in which one group or culture’s reality collides with another’s, and what seems entirely quotidian to one is perceived as magical by the other. That’s not what Colson Whitehead is doing, and, however lyrical and ethereal occasionally dreamlike Barry Jenkin’s visual rendering is, it’s not what the series is doing either. In truth, the literalization of the underground railroad is the least of Whitehead’s historical tweaks: The Underground Railroad is not magical realism, but nor is it alternative history (a genre that usually relies on a bifurcation in history’s progression, such as having Nazi sympathizer Charles Lindbergh win the presidency in 1940 in Philip Roth’s The Plot Against America). The Georgia plantation on which The Underground Railroad begins is familiar enough territory, not discernable from similar depictions in Uncle Tom’s Cabin or 12 Years a Slave. But then as Cora journeys from state to state, each state embodies a peculiar distillation of racism. Cora and Caesar first make it to South Carolina, which appears at first glance to be an enlightened and indeed almost utopian place: there is no slavery, and the white population seems dedicated to uplifting freed Blacks, from providing education and employment to scrupulous health care to good food and lodging. Soon however the paternalistic dimension of this altruism becomes more glaring, and Cora and Caesar realize that the free Blacks of South Carolina are being sterilized and unwittingly used in medical experimentation.

In North Carolina, by contrast, Blacks have been banished, and any found within the state borders are summarily executed. The white people of North Carolina declared slavery a blight—because it disenfranchised white workers. Since abolishing slavery and banishing or killing all of the Black people, the state has made the ostensible purity of whiteness a religious fetish. Cora spends her time in North Carolina huddled in the attic of a reluctant abolitionist and his even more reluctant wife in an episode that cannot help but allude to Anne Frank.

Whitehead’s vision—stunningly rendered in Jenkin’s adaptation—is less an alternative history than a deconstructive one. As Scott Woods argues, The Underground Railroad is not a history lesson, but a mirror, with none of “the finger-wagging of previous attempts to teach the horrors of slavery to mainstream audiences.” I think Woods is being polite here, using “mainstream” as a euphemism for “white,” and he tactfully does not observe that effectively all of such finger-wagging attempts (cinematically, at any rate) have tended to come from white directors and feature white saviour protagonists to make liberal white audiences feel better about themselves.

There are no white saviours in The Underground Railroad; there is, in fact, very little in the way of salvation of any sort, just moments of relative safety. As I still have a few episodes to go, I can’t say how the series ends; the novel, however, ends ambivalently, with Cora having defeated the dogged slave-catcher who has been pursuing her from the start, but still without a clear sense of where she is going—the liberatory trajectory of the underground railroad is unclear and fraught because of the weight of the experience Cora has accrued over her young life. As she says at one point, “Make you wonder if there ain’t no real place to escape to … only places to run from.” There is no terminus, just endless flight.

When I say The Underground Railway is a “deconstructive” history, I don’t use the term in the sense as developed by Jacques Derrida (or at least, not entirely). Rather, I mean it in the more colloquial sense such as employed by, for example, chefs when they put, say, a “deconstructed crème brûlée” on the menu, which might be a smear of custard on the plate speared by a shard of roasted sugar and garnished with granitas infused with cinnamon and nutmeg. If the dish is successful, it is because it defamiliarizes a familiar dessert by breaking down its constituent ingredients in such a way as to make the diner appreciate and understand them anew—and, ideally, develop a more nuanced appreciation for a classic crème brûlée.

So—yes, odd analogy. But I’d argue that Whitehead’s novel is deconstructive in much the same manner, by taking a pervasive understanding of racism and the legacy of slavery and breaking it down into some of its constituent parts. In South Carolina, Cora and Caesar experience the perniciousness of white paternalism of the “white man’s burden” variety—the self-important concern for Black “uplift” that is still invested in the conviction of Black culture’s barbarism and inferiority, and which takes from this conviction license to violate Black bodies in the name of “science.”

Thuso Mbedu as Cora and William Jackson Harper as Royal.

Then in North Carolina, we see rendered starkly the assertion that Blacks are by definition not American, and are essentially seen as the equivalent of an invasive species. This, indeed, was the basis of the notorious 1857 Supreme Court ruling on Dred Scott v. Sandford, which asserted that Black people could not be citizens of the United States. Though that precedent was effectively voided by the thirteenth and fourteenth amendments—which abolished slavery and established citizenship for people of African descent born in America, respectively—the franchise was not effectively extended to Black Americans until Lyndon Johnson signed the Voting Rights act a century after the end of the Civil War. The delegitimization of Black voters continues: while the current Trumpian incarnation of the G.O.P. tacitly depicts anyone voting Democrat as illegitimate and not a “real American,” in practice, almost all of the legal challenges to the 2020 election result were directed at precincts with large numbers of Black voters.

Later in the novel when Cora finds her way to Indiana to live on a thriving Black-run farm, we see the neighbouring white community’s inability to countenance Black prosperity in such close proximity, especially when Black people flourishing reflects badly on their own failures. The pogrom that follows very specifically evokes the Tulsa Massacre of 1921, when a huge white mob essentially burned down a thriving Black part of town.

What’s important to note here is that Whitehead’s deconstructive process is less about history proper than about our pervasive depictions of history in popular culture, especially by way of fiction, film, and television. Or to be more accurate, it is about the mythologization of certain historical tropes and narratives pertaining to how we understand racism. One of the big reasons why so many (white) people are able to guilelessly[1] suggest that America is not a racist nation, or claim that the election of a Black president proves that the U.S. is post-racial, is because racism has come to be understood as a character flaw rather than a systemic set of overlapping cultural and political practices. Think of the ways in which Hollywood has narrated the arc of American history from slavery to the Civil War to the fight for civil rights, and try to name films that don’t feature virtuous white protagonists versus racist white villains. Glory, Mississippi Burning, Ghosts of Mississippi, A Time to Kill, Green Book, The Help[2]—and this one will make some people bristle—To Kill a Mockingbird. I could go on.

To be clear, I’m not saying some of these weren’t excellent films, some of which featured nuanced and textured Black characters[3] with considerable agency; but the point, as with all systemic issues, is not the individual examples, but the overall patterns. These films and novels flatter white audiences—we’re going to identify with Willem Dafoe’s earnest FBI agent in Mississippi Burning against the Klan-associated sheriff. “That would be me,” we[4] think, without considering how the FBI—at the time the film was set, no less!—was actively working to subvert the civil rights movement and shore up the societal structures subjugating and marginalizing Black Americans, because in this framing, racism is a personal choice, and therefore Dafoe’s character is not complicit in J. Edgar Hoover’s.  

Gene Hackman and Willem Dafoe in Mississippi Burning (1988).

The white saviour tacitly absolves white audiences of complicity in racist systems in this way, by depicting racism as a failing of the individual. It allows us to indulge in the fantasy that we would ourselves be the white saviour: no matter what point in history we find ourselves, we would be the exception to the rule, resisting societal norms and pressures in order to be non-racists. Possibly the best cinematic rebuke to this fantasy was in 12 Years a Slave,[5] in the form of a liberal-minded plantation owner played by Benedict Cumberbatch, who recognizes the talents and intelligence of Solomon Northup (Chiwetel Ejiofor), a formerly free Black man who had been dragooned by thugs and illicitly sold into slavery. Cumberbatch’s character looks for a brief time to be Solomon’s saviour, as he enthusiastically takes Solomon’s advice on a construction project over that of his foreman. But when the foreman violently retaliates against Solomon, Cumberbatch’s character cannot stand up to him. In a more typical Hollywood offering, we might have expected the enlightened white man to intervene; instead, he lacks the intestinal fortitude to act in a way that would have brought social disapprobation, and as a “solution” sells Solomon to a man who proves to be cruelly sociopathic.

Arguing for the unlikeliness that most people could step out of the roles shaped for them by social and cultural norms and pressures might seem like an apologia for historical racism—how can we have expected people to behave differently?—but really it’s about the resistance to seeing ourselves enmeshed in contemporary systemic racism.

Saying that that is Whitehead’s key theme would be reductive; there is so much more to the novel that I’m not getting to. There is, to be certain, a place—a hugely important place—for straightforward historical accounts of the realities and finer details of slavery, even the more “finger wagging” versions Scott Woods alludes to. But what both Whitehead’s novel and Barry Jenkins’ adaptation of it offer is a deconstruction of the simplistic binarism of decades of us vs. them, good vs. bad constructions of racism that give cover to all but the most unapologetically racist white people. The current backlash against “critical race theory”—which I’ll talk more about in the third of these three posts—proceeds to a great extent from its insistence on racism not as individual but systemic, as something baked into the American system.

Which, when you think about it, is not the outrageous argument conservatives make it out to be. Not even close: Africans brought to American shores, starting in 1619, were dehumanized, brutalized, subjected to every imaginable violence, and treated as subhuman property for almost 250 years. Their descendants were not given the full franchise as American citizens until the Civil Rights Act and the Voting Rights Act of 1964 and 1965, respectively. Not quite sixty years on from that point, it’s frankly somewhat baffling that anyone, with a straight face, can claim that the U.S. isn’t a racist nation. One of the greatest stumbling blocks to arriving at that understanding is how we’ve personalized racism as an individual failing. It shouldn’t be so difficult to recognize, as a white person, one’s tacit complicity in a long history without having to feel the full weight of disapprobation that the label “racist” has come to connote through the pop cultural mythologization of racism as a simple binary.

NOTES


[1] I want to distinguish here between those who more cynically play the game of racial politics, and those who genuinely do not see racism as systemic (granted that there is a grey area in between these groups). These are the people for whom having one of more Black friends is proof of their non-racist bona fides, and who honestly believe that racism was resolved by the signing of the Civil Rights Act and definitively abolished by Obama’s election.

[2] The Help, both the novel and the film, is perhaps one of the purest distillations of a white saviour story packaged in such a way to flatter and comfort white audiences. Essentially, it is the story of an irrepressible young white woman (with red hair, of course) nicknamed Skeeter (played by Emma Stone) who chafes against the social conventions of 1963 Mississippi and dreams of being a writer and journalist. TL;DR: she ends up telling the stories of the “help,” the Black women working as domestic labour for wealthy families such as her own, publishes them—anonymously of course, though Skeeter is the credited author—and thus gets the traction she needs to leave Mississippi for a writing career.

I can’t get into all of the problems with this narrative in a footnote; hopefully I don’t need to enumerate them. But I will say that one of the key things that irked me about this story, both the novel and the movie, is how it constantly name-checks Harper Lee and To Kill a Mockingbird as Skeeter’s inspiration. Lee might have given us the archetype of the white saviour in the figure of Atticus Finch, but she did it at a moment in time (it was published in 1960) when the subject matter was dangerous (Mary Badham, who played Scout, found herself and her family shunned when they returned to Alabama after filming ended for having participated in a film that espoused civil rights). By contrast, The Help, which was published in 2009 and the film released in 2011, is about as safe a parable of racism can be in the present day—safely set in the most racist of southern states during the civil rights era, with satisfyingly vile racist villains and an endearing, attractive white protagonist whose own story of breaking gender taboos jockeys for pole position with her mission to give voice to “the help.”

[3] Though to be fair, in some instances this had as much to do with virtuoso performances by extremely talented actors, such as Denzel Washington, Morgan Freeman, and Andre Braugher in Glory. And not to heap yet more scorn on The Help, but the best thing that can be said about that film is that it gave Viola Davis and Octavia Spencer the visibility—and thus the future casting opportunities—that their talent deserves.

[4] I.e. white people.

[5] Not coincidentally helmed by a Black director, Steve McQueen (no, not that Steve McQueen, this Steve McQueen).

Leave a comment

Filed under Uncategorized

Summer Blogging and Finding Focus

Why do I write this blog? Well, it certainly isn’t because I have a huge audience—most of my posts top out at 40-60 views, and many garner a lot less than that. Every so often I get signal boosted when one or more people share a post. The most I’ve ever had was when a friend posted a link of one I wrote about The Wire and police militarization to Reddit, and I got somewhere in the neighbourhood of 1500 views.[1] Huge by my standards, minuscule by the internet’s.

Not that I’m complaining. I have no compunction to chase clicks, or to do the kind of networking on Twitter that seems increasingly necessary to building an online audience, which also entails strategically flattering some audiences and pissing off others. The topics I write about are eclectic and occasional, usually the product of a thought that crosses my mind and turns into a conversation with myself. My posts are frequently long and sometimes rambling, which is also not the best way to attract readers.

Blogging for me has always been something akin to thinking out loud—like writing in a journal, except in a slightly more formal manner, with the knowledge that, however scant my audience is, I’m still theoretically writing for other people, and so my thoughts have to be at least somewhat coherent. And every so often I get a hit of dopamine when someone shares a post or makes a complimentary comment.

I started my first blog when I moved to Newfoundland as a means of giving friends and family a window into my new life here, without subjecting them to the annoyance of periodic mass emails. I posted in An Ontarian in Newfoundland for eight years, from 2005 to 2013, during which time it went from being a digest of my experiences in Newfoundland to something more nebulous, in which I basically posted about whatever was on my mind. I transitioned to this blog with the thought that I would focus it more on professional considerations—using it as a test-space for scholarship I was working on, discussions about academic life, and considerations of things I was reading or watching. I did do that … but then also inevitably fell into the habit of posting about whatever was on my mind, often with long stretches of inactivity that sometimes lasted months.

During the pandemic, this blog has become something akin to self-care. I’ve written more consistently in this past year than I have since starting my first blog (though not nearly as prolifically as I posted in that first year), and it has frequently been a help in organizing what have become increasingly inchoate thoughts while enduring the nadir of Trump’s tenure and the quasi-isolation enforced by the pandemic. I won’t lie: it has been a difficult year, and wearing on my mental health. Sometimes putting a series of sentences together in a logical sequence to share with the world brought some order to the welter that has frequently been my mind.

As we approach the sixth month of the Biden presidency and I look forward to my first vaccination in a week, you’d think there would be a calming of the mental waters. And there has been, something helped by the more frequent good weather and more time spent outside. But even as we look to be emerging from the pandemic, there’s a lot still plaguing my peace of mind, from my dread certainty that we’re looking at the end of American democracy, to the fact that we’re facing huge budget cuts in health care and education here in Newfoundland.

The Venn diagram of the thoughts preoccupying my mind has a lot of overlaps, which contributes to the confusion. There are so many points of connection: the culture war, which irks me with all of its unnuanced (mis)understandings of postmodernism, Marxism, and critical race theory; the sustained attack on the humanities, which proceeds to a large degree from the misperception that it’s all about “woke” indoctrination; the ways in which cruelty has become the raison d’être of the new Right; the legacy of the “peace dividend” of the 1990s, the putative “end of history,” and the legacy of post-9/11 governance leading us to the present impasse; and on a more hopeful note, how a new humanism practiced with humility might be a means to redress some of our current problems.

For about three or four weeks I’ve been spending part of my days scribbling endless notes, trying to bring these inchoate preoccupations into some semblance of order. Reading this, you might think that my best route would be to unplug and refocus; except that this has actually been energizing. It helps in a way that there is significant overlap with a handful of articles I’m working on, about (variously) nostalgia and apocalypse, humanism and pragmatism, the transformations of fantasy as a genre, and the figuration of the “end of history” in Philip Roth’s The Plot Against America and contemporary Trumpist figurations of masculinity.

(Yes, that’s a lot. I’m hoping, realistically, to get one completed article out of all that, possibly two).

With all the writing I’ve been doing, it has been unclear—except for the scholarly stuff—how best to present it. I’ve been toying with the idea of writing a short book titled The Idiot’s[2] Guide to Postmodernism, which wouldn’t be an academic text but more of a user manual to the current distortions of the culture wars, with the almost certainly vain idea of reintroducing nuance into the discussion. That would be fun, but in the meantime I think I’ll be breaking it down into a series of blog posts.

Some of the things you can expect to see over the next while:

  • A three-part set of posts (coming shortly) on history, memory, and forgetting.
  • A deep dive into postmodernism—what it was, what it is, and why almost everyone bloviating about it and blaming it for all our current ills has no idea what they’re talking about.
  • A handful of posts about cruelty.
  • “Jung America”—a series of posts drawing a line from the “crisis of masculinity” of the 1990s to the current state of affairs with Trumpism and the likes of Jordan Peterson and Ben Shapiro.
  • At least one discussion about the current state of the humanities in the academy, as well as an apologia arguing why the humanities are as important and relevant now as they have ever been.

Phew. Knowing me, I might get halfway through this list, but we’ll see. Meantime, stay tuned.

NOTES


[1] David Simon also left a complimentary comment on that one. Without a doubt, the highlight of my blogging career.

[2] Specifically, Jordan Peterson, but there are others who could use a primer to get their facts straight.

2 Comments

Filed under blog business, Uncategorized

My Mostly Unscientific Take on UFOs

Over the past year or so, is has seemed as though whatever shadowy Deep State agencies responsible for covering up the existence of extraterrestrials have thrown up their hands and said “Yeah. Whatever.”

Perhaps the real-world equivalent of The X-Files Smoking Man finally succumbed to lung cancer, and all his subordinates just couldn’t be bothered to do their jobs any more.

Or perhaps the noise of the Trump presidency created the circumstances in which a tacit acknowledgement of numerous UFO sightings wouldn’t seem to be bizarre or world-changing.

One way or another, the rather remarkable number of declassified videos from fighter pilots’ heads-up-displays of unidentified flying objects of odd shapes and flying capabilities has evoked an equally remarkable blasé response. It’s as if the past four years of Trump, natural disasters, civil tragedies, and a once-in-a-century (touch wood) pandemic has so eroded our capacity for surprise that, collectively, we seem to be saying, “Aliens? Bring it.” Not even the QAnon hordes, for whom no event or detail is too unrelated not to be folded into the grand conspiracy have seen fit to make comment upon something that has so long been a favourite subject of conspiracists (“Aliens? But are they pedophile child sex-trafficking aliens?”).

Perhaps we’re all just a bit embarrassed at the prospect of alien contact, like having a posh and sophisticated acquaintance drop by when your place is an utter pigsty. I have to imagine that, even if the aliens are benevolent and peaceful, humanity would be subjected to a stern and humiliating talking-to about how we let our planet get to the state it’s in.

“I’m sorry to have to tell you, sir, that your polar icecaps are below regulation size for a planet of this category, sir.” (Good Omens)

Not to mention that if they landed pretty much anywhere in the U.S., they’d almost certainly get shot at.

And imagine if they’d landed a year ago.

“Take us to your leader!”
“Um … are you sure? Perhaps you should try another country.”
“All right, how do I get to Great Britain?”
“Ooh … no. You really don’t want that.”
“Russia then? China? India? Hungary?”
“Uh, no, no, no, and no.”
“Brazil?”
“A world of nope.”
“Wait–what’s the one with the guy with good hair?”
“Canada. But, yeah … probably don’t want to go there either. Maybe … try Germany?”
“Wasn’t that the Hitler country?”
“They got better.”

You’d think there would be more demand for the U.S. government to say more about these UFO sightings. The thing is, I’m sure that in some sections of the internet, there is a full-throated ongoing yawp all the time for that, but it hasn’t punctured the collective consciousness. And frankly, I don’t care enough to go looking for it.

It is weird, however, considering how we’ve always assumed that the existence of extraterrestrial life would fundamentally change humanity, throwing religious belief into crisis and dramatically transforming our existential outlook. The entire premise of Star Trek’s imagined future is that humanity’s first contact with the Vulcans forced a dramatic reset of our sense of self and others—a newly galactic perspective that rendered all our internecine tribal and cultural squabbles irrelevant, essentially at a stroke resolving Earth’s conflicts.

To be certain, there hasn’t been anything approaching definitive proof of alien life, so such epiphany or trauma lies only in a possible future. Those who speak with any authority on the matter are always careful to point out that “UFO” is not synonymous with “alien”—they’re not necessarily otherworldly, just unidentified.

I, for one, am deeply skeptical that these UFOs are of extraterrestrial origin—not because I don’t think it’s infinitesimally possible, just that the chances are in fact infinitesimal. In answer to the question of whether I think there’s life on other planets, my answer is an emphatic yes, which is something I base on the law of large numbers. The Milky Way galaxy, by current estimates, contains somewhere in the neighbourhood of 100 billion planets. Even if one tenth of one percent of those can sustain life, that’s still a million planets, and that in just one of the hundreds of billions of galaxies in the universe.

But then there’s the question of intelligence, and what comprises intelligent life. We have an understandably chauvinistic understanding of intelligence, one largely rooted in the capacity for abstract thought, communication, and inventiveness. We grant that dolphins and whales are intelligent creatures, but have very little means of quantifying that; we learn more and more about the intelligence of cephalopods like octopi, but again: such intelligences are literally alien to our own. The history of imagining alien encounters in SF has framed alien intelligence as akin to our own, just more advanced—developing along the same trajectory until interplanetary travel becomes a possibility. Dolphins might well be, by some metric we haven’t yet envisioned, far more intelligent than us, but they’ll never build a rocket—in part because, well, why would they want to? As Douglas Adams put it in The Hitchhiker’s Guide to the Galaxy, “man had always assumed that he was more intelligent than dolphins because he had achieved so much—the wheel, New York, wars and so on—whilst all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man—for precisely the same reasons.”

To put it another way, to properly imagine space-faring aliens, we have to imagine not so much what circumstances would lead to the development of space travel as how an alien species would arrive at an understanding of the universe that would facilitate the very idea of space travel.

Consider the thought experiment offered by Hans Blumenberg in the introduction of his book The Genesis of the Copernican World. Blumenberg points out that our atmosphere has a perfect density, “just thick enough to enable us to breath and to prevent us from being burned up by cosmic rays, while, on the other hand, it is not so opaque as to absorb entirely the light of the stars and block any view of the universe.” This happy medium, he observes, is “a fragile balance between the indispensable and the sublime.” The ability to see the stars in the night sky, he says, has shaped humanity’s understanding of themselves in relation to the cosmos, from our earliest rudimentary myths and models, to the Ptolemaic system that put us at the center of Creation and gave rise to the medieval music of the spheres, to our present-day forays in astrophysics. We’ve made the stars our oracles, our gods, and our navigational guides, and it was in this last capacity that the failings of the Ptolemaic model inspired a reclusive Polish astronomer named Mikołaj Kopernik, whom we now know as Copernicus.

But what, Blumenberg asks, if our atmosphere was too thick to see the stars? How then would have humanity developed its understanding of its place in the cosmos? And indeed, of our own world—without celestial navigation, how does seafaring evolve? How much longer before we understood that there was a cosmos, or grasped the movement of the earth without the motion of the stars? There would always of course be the sun, but it was always the stars, first and foremost, that inspired the celestial imagination. It is not too difficult to imagine an intelligent alien species inhabiting a world such as ours, with similar capabilities, but without the inspiration of the night sky to propel them from the surface of their planet.[1]

Now think of a planet of intelligent aquatic aliens, or creatures that live on a gas giant that swim deep in its dense atmosphere.

Or consider the possibility that our vaunted intelligence is in fact an evolutionary death sentence, and that that is in fact the case for any species such as ourselves—that our development of technology, our proliferation across the globe, and our environmental depredations inevitably outstrip our primate brains’ capacity to reverse the worst effects of our evolution.

Perhaps what we’ve been seeing is evidence of aliens who have mastered faster than light or transdimensional travel, but they’re biding their time—having learned the dangers of intelligence themselves, they’re waiting to see whether we succeed in not eradicating ourselves with nuclear weapons or environmental catastrophe; perhaps their rule for First Contact is to make certain a species such as homo sapiens can get its shit together and resolve all the disasters we’ve set in motion. Perhaps their Prime Directive is not to help us, because they’ve learned in the past that unless we can figure it out for our own damned selves, we’ll never learn.

In the words of the late great comedian Bill Hicks, “Please don’t shoot at the aliens. They might be here for me.”

EDIT: Stephanie read this post and complained that I hadn’t worked in Monty Python’s Galaxy Song, so here it is:

NOTES


[1] And indeed, Douglas Adams did imagine such a species in Life, the Universe, and Everything—an alien race on a planet surrounded by a dust cloud who live in utopian peace and harmony in the thought that they are the sum total of creation, until the day a spaceship crash-lands on their planet and shatters their illusion. At which point, on reverse engineering the spacecraft and flying beyond the dust cloud to behold the splendours of the universe, decide there’s nothing else for it but to destroy it all.

Leave a comment

Filed under maunderings, Uncategorized

Liz Cheney is as Constant as the Northern Star …

… and I don’t particularly mean that as a compliment.

Literally minutes before he is stabbed to death by a posse of conspiring senators, Shakespeare’s Julius Caesar declares himself to be the lone unshakeable, unmoving, stalwart man among his flip-flopping compatriots. He makes this claim as he arrogantly dismisses the petition of Metellus Cimber, who pleads for the reversal of his brother’s banishment. Cimber’s fellow conspirators echo his plea, prostrating themselves before Caesar, who finally declares in disgust,

I could be well moved if I were as you.
If I could pray to move, prayers would move me.
But I am constant as the northern star,
Of whose true-fixed and resting quality
There is no fellow in the firmament.
The skies are painted with unnumbered sparks.
They are all fire and every one doth shine,
But there’s but one in all doth hold his place.
So in the world. ‘Tis furnished well with men,
And men are flesh and blood, and apprehensive,
Yet in the number I do know but one
That unassailable holds on his rank,
Unshaked of motion. And that I am he.

Caesar mistakes the senators’ begging for weakness, not grasping that they are importuning him as a ploy to get close enough to stab him until it is too late.

Fear not, I’m not comparing Liz Cheney to Julius Caesar. I suppose you could argue that Cheney’s current anti-Trump stance is akin to Caesar’s sanctimonious declaration if you wanted to suggest that it’s more performative than principled. To be clear, I’m not making that argument—not because I don’t see it’s possible merits, but because I really don’t care.

I come not to praise Liz Cheney, whose political beliefs I find vile; nor do I come to bury her. The latter I’ll leave to her erstwhile comrades, and I confess I will watch the proceedings with a big metaphorical bowl of popcorn in my lap, for I will be a gratified observer no matter what the outcome. If the Trumpists succeed in burying her, well, I’m not about to mourn a torture apologist whose politics have always perfectly aligned with those of her father. If she soldiers on and continues to embarrass Trump’s sycophants by telling the truth, that also works for me.

Either way, I’m not about to offer encomiums for Cheney’s courage. I do think it’s admirable that she’s sticking to her guns, but as Adam Serwer recently pointed out in The Atlantic, “the [GOP’s] rejection of the rule of law is also an extension of a political logic that Cheney herself has cultivated for years.” During Obama’s tenure, she frequently went on Fox News to accuse the president of being sympathetic to jihadists, and just as frequently opined that American Muslims were a national security threat. During her run for a Wyoming Senate seat in 2014, she threw her lesbian sister Mary under the bus with her loud opposition to same-sex marriage, a point on which she stands to the right of her father. And, not to repeat myself, but she remains an enthusiastic advocate of torture. To say nothing of the fact that, up until the January 6th assault on the Capitol, was a reliable purveyor of the Trump agenda, celebrated then by such current critics as Steve Scalise and Matt Gaetz.

Serwer notes that the Cheney’s “political logic”—the logic of the War on Terror—is consonant with that of Trumpism not so much in policy as in spirit: the premise that there’s them and us, and that “The Enemy has no rights, and anyone who imagines otherwise, let alone seeks to uphold them, is also The Enemy.” In the Bush years, this meant the Manichaean opposition between America and Terrorism, and that any ameliorating sentiment about, say, the inequities of American foreign policy, meant you were With the Terrorists. In the present moment, the Enemy of the Trumpists is everyone who isn’t wholly on board with Trump. The ongoing promulgation of the Big Lie—that Biden didn’t actually win the election—is a variation of the theme of “the Enemy has no rights,” which is to say, that anyone who does not vote for Trump or his people is an illegitimate voter. Serwer writes:

This is the logic of the War on Terror, and also the logic of the party of Trump. As George W. Bush famously put it, “You are either with us or with the terrorists.” You are Real Americans or The Enemy. And if you are The Enemy, you have no rights. As Spencer Ackerman writes in his forthcoming book, Reign of Terror, the politics of endless war inevitably gives way to this authoritarian logic. Cheney now finds herself on the wrong side of a line she spent much of her political career enforcing.

All of which is by way of saying: Liz Cheney has made her bed. The fact that she’s chosen the hill of democracy to die on is a good thing, but this brings us back to my Julius Caesar allusion. The frustration being expressed by her Republican detractors, especially House Minority Leader Kevin McCarthy, is at least partially rational: she’s supposed to be a party leader, and in so vocally rejecting the party line, she’s not doing her actual job. She is being as constant as the Northern Star here, and those of us addicted to following American politics are being treated to a slow-motion assassination on the Senate (well, actually the House) floor.

But it is that constancy that is most telling in this moment. Cheney is anchored in her father’s neoconservative convictions, and in that respect, she’s something of a relic—an echo of the Bush years. As Serwer notes, however, while common wisdom says Trump effectively swept aside the Bush-Cheney legacy in his rise to be the presidential candidate, his candidacy and then presidency only deepened the bellicosity of Bush’s Us v. Them ethos, in which They are always already illegitimate. It’s just now that the Them is anyone opposed to Trump.

In the present moment, I think it’s useful to think of Liz Cheney as an unmoving point in the Republican firmament: to remember that her politics are as toxic and cruel as her father’s, and that there is little to no daylight between them. The fact that she is almost certainly going to lose both her leadership position and lose a primary in the next election to a Trump loyalist, is not a sign that she has changed. No: she is as constant as the Northern Star, and the Trump-addled GOP has moved around her. She is not become more virtuous; her party has just become so very much more debased.

2 Comments

Filed under maunderings, The Trump Era, Uncategorized, wingnuttery

Of Course There’s a Deep State. It’s Just Not What the Wingnuts Think it is.

There is a moment early in the film The Death of Stalin in which, as the titular dictator lays dying, the circle of Soviet officials just beneath Stalin (Khrushchev, Beria, Malenkov) panic at the prospect of finding a reputable doctor to treat him. Why? Because a few years earlier, Stalin, in a fit of characteristic paranoia, had become convinced that doctors were conspiring against him, and he had many of them arrested, tortured, and killed.

I thought of this cinematic moment—the very definition of gallows humour—while reading an article by Peter Wehner in The Atlantic observing that part of the appeal of QAnon (the number of whose adherents have, counter-intuitively perhaps, inflated since Biden’s election) is precisely because of its many disparate components. “I’m not saying I believe everything about Q,” the article quotes one Q follower as saying. “I’m not saying that the JFK-Jr.-is-alive stuff is real, but the deep-state pedophile ring is real.”

As [Sarah Longwell, publisher of The Bulwark] explained it to me, Trump supporters already believed that a “deep state”—an alleged secret network of nonelected government officials, a kind of hidden government within the legitimately elected government—has been working against Trump since before he was elected. “That’s already baked into the narrative,” she said. So it’s relatively easy for them to make the jump from believing that the deep state was behind the “Russia hoax” to thinking that in 2016 Hillary Clinton was involved in a child-sex-trafficking ring operating out of a Washington, D.C., pizza restaurant.

If you’ll recall, the “Deep State” bogeyman was central to Steve Bannon’s rhetoric during his tenure early in the Trump Administration, alongside his antipathy to globalism. The two, indeed, were in his figuration allied to the point of being inextricable, which is also one of the key premises underlying the QAnon conspiracy. And throughout the Trump Administration, especially during his two impeachments and the Mueller investigation, the spectre of the Deep State was constantly blamed as the shadowy, malevolent force behind any and all attempts to bring down Donald Trump (and was, of course, behind the putative fraud that handed Joe Biden the election).

Now, precisely why this article made me think of this moment in The Death of Stalin is a product of my own weird stream of consciousness, so bear with me: while I’ve always found Bannon & co.’s conspiracist depiction of the Deep State more than a little absurd, so too I’ve had to shake my head whenever any of Trump’s detractors and critics declare that there’s no such thing as a Deep State.

Because of course there’s a deep state, just one that doesn’t merit ominous capitalization. It also doesn’t merit the name “deep state,” but let’s just stick with that now for the sake of argument. All we’re really talking about here is the vast and complex bureaucracy that sustains any sizable human endeavour—universities to corporations to government. And when we’re talking about the government of a country as large as the United States, that bureaucracy is massive. The U.S. government employs over two million people, the vast majority of them civil servants working innocuous jobs that make the country run. Without them, nothing would ever get done.

Probably the best piece of advice I ever received as a university student was in my very first year of undergrad; a T.A. told me to never ask a professor about anything like degree requirements or course-drop deadlines, or, really, anything to do with the administrative dimension of being a student. Ask the departmental secretaries, he said. In fact, he added, do your best to cultivate their respect and affection. Never talk down to them or treat them as the help. They may not have a cluster of letters after their name or grade your papers, but they make the university run.

I’d like to think that I’m not the kind of person who would ever be the kind of asshole to berate secretaries or support staff, but I took my T.A.’s advice to heart, and went out of my way to be friendly and express gratitude, to be apologetic when I brought them a problem. It wasn’t long before I was greeted with smiles whenever I had paperwork that needed processing, and I never had any issues getting into courses (by contrast, in my thirty years in academia from undergrad to grad student to professor, I have seen many people—students and faculty—suffer indignities of mysterious provenance because they were condescending or disrespectful to support staff).

The point here is that, for all the negative connotations that attach to bureaucracy, it is an engine necessary for any institution or nation to run. Can it become bloated and sclerotic? Of course, though in my experience that tends to happen when one expands the ranks of upper management. But when Steve Bannon declared, in the early days of the Trump Administration, that his aim was “the deconstruction of the administrative state,” I felt a keen sense of cognitive dissonance in that statement—for the simple reason that there is no such thing as a non­-administrative state.

Which brings us back, albeit circuitously, to The Death of Stalin. There is no greater example of a sclerotic and constipated bureaucracy than that of the former Soviet Union, a point not infrequently made in libertarian and anti-statist arguments for small government. But I think the question that rarely gets raised when addressing dysfunctional bureaucracy—at least in the abstract—is why is it dysfunctional? There are probably any number of reasons why that question doesn’t come up, but I have to imagine that a big one is because we’ve been conditioned to think of bureaucracy as inevitably dysfunctional—a sense reinforced by every negative encounter experienced when renewing a driver’s license, waiting on hold with your bank, filing taxes, dealing with governmental red tape, or figuring out what prescriptions are covered by your employee health plan. But a second question we should ask when having such negative experiences is: are they negative because of an excess of bureaucracy, or too little? The inability of Stalin’s minions to find a competent doctor is a profound metaphor for what happens when we strip out the redundancies in a given system—in this case, the state-sponsored murder of thousands of doctors because of a dictator’s paranoia, such that one is left with (at best) mediocre medical professionals too terrified of state retribution to be dispassionately clinical, which is of course what one needs from a doctor.

I’m not a student of the history of the U.S.S.R., so I have no idea if anyone has written about whether the ineptitude of the Soviet bureaucracy was a legacy of Stalinist terror and subsequent Party orthodoxy, in which actually competent people were marginalized, violently or otherwise; I have to assume there’s probably a lot of literature on the topic (certainly, Masha Gessen’s critical review of the HBO series Chernobyl has something to say on the subject). But there’s something of an irony in the fact that Republican administrations since that of Ronald Reagan have created their own versions of The Death of Stalin’s doctor problem through their evisceration of government. Reagan famously said that the nine most frightening words were “I’m from the government, and I’m here to help,” and since then conservative governments—in the U.S., Canada, and elsewhere—have worked hard to make that a self-fulfilling prophecy. Thomas Frank, author of What’s the Matter With Kansas? (2004) has chronicled this tendency, in which Republican distrust of government tends to translate into the rampant gutting of social services, governmental agencies from the Post Office to the various cabinet departments, which then dramatically denudes the government’s ability to do anything. All of the failures that then inevitably occur are held up as proof of the basic premise of government’s inability to get anything right (and that therefore its basic services should be outsourced to the private sector).

In my brief moments of hope I wonder if perhaps the Trump Administration’s explicit practice of putting hacks and incompetent loyalists in key positions (such as Jared Kushner’s bizarrely massive portfolio) made this longstanding Republican exercise too glaring to ignore or excuse. Certainly, the contrast between Trump’s band of lickspittles and Biden’s army of sober professionals is about the most glaring difference we’ve seen between administrations, ever. What I hope we’re seeing, at any rate, is the reconstruction of the administrative state.

And it’s worth noting that Dr. Anthony Fauci has been resurrected from Trump’s symbolic purge of the doctors.

1 Comment

Filed under maunderings, The Trump Era, wingnuttery

A Few Things

I haven’t blogged in a while, which is partially to do with the fact that I’d more or less forgotten the whole reason I started these “A Few Things” posts to start with—namely, that at any given time I have a handful of unfinished posts on a variety of topics, but which don’t see the light of day because I have difficulty finishing them to my satisfaction. So I’ve had to remind myself that not every one of my blogworthy thoughts needs two or three thousand words; a quick(ish) precis will often serve.

With that being said, here are a few things that were in the hopper …

Vaccine Envy and the Spectacular Vindication of Max Brooks     Way back when, shortly after the earth cooled (or so it feels), Canadians could and did feel somewhat smug about our response to the pandemic in comparison to the U.S. … generally speaking, Canadians did not fight the quarantine restrictions, and we watched as the Trump Administration flailed about getting everything exactly wrong, and we took pride in our system of free health care and our much lower infection rates. This was a time when even the much-loathed Doug Ford seemed to step up to the challenge, garnering grudging nods of respect from people like me for his no-nonsense response to COVID (more on this below).

Well, as they say, that was then … it’s not as though the U.S. is any less politically polarized, but the Biden Administration seems determined to remind the world what America can do when competent leadership directing competent experts puts the system into overdrive and expends massive resources to solve a problem. Back in those early days of Canadian smugness, author Max Brooks was much in demand on podcasts; Brooks, who happens to be the son of Mel Brooks, is most famous for his novel World War Z: An Oral History of the Zombie War, which has become one of the classics of the zombie apocalypse genre. Though the novel is global in its scope—chronicling how a zombie plague would play out worldwide—the narrative spine deals with the American response. The TL;DR is that the U.S. is caught on its heels through a combination of cynical politics, a complacent and apathetic populace, and self-interested capitalists, but that once the people come to understand the enormity and gravity of the threat, they come together and rediscover the value of sacrifice, hard work, and community, and ultimately stage the most effective response in the world.

The novel is a quite explicit love letter to the Greatest Generation and the New Deal era. Max Brooks is himself quite clear on this point in interviews, talking about how his parents were both survivors of the Great Depression, and were schooled by the Second World War (Mel Brooks was in the Army Engineer Corps and was responsible for disarming land mines). In those interviews he gave in the early days of the pandemic, he talked about how many of the nations that had responded well, such as South Korea and Taiwan, did so because they lived under fairly constant threat, and so were the most primed to respond quickly. The U.S., he said, takes time to (a) become cognizant of a threat, (b) get its shit together, but (c) always makes up for early stumbles and becomes a world leader. Many of those interviewing him, in those early days of the Trump pandemic fecklessness, voiced skepticism.

But, well, it looks as though Brooks’ perspective has been borne out, especially considering that acute vaccine envy I seem to experience daily when my American friends post their vax selfies.

The Limits of Bullying     To return to the topic of Doug Ford …

About a year ago I started writing a blog post about how, though there is little more I loathe in this world than a bully, sometimes in the right context a bully is what you need. I was writing this, as might be obvious, apropos of the grudging respect being given Doug Ford in Ontario and the outsized adulation lavished on Governor Andrew Cuomo of New York. Sometimes in moments of crisis, I mused, having someone you might otherwise dislike for their braggadocio lay down the law offered a strange form of comfort.

It should go without saying that I am now extremely happy I did not write that post.

Though to be fair, it proved to be something of a non-starter. I had written maybe two paragraphs when the obvious counter-argument made me trail off—that is, that though Ford and Cuomo seemed to be doing an effective job in those early days, they were hugely outnumbered by leaders who did not need bellicosity to get the job done (not uncoincidentally, many of these leaders, like Angela Merkel and Jancinda Ardhern, are women).

It has now been over a year since the coronavirus upended the world, and there are few examples of early effective responses that have not met reversals—though few more spectacular than Cuomo and Ford. Cuomo’s example is a good object lesson in the fact that being a bully and an asshole is only effective if you can deliver the goods; as we have learned in the past weeks, he wasn’t delivering the goods so much as obscuring his failures, and once the double-whammy of his COVID missteps and the critical mass of women he has harassed became clear, there weren’t many people left who had his back. Turns out, if you spend your career being an asshole, you accrue a lot of people who are more than willing to stick the knife in once your fortunes change.

Doug Ford, on the other hand, is a very different case. Whatever else you might say about Andrew Cuomo, he’s not an utter moron. Ford’s problem isn’t so much that he’s an asshole and a bully, it’s that he’s a monstrously stupid asshole and bully. He’s so obviously in over his head that I’d feel sorry for him if he weren’t so contemptible. His appeal has always been the same species as Trump’s, which is that a segment of the population who feel victimized by “elites,” by the mandarins of the Liberal Party and the CBC, and by increasingly diverse and vocal Ontarians, elected him specifically to be their bully … which is not a task that requires much in the way of tactical shrewdness or intellectual depth, just the ability to infuriate the Left and deliver arrogant verbal smackdowns in press conferences.

There’s an irony in the fact that Ford’s appeal lies at least in part in the truism that the most satisfying way to deal with a bully is to sic a bigger and badder bully on them—but COVID-19 is also a bully, and doesn’t discriminate.

Biden Departs Afghanistan     I may return to this in a longer post at a later date, as it’s something I’ve been thinking a great deal about. After twenty years, the U.S. presence in Afghanistan will be brought home. The announcement evoked the predictable storm in media and social media, with some celebrating Biden’s decision, some expressing ambivalence, and many calling it disastrous.

Biden responded with an eminently sensible question of his own: if not now, when? Unless the U.S. is going to commit to a permanent presence in Afghanistan as the necessary price of stabilizing the country, there’s no other withdrawal timeline that makes sense. What’s somewhat galling about the castigations of Biden’s announced withdrawal is how likely it is that a good number of his critics almost certainly do tacitly endorse a permanent occupation … but of course won’t say as much because such an admission would be politically toxic. The American presence in Afghanistan has always been a little like the weird existential state of being a smoker—with only one or two notable exceptions, every single smoker I have ever known in my life indulged in the habit on the assumption that they were going to quit, of course … someday. The American presence in Afghanistan was always predicated on it eventually ending. There were, of course, end-conditions: destroying Al-Qaeda in the country, building of a self-sufficient, competent Afghan defense force, and solidifying a non-corrupt democratic government, for example. Check that first box, but the other two are as unlikely today as they have been for twenty years.

One thing the announcement of the withdrawal has done is make me mentally revisit those early years of the Bush Administration—the shock and trauma of 9/11, the quasi-hopeful aftermath when the world rallied behind the U.S., the prospect that the targeted, multilateral incursion into Afghanistan would eliminate Al-Qaeda and bin Laden, and that would be that.

It was a nice thought. But no: Bush’s neoconservative brain trust declared the War on Terror, rolled back civil rights with the Patriot Act, and instead of finishing off bin Laden at Tora Bora in December 2001, let him escape as they turned their focus to Saddam Hussein and Iraq.

There are two things that stick in my mind as I read the various excoriations of Biden for leaving Afghanistan. One is that a war that dragged on for an attenuated twenty years originally had an extremely limited scope, and was meant to end upon achieving the specific goal of killing or capturing bin Laden. The other was that the precipitating event that started the war was the result of an avoidable intelligence failure that occurred in part because the Bush team were dismissive of the warnings the Clinton Administration left for them, as well as breakdowns between warring fiefdoms in the C.I.A. and F.B.I. (a breakdown meticulously chronicled in Lawrence Wright’s excellent book The Looming Tower)

Even with these issues within the intelligence community, there were numerous red flags that were raised, to the point where CIA director George Tenet, interviewed by Bob Woodward, recalled musing immediately after the attacks, “I wonder if it has anything to do with this guy taking pilot training.” And let us not forget the notorious memo George W. Bush was given titled “Bin Laden Determined to Strike In U.S.

The point here is that 9/11—the entire reason for the war in Afghanistan—had been preventable. An emboldened Taliban and reconstituted Al-Qaeda potentially pose the same threat as they did in the late 1990s, which puts the onus on the intelligence community to fix the problems it developed back then.

Leave a comment

Filed under A Few Things, The Biden Presidency, The Trump Era

Catastrophe as MacGuffin, or, Were the Zombies Trying to Warn Us About Trump This Whole Time?

As I mentioned in my last post, we’re currently watching The Stand, a mini-series adaptation of Stephen King’s 1978 novel. Though both the novel and the series are quite graphic in their depictions of the weaponized superflu that wipes out most of the world—and quite clear on just how the pathogen escaped its experimental military facility—ultimately, the flu itself is ancillary to the substance of the story, which is about a showdown between the forces of good and those of evil.

As I enter the last two weeks of classes for the term and reflect back on the texts we’ve done in my graduate seminar on 21stC post-apocalyptic narratives, some of which have overlapped with my Utopias & Dystopias course, and the course I taught last term on pandemic fiction, I’m struck by how often it has been the case that the catastrophe precipitating these stories has been, more often than not, simply a device to clear the decks for what comes next. I really have no reason to be so struck by this fact, given that I titled my grad course “The Spectre of Catastrophe,” specifically because it focuses on narratives preoccupied less with the catastrophe itself than the aftermath. But it occurs to me that when the catastrophe—be it a viral outbreak, asteroid strike, alien invasion, or whatever—is the focus of the story, it’s usually because it will be resolved by the end. It is, in such instances, the focus of the action, not the setup for the action.

By contrast, many of the post-apocalyptic narratives I’ve been looking at this year often go out of their way to be vague about the nature of the precipitating catastrophe. Your average zombie apocalypse has little to say about what caused the dead to rise—few offer even as much exposition as 28 Days Later prologue, in which eco-terrorists storm a lab and inadvertently free monkeys who have been infected with the Rage virus, or the vague suggestion in Night of the Living Dead that zombies are the result of radiation from a satellite. Shaun of the Dead offers the most perfect parody of this tendency, when at the end, again safely ensconced on his couch, Shaun flips through channels on the TV reporting on the aftermath of the zombie plague and changes the channel before anyone can offer an explanation for how it happened.

Even the arguably bleakest post-apocalyptic narrative of the past twenty years—Cormac McCarthy’s The Road—deliberately frustrates readers keen to know how the world of the novel ended up a blasted wasteland. So too Emily St. John Mandel’s far more hopeful Station Eleven is conspicuously uninterested in the details of the “Georgia Flu” that devastates the world. The comparable pandemic in Kevin Brockmeier’s The Brief History of the Dead is less significant for how it wipes out the world’s population than for the simple fact that it does, which allows for the rather intriguing vision of the afterlife on which the novel is premised. And even all of the gross, granular detail with which Stephen King endows “Captain Trips,” the superflu that wipes out 99.9% of America (and presumably the world, but that’s a speculation for a future post), is ultimately a bit of misdirection as the novel then settles into its aforementioned epic battle between Light and Dark.

In all of these examples, catastrophe plays the narrative role of what film nerds like me call a “MacGuffin,” a concept most associated with Alfred Hitchcock. A MacGuffin, Hitchcock said, is something the characters find important, but the audience doesn’t care about—something that precipitates the action, but is ultimately ancillary to it, like the Maltese falcon of The Maltese Falcon, a putatively priceless statuette that drives the plot, but which is never found; or, more specific to Hitchcock, the money Marion Crane (Janet Leigh) steals from her employer in Psycho, but which is less important than the fact that in fleeing her crime she ends up at the Bates Motel.

Catastrophe has become one of our more prevalent MacGuffins: how the world ends is more or less incidental to what comes afterward. There are of course exceptions to this rule; this post comes about in part from my notes for my grad class on Monday, in which we’re finishing Ling Ma’s novel Severance and starting Mandel’s Station Eleven. The latter, as I mention above, is exemplary of this tendency to use catastrophe as a plot device; Severance, by contrast, is far more interested in the particulars of the pandemic that collapses civilization, mainly because the particulars of the “Shen Fever” are tied closely to the novel’s themes of nostalgia and home. The infectiousness of the disease, in which people are reduced to mindless automatons repeating rituals from their former lives, is most prevalent among those given to nostalgia. Candace Chen, the novel’s protagonist, came to the U.S. at a young age when her parents immigrated from China; in the novel’s present, both of her parents have died, and she feels equally not at home in New York City and her home region of China, which she visits on business trips. Feeling generally rootless and untethered makes Candace ironically immune to the disease.

This thematic connection between the catastrophic pandemic and Candace’s situation—which in the novel is also more broadly representative of the millennial experience of late capitalism—makes the cause and particulars of the catastrophe central to the novel, in that both the characters and the audience care about it; this also makes Severance an instructive outlier in a burgeoning sub-genre full of catastrophic MacGuffins.

A large part of the reason for this relates to one of my overarching arguments about contemporary post-apocalyptic narratives, which is also one of the premises of my course: namely, that the preoccupation with the aftermath of catastrophe is indicative of a breakdown of faith and trust in government—not a new phenomenon by any means, but something that became supercharged by the Bush Administration’s post-9/11 failures in Iraq, the debacle of Hurricane Katrina, the increasing polarization of politics and culture, all culminating in the election of Donald Trump. Trump was an is a catastrophic figure, and while I mean that quite literally, it’s important to keep in mind that being symbolically catastrophic—i.e. being seen by his followers as a bomb that would demolish “the Establishment”—was a huge part of his appeal. To many people, Trump is an expression of what I’ve been terming “hopeful nihilism,” which is also an animating factor in many post-apocalyptic narratives. Hopeful nihilism is the flip side of what Lauren Berlant terms “cruel optimism,” a condition in which the object of your desire is actually an obstacle to your flourishing; hopeful nihilism is the belief that burning down and destroying the present system clears the decks for a freer and more authentic existence in the wreckage.

Hence, the general lack of interest in the nature of the catastrophe in these stories: the important thing isn’t the why and how, but the simple fact of civilization’s end. The catastrophe is a MacGuffin; the important thing is what happens next, and how the characters negotiate the circumstances. It may be that in time that we see Trump in similar terms, which would actually go a long way towards explaining how a fatuous, preening New Yorker billionaire became the symbol of defiance for a rump of resolutely “anti-elitist” people; perhaps the particulars of the Trumpian catastrophe were less important than the fact of it. It is, I admit, a comforting thought, as it suggests a huge difficulty for those who want to step into Trump’s role going forward—especially if (and it’s a big if) the Biden Administration can take steps to re-establish people’s faith in government to help its people.

1 Comment

Filed under teaching, The Trump Era