Research Notes: “Bomber” Harris and the Banality of Ego (Part One)

Trying something new: In the process of researching for articles in progress, I often come across cool and interesting stuff that, while cool and interesting, I can’t really use. Conversely, I find myself writing myself into tangents of thought that, again, while cool and interesting, are at best ancillary to the project. Sometimes these things make it into footnotes, but more often than not they remain as jots in a notebook, or are lost to my memory when I delete them from a draft.

Hence, “Research Notes,” an outlet for such ancillary thinking and tangents. At this rate, the essay I’m currently working on will produce a few of these posts.

I’m currently working on an article about the war poetry of Randall Jarrell. Jarrell was a mid-century American poet, most famous for his “bomber poems.” Jarrell did not himself see combat: he enlisted to be a pilot in the U.S. Army Air Force, but washed out of flight school. He did however prove to be quite adept at celestial navigation, and was kept stateside as an instructor. His bomber poems are in part about the depersonalization of soldiers; the heavy bomber as it appears in his poems is the distillation of the individual’s mechanization and assimilation into the war machine.

Or that’s what’s I’m arguing, at any rate.

In the process of researching this paper I’ve been immersing myself in the history of the bombing war, which makes for fascinating, if often harrowing, reading. Several weeks ago, I picked up Malcolm Gladwell’s most recent book The Bomber Mafia. It was with a bit of surprise that I realized, as I read the list at the front of all the other books Gladwell has written, that I had never read a book by Malcolm Gladwell. It felt as though I’d read several, which is because I’ve read a lot of his essays in The New Yorker and other publications, I’ve read or listened to a number of interviews with him, and I listened to at least one season of his podcast Revisionist History. And for a while it seemed as if every time I turned around he’d written a new book.

But I’d never read any, until now. The fact that it felt as though I’d read his books but hadn’t is either ironic or appropriate—or ironically appropriate—given that Gladwell’s usual practice is to take a piece of conventional wisdom, something people feel is true, and then show the various ways in which it isn’t. He’s the sage of counter-intuition, and is very deft at building narratives that, though they start out counter-intuitively, come themselves to feel true, becoming (potentially) new nuggets of conventional wisdom.

So it was interesting to read Gladwell’s take on a subject in which I’d been recently immersed, given that it makes his schtick pretty obvious; I can see why he irks a lot of historians, who often take issue with his tendency to oversimplify. Gladwell writes for a broad audience, and his books are basically pop-history (and pop-other academic disciplines). He covers much of the same territory as the chunkier histories I’ve read on the subject, in two hundred breezy pages,[1] and in the process crafts a narrative that, while not historically incorrect per se, prunes the history in ways convenient to his story.

To be fair, all history engages in such pruning to some extent; but Gladwell’s has the effect of eliding much of the nuance of the history he recounts. It was one such oversimplification that my mind caught on and sent me down a rabbit hole of thought that led to this post.

This one detail leapt out at me not least because it was in both the main text of the book and in the summary on the back. The back of the book promises that “In The Bomber Mafia, Malcolm Gladwell weaves together the stories of a Dutch genius and his homemade computer, a band of brothers in central Alabama, a British psychopath, and pyromaniacal chemists at Harvard to examine one of the greatest moral challenges in modern American history.” The research I’d done so far rendered this sentence, designed to be mysterious and tantalizing, satisfyingly transparent: the Dutch genius is Carl Norden, who designed the famous Norden Bombsight (the “homemade computer” in question), which, it was believed, would prove so accurate as to revolutionize warfare; the “band of brothers” in Alabama are the aviators who essentially invented the U.S. Air Force, and who became the generals responsible for prosecuting the air war in Europe and the Pacific; and the “pyromaniacal chemists” were the people who improved and refined the incendiary bombs—eventually inventing napalm—that wrought so much destruction.

Sir Arthur “Bomber” Harris

That leaves the “British psychopath,” who could only be Air Marshall Sir Arthur “Bomber” Harris, the man who led the RAF’s Bomber Command like a medieval fiefdom. Sure enough, when Gladwell introduces Harris into his narrative, he says unequivocally, “Arthur Harris was a psychopath.”

I found myself somewhat irked by this characterization, and I wondered why on earth it bothered me. By all rights it shouldn’t have: while I’m unqualified to clinically diagnose anyone with psychopathy one way or another, I also wouldn’t want to oblige myself to argue that Bomber Harris wasn’t a psychopath … for the simple reason that he most probably was one, and certainly was in the more colloquial way we mean the term when we refer to someone who is at once cruel and malicious and takes pleasure in inflicting pain and death, and/or is callously indifferent to the suffering he inflicts. Harris was absolutely determined to use his fleet of heavy bombers to reduce every German city of note to rubble and kill as many German civilians as possible; and in pursuing this bloody goal he was also quite cavalier with the lives of his air crews, to the point—as Gladwell notes—that while he was popularly known as “Bomber” Harris to the public, his men called him “Butcher.”

But a psychopath? Well, probably. But it was psychopathy enabled by an unholy combination of military bureaucracy and the inertia it produces, bloody-minded obstinance, and national and personal ego. And while it’s more narratively and morally satisfying to ascribe the kind of indiscriminate destruction wrought by RAF Bomber Command to an unhinged mind, it obscures the ways in which the sheer scale of violence in WWII was only possible by way of bureaucratic banality.

Let me back up a bit. The bomber war was divided between two distinct ethos: “precision” bombing and area bombing. I put the one descriptor in quotes but not the other for the simple reason that precision in high-altitude bombing was a comforting fiction; area bombing, by contrast, is an accurate name for an inherently inaccurate practice. The “bomber mafia” of Gladwell’s title were a small group of American military aviators who came to believe—believe passionately, in fact—that wars of the future could be won by a relatively small number of bombers destroying specific targets crucial to the enemy’s military and civilian infrastructure. Take out power stations and relays, factories manufacturing key items like ball bearings, transportation hubs, and so forth, and you would cut the strings that made the enemy war machine dance. So simple! All you need is a consistent way to deliver munitions with reliable accuracy from a safe altitude.

It should be noted that this blue-sky thinking proceeded from two laudable premises: first, anything that could prevent or circumvent the kind of horrific attrition recently experienced in the trenches of WWI—indeed, anything that could make any war regrettable but short—was a moral good; and second, that precision bombing theoretically allowed you to solely strike military targets and avoid civilian casualties.

The Norden Bomb Sight seemed to promise such accuracy. It was heralded as a marvel of engineering that would take into account a huge range of variables—among other things, windspeed, forward velocity, air temperature and density, bomb size, even the earth’s rotation—which the bombardier would feed into the machine’s difference engine and, from twenty thousand feet up or more, plant a bomb neatly into a pickle barrel.[2]

Now is when the narrator cuts in to say, “They did not, in fact, plant bombs in pickle barrels.”

Far from it: bombing was never anything close to the exact science imagined by the bomber mafia and touted by wartime propaganda. Leaving aside the fact that the Norden Bomb Sight was not nearly as accurate as promised, bomber fleets had to deal with complications ranging from the harrowing (enemy fighters, flak) to the quotidian (cloud cover). Sometimes bomber crews lost their nerve and turned tail before reaching the target, often dropping their payloads over farmland or forest. Sometimes their navigation was so off they bombed the wrong city entirely. As Paul Fussell notes in his book Wartime, an exhaustive evisceration of WWII myths, “The fact was that bombing proved so grossly inaccurate that the planes had to fly well within anti-aircraft range to hit anywhere near the target, and even then they very often missed it entirely.” As the war progressed, “’precision bombing’ became a comical oxymoron relished by bomber crews with a sense of black humor.”

There was a constant push-and-pull between the British and the Americans throughout the bombing war in Europe. The putative psychopath Arthur Harris considered the American insistence on bombing specific targets absurd; he constantly harangued his American counterparts—and harangued Churchill to harangue Roosevelt—to give up their daytime attacks and join the RAF in night-time area bombing to more quickly effect his goal of reducing every German city to rubble.[3] Harris dismissed the American insistence on hitting crucial targets, whether they were travel hubs, ball bearing factories, or oil production, as “panaceas.” That became his favourite word every time he was urged have his bombers try to hit targets more specific than city centers: he imbued it with a haughty, derisive disdain for what he saw as naïve and even childish American thinking. He embodied the antithesis of the Bomber Mafia’s idealism. War for Bomber Harris was a necessarily brutal affair and every single German was, to his mind, a valid target. American queasiness at the prospect of civilian casualties was, he believed, a weakness[4] that wilfully ignored reality and would prolong the war and, paradoxically, cause greater suffering. Only by breaking the German spirit by pitilessly raining destruction on their cities would the war be won—and ultimate victory, he insisted all the way through, could be achieved by bombing alone.

Again, cue the narrator: “The war could not, in fact, be won by bombing alone.”

However, as the war wore on through 1943 and into 1944 and the behemoth of the U.S. military-industrial complex started producing bombers faster than they could recruit and train aircrews, the American strategy ultimately proved more effective. To be clear: “precision bombing” was never a reality, never became anything more than an oxymoron to be savoured as gallows humour. But as the USAAF was able to put more and more planes in the air, focused raids on high-value targets increasingly paid dividends that Harris’ night-time area bombing did not, even as Harris’ air fleet similarly grew larger and larger. As Canadian historian Randall Hansen details in Fire and Fury, what kept Albert Speer (who had quickly risen through the ranks from his role as Hitler’s pet architect, to being in charge of all war production) awake at night wasn’t the devastation wrought on civilian populations by the RAF, but the massive blows to industry and, especially, to oil production by the American attacks. Yet in the face of mounting evidence that daytime bombing focused on military and industrial targets was having greater effect than his night-time area bombing—as well as increasing pressure from his superiors to follow suit—Arthur Harris remained obstinate in his rejection of “panaceas.”

Hence Gladwell’s characterization of him as a psychopath: Bomber Harris had a list of German cities, and he was determined to reduce every last one of them to rubble on the premise that it would destroy German morale and result in the collapse of the Nazi regime (“area bombing” was also often euphemized as “morale bombing”). There was a method to the madness, albeit a phantasmic one: the best targets for area bombing, so the argument went, were the densely populated city centers, where a carefully calibrated combination of high-explosives and incendiaries would have the greatest effect, creating firestorms that would overwhelm fire prevention efforts; those who weren’t killed were rendered homeless. The people living in these areas were predominantly working-class; the belief was that killing them and/or destroying their homes would create worker shortages and enervate the German war effort.

Except that this did not happen. In fact, just the opposite happened: the German people held firm in defiance of the bombing. Area bombing, rather than breaking their spirit, stiffened their spines … much as the Lufftwaffe’s bombing of London in 1940-41 had done for the British.

All of the histories of the bombing war I’ve read, Gladwell’s included, make this point: Harris’ theory that the bombing of civilians would break a nation’s spirit had already been definitively disproved by the British citizenry. Indeed, this much was pointed out to Harris by Ira Eaker, the commander of the U.S. bomber force. One evening after the two men had eaten dinner together,[5] Harris expounded at length on why the Americans should join the British in night-time area bombing. Eaker pointed out that that the Blitz had not broken the British spirit—why would he think it would break the Germans’? According to Eaker’s account, Harris dismissed the comparison, asserting that the Brits were made of sterner stuff; the Germans, by contrast, would surely crack.

It’s important to note that at the outset of the Luftwaffe bombing campaign—itself launched on the premise that it would break the British and soften them up for Operation Sea Lion, Hitler’s planned invasion of Britain—the British high command’s greatest fear was precisely that their people’s spirit would break, and civil society would collapse. The “keep calm and carry on” ethos that has become so identified with the British response to the Blitz was post-facto: the citizenry’s implacability, alongside the RAF fighter corps’ success in repelling the Germans, were two rare bright spots in Britain’s darkest hour. The British have always been justified in feeling proud of that stoicism; Harris’ assumption, which he was not alone in making—that this was somehow exceptional to the British—is exemplary of cultural chauvinism, national ego replicated in Harris’ personal arrogance and obstinance.

Harris rose through the ranks of the nascent RAF in some of Britain’s many colonial frontiers, cutting his teeth for his later career by putting down local uprisings in Iraq in the early 1920s through intimidation bombing—an early model for area bombing, as it was unconcerned with accuracy and designed to terrify the unruly locals into submission. After the start of WWII, he took advantage of confusion and incompetence in Britain’s then-tiny bombing fleet to advance himself to taking over Bomber Command. He was notorious for his arrogant, bullying behaviour, but developed—and cultivated—a popular public profile as someone who would take the fight to the Germans. This popularity, along with his ability to browbeat both his underlings and superiors, made him a favourite of Churchill’s—and thus effectively impossible to dislodge from his fiefdom, even when it became painfully obvious to everyone concerned that area bombing simply didn’t work.

His monomaniacal determination to destroy every German city coupled with his refusal to see the obvious evidence of his strategy’s failure does suggest a measure of mental derangement, to put it mildly. But it is also an example of ego on a catastrophic scale, enabled by bureaucratic inertia.

For reasons I’ll get into in my next post, I went back and forth on using the expression “the banality of ego” as my title. Suffice to say here, my quibble with Gladwell’s characterization of Harris as a psychopath isn’t that he’s necessarily wrong, or somehow doing Harris a disservice; rather, he’s doing our understanding of WWII a disservice, as reducing Harris’ aims and agency to the vagaries of a single, unhinged man elides the more complex and disturbing reality of the war’s massive scope and scale. Harris was a bad actor, something tacitly recognized by the fact that after the war, much to his annoyance, he was almost entirely ignored in the conferring of honours and the recognitions bestowed upon the U.K.’s national heroes. There wasn’t much appetite among the British elites for celebrating Harris, whose abrasive and bullying manner had made him so many enemies. Like his patron Churchill, he became something of an embarrassment to the ruling class once the indomitable Bosch were safely domitable again. But there is also, I would argue, a hint of collective guilt at work in the official ignoring of Harris: his single-minded quest to pound every German city of note into rubble was as much a product of the structures of power facilitating him as it was Harris’ own putative psychopathy. Bomber Harris, far from being stymied by his superiors, was enabled by them.

The nature of this enabling is the subject of part two of this post, so stay tuned.



[1] I’d been slogging my way through Richard Overy’s excellent The Bombers and the Bombed for about two weeks before picking up Gladwell’s book. Overy’s clocks in at over four hundred dense pages, plus another hundred pages of endnotes. In contrast, I read The Bomber Mafia in slightly more than a day. And I enjoyed it! This post isn’t an exercise in trashing Gladwell by any means. But reading him on a topic you know well is a bit like seeing your profession depicted on TV—you can’t help but get worked up over the details that get missed.

[2] The ability to put a “bomb in a pickle barrel” was the common boast made in American propaganda vaunting the technological marvel of the Army Air Force’s bomber fleet. Why pickle barrels and not some other receptacle is a question that remains unanswered.

[3] He came close to getting the Americans to capitulate—Churchill was on his side, and for a time it looked as though Churchill had convinced Roosevelt. General Ira Eaker, commander of the Eighth Air Force, kept the Americans’ daytime bombing alive by convincing Churchill that, with the Americans bombing by day and the Brits bombing by night, German cities would be pounded relentlessly—“round the clock,” as Eaker said. Churchill liked the sound of that.

[4] Or in the RAF’s parlance, “lack of moral fibre.” This was the verdict leveled against many airmen who cracked up because of the strain of repeated missions. Sometimes, sympathetic psychiatrists found more acceptable diagnoses that spared traumatized flyers’ reputations, but many more suffered the ignominy of a simple “LMF” in their files, a badge of dishonour they wore for years.

[5] Eaker and Harris, counter-intuitively perhaps, became great friends.

1 Comment

Filed under Research Notes

A Pride Month Post

Pride Month is almost over. I might be a basic cishetero dude, but I love Pride—I love the celebration, the camp, the colour, the music (dancing in the London, Ontario Pride Parade to “I Am What I Am” blasting from the speakers remains one of my fondest memories). I love the love. Some of this has to do with me being a liberal lefty bleeding heart social justice type, but then one of the reasons I’m a liberal lefty bleeding heart social justice type is because many of the people in my life I have loved, who have been dearest to me, who have shaped my worldview, have been queer. I want to take this space to say to all of my queer friends: I love you. My world is a better place for you being in it.

Happy Pride to all my favourite monsters.

This past Monday morning I was at the Starbucks closest to campus, writing my daily entry in my journal. These past several months, I’ve been making a point of writing something every day—even if it’s just a paragraph noting the weather.[i] On this particular morning, I had a lot on my mind, as I had been reading a bunch of articles about the Texas GOP convention that took place in Houston last weekend. It was all through my newsfeeds, especially in regard to the resolution that Texas will hold a referendum no later than 2023 on whether or not to secede from the union; and the addition to their platform asserting that homosexuality is “an abnormal lifestyle choice.”[ii]

I’d read a long thread on Twitter discussing the secession referendum and whether (a) it was legal; (b) it was even feasible; and (c) if refugees from the newly founded Republic of Texas would be supported by a Democratic administration in their relocation. Twitter being Twitter, there was also a lot of sentiment advocating letting Texas go, anticipatory schadenfreude predicting the crash of their economy, and just a lot of general dunking on the stupidity of the whole thing. I had my own thoughts about how Texas Republicans would be making a big bet on being correct in their climate change denial in a state that already has areas very nearly unlivable at the height of summer, as well as a long gulf coast uniquely vulnerable to extreme weather events (to say nothing of a decrepit electrical grid).

Anyway, because I generally avoid tweeting except for the rare occasions when I think of a witty Wildean aphorism, I was sorting my thoughts out in my journal when, looking idly around the Starbucks, I had an odd moment of defamiliarization.

The Starbucks in question is not my favourite coffee shop to work in;[iii] evil corporation aside, I’m generally amenable to Starbucks so long as they at least have the feel of a comfortable café—soft lighting, warmly coloured décor, that sort of thing. This one is new, just opening a few months ago, and is unfortunately very bright and austere, bordering on institutional. But it is also the most proximate coffee to campus,[iv] so I find myself there sometimes.

I had paused in my writing for a moment and was staring into the middle distance, lost in thought. In that moment, Gloria Gaynor came on the sound system—“I Will Survive,” a song that is impossible not to bob along with. And a staple of the Gay Pride soundtrack. Looking around, I suddenly became aware of all the Pride Month paraphernalia decorating the place: there was a big Pride Progress flag behind the counter, another over the chalkboard at the front, crossed with a Trans flag; a row of Starbucks cups was arrayed on the counter in front of the espresso machines, each with a piece of tissue paper corresponding to one of the colours from the Pride flag. There was also the staff themselves, many of whom were, if not actually queer, then certainly rocking the aesthetic.[v]

To be clear: I hadn’t not noticed any of this before. I’ve been there at least once or twice since June began when the profusion of Pride décor went up, but I seem to think they’ve always had a Pride flag at the front of the store, and I had previously taken note that the staff would not be out of place behind the bar or on the dance floor of a gay club that would be way too fashionable for my basic self. But that was the nub of my moment of defamiliarization: I had noticed all these things without noticing them. As I sat there listening to Gloria dunking on her ex, I was taken out of the moment and thought about how when I was the baristas’ age this kind of décor would be limited—even during Pride Month—to specific spots on campus and the “gay ghetto” (as it was then called) of the Church & Wellesley neighbourhood in Toronto. That it was unremarkable that it should be on display in a corporate franchise coffee shop[vi] in St. John’s was, it suddenly occurred to me, remarkable … or remarkable to someone who grew up in the 1980s while attending a Catholic high school, some of whose teachers were quite vocal in their opinion that HIV/AIDS was divine punishment for homosexuality.[vii]

It was an odd moment: what should have been a warm glow of vicarious pride in suddenly seeing such cultural progress that literally snuck up on me sat in stark contrast to the recent full-bore attacks on queer people epitomized in the Texas GOP’s delineation of homosexuality as “an abnormal lifestyle choice.” This specific language is telling, as it hearkens back to an earlier era of anti-gay rhetoric, when it was a point of homophobic common wisdom that being queer was purely a matter of choice. The Texas GOP’s phrasing effectively elides at least thirty years of hard-won progress whose signal event, when Obergefell v. Hodges established federal recognition of same-sex marriage, felt less like revolutionary upheaval than simple confirmation of prevailing attitudes.[viii]

One of the frequent rhetorical recourses made by people opposed to social justice progress—or those who profess to be all about social justice but worry that it is progressing too quickly—is to point to all the obvious advances that have been made in terms of feminism, gay rights, cultural diversity, civil rights, and so forth, as evidence that the current crop of “social justice warriors”[ix] and activists are overreacting, making mountains out of molehills, or otherwise being disingenuous in pretending society hasn’t improved over the past decades and centuries. “So what you’re saying is there’s no difference between today and the Jim Crow era?” is how a typical argument attacking Black Lives Matter might go.

One hears such a tone, for example, in critiques of The Handmaid’s Tale when its misogynist dystopia is dismissed as alarmist fantasy; but then a SCOTUS leak reveals the un-edited version of Samuel Alito’s Gileadean thinking about abortion … and then, somewhere halfway through the writing of this post, the Court strikes down Roe v. Wade with nary a word of the opinion edited. For the better part of my adult life, but especially since Obergefell, when the rainbow flags come out (so to speak) in June there have been predictable harrumphs wondering why Pride is even necessary anymore. Haven’t those people got all the rights now? Why do they need this display?

But then the reactionary Right pivots with the agility of a very agile thing from anti-CRT campaigns in schools to Florida’s “Don’t Say Gay” bill and the broader attempt to associate any mention of LGBTQ people, history, lifestyles, or art and literature with the “grooming” of children. Actually, I shouldn’t say this was a pivot “from” anti-CRT campaigns, given that it’s not as if these people have given up on excising mention of slavery and systemic racism from curricula; it’s more a matter of a broadening of the battlefront, with cynical operators like Christopher Rufo setting the plays. What started as the targeting of trans people has, not unpredictably, exploited the momentum of the Right’s broader culture war to open fronts against what has become assumed to be settled issues—gay rights in particular.[xi] It is telling that none other than Donald Trump Jr. rebuked the Texas GOP for, presumably in accordance with their plank about homosexuality being “an abnormal lifestyle choice,” did not allow the Log Cabin Republicans to set up a booth at the convention. “The Texas GOP should focus its energy on fighting back against the radical democrats and weak RINOs currently trying to legislate our 2nd Amendment rights away,” he said, “instead of canceling a group of gay conservatives who are standing in the breach with us.” That this weak tea was almost certainly the noblest sentiment Don Jr. has ever voiced says as much about the Frankenstein’s monster the MAGA movement has become as it does about Trump’s most toxic spawn.

All of which is by way of saying that, however much progress we as a society have made, there is never cause for complacency. Progress is never inevitable; though there is a tendency to think that it is,[xii] I’d be interested to see how that assumption breaks down on cultural and national lines, and if those who have benefited most from prosperity and the attention to cultural issues that often affords are the most complacent about the moral arc of history bending in their favour. I started writing this post earlier in the week; I picked it up again on Friday and worked on it at one of my favourite downtown spots.[xiii] Whereas Monday morning I’d looked up, lost in thought, and noticed the Pride paraphernalia, on Friday I looked up at one of the televisions over the bar to see the news that Roe v. Wade had been struck down. It was an odd set of bookends to the work week.

As dire as things seem right now—and they are dire, from the worsening climate to the revanchist culture war to the not-zero possibility that Pierre Polievre could be our next Prime Minister—I do still have hope. I have hope because my local Starbucks, in spite of its unfortunate austere design choices, was decked out more queerly than most gay bars I went to in my 20s. I have hope because my students have hope: this generation is earnest and determined, and they have no patience for our prevarications. I laugh, often out loud, when I read shrill screeds about how youth are being indoctrinated with woke ideology by postmodern neo-Marxist (to coin a term!) professors like me. Dude, I want to say, they’re not the ones being indoctrinated. I am.

With hope.

Happy Pride.



[i] This on the premise that it takes my attention away from the computer and my phone and their endless doomscrolling, and also that it will help center me (especially if I get to it early in the day) and make me organize my thoughts. All this has been moderately successful, if for no other reason than when I start missing days that functions as a bit of a canary in the coal mine for my mental state.

[ii] In addition to secession and their homophobic throwback to the heyday of Jerry Falwell, the Texas Republicans also declared that the 2020 election was corrupt and Joe Biden is therefore an illegitimate president; called for the repeal of the 16th Amendment, which established a federal income tax; declared that  students should be required to “learn about the humanity of the preborn child”; the abolition of any and all abortion; the abolition of the 1965 Voting Rights Act; and the reinstatement of school prayer. John Cornyn, the senior Senator for Texas, was booed during his speech because he had indicated he was open to voting for the minimal gun control bill currently before Congress; and perhaps most remarkably, Congressman Dan Crenshaw and his entourage was attacked in the corridors for … being Trumpy but not Trumpy enough, I guess? Crenshaw, who was a Navy Seal and famously wears an eyepatch because of a wound he received in Iraq, not long ago released a genuinely bonkers political ad featuring himself in full military regalia parachuting out of a plane to attack “antifa and leftists.” He has also tied himself in logical knots while trying to be a “reasonable” conservative while not ever saying a bad word about Trump. Apparently this wasn’t enough for his Texas detractors, who harangued him as an “eyepatch McCain,” an epithet they took from Tucker Carlson.

[iii] Coffee shops and pubs will always rival both my home and campus offices as a preferable work space. One might think that now, in contrast to undergrad and grad school, I do in fact have actual offices—and very nice ones, too!—such public spaces as a local café or pub wouldn’t hold quite the same allure. But they do, for me at least; I like the white noise of these places, and the fact that people-watching can be very zen when you’re paused in thought. I have long been that person you see ensconced in a comfortable seat or booth with a latte or a pint, scribbling in a Moleskine or tapping away at my laptop (though usually the former—working longhand without internet distraction is another point in the coffee shop’s favour).

[iv] For all that Memorial University has to recommend it, it weirdly lacks for cafes and pubs on campus. This, I feel, is contrary to the academic educational process. My undergraduate degree was massively enhanced by the endless discussions and arguments I had with friends at the Absinthe Pub of Winters College at York University; and I’m reasonably certain I wrote at least a chapter’s-worth of my dissertation at UWO’s Grad Club.

[v] When I shared this story with my students in my grad seminar later that morning, one of them helpfully said, “Oh, don’t you know? Everybody who works at Starbucks is queer.” I’ll assume this is at least a slight exaggeration and there are a handful of closeted heterosexuals making lattes, but I’ll take his authoritative word on the subject. 

[vi] Not that corporate coffee franchises should be antithetical to queer culture: one of the rabbit holes my mind went down in the immediate aftermath of this reverie was the memory of the infamous steps in front of the Second Cup on the corner of Church & Wellesley in Toronto. On pleasant days, the steps would be full of queer folk (with a few straights like myself occasionally thrown in), drinking coffee, chatting, watching the street, as much spectacle as spectators. To anyone who’d ever been on Church Street, those steps were instantly recognizable in the series of Kids in the Hall sketches called, well, “Steps.”

Curious if the Second Cup was still there, I did a quick Google search and discovered that the steps had been removed (how, I don’t know) by the owners to “discourage loitering,” and the café itself had departed in 2005—but that a new Second Cup took up residence just down the block six years later.

[vii] No, seriously.

[viii] Which is not, I hasten to add, to downplay its importance or the enormity of the victory it represented; but the fact that it was met more with a shrug from the balance of straight people than with outrage is pretty extraordinary in itself.

[ix] This is by no means an original observation, but you’ve gotta hand it to the forces of reaction that “social justice warrior,” or SJW for short, was turned so quickly into a pejorative, and that anything and everything related to or involving social justice became so resolutely understood as the precinct for the shrill, the sanctimonious, as well as being synonymous with the suppression of free expression. Ditto for critical race theory (or CRT—is there power in reducing something to a three-letter acronym?), and the more recent retread of Anita Bryant’s association of homosexuality with pedophilia, this time with the handy epithet “groomer” attached to anyone committing the sin of admitting to queer identity in the presence of children.

[xi] It would be entertaining, if the circumstances weren’t so distressing, to watch such gay conservatives as Andrew Sullivan—who have taken “gender critical” positions on trans rights—suddenly gobsmacked to find their own subject-positions under attack. Sullivan is a particularly notable example: he has become increasingly strident on what he characterizes as the tyranny of pro-trans discourse, and recently had Christopher Rufo on his podcast in which he agreed on many of Rufo’s attacks on critical race theory. More recently however, he has thrown up his hands and said (figuratively) “Whoa, whoa!” in response to Rufo’s most recent rounds of anti-gay rhetoric.

[xii] Personally, I blame Hegel.

[xiii] Blue on Water. During the day, the bar side is quiet and comfortable, and they have a pretty good menu. And, as I recently discovered, they serve excellent coffee, with good breakfast choices.

Leave a comment

Filed under maunderings

The Loneliest Billionaire

A billboard in Hungary that was part of Viktor Orban’s anti-George Soros campaign, which reads “Let’s not let Soros have the last laugh!” The billboards were taken down in advance of a state visit from israeli prime minister Benjamin Netanyahu, as the campaign had been accused of anti-Semitism.

I’m thinking of writing a one-act play á là Samuel Beckett’s Krapp’s Last Tape that would feature progressive billionaire George Soros alone on stage in a small pool of light with only a stool and an iPhone for company. The play would mostly be him scrolling and muttering to himself in increasingly unhinged non-sequiturs, the gist of which we eventually glean is his existential angst at being the only progressive billionaire, which means he thus must shoulder all of the instinctive hatred for billionaires directed at him from right-wing media and politicians. “Mercer, Murdoch, Musk,” he mutters in what becomes a refrain, “Koch. Bezos. They share. They share. (he scrolls for a long moment, then looks out at the darkness in the direction of the audience) Soros. Alone.”

The play ends midway through the pandemic. Soros grows more and more excited as he reads conspiracy theorists’ attacks on Bill Gates accusing him of putting mind-control chips in the covid vaccine. Soros looks up from his phone with a look of fragile hope on his face as he whispers, “Is there another?

All of which is by way of saying that I’m endlessly fascinated by the way in which George Soros has become the singular bogeyman of the alt-right and Steve Bannon’s cohort, of QAnon conspiracy theorists, and Hungary’s Viktor Orban, current darling of the Tucker Carlson wing of the GOP (lots of overlaps in that Venn diagram, to be sure). When it comes to the question of billionaires, I’m generally in agreement Elizabeth Warren: that is, the existence of billionaires qua billionaires isn’t a problem; a proliferation of multi-billionaires existing concurrently with systemic poverty and hunger, widespread lack of access to health care, and the ongoing climate crisis is a moral obscenity. And while apologists might point to the Gates Foundation’s work to address some of these problems or the fact that Elon Musk has done more to move us toward electric vehicles than any group of people, well, kudos to them … but they remain a vanishingly small minority of the class.

More common is the Atlas Shrugged brand of libertarianism embraced by the likes of Charles Koch and his late brother David, which frames the making of obscene amounts of money as a form of virtue—and which tirelessly spends a huge amount of that money to ensure that it stays with the ultra-rich and furthers their ability to accumulate even more wealth. New Yorker staff writer Jane Mayer wrote an excellent, exhaustive book titled Dark Money in 2017, which did a deep and detailed dive into the vast sums spent by right-wing billionaires in shoring up conservative politicians at all levels of government—from local school boards to congress and the White House—as well as funding climate disinformation campaigns, conservative think tanks like the Claremont Institute and anti-tax organizations like American for Prosperity, as well as a huge constellation of other right-wing causes.

If there was something even approaching numerical parity between progressive and conservative billionaires, each advancing their political interests like Olympian gods choosing sides in the Trojan War, that would be one thing (it wouldn’t resolve or really even ameliorate the structural problems of billionaires in an inequitable society, but it would definitely be a thing). But the fact of the matter is that the conservative vilification of Soros’ progressive agenda is profoundly disingenuous, for the simple reason that he’s all they’ve got to attack, while on their side they’ve got Rupert Murdoch, Peter Thiel, Charles Koch, Robert and Rebekah Mercer—who are collectively worth over $100B compared to Soros’ $8.6B—as well as a legion of others who actively spend their money on conservative political causes.

Of course, there’s also Warren Buffett, one of the world’s richest men who tends to voice liberal political opinions and is nominally in favour of higher taxes for the rich, but he’s largely left alone—for reasons I won’t speculate on for at least a few paragraphs—by the right-wing mediasphere.

There’s also the fact that, for all his mouthing of liberal platitudes, Buffett doesn’t do much to put his money where his mouth is, and has frequently been accused of hypocrisy by right and left alike. Indeed, even the most “liberal” of billionaires tend more to gesture at social progressivism while accumulating wealth through the most ruthless means available, espousing a libertarianism that puts free speech and legal weed in the same philosophical framework as industry deregulation and low taxes for the über-rich. Even Bill Gates, arguably the most socially conscious of the billionaire class, dismisses the policies of Elizabeth Warren and Bernie Sanders out of hand on the principle that individuals are better judges of how to spend their money than the government.

And if all billionaires—or really, just some of them—established their own versions of the Gates Foundation, he might have something approaching a point. But of course he and George Soros are the outliers, with most billionaires who engage in politics doing so with an eye to entrenching their wealth and facilitating the means to make more.

(If I’m being conspicuous in not mentioning Elon Musk, it’s because Musk is pretty much sui generis, falling into a category of his own devising that is somewhere between chaos muppet and Bond villain. If it weren’t for the fact that the man can tank the stock market with a tweet, it would be amusing to watch his new alt-right fanboys reconcile their love of Musk’s shitposting with the fact that he’s the godfather of the EV revolution).

 So pity poor George Soros, the loneliest billionaire. As the sole progressive plutocrat who actually funds progressive causes, he gets the brunt of the paranoid right’s vitriol. Though if you find it puzzling as to the frequency and intensity of the attacks on a man with one sixteenth of Jeff Bezos’ wealth—which is still more money than any one person should be able to possess—you might want to take note of how often the name “Soros” is spoken in the same breath as “globalist.” Or to put it more plainly: it’s the anti-Semitism, stupid.

Leave a comment

Filed under wingnuttery

Gremlins redux

Two blog posts ago I went on at length about gremlins—both in general, and specific to The Twilight Zone episode “Nightmare at 20,000 Feet.” That episode comprised my most terrifying fictional experience, something that stuck with me for years. My students’ first assignment in my weird fiction class this summer is to write a piece of creative non-fiction describing their most terrifying fictional experience. As I said in that post, I was planning to write my own, by way of example and in the name of fairness—given that we’ll be sharing everyone’s pieces. And I said I would post it here.

So here we are. As might have been obvious from my last gremlins post, this is has become something of a very interesting and serendipitous rabbit hole for me, at once touching on a handful of my current research interests as well as jogging a lot of memories that I want to explore. Possibly this turns into a larger project, possibly it becomes an avenue of self-exploration, possibly both.

So what I’m saying is, don’t be surprised to see more posts here vectoring off from this line of inquiry.

One caveat: I made a point of not consulting with my parents as I wrote this. What I’ve related in this essay is purely based on my memory of the summer of 1984, and as such might be wildly off base. I’ll be interested to see what my Mom and Dad have to say and whether their own memories are at all consonant with mine. If not, I will write a follow-up.

Meanwhile, without further ado …

A man sees something on the wing of the airplane, a vague shape in the rain and lightning. It’s impossible that anything alive could be out there, at this speed and altitude. But he sees it again. He’s a nervous flyer; perhaps his mind is playing tricks. But then he sees it again: person-shaped but inhuman, its intent obviously malevolent as it tears into the wing.

He is the only one who sees it. He cries out to the flight attendants, to his fellow passengers in panic, but they think he’s crazy. He wonders himself if he’s losing his mind. He lets himself be calmed down, he closes the window shade, he tries to sleep. But soon enough, he can’t help himself. He opens the shade, and there is the thing, a creature that looks like a demonic goblin, inches from his face on the other side of the window, staring back at him with something like sadistic glee.

It’s a gremlin, of course, a folkloric imp that emerged in the age of flight, invented by RAF aviators in the years before WWII. Heir to a long lineage of mischievous pixies and fey folk, the gremlin is nevertheless a modern creation, blamed for the frequent and seemingly random malfunctions that bedeviled airplanes during the frantic steeplechase of flight technology between the wars. Gremlins weren’t just the comic antagonists of tall tales told by pilots and crew on airbases between missions—enough airmen were genuinely convinced that gremlins were real, swearing up and down that they’d seen the little bastards on their wings, that concerned psychological papers were written.

Roald Dahl’s first novel was about gremlins. Bugs Bunny tangled with one in the Looney Tunes short “Falling Hare.” Like their folkloric predecessors, gremlins were given to mischief and occasional cruelty, but were mostly depicted as annoyances and not threats.

For a time in my childhood, gremlins were a source of abject terror for me.


When I think of gremlins, I think of the summer of 1984. The movie Gremlins was released that June, but I never saw it. I still haven’t. By the time it hit theatres, I’d already been terrified beyond what was strictly reasonable by the gremlin in The Twilight Zone: The Movie, which my father rented for us to watch some time after its release in 1983 and before Gremlins came out. The fourth of the anthology film’s four segments was “Nightmare at 20,000 Feet,” in which a nervous flyer sees a gremlin on the wing of his plane. The moment when he opens the shade to see the demonic creature staring back at him haunted me for years. When I lay in bed at night and the scene came to mind, I hid my head under the covers—trapping myself, for suddenly I couldn’t shake the idea that the gremlin would be perched there, staring at me, if I lowered them.

Lest you assume these were the infantile fears of a young boy, let me clarify: these were the infantile fears of a twelve-year-old.


A vicious heat wave hit our Toronto suburb in 1984. It coincided with the Olympics, which ran from the last week of July into August. It was the kind heat that pervades my childhood memories of summer: a baking sun in a clear sky, air that was somehow stifling and humid while also drying the grass to brittle blades that abraded bare feet. Even basements were no refuge.

Our house had no air conditioning, so my father brought the television set outside onto the side deck where there was at least some shade and, occasionally, a breeze. This arrangement appealed to my mother: puritanical about not spending summer days indoors watching the tube, she also hated missing even a moment of Olympic coverage. Because the 1984 Games were held in Los Angeles, the time difference meant our Olympics viewing stretched into the darkening evening. We ate dinner on the deck and watched athletes run, swim, hurl, and paddle. Neighbours came over, bringing beer and wine and snacks. An ongoing PG-13 bacchanal took up residence on our deck and spilled out onto the yellowing grass of our corner lot as the neighbourhood kids staged our own Games.


The 1984 Games were notable for the absence of the Soviet Union and most of the Eastern Bloc. They boycotted Los Angeles in retaliation for the United States’ boycott of the 1980 Moscow Games, which had been in protest over Russia’s invasion of Afghanistan.

It was petulance, said one neighbour. Hypocritical, said another. Someone made an off-colour joke I didn’t understand about women’s weightlifting being fair this time. I didn’t grasp the nuances of the politics, but I knew that the Soviet absence tinged everything with vague unease. The 1984 summer Olympics marked the precise midpoint of Ronald Reagan’s presidency and the renewed belligerence of the Cold War. Fears of nuclear conflict that had smouldered like banked coals during the détente years of the 1970s leapt again into open flame. Pious sages of geopolitics kept inching the Doomsday Clock closer to midnight. I was in some ways a literal-minded child and did not quite understand that the clock was metaphorical. Every time its minute hand crept forward, I could not sleep for days afterward.

The Day After, a terrifying depiction of a nuclear exchange, aired in late 1983. It showed the effects of multiple warheads striking in the American heartland, and the immediate aftermath as people suffering from severe radiation poisoning struggle and fight over food and water. The images of the mushroom clouds and their devastation were the most graphic ever portrayed at the time. A disclaimer at the end told the audience that, however ghastly the film’s depiction had been, it was mild in comparison to what the reality would be. With over one hundred million viewers, it was the most-watched television event in history.

I did not see it. I didn’t even know it existed until I heard about it at school from classmates who had watched it. It had been recommended that parents watch it with their children; guides were made available to help with the discussions afterward. But it was not mentioned in my house and I was somehow smart enough not to ask why.

Whatever sleep I lost worrying about the Bomb, my mother’s nuclear anxieties contained multitudes.


Because serendipity is like gravity, that summer one of the television channels aired old episodes of The Twilight Zone. Every night when Olympics coverage ended, when most of the neighbours had gone home, while the lawn and the hedges and the asphalt of the street sighed the stored heat of the day into the darkness, we switched over to the slow cascading fall of the theme music and the studied portent in Rod Serling’s voice.

With one exception, I don’t remember which episodes we watched. I do remember my parents waxing on about episodes we didn’t see. “The Monsters Are Due on Maple Street” was a favourite of theirs. To this day I haven’t seen it, but the plot as they related it stuck in my mind: a quiet suburban neighbourhood like ours suffers an inexplicable blackout on a summer night; the neighbours congregate in the street, anxious but not concerned until the lights go on at one house. Suspicion starts to set in: what’s special about that person’s house? Other houses get power, then lose it, until the previously friendly neighbours descend into paranoid warring factions. The episode ends with aliens in a ship overhead, who have been manipulating the power grid, saying, Look, we don’t have to attack them, these humans will turn on each other.

But because serendipity is the gravity of my life, we did see “Nightmare at 20,000 Feet.” I can’t help but think that if I’d seen the original episode first, the spectre of a gremlin on a wing wouldn’t have stuck so deep in my brain’s fear centers. It featured a pre-Star Trek William Shatner as the nervous flyer, demonstrating that scenery-chewing was always his first and best talent. The gremlin itself looked like a man in a monkey suit, less a demon than an ugly teddy bear. Creepy but hardly terrifying.

But in the context of that uncanny summer fortnight, with the memory of the movie gremlin colouring its black and white predecessor with shades of fear out of space, what was otherwise risible had the effect of driving my original horror deeper into my mind. It became existential. Each night when we changed the channel over and the theme music played I felt ill, and yet could not look away. Then one night, the show began with Serling’s narration: “Portrait of a frightened man: Mr. Robert Wilson, thirty-seven, husband, father and salesman,” showing William Shatner (whom I did not then recognize as Captain Kirk) slouched in his airplane seat. We learned he had recently spent time in a sanitarium—that he had been there because he had a nervous breakdown on a flight “on an evening not dissimilar to this one.” On this night, however, Serling tells us, “his appointed destination … happens to be in the darkest corner of the Twilight Zone.”

I could not look away, even as part of me knew just how much this campy earlier iteration was about to make the lingering effects of the later one indelible. I have little memory of the actual episode. What I have is sense memory: the night’s heavy, suffocating humidity, the creak of crickets in the hedge, the smell of the grass, the bilious weight in my belly, and the dread of knowing I would soon have to try and sleep in my dark and stuffy room.


In German, “uncanny” is unheimlich, literally unhomely, that which makes you feel not at home. Those two weeks that summer were dislocating: I was not at home in my home, and my home itself felt adrift, untethered. Or perhaps what I felt was the dread certainty that it was always untethered on the world’s currents, and that the feeling of safety remote from the larger world was the illusion—that there was always a gremlin on the wing, marking time in missile silos and in the minds of world leaders. RAF airmen invented gremlins in part to resolve a contradiction: flight technology was advancing by leaps and bounds but left them uniquely vulnerable while aloft. What more unthinkable technology has existed than nuclear weapons? Perhaps for me the gremlin was not a narrative comfort as it was for the aviators, but the certainty of the technology’s malevolence.


The Twilight Zone was in many ways the quintessential Cold War TV show, as it embodied the nagging, unhomely sense of something being not quite right, which was the constant undercurrent of the bland suburban order that America was so desperate to convey to itself. It is no surprise that so many of the show’s episodes are set in such innocuous suburbs as Maple Street.

My father, who grew up in just such a suburb, loved The Twilight Zone when he was my age; he told me that he watched the episodes eagerly when they first aired. He was twelve when the show premiered in autumn 1959. He was a different twelve-year-old than me, apparently—I tried to imagine actually enjoying something that unsettling, actually looking forward to seeing what each new episode would bring, but that sensibility was still alien to me. In a few short years I would learn to love horror when I discovered Stephen King and tore through his novels at breakneck speed. But at twelve I had not yet grown out of the nausea the uncanny inspired in me. That two-week stretch of an otherwise idyllic summer was a perfect storm of subtle dislocations: the heat wave, the outdoor television viewing, the constant low-grade party atmosphere, the hours and hours of Olympic coverage, all with the Soviet absence drawing attention to the Cold War’s constant menacing background hum.

1 Comment

Filed under maunderings

Dystopian Thought for the Day

It occurs to me that the current state of the U.S. Supreme Court is like climate change … which is to say, it has been ongoing for several decades and visible to anyone willing to see it developing, but it has not prompted anything but the most tepid of responses. And now that we’re experiencing the judicial equivalent of massive flooding, it’s already too late.

(I can’t decide whether this analogy is ironic or appropriate, considering this court is likely to do everything in its power to curtail efforts to reverse climate change).

I remember reading Angels in America for the first time over twenty-five years ago, and coming on the scene in which the notorious lawyer and fixer Roy Cohn—now most famous for having been Donald Trump’s mentor in the 1970s—takes the closeted law clerk Joe Pitt out to dinner and introduces him to a Reagan Justice Department apparatchik who waxes poetic about how they’re seeding the federal bench with conservatives judges. “The Supreme Court will be block-solid Republican appointees,” he enthuses, “and the federal bench—Republican judges like land mines, everywhere, everywhere they turn … We’ll get our way on just about everything: abortion, defense, Central America, family values.”

I remember reading that and thinking, wow, diabolical. And then every time I read a news item about the Federalist Society or the GOP’s SCOTUS-oriented machinations, I thought of that scene. When Mitch McConnell held the late Antonin Scalia’s seat hostage from Merrick Garland, I thought of that scene, and thought of it again through Neil Gorsuch’s hearings and the debacle of Brett Kavanagh, and of course once again when McConnell rushed Amy Coney Barrett’s nomination through in what ended up being the last days of the Trump Administration. By then, the full crisis of the American judiciary (my first inkling of which was from a play that first ran off-Broadway in 1992) was plain to see. The U.S. has been experiencing extreme judicial weather events for over a decade now; the leak of the Samuel Alito-authored decision repealing Roe v. Wade is like knowing not just that there’s a category 5 hurricane just below the horizon, but that such storms and worse are the new normal for the foreseeable future.

Union/Confederacy left, 2012 election map right.

Recently it has not been uncommon, especially at moments of more acute racial discord, for people to post images on social media juxtaposing recent electoral maps with maps circa 1860. The red states east of the Mississippi River match almost precisely with the Confederacy; and though Biden’s win in Georgia in 2020 is a welcome disruption of that consonance, otherwise the geography or red v. blue has been increasingly entrenched since Nixon first embarked on The Southern Strategy and accelerated a shift that, sadly, was probably inevitable the moment Lyndon Johnson signed the Civil Rights and Voting Rights Acts.

There has also been, especially since Trump’s election—and even more so since the January 6 insurrection—the prospect of a “new civil war” bandied about, from think pieces to more than a few books. Most such speculations are careful to point out that any such conflict would necessarily be dramatically different from the actual U.S. Civil War—that the seemingly solid blocks of red and blue that replicate the territory of the Confederacy and the Union are deceptive; that however polarized U.S. politics have become, geographically speaking conservative and liberal factions are far more integrated than the maps allow. The divide is more urban/rural than north/south, with substantial blue enclaves in deep red states, like Austin in Texas, or big red swaths in rural California.

The pandemic shook the etch-a-sketch up somewhat, too, as urban professionals, forced to distance socially and work remotely, found the cheaper rents and real estate outside of their cities more amenable (whether the end of the pandemic reverses that out-migration remains to be seen). And when businesses decamp from states like California to states like Texas, they bring with them work forces that tend to be younger and more socially and politically progressive, muddying things further. (Let’s not forget that Florida governor Ron DeSantis’ current feud with Disney over the “Don’t Say Gay” bill was precipitated not by the company’s management, but by its workers, whose hue and cry over what they saw as an unconscionably tepid response prompted the CEO to, one assumes reluctantly, condemn the bill). 

What I’m wondering today is: does the imminent repeal of Roe v. Wade herald a 21st century Great Migration? Except this time, instead of Black Americans fleeing the Jim Crow south, will it be liberals and progressives fleeing Republican states for Democratic ones? Possibly that seems like I’m overstating the case, but I think it will depend on just how far this SCOTUS will take the logic of Alito’s rationale, which is essentially predicated on the assertion that there is no right to privacy enshrined in the U.S. Constitution. Numerous legal experts have weighed in on this speculation, running down a list of landmark Supreme Court cases that hinged at least in part on the premise of the right to privacy: legal contraception, the abolition of anti-sodomy laws, interracial marriage, the prohibition of forced sterilization, and same-sex marriage. Even a year or two ago I would have not worried overmuch about such cases being overturned, thinking it unlikely that any high court, however conservative its composition, would be so retrograde. But this court’s conservative majority has demonstrated a shocking unconcern for even the appearance of being measured and apolitical. They’ve pretty much made it obvious that anything and everything is on the table. That goes also for the current spate of legislating being done by Republican-dominated states: injunctions against teaching the history of slavery, the banning of books, the abolition of sex education, and of course the aforementioned “Don’t Say Gay” bill in Florida, which looks ready to be imitated in other red states. Should any challenges to these pieces of legislation make it to a SCOTUS hearing, how likely do we think it is that the current bench would quash them?

Which makes me wonder at what point being a liberal or progressive living in a blue city in a red state becomes untenable? What would that do to the U.S. polity? There would be a significant brain drain from red states; businesses would be obliged to follow when their pool of qualified workers dried up; urban centers in red states would wither; the current political polarization would in fact become geographical, as the states lost their last vestiges of philosophical diversity and became more and more autonomous, no longer subject to any federal law or statute they felt like challenging before a sympathetic Supreme Court.

That might indeed be a recipe for a “traditional” civil war.

Leave a comment

Filed under maunderings, politics, wingnuttery

On Gremlins

I’ve been thinking a lot about gremlins these past few days.

Bugs Bunny and friend in “Falling Hare” (1943)

I’m teaching a graduate seminar on weird fiction this summer (full title: “Weird Fiction: Lovecraft, Race, and the Queer Uncanny”), and the first assignment is a piece of creative non-fiction describing your most terrifying fictional experience; whether in a book, film, or episode of television, what scared you so badly that it stayed with you for weeks or years? I’ve done this kind of assignment before in upper-level classes, and it has always worked well—especially considering that I post everyone’s pieces in the online course shell so they can be read by the entire class. That always leads to a good and interesting class discussion.

In the interests of fairness and by way of example, I’m also writing one. Which is where the gremlins come in.

No, not the 1984 movie. I couldn’t watch it, given that by the time it was released I’d already been traumatized by “Nightmare at 20,000 Feet.” And no, not the original Twilight Zone episode from 1963. If I’d watched that episode—which featured pre-Star Trek William Shatner gnawing the furniture as a nervous flyer who sees a gremlin that looks like a man in a monkey suit on the wing of the plane—it’s possible gremlins wouldn’t have come to haunt my imagination the way they did. The original episode is creepy, to be certain, but not particularly scary; the gremlin is too fluffy and Shatner too Shatner to really evoke terror.

The gremlin that made me terrified to sleep at night for several months was the one in the “Nightmare” segment of the 1983 film The Twilight Zone: The Movie.

The premise is very simple, and most likely familiar even to those who haven’t seen it (not least because it was parodied in a Simpsons Halloween episode): a man who is a nervous flyer to start with is on a plane during a storm. Looking out the window, he sees … something. At first, he thinks his eyes are playing tricks, but then he sees it again. And then again, and each time it becomes clearer that there is a person-shaped thing out there on the wing. Panicked, he calls for a flight attendant, shouting “There’s a man on the wing of this plane!” This, of course, is impossible, and it is obvious that the fight staff and his fellow passengers think him hysterical (it doesn’t help his case that the segment begins with a flight attendant talking to him through the bathroom door as he’s inside having a panic attack). After talking himself down, realizing it would be impossible for a man to be on the wing at this speed and altitude, He accepts a valium from the senior flight attendant, closes the window shade, and attempts to sleep.

Of course, after a fitful attempt, he can’t help himself, and he opens the shade … and sees the thing, clearly demonic in appearance now, inches away from his face on the other side of the window.

Yup. This bastard.

This was the precise moment it broke my brain and gave me nightmares for months.

Anyway, TL;DR: either through turbulence or the creature’s sabotage, the plane lurches violently. The man grabs a gun from the sky marshal and shoots at the creature through the window. The cabin decompresses and he’s sucked out halfway. He shoots at the creature again, which wags a clawed finger at him, and flies off into the night. The plane lands and the man is taken off in a straitjacket; meanwhile, the mechanics examining the plane’s engine find it torn to shreds, the metal bent and ripped and covered in deep rents that look like claw marks.

I don’t remember any of the rest of the movie, which had three other segments based on old Twilight Zone episodes. I just remember watching “Nightmare,” being terrified, and my father telling me, in reply to my shocked question, that the creature was a gremlin and that they sabotage airplanes.

Really, it’s amazing I ever willingly went back on a plane.

I’ve been thinking about that and remembering a lot of details about the summer of 1984, which is when this all happened, and trying to work through precisely why it scared me so profoundly. I’ll post that essay here when it’s written; but in the meantime, I’ve been going down the rabbit hole on gremlins and their origins as an element of modern folklore.


There’s surprisingly little written about gremlins, which is possibly a function of the twinned facts that, on one hand, they’re basically a sub-species of a vast array of pixies, fairies, goblins, imps, and other mischievous fey creatures from folklore and legend; and on the other hand, they have a recent and fairly specific point of origin. Gremlins emerge alongside aviation (something The Twilight Zone hews to and the movie Gremlins ignores). More specifically, gremlins are creatures of the RAF, and start appearing as an explanation for random malfunctions sometime in the 1920s, becoming a staple of flyers’ mythos by the outbreak of WWII.

Gremlins, indeed, almost became the subject of a Disney film: author Roald Dahl, who would go on to write Charlie and the Chocolate Factory and James and the Giant Peach among innumerable other beloved children’s books, was an RAF pilot. His first book was titled The Gremlins, about a British Hawker Hurricane pilot named Gus who is first tormented by gremlins, but ultimately befriends them and convinces them to use their technical savvy to help the British war effort. In 1942, Dahl was invalided out of active service and sent to Washington, D.C. as an RAF attaché. The Gremlins brought the RAF mythos of airborne imps to America, and was popular enough that Disney optioned it as an animated feature. Though Disney ultimately did not make the movie, Dahl convinced them to publish it with the animators’ illustrations in 1943. First Lady Eleanor Roosevelt reportedly delighted in reading it to her grandchildren.

There was also a Loony Toons short in 1943 featuring Bugs Bunny being bedevilled by a gremlin on a U.S. airbase.

Though Dahl would later claim to have coined the word “gremlin,” that is demonstrably false, as the term was in use from the 1920s and was featured in Pauline Gower’s 1938 novel The ATA: Women With Wings. The word’s etymology is difficult to determine, with some suggesting it comes from the Old English word gremian, “to vex,” which is also possibly related to gremmies, British English slang for goblin or imp. Another theory holds that the word is a conflation of goblin and Fremlin, the latter being a popular brand of beer widely available on British airbases in the mid-century—one can imagine tales of mischievous airborne creatures characterized as goblins seen after too many Fremlins.

One of the more interesting aspects of the gremlins mythos is how many flyers seemed genuinely convinced of the creatures’ existence. So common were tales of malfunction attributed to gremlins that U.S. aircrews stationed in England picked up on the lore and many of them, like their British counterparts, swore up and down they’d actually seen the little bastards working their mischief. Indeed, one of the only academic pieces of writing I’ve been able to find on gremlins is not the work of a folklorist, but a sociologist: in a 1944 edition of The Journal of Educational Sociology, Charles Massinger writes gravely about the fact that “a phase of thinking that had become prevalent in the Royal Air Force”—which is to say, gremlins—“had subsequently infected the psychology of the American airmen in the present war.” Massinger’s article expresses concern that otherwise rational people, thoroughly trained in advanced aviation, who necessarily possess a “breadth of … scientific knowledge relative to cause and effect of stress on the fighting machine” would be so irrational as to actually believe in the existence of “fantastic imps.”

Massinger suggests that it is the stress of combat that gives rise to such fantasies, which is not an unreasonable hypothesis—war zones are notoriously given to all sorts of fabulation. But he says that it is the stress and fear in the moment, in which split-second decisions and reactions that don’t allow for measured and reasoned thought, that short-circuits the sense of reality: “If pilots had sufficient time to think rationally about machine deficiencies under actual flying conditions,” he says, “it is doubtful whether the pixy conception would have crept into their psychology.” Leaden prose aside, this argument strikes me as precisely wrong. The mythology surrounding gremlins may have had its start in panicked moments of crisis while aloft, but it developed and deepened in moments of leisure—airmen relaxing between missions in the officers’ club or mess, probably over numerous bottles of Fremlins. It is indeed with just such a scene that we first learn of gremlins in Dahl’s story.

I do however think Massinger’s instinct isn’t wrong here, i.e. the idea that airmen respond to the stresses of combat and the frustrations of frequent baffling breakdowns with fantasy rather than reason. What he’s missing is the way in which mess-hall fabulation humanizes the experience; the rationality of science and technology in such situations, I would hazard, is not a comfort, no matter how long the flyers have for reflection. The mechanical dimension of air combat is the alienating factor, especially at a point in time when flight was not just new but evolving by leaps and bounds. Roald Dahl’s experience in this respect is instructive: he started the war flying Gloster Gladiator biplanes, which were badly obsolete even when they were first introduced in 1934. By the time he was invalided, he had graduated to Hawker Hurricanes, which in the early days of the war were among the most advanced fighters. By the time he was in the U.S. and Eleanor Roosevelt was reading his first book to her grandchildren, the Allied bombing campaign had already lost more planes than flew in total during the First World War, with the new planes coming off assembly lines not just matching the losses but growing the massive air fleets.

Air travel has become so rote and banal today, and catastrophic airframe malfunctions so rare, that it is difficult to remember what must have been a vastly disorienting experience in WWII: ever-more sophisticated fighters and bombers that were nevertheless plagued by constant mechanical failures, machines of awesome destructive power that were also terribly vulnerable. Bomber crews suffered the highest rates of attrition in the war—about half of them were killed in action—while there was also the constant drumbeat of propaganda about the supposed indomitability of the Allied bombing wings.

When I teach my second-year course on American literature after 1945, I always start with the poetry of Randall Jarrell; specifically, we do a few of his war poems, as a means of emphasizing how the Second World War so profoundly transformed the world and the United States’ place in it, and the extent to which American popular culture became invested in mythologizing the war. Jarrell’s poetry is a disconcertingly ambivalent glimpse of the depersonalization and mechanization of the soldier by a war machine Hollywood has largely erased through such sentimental portrayals as The Longest Day and Saving Private Ryan. “The Death of the Turret-Ball Gunner” is usually the first poem we do, and I can reliably spend an entire class on it despite its brevity. In its entirety:

From my mother’s sleep I fell into the State,
And I hunched in its belly till my wet fur froze.
Six miles from earth, loosed from its dream of life,
I woke to black flak and the nightmare fighters.
When I died they washed me out of the turret with a hose.

The final line is a gut-punch, but it’s the first two lines that establish one of Jarrell’s key themes with devastating economy. The speaker “falls” from the warmth and safety of the mother’s care, where he is loved as an individual, to the ownership of the State, where he is depersonalized and expendable—rendered inhuman even before the “black flak” (anti-aircraft fire) unincorporates his body. In the second line, the State is explicitly conflated with the weapon of war, the bomber, of which he has become a mechanism, and which functions as a monstrous womb: the parallel structure of the two lines aligns the “belly” of airplane with the “mother’s sleep.” The “wet fur,” freezing in the sub-zero temperatures of the high altitude, is literally the fur lining his bomber jacket, but also alludes to the lanugo, the coat of fur that fetuses develop and then shed while in the womb.

The bomber functions in Jarrell’s poetry as the exemplar of the Second World War’s inhuman scope and scale, built in vast numbers, visiting vast devastation on its targets—the last two of which were Hiroshima and Nagasaki—but which itself was terribly vulnerable and always in need of more bodies to fill out its crews. The machine itself was never scarce.

All of which might seem like a huge digression from a discussion of gremlins, but it’s really not: gremlins are identifiably kin to myth and folklore’s long history of mischievous “little people,” from pixies to the sidhe. That they emerge as a specific sub-species (sub-genre?) at the dawn of aviation—specifically, military aviation—is suggestive of a similar mythopoeic impulse when faced with the shock of the new. That some airmen become convinced of their existence as the war went on and the air war grew to unthinkable proportions is, I would suggest, pace Massinger, utterly unsurprising.

A Disneyfied gremlin.


Donald, Graeme. Sticklers, Sideburns, and Bikinis: The Military Origins of Everyday Words and Phrases. 2008.

Leach, Maria (ed). The Dictionary of Folklore. 1985.

Massinger, Charles. “The Gremlin Myth.” The Journal of Educational Sociology., Vol. 17 No. 6 (Feb. 1944). pp. 359-367.

Rose, Carol. Spirits, Fairies, Gnomes, and Goblins: An Encyclopedia of the Little People. 1996.

1 Comment

Filed under history, maunderings, what I'm working on

The Nuclear (not an) Option

I was interviewed recently by a student of mine for Memorial’s student newspaper on the topic of the importance of the humanities.1  I’m now wishing I’d read this Washington Post column by Jason Willick, titled “Putin has a huge advantage in the kind of nuclear weapon he would be most likely to use” prior. This paragraph in particular:

Russia has only a modest lead over the United States in long-range, strategic nuclear warheads regulated by the 2010 New Start treaty — 1,456 vs. 1,357 of the high-payload weapons. But when it comes to unregulated, shorter-range and lower-payload tactical nuclear weapons, according to a 2021 Congressional Research Service report, the United States has only 230, “with around 100 deployed with aircraft in Europe.” Russia has up to 2,000.

I’m not saying that having done a degree in English, philosophy, or history would automatically alert you to the absurdity of this framing;2 but there is a greater likelihood that one would, having studied such subjects, understand, respectively, its perversion of language, its moral and ethical failure, and its ignorance of historical context .

There have been a lot of commentators reaching for comparisons to the Cold War in the past week or so. Whatever the valence of such historical parallels, I think this is the first time I’ve read something that has resorted to Cold War logic. One of the benefits, rhetorically and imaginatively speaking, of the Soviet Union’s collapse was that we started again to think of nuclear weapons in singular terms—by which I mean, a reversion to the wise instinct that one nuclear warhead was one too many. I’m old enough to remember the nuclear anxiety that pervaded in the 1980s, the relief when that briefly vanished in the period spanning glasnost and the U.S.S.R.’s implosion, and then the more diffuse but still nagging anxiety attached to the prospect of bad actors trafficking “loose nukes.” 9/11 ramped up the paranoia about “suitcase bombs” whose relatively small yields would not have registered on cold warriors’ thermonuclear radar, but which served as a reminder of the irreducible violence—different from conventional munitions not in degree but in kind—of weaponized fission.

This understanding is what makes a nation like Iran developing the Bomb unthinkable. It is why nobody in their right mind shrugs off North Korea’s nuclear arsenal because it is minuscule.

And yet here we are, mere days after Vladimir Putin re-introduces the spectre of nuclear warfare—and not merely by inference!—talking about the “advantage” of numbers of nuclear weapons. When I read the passage quoted above, I had to pause and talk to the empty room in lieu of having the article’s author present for a vigorous lapel-shake. I want to ask him: what advantage, precisely, does a 2000 : 100 ratio of TACTICAL NUCLEAR WEAPONS grant you? Tactical nuclear weapons range from the tens of kilotons to the hundreds; to put that in perspective, the bomb that destroyed Hiroshima was thirteen kilotons. So the United States’ paltry tactical nuclear capability currently in Europe, considered conservatively, has the capacity of one hundred Hiroshimas. But once those are used up, presumably in a back-and-forth with Russia, Putin can deliver 1900 more!

Of course, that there could ever be such an exchange—that the initial use of any nuclear weapon, no matter how relatively modest in yield, would not in itself be a world-changing event—is absurd on its face. The relative size of the arsenals would be instantly irrelevant. In the best case scenario, everything comes to a crashing halt as the world looks on in horror and heaps recriminations on the perpetrator. In the worst case scenario, sadly the more likely, the initial use of tactical nuclear weapons rapidly escalates to a large-scale exchange in weapons measured not in kilotons but megatons.

Any attempt to euphemize or elide the singular horror of nuclear weapons needs to be met, at the very least, with the mocking spectre of Buck Turgidson (George C. Scott), the trigger-happy Air Force general from Stanley Kubrick’s masterpiece of black comedy Dr. Strangelove, or How I Learned to Stop Worrying and Love the Bomb (1964): when the President (Peter Sellers) responds to Turgidon’s exhortation to follow through on an unplanned nuclear strike on the Soviet Union, “You’re talking about mass murder, General, not war!” Turgidson says, “Mr. President, I’m not saying we wouldn’t get our hair mussed! But I do say no more than 10 to 20 million killed, tops! Uh, depending on the breaks.”

George C. Scott as General Buck Turgidson, doing his enthusiastic impression of a B-52.

It’s distressing, especially in the present moment, to come across in one of the United States’ major newspapers such an ostensibly reasonable and rational discussion of an invention that is everything but reasonable and rational. Martin Amis’ essay “Thinkability,” the introduction to his 1987 collection of short stories about nuclear weapons and nuclear war, Einstein’s Monsters, addresses precisely the fallacy of trying to make the unthinkable thinkable, and the ways in which the attempt invariably tortures the language used:

It is gratifying in a way that all military-industrial writing about nuclear “options” should be instantly denatured by the nature of the weapons it describes, as if language itself were refusing to cooperate with such notions. (In this sense language is a lot more fastidious than reality, which has doggedly accepted the antireality of the nuclear age.) In the can-do world of nuclear “conflict management,” we hear talk of retaliating first; in this world, deaths in the lower tens of millions are called acceptable; in this world, hostile, provocative, destabilizing nuclear weapons are aimed at nuclear weapons (counterforce), while peaceful, defensive, security-conscious nuclear weapons (there they languish, adorably pouting) are aimed at cities (countervalue). In this world, opponents of the current reality are known as cranks. “Deceptive basing modes,” “dense pack groupings,” “baseline terminal defense,” “the Football” (i.e., the Button), acronyms like BAMBI, SAINTS, PALS, and AWDREY (Atomic Weapons Detection, Recognition, and Estimation of Yield), “the Jedi concept” (near-lightspeed plasma weapons), “Star Wars” itself: these locutions take you out onto the sports field—or back to the nursery.

Reading Amis’ essay anew is a good reminder of the absurd rhetorical lengths the national security apparatus went to (and presumably still does in a more limited fashion) to make the use—and indeed, the very existence—of nuclear weapons seem reasonable.

They are not reasonable. Frighteningly, it doesn’t seem as though Vladimir Putin is reasonable at this moment in time either. But it’s not his numerical advantage in tactical nukes that makes me lose sleep—it’s that he might consider using even one, of any size.  


1. When I started writing this post with precisely this sentence, I then proceeded to digress into a discussion of the interview and the difficulty of abstracting from the forty-five minute interview a sentence of two that best sums up the value of an education in the humanities. I went on for about three paragraphs, realized I was writing a different blog post, and opened a new Word document to start over. Look forward to a post in the near future in which I go on at length about the humanities.

2. Any argument for the humanities rooted in the idea that it invariably fosters empathy and morality needs to remember the Ivy League pedigrees of “the best and brightest” of John F. Kennedy and then Lyndon B. Johnson’s cabinets, whose intellectual arrogance—emerging from educations at Harvard, Yale, et al that would have requirements to read the Great Books—precipitated and then escalated the United States’ war in Vietnam.

Leave a comment

Filed under wingnuttery

2021 in Review: My Favourite Blog Posts

Anybody who reads this blog knows I post sporadically at best. I go through some periods of great energy, and then this space can lie fallow for months at a time. Which isn’t to say I don’t frequently have ideas for posts: it’s a more a question of whether the idea that pops into my head is something I can stick the landing on. I have a folder on my desktop full of half-written posts that I’ve either lost the thread on, couldn’t make work to my satisfaction, or simply was distracted by a shiny thing, and by the time I think about returning to the post in progress, it is no longer timely.

This year was interesting: I blogged 35 times, which is not a lot, but then my posts were largely clustered in the first half of the year. I had a fair bit of momentum coming out of 2020, and was propelled through January and February by events (most notably the assault on the Capitol and Biden’s inauguration). I posted eight times in January, which is a lot for me; then my productivity was reduced by two each month following until April (with two posts). May and June were saw five and six posts, respectively, in part because I was being ambitious and attempting to produce several posts on a handful of themes. That tapered off in July … and then nothing until November (once), and a single December post.

I’m never entirely sure why the well goes dry on a fairly regular basis, though as I said, often it’s not so much about the writing as about the finishing (I still have sitting on my desktop a post about galactic empires that I do want to finish). Sometimes it’s reflective of how productive I’m being otherwise, but not always; sometimes my blog is a useful procrastinatory device, something that makes me feel productive when I should be directing my energies elsewhere.

At any rate, I thought I might do something I’ve seen other blogs do, which is a year in review with a list of the best/most read/favourite posts. Given that my readership here is pretty tiny, it would be a bit silly to list my most popular posts. So I’m going with my personal favourites: which is to say, the posts was proudest of, and which I felt managed to get closest to the thoughts that spawned them.

I’m going with my top five, though I’m not ranking them, just listing them in chronological order.

January 5: The (Ironically) Monarchical Presidency

It’s a little odd that, of these top five, two of them deal with the topic of monarchy. This post is about how the American Republican system of government—developed specifically as a revolt against the tyrannical British crown—has ironically ended up imbuing the American chief executive with more king-like qualities than the prime minister in a parliamentary system. This contradiction, I point out, had become all the more glaring in the Age of Trump, whose authoritarian tendencies exacerbated the monarchical elements of the Office of the President.

I will also note that I posted this entry the day before the January 6 insurgency.

January 22: The Reality of QAnon  

As I have noted many times on this blog, I wrote my doctoral dissertation on conspiracy and paranoia in postwar American fiction, film, and popular culture. Through the successive waves of Trutherism, Birtherism, Glenn Beck’s chalkboard rants, and the various paranoid fantasies spun by Tea Partiers, I have thought about dusting off the thesis … and am overcome with the sensation of probing the nerve of a tooth.

QAnon, like everything it does, ratchets that up to eleven. And yet I found myself writing this post, which I think does a pretty good job of breaking down the important elements. And I’ve watched the HBO docuseries Q: Into the Storm and read the book The Storm is Upon Us by Mike Rothschild … and now I find myself writing about it for another project.

It makes me feel like Al Pacino in Godfather III: “Every time I try to get OUT … they keep dragging me back IN.”

March 12: In Which I Mark the One Year Anniversary of the Pandemic With Some Thoughts on Monarchy

My second of two posts on monarch was prompted, perhaps counter-intuitively, by my profound indifference to the various plights of the British royal family. The occasion of this particular bout of indifference was the fallout from Meghan Windsor née Markle’s interview with Oprah Winfrey and the fallout it caused.

But if you’re so indifferent, why go to the effort of writing a blog post, you may ask? Well, imaginary interlocutor, I started pondering precisely why hereditary monarchy has such a powerful hold on the contemporary imagination. And that led me down a rabbit hole of thought that proved quite interesting.

Also, it gave me the opportunity to write the sentence “Piers Morgan was the result of a lab experiment in which a group of scientists got together to create the most perfect distillation of an asshole.” So there’s that, too.

May 31: History, Memory, and Forgetting Part 2: Forgetting and the Limits of Defamiliarization

I made two attempts at sustained deep dives into large topics this summer. The first was “History, Memory, and Forgetting,” and the other was a series of posts revisiting the concept of postmodernism (“Remembering Postmodernism”). I have to say, I was very pleased with what I produced on both fronts, and annoyed with myself that the postmodernism series got bogged down and remains unfinished. (It’s for that reason that “Remembering Postmodernism” is not represented here).

To be honest, I think all three of my “History, Memory, and Forgetting” posts deserve a place here, but because they’re sort of a unit, I’ll settle for the one I’m proudest of, which is about how the erosion of memory about the Holocaust—through time, distraction, and the death of survivors—has denuded the historical awareness and created a present situation in which such terms as “Nazi” and “fascist” have lost meaning in the popular imagination.

June 29: Tolkien and the Culture Wars

Early this summer, I was alerted to a backlash against the Tolkien Society’s Summer Seminar—an annual conference in which academics present papers on a theme chosen by the society. Part themes have included Tolkien and Landscape, the Art of Tolkien’s World, and so on. This year? Tolkien and Diversity. Which prompted a not-unpredictable backlash in conservative circles, especially after the paper titles were posted. Though it hardly reached “critical race theory” levels of vitriol, there was an awful lot of angry talk about the “woke mob” coming to tear down J.R.R. Tolkien. Though most of this happened on message boards and social media, it did reach the lofty heights of the National Review.

A few days after I wrote this post, I attended the (virtual) Tolkien Seminar conference and watched every single paper. Guess what? Tolkien’s legacy survived. And guess what else? Everyone presenting at that conference loves Tolkien. No one wanted to tear him down. The very thought would horrify them. The biggest fallacy of “anti-woke” thought—which, really, stretches back to the culture wars of the 90s and Harold Bloom’s castigation of what he called “the school of resentment”—is the idea that people who challenge the traditional canons of art and literature and offer feminist, queer, anti-colonialist, anti-racist, or other such readings are doing so because they hate and resent the genius of such canonical writers as Shakespeare, Milton, or Wordsworth (an irony here being that Tolkien has never been included in “the Western Canon”). While there may be genuinely antagonistic readings of classic authors, most of the time people—like those at the Tolkien seminar—are finding spaces in their work in which they see themselves reflected.

And in the end, it’s a testament to Tolkien’s genius that queer graduate students can find themselves in the work of an ultraconservative Catholic. To those who lambasted the “Tolkien and Diversity” seminar, I ask: how is that a bad thing?

On second thought, don’t answer.

Leave a comment

Filed under blog business

2021 in Review: Unreal Nation

Happy new year, everyone. Did you all have a good New Year’s Eve? Stephanie and I celebrated by ordering pizza and watching Don’t Look Up on Netflix—a film that is at once simultaneously so hilarious and so depressing that I found myself wondering whether watching it on the last day of 2021 was a terrible idea or entirely appropriate … though I suppose it could be both.

Jonah Hill, Leonardo DiCaprio, Meryl Streep, and Jennifer Lawrence in Don’t Look Up

I hate New Year’s Eve as a night of celebration, and I always have—even when I was younger and more disposed to party like it’s 1999. A good friend of mine had the perfect summary of why NYE is so terrible. There are two days a year, he said, when you feel the most societal pressure to enjoy yourself: your birthday and NYE. Provided you have friends and/or family who love you, birthdays are great fun because they’re all about you. On New Year’s Eve, by contrast, it’s everybody’s birthday. Which is possibly why the more excessive the celebrations, the more they smack of desperation.

Normally I would have prepared a nice meal as a reluctant nod to mark the day, but I just got back from my parents’ place in Ontario five days ago, which means I’ve been quarantining. Which also means I can’t leave the house until tomorrow and buy groceries, and while we don’t lack for the basics, there isn’t much with which to make anything more then, well, the basics.

So, pizza. Which, given my antipathy to this particular holiday, seemed even more appropriate than anything requiring effort on my part. It might have to become the new custom.

I’ve been seeing a lot of 2021-related memes on social media, most of which involved (1) WTF? And (2) warnings not to jinx the coming year with high expectations, which we did at the start of 2021. But when you think about it, we’ve been exiting the year with a snarl and a backward-facing middle finger since … well, 2016, haven’t we? Which makes many of us want to blame Donald Trump for this series of successively sucky years, but if we haven’t yet collectively understood that he’s not the architect of our societal woes, just the bellwether, then things are gonna keep sliding downhill.

I mean … they probably will anyway, but the big reason for 2021’s unreasonably high expectations was the tacit assumption that with Trump out of office, things would inevitably get better.

And for a time, it looked like they were! And then … well, I was about to say that reality reasserted itself, but that wasn’t the biggest problem with this past year, was it? It would be more accurate to say that unreality reasserted itself. The Big Lie, anti-vaxxers, the hysteria over “critical race theory” and other bogus culture war non-issues, January 6 trutherism, and of course the ongoing state of climate-change denial. Reality has never been the problem, except insofar as accessing it without having to run a gauntlet of disinformation is now more or less impossible.

This profound sense of frustration and disconnect is why Don’t Look Up has landed so hard on its viewers. I laughed throughout the film, because it’s hard not to—it is comedy, based in the wilful obliviousness and ignorance of people so self-interested, so elementally selfish, that they are congenitally incapable of recognizing the very idea of a collective good. Many times through the film, I found myself thinking, “Wow, this is unsubtle.” But then … so is Tucker Carlson. So is QAnon. So is Donald J. Trump, and so are his legions of enablers and imitators. We live, sad to say, in profoundly unsubtle times.

Or as Stephanie put it midway through the movie, “It’s really depressing when there’s no daylight between satire and the thing it’s satirizing.”

So … happy 2022? Let’s … be careful out there.

1 Comment

Filed under politics, wingnuttery

Some Christmas thoughts about 2021

There’s a moment that happens on occasion in American football, when a running back, having been handed the ball, breaks through all the defenders and sprints into the open field, with nothing between him and the endzone.

And sometimes, rather than making a touchdown, he gets blindsided by a linebacker he didn’t see coming.

That’s what 2021 has been like, especially these past few months.

I was vaccinated in June, and then again in August. We resumed in-person classes at Memorial University in September. Like many people, I felt a massive sense of relief—the pandemic finally seemed to be heading into the rearview mirror, and things were returning to something resembling normalcy. Except … yeah, not so much. I’m writing this at my parents’ house on Christmas day, but we won’t be properly celebrating the holiday until tomorrow because my niece and nephew were possibly exposed to COVID on a schoolbus, and thus need to quarantine for another day. I will myself have to quarantine for five days on returning to St. John’s.

We should be in the clear, running to the endzone with open field, but we keep getting tackled.

This is what I told all my students earlier this term: I wanted them to know it wasn’t just them, it was a general malaise that I was also experiencing. I told them they weren’t alone in feeling anxious, discomfited, or just generally off. I told them that if it came down to it, if they were struggling to get their classwork done, that there was no shame in dropping my course … or any other course they were taking. I told them that I would do whatever I could to help, provided they kept me in the loop. You don’t need to give me the gory details, I said—just tell me you’re having a rough go of it, and that’s enough. We’ll work something out.

I have to imagine there are those who will say I’m catering to the fragility of the snowflake generation in making such accommodations. Anyone having that thought can fuck right off. What saved me from breakdown these past few months was the fact that I had such extraordinary students: as I said to all my classes at the term’s end, they’re the reason why I have hope for the world. As a GenXer, irony is my default setting; for those inheriting the catastrophes of climate change and resurgent fascism, earnestness is their baseline. What would have been radically uncool in the 1990s is quite possibly what will save us all.

On Christmas day, I find myself thinking about all the best parts of the past year. My students are, as they often are, Exhibit A.

I taught a graduate seminar in the winter term that, in spite of the fact that it happened on Zoom, was hands down my favourite class ever. It was called “The Spectre of Catastrophe,” and looked at 21st-century post-apocalyptic narratives. In spite of the fact that the subject was perhaps a little bit too on the nose for our current situation, it was the most fun I’ve had as a teacher ever (and that’s saying a lot, as my following comments will make clear). The only advantage classes taught over Zoom have over in-person teaching is the comments students can write as the lecture/discussion unfolds. In this particular class, there was always an ongoing conversation scrolling up on the side. I used the ten-minute breaks each hour in our three-hour classes to read through the lively and often hilarious discussions happening in parallel to the actual class.

Getting back into the classroom this past September felt so very, very good. It didn’t hurt that I was teaching a roster of courses that were, for me, an embarrassment of riches: a first-year class called “Imagined Places,” which gave me the (gleeful) chance to teach Tolkien (The Hobbit), Ursula K. Le Guin (A Wizard of Earthsea), Sir Terry Pratchett (Small Gods), Neil Gaiman (The Ocean at the End of the Lane), and Jo Walton (The Just City) (for the record, everybody needs to read Jo Walton). I also taught, for the third time, “Reading The Lord of the Rings.” I had five, count ‘em FIVE, students in LOTR who were also in my fourth-year seminar “The American Weird: Lovecraft, Race, and Genre as Metaphor.” (I always measure my success as an educator by the number of students who take multiple classes with me. I assume it means I’m doing something right, though it’s also possible they’ve just got my number and know what it takes to get a decent grade).  Given that the Tolkien and the Lovecraft courses were back-to-back, I had something like an entourage following me from the chemistry building in which I taught LOTR to the Arts buildings that was the home of the weird.

I called my Lovecraft students “weirdos,” because, well, obviously. It only offended them the first time I called them that.

Even given the fact that I was teaching a dream slate of classes, this fall semester was hard. I am congenitally incapable of asking for help, or for that matter recognizing that I need help until after the fact; these past few days of decompressing at my parents’ home have been invaluable in offering relaxation and reflection. I have also realized I need to find my way to a therapist in 2022.

But the moral of the story is that the students I worked with this past year kept me sane, and gave me hope. So on this day of prayer (even for those as atheistic as myself) I am grateful for you all. And given how many of you have signed up for yet more Lockett pontification next term, I can only assume I’m doing something right.

Or you’ve got my number. Either way, it’s all good.

1 Comment

Filed under Uncategorized