Category Archives: maunderings

A Pride Month Post

Pride Month is almost over. I might be a basic cishetero dude, but I love Pride—I love the celebration, the camp, the colour, the music (dancing in the London, Ontario Pride Parade to “I Am What I Am” blasting from the speakers remains one of my fondest memories). I love the love. Some of this has to do with me being a liberal lefty bleeding heart social justice type, but then one of the reasons I’m a liberal lefty bleeding heart social justice type is because many of the people in my life I have loved, who have been dearest to me, who have shaped my worldview, have been queer. I want to take this space to say to all of my queer friends: I love you. My world is a better place for you being in it.

Happy Pride to all my favourite monsters.

This past Monday morning I was at the Starbucks closest to campus, writing my daily entry in my journal. These past several months, I’ve been making a point of writing something every day—even if it’s just a paragraph noting the weather.[i] On this particular morning, I had a lot on my mind, as I had been reading a bunch of articles about the Texas GOP convention that took place in Houston last weekend. It was all through my newsfeeds, especially in regard to the resolution that Texas will hold a referendum no later than 2023 on whether or not to secede from the union; and the addition to their platform asserting that homosexuality is “an abnormal lifestyle choice.”[ii]

I’d read a long thread on Twitter discussing the secession referendum and whether (a) it was legal; (b) it was even feasible; and (c) if refugees from the newly founded Republic of Texas would be supported by a Democratic administration in their relocation. Twitter being Twitter, there was also a lot of sentiment advocating letting Texas go, anticipatory schadenfreude predicting the crash of their economy, and just a lot of general dunking on the stupidity of the whole thing. I had my own thoughts about how Texas Republicans would be making a big bet on being correct in their climate change denial in a state that already has areas very nearly unlivable at the height of summer, as well as a long gulf coast uniquely vulnerable to extreme weather events (to say nothing of a decrepit electrical grid).

Anyway, because I generally avoid tweeting except for the rare occasions when I think of a witty Wildean aphorism, I was sorting my thoughts out in my journal when, looking idly around the Starbucks, I had an odd moment of defamiliarization.

The Starbucks in question is not my favourite coffee shop to work in;[iii] evil corporation aside, I’m generally amenable to Starbucks so long as they at least have the feel of a comfortable café—soft lighting, warmly coloured décor, that sort of thing. This one is new, just opening a few months ago, and is unfortunately very bright and austere, bordering on institutional. But it is also the most proximate coffee to campus,[iv] so I find myself there sometimes.

I had paused in my writing for a moment and was staring into the middle distance, lost in thought. In that moment, Gloria Gaynor came on the sound system—“I Will Survive,” a song that is impossible not to bob along with. And a staple of the Gay Pride soundtrack. Looking around, I suddenly became aware of all the Pride Month paraphernalia decorating the place: there was a big Pride Progress flag behind the counter, another over the chalkboard at the front, crossed with a Trans flag; a row of Starbucks cups was arrayed on the counter in front of the espresso machines, each with a piece of tissue paper corresponding to one of the colours from the Pride flag. There was also the staff themselves, many of whom were, if not actually queer, then certainly rocking the aesthetic.[v]

To be clear: I hadn’t not noticed any of this before. I’ve been there at least once or twice since June began when the profusion of Pride décor went up, but I seem to think they’ve always had a Pride flag at the front of the store, and I had previously taken note that the staff would not be out of place behind the bar or on the dance floor of a gay club that would be way too fashionable for my basic self. But that was the nub of my moment of defamiliarization: I had noticed all these things without noticing them. As I sat there listening to Gloria dunking on her ex, I was taken out of the moment and thought about how when I was the baristas’ age this kind of décor would be limited—even during Pride Month—to specific spots on campus and the “gay ghetto” (as it was then called) of the Church & Wellesley neighbourhood in Toronto. That it was unremarkable that it should be on display in a corporate franchise coffee shop[vi] in St. John’s was, it suddenly occurred to me, remarkable … or remarkable to someone who grew up in the 1980s while attending a Catholic high school, some of whose teachers were quite vocal in their opinion that HIV/AIDS was divine punishment for homosexuality.[vii]

It was an odd moment: what should have been a warm glow of vicarious pride in suddenly seeing such cultural progress that literally snuck up on me sat in stark contrast to the recent full-bore attacks on queer people epitomized in the Texas GOP’s delineation of homosexuality as “an abnormal lifestyle choice.” This specific language is telling, as it hearkens back to an earlier era of anti-gay rhetoric, when it was a point of homophobic common wisdom that being queer was purely a matter of choice. The Texas GOP’s phrasing effectively elides at least thirty years of hard-won progress whose signal event, when Obergefell v. Hodges established federal recognition of same-sex marriage, felt less like revolutionary upheaval than simple confirmation of prevailing attitudes.[viii]

One of the frequent rhetorical recourses made by people opposed to social justice progress—or those who profess to be all about social justice but worry that it is progressing too quickly—is to point to all the obvious advances that have been made in terms of feminism, gay rights, cultural diversity, civil rights, and so forth, as evidence that the current crop of “social justice warriors”[ix] and activists are overreacting, making mountains out of molehills, or otherwise being disingenuous in pretending society hasn’t improved over the past decades and centuries. “So what you’re saying is there’s no difference between today and the Jim Crow era?” is how a typical argument attacking Black Lives Matter might go.

One hears such a tone, for example, in critiques of The Handmaid’s Tale when its misogynist dystopia is dismissed as alarmist fantasy; but then a SCOTUS leak reveals the un-edited version of Samuel Alito’s Gileadean thinking about abortion … and then, somewhere halfway through the writing of this post, the Court strikes down Roe v. Wade with nary a word of the opinion edited. For the better part of my adult life, but especially since Obergefell, when the rainbow flags come out (so to speak) in June there have been predictable harrumphs wondering why Pride is even necessary anymore. Haven’t those people got all the rights now? Why do they need this display?

But then the reactionary Right pivots with the agility of a very agile thing from anti-CRT campaigns in schools to Florida’s “Don’t Say Gay” bill and the broader attempt to associate any mention of LGBTQ people, history, lifestyles, or art and literature with the “grooming” of children. Actually, I shouldn’t say this was a pivot “from” anti-CRT campaigns, given that it’s not as if these people have given up on excising mention of slavery and systemic racism from curricula; it’s more a matter of a broadening of the battlefront, with cynical operators like Christopher Rufo setting the plays. What started as the targeting of trans people has, not unpredictably, exploited the momentum of the Right’s broader culture war to open fronts against what has become assumed to be settled issues—gay rights in particular.[xi] It is telling that none other than Donald Trump Jr. rebuked the Texas GOP for, presumably in accordance with their plank about homosexuality being “an abnormal lifestyle choice,” did not allow the Log Cabin Republicans to set up a booth at the convention. “The Texas GOP should focus its energy on fighting back against the radical democrats and weak RINOs currently trying to legislate our 2nd Amendment rights away,” he said, “instead of canceling a group of gay conservatives who are standing in the breach with us.” That this weak tea was almost certainly the noblest sentiment Don Jr. has ever voiced says as much about the Frankenstein’s monster the MAGA movement has become as it does about Trump’s most toxic spawn.

All of which is by way of saying that, however much progress we as a society have made, there is never cause for complacency. Progress is never inevitable; though there is a tendency to think that it is,[xii] I’d be interested to see how that assumption breaks down on cultural and national lines, and if those who have benefited most from prosperity and the attention to cultural issues that often affords are the most complacent about the moral arc of history bending in their favour. I started writing this post earlier in the week; I picked it up again on Friday and worked on it at one of my favourite downtown spots.[xiii] Whereas Monday morning I’d looked up, lost in thought, and noticed the Pride paraphernalia, on Friday I looked up at one of the televisions over the bar to see the news that Roe v. Wade had been struck down. It was an odd set of bookends to the work week.

As dire as things seem right now—and they are dire, from the worsening climate to the revanchist culture war to the not-zero possibility that Pierre Polievre could be our next Prime Minister—I do still have hope. I have hope because my local Starbucks, in spite of its unfortunate austere design choices, was decked out more queerly than most gay bars I went to in my 20s. I have hope because my students have hope: this generation is earnest and determined, and they have no patience for our prevarications. I laugh, often out loud, when I read shrill screeds about how youth are being indoctrinated with woke ideology by postmodern neo-Marxist (to coin a term!) professors like me. Dude, I want to say, they’re not the ones being indoctrinated. I am.

With hope.

Happy Pride.


 

NOTES

[i] This on the premise that it takes my attention away from the computer and my phone and their endless doomscrolling, and also that it will help center me (especially if I get to it early in the day) and make me organize my thoughts. All this has been moderately successful, if for no other reason than when I start missing days that functions as a bit of a canary in the coal mine for my mental state.

[ii] In addition to secession and their homophobic throwback to the heyday of Jerry Falwell, the Texas Republicans also declared that the 2020 election was corrupt and Joe Biden is therefore an illegitimate president; called for the repeal of the 16th Amendment, which established a federal income tax; declared that  students should be required to “learn about the humanity of the preborn child”; the abolition of any and all abortion; the abolition of the 1965 Voting Rights Act; and the reinstatement of school prayer. John Cornyn, the senior Senator for Texas, was booed during his speech because he had indicated he was open to voting for the minimal gun control bill currently before Congress; and perhaps most remarkably, Congressman Dan Crenshaw and his entourage was attacked in the corridors for … being Trumpy but not Trumpy enough, I guess? Crenshaw, who was a Navy Seal and famously wears an eyepatch because of a wound he received in Iraq, not long ago released a genuinely bonkers political ad featuring himself in full military regalia parachuting out of a plane to attack “antifa and leftists.” He has also tied himself in logical knots while trying to be a “reasonable” conservative while not ever saying a bad word about Trump. Apparently this wasn’t enough for his Texas detractors, who harangued him as an “eyepatch McCain,” an epithet they took from Tucker Carlson.

[iii] Coffee shops and pubs will always rival both my home and campus offices as a preferable work space. One might think that now, in contrast to undergrad and grad school, I do in fact have actual offices—and very nice ones, too!—such public spaces as a local café or pub wouldn’t hold quite the same allure. But they do, for me at least; I like the white noise of these places, and the fact that people-watching can be very zen when you’re paused in thought. I have long been that person you see ensconced in a comfortable seat or booth with a latte or a pint, scribbling in a Moleskine or tapping away at my laptop (though usually the former—working longhand without internet distraction is another point in the coffee shop’s favour).

[iv] For all that Memorial University has to recommend it, it weirdly lacks for cafes and pubs on campus. This, I feel, is contrary to the academic educational process. My undergraduate degree was massively enhanced by the endless discussions and arguments I had with friends at the Absinthe Pub of Winters College at York University; and I’m reasonably certain I wrote at least a chapter’s-worth of my dissertation at UWO’s Grad Club.

[v] When I shared this story with my students in my grad seminar later that morning, one of them helpfully said, “Oh, don’t you know? Everybody who works at Starbucks is queer.” I’ll assume this is at least a slight exaggeration and there are a handful of closeted heterosexuals making lattes, but I’ll take his authoritative word on the subject. 

[vi] Not that corporate coffee franchises should be antithetical to queer culture: one of the rabbit holes my mind went down in the immediate aftermath of this reverie was the memory of the infamous steps in front of the Second Cup on the corner of Church & Wellesley in Toronto. On pleasant days, the steps would be full of queer folk (with a few straights like myself occasionally thrown in), drinking coffee, chatting, watching the street, as much spectacle as spectators. To anyone who’d ever been on Church Street, those steps were instantly recognizable in the series of Kids in the Hall sketches called, well, “Steps.”

Curious if the Second Cup was still there, I did a quick Google search and discovered that the steps had been removed (how, I don’t know) by the owners to “discourage loitering,” and the café itself had departed in 2005—but that a new Second Cup took up residence just down the block six years later.

[vii] No, seriously.

[viii] Which is not, I hasten to add, to downplay its importance or the enormity of the victory it represented; but the fact that it was met more with a shrug from the balance of straight people than with outrage is pretty extraordinary in itself.

[ix] This is by no means an original observation, but you’ve gotta hand it to the forces of reaction that “social justice warrior,” or SJW for short, was turned so quickly into a pejorative, and that anything and everything related to or involving social justice became so resolutely understood as the precinct for the shrill, the sanctimonious, as well as being synonymous with the suppression of free expression. Ditto for critical race theory (or CRT—is there power in reducing something to a three-letter acronym?), and the more recent retread of Anita Bryant’s association of homosexuality with pedophilia, this time with the handy epithet “groomer” attached to anyone committing the sin of admitting to queer identity in the presence of children.

[xi] It would be entertaining, if the circumstances weren’t so distressing, to watch such gay conservatives as Andrew Sullivan—who have taken “gender critical” positions on trans rights—suddenly gobsmacked to find their own subject-positions under attack. Sullivan is a particularly notable example: he has become increasingly strident on what he characterizes as the tyranny of pro-trans discourse, and recently had Christopher Rufo on his podcast in which he agreed on many of Rufo’s attacks on critical race theory. More recently however, he has thrown up his hands and said (figuratively) “Whoa, whoa!” in response to Rufo’s most recent rounds of anti-gay rhetoric.

[xii] Personally, I blame Hegel.

[xiii] Blue on Water. During the day, the bar side is quiet and comfortable, and they have a pretty good menu. And, as I recently discovered, they serve excellent coffee, with good breakfast choices.

Leave a comment

Filed under maunderings

Gremlins redux

Two blog posts ago I went on at length about gremlins—both in general, and specific to The Twilight Zone episode “Nightmare at 20,000 Feet.” That episode comprised my most terrifying fictional experience, something that stuck with me for years. My students’ first assignment in my weird fiction class this summer is to write a piece of creative non-fiction describing their most terrifying fictional experience. As I said in that post, I was planning to write my own, by way of example and in the name of fairness—given that we’ll be sharing everyone’s pieces. And I said I would post it here.

So here we are. As might have been obvious from my last gremlins post, this is has become something of a very interesting and serendipitous rabbit hole for me, at once touching on a handful of my current research interests as well as jogging a lot of memories that I want to explore. Possibly this turns into a larger project, possibly it becomes an avenue of self-exploration, possibly both.

So what I’m saying is, don’t be surprised to see more posts here vectoring off from this line of inquiry.

One caveat: I made a point of not consulting with my parents as I wrote this. What I’ve related in this essay is purely based on my memory of the summer of 1984, and as such might be wildly off base. I’ll be interested to see what my Mom and Dad have to say and whether their own memories are at all consonant with mine. If not, I will write a follow-up.

Meanwhile, without further ado …

A man sees something on the wing of the airplane, a vague shape in the rain and lightning. It’s impossible that anything alive could be out there, at this speed and altitude. But he sees it again. He’s a nervous flyer; perhaps his mind is playing tricks. But then he sees it again: person-shaped but inhuman, its intent obviously malevolent as it tears into the wing.

He is the only one who sees it. He cries out to the flight attendants, to his fellow passengers in panic, but they think he’s crazy. He wonders himself if he’s losing his mind. He lets himself be calmed down, he closes the window shade, he tries to sleep. But soon enough, he can’t help himself. He opens the shade, and there is the thing, a creature that looks like a demonic goblin, inches from his face on the other side of the window, staring back at him with something like sadistic glee.

It’s a gremlin, of course, a folkloric imp that emerged in the age of flight, invented by RAF aviators in the years before WWII. Heir to a long lineage of mischievous pixies and fey folk, the gremlin is nevertheless a modern creation, blamed for the frequent and seemingly random malfunctions that bedeviled airplanes during the frantic steeplechase of flight technology between the wars. Gremlins weren’t just the comic antagonists of tall tales told by pilots and crew on airbases between missions—enough airmen were genuinely convinced that gremlins were real, swearing up and down that they’d seen the little bastards on their wings, that concerned psychological papers were written.

Roald Dahl’s first novel was about gremlins. Bugs Bunny tangled with one in the Looney Tunes short “Falling Hare.” Like their folkloric predecessors, gremlins were given to mischief and occasional cruelty, but were mostly depicted as annoyances and not threats.

For a time in my childhood, gremlins were a source of abject terror for me.

***

When I think of gremlins, I think of the summer of 1984. The movie Gremlins was released that June, but I never saw it. I still haven’t. By the time it hit theatres, I’d already been terrified beyond what was strictly reasonable by the gremlin in The Twilight Zone: The Movie, which my father rented for us to watch some time after its release in 1983 and before Gremlins came out. The fourth of the anthology film’s four segments was “Nightmare at 20,000 Feet,” in which a nervous flyer sees a gremlin on the wing of his plane. The moment when he opens the shade to see the demonic creature staring back at him haunted me for years. When I lay in bed at night and the scene came to mind, I hid my head under the covers—trapping myself, for suddenly I couldn’t shake the idea that the gremlin would be perched there, staring at me, if I lowered them.

Lest you assume these were the infantile fears of a young boy, let me clarify: these were the infantile fears of a twelve-year-old.

***

A vicious heat wave hit our Toronto suburb in 1984. It coincided with the Olympics, which ran from the last week of July into August. It was the kind heat that pervades my childhood memories of summer: a baking sun in a clear sky, air that was somehow stifling and humid while also drying the grass to brittle blades that abraded bare feet. Even basements were no refuge.

Our house had no air conditioning, so my father brought the television set outside onto the side deck where there was at least some shade and, occasionally, a breeze. This arrangement appealed to my mother: puritanical about not spending summer days indoors watching the tube, she also hated missing even a moment of Olympic coverage. Because the 1984 Games were held in Los Angeles, the time difference meant our Olympics viewing stretched into the darkening evening. We ate dinner on the deck and watched athletes run, swim, hurl, and paddle. Neighbours came over, bringing beer and wine and snacks. An ongoing PG-13 bacchanal took up residence on our deck and spilled out onto the yellowing grass of our corner lot as the neighbourhood kids staged our own Games.

***

The 1984 Games were notable for the absence of the Soviet Union and most of the Eastern Bloc. They boycotted Los Angeles in retaliation for the United States’ boycott of the 1980 Moscow Games, which had been in protest over Russia’s invasion of Afghanistan.

It was petulance, said one neighbour. Hypocritical, said another. Someone made an off-colour joke I didn’t understand about women’s weightlifting being fair this time. I didn’t grasp the nuances of the politics, but I knew that the Soviet absence tinged everything with vague unease. The 1984 summer Olympics marked the precise midpoint of Ronald Reagan’s presidency and the renewed belligerence of the Cold War. Fears of nuclear conflict that had smouldered like banked coals during the détente years of the 1970s leapt again into open flame. Pious sages of geopolitics kept inching the Doomsday Clock closer to midnight. I was in some ways a literal-minded child and did not quite understand that the clock was metaphorical. Every time its minute hand crept forward, I could not sleep for days afterward.

The Day After, a terrifying depiction of a nuclear exchange, aired in late 1983. It showed the effects of multiple warheads striking in the American heartland, and the immediate aftermath as people suffering from severe radiation poisoning struggle and fight over food and water. The images of the mushroom clouds and their devastation were the most graphic ever portrayed at the time. A disclaimer at the end told the audience that, however ghastly the film’s depiction had been, it was mild in comparison to what the reality would be. With over one hundred million viewers, it was the most-watched television event in history.

I did not see it. I didn’t even know it existed until I heard about it at school from classmates who had watched it. It had been recommended that parents watch it with their children; guides were made available to help with the discussions afterward. But it was not mentioned in my house and I was somehow smart enough not to ask why.

Whatever sleep I lost worrying about the Bomb, my mother’s nuclear anxieties contained multitudes.

***

Because serendipity is like gravity, that summer one of the television channels aired old episodes of The Twilight Zone. Every night when Olympics coverage ended, when most of the neighbours had gone home, while the lawn and the hedges and the asphalt of the street sighed the stored heat of the day into the darkness, we switched over to the slow cascading fall of the theme music and the studied portent in Rod Serling’s voice.

With one exception, I don’t remember which episodes we watched. I do remember my parents waxing on about episodes we didn’t see. “The Monsters Are Due on Maple Street” was a favourite of theirs. To this day I haven’t seen it, but the plot as they related it stuck in my mind: a quiet suburban neighbourhood like ours suffers an inexplicable blackout on a summer night; the neighbours congregate in the street, anxious but not concerned until the lights go on at one house. Suspicion starts to set in: what’s special about that person’s house? Other houses get power, then lose it, until the previously friendly neighbours descend into paranoid warring factions. The episode ends with aliens in a ship overhead, who have been manipulating the power grid, saying, Look, we don’t have to attack them, these humans will turn on each other.

But because serendipity is the gravity of my life, we did see “Nightmare at 20,000 Feet.” I can’t help but think that if I’d seen the original episode first, the spectre of a gremlin on a wing wouldn’t have stuck so deep in my brain’s fear centers. It featured a pre-Star Trek William Shatner as the nervous flyer, demonstrating that scenery-chewing was always his first and best talent. The gremlin itself looked like a man in a monkey suit, less a demon than an ugly teddy bear. Creepy but hardly terrifying.

But in the context of that uncanny summer fortnight, with the memory of the movie gremlin colouring its black and white predecessor with shades of fear out of space, what was otherwise risible had the effect of driving my original horror deeper into my mind. It became existential. Each night when we changed the channel over and the theme music played I felt ill, and yet could not look away. Then one night, the show began with Serling’s narration: “Portrait of a frightened man: Mr. Robert Wilson, thirty-seven, husband, father and salesman,” showing William Shatner (whom I did not then recognize as Captain Kirk) slouched in his airplane seat. We learned he had recently spent time in a sanitarium—that he had been there because he had a nervous breakdown on a flight “on an evening not dissimilar to this one.” On this night, however, Serling tells us, “his appointed destination … happens to be in the darkest corner of the Twilight Zone.”

I could not look away, even as part of me knew just how much this campy earlier iteration was about to make the lingering effects of the later one indelible. I have little memory of the actual episode. What I have is sense memory: the night’s heavy, suffocating humidity, the creak of crickets in the hedge, the smell of the grass, the bilious weight in my belly, and the dread of knowing I would soon have to try and sleep in my dark and stuffy room.

***

In German, “uncanny” is unheimlich, literally unhomely, that which makes you feel not at home. Those two weeks that summer were dislocating: I was not at home in my home, and my home itself felt adrift, untethered. Or perhaps what I felt was the dread certainty that it was always untethered on the world’s currents, and that the feeling of safety remote from the larger world was the illusion—that there was always a gremlin on the wing, marking time in missile silos and in the minds of world leaders. RAF airmen invented gremlins in part to resolve a contradiction: flight technology was advancing by leaps and bounds but left them uniquely vulnerable while aloft. What more unthinkable technology has existed than nuclear weapons? Perhaps for me the gremlin was not a narrative comfort as it was for the aviators, but the certainty of the technology’s malevolence.

***

The Twilight Zone was in many ways the quintessential Cold War TV show, as it embodied the nagging, unhomely sense of something being not quite right, which was the constant undercurrent of the bland suburban order that America was so desperate to convey to itself. It is no surprise that so many of the show’s episodes are set in such innocuous suburbs as Maple Street.

My father, who grew up in just such a suburb, loved The Twilight Zone when he was my age; he told me that he watched the episodes eagerly when they first aired. He was twelve when the show premiered in autumn 1959. He was a different twelve-year-old than me, apparently—I tried to imagine actually enjoying something that unsettling, actually looking forward to seeing what each new episode would bring, but that sensibility was still alien to me. In a few short years I would learn to love horror when I discovered Stephen King and tore through his novels at breakneck speed. But at twelve I had not yet grown out of the nausea the uncanny inspired in me. That two-week stretch of an otherwise idyllic summer was a perfect storm of subtle dislocations: the heat wave, the outdoor television viewing, the constant low-grade party atmosphere, the hours and hours of Olympic coverage, all with the Soviet absence drawing attention to the Cold War’s constant menacing background hum.

1 Comment

Filed under maunderings

Dystopian Thought for the Day

It occurs to me that the current state of the U.S. Supreme Court is like climate change … which is to say, it has been ongoing for several decades and visible to anyone willing to see it developing, but it has not prompted anything but the most tepid of responses. And now that we’re experiencing the judicial equivalent of massive flooding, it’s already too late.

(I can’t decide whether this analogy is ironic or appropriate, considering this court is likely to do everything in its power to curtail efforts to reverse climate change).

I remember reading Angels in America for the first time over twenty-five years ago, and coming on the scene in which the notorious lawyer and fixer Roy Cohn—now most famous for having been Donald Trump’s mentor in the 1970s—takes the closeted law clerk Joe Pitt out to dinner and introduces him to a Reagan Justice Department apparatchik who waxes poetic about how they’re seeding the federal bench with conservatives judges. “The Supreme Court will be block-solid Republican appointees,” he enthuses, “and the federal bench—Republican judges like land mines, everywhere, everywhere they turn … We’ll get our way on just about everything: abortion, defense, Central America, family values.”

I remember reading that and thinking, wow, diabolical. And then every time I read a news item about the Federalist Society or the GOP’s SCOTUS-oriented machinations, I thought of that scene. When Mitch McConnell held the late Antonin Scalia’s seat hostage from Merrick Garland, I thought of that scene, and thought of it again through Neil Gorsuch’s hearings and the debacle of Brett Kavanagh, and of course once again when McConnell rushed Amy Coney Barrett’s nomination through in what ended up being the last days of the Trump Administration. By then, the full crisis of the American judiciary (my first inkling of which was from a play that first ran off-Broadway in 1992) was plain to see. The U.S. has been experiencing extreme judicial weather events for over a decade now; the leak of the Samuel Alito-authored decision repealing Roe v. Wade is like knowing not just that there’s a category 5 hurricane just below the horizon, but that such storms and worse are the new normal for the foreseeable future.

Union/Confederacy left, 2012 election map right.

Recently it has not been uncommon, especially at moments of more acute racial discord, for people to post images on social media juxtaposing recent electoral maps with maps circa 1860. The red states east of the Mississippi River match almost precisely with the Confederacy; and though Biden’s win in Georgia in 2020 is a welcome disruption of that consonance, otherwise the geography or red v. blue has been increasingly entrenched since Nixon first embarked on The Southern Strategy and accelerated a shift that, sadly, was probably inevitable the moment Lyndon Johnson signed the Civil Rights and Voting Rights Acts.

There has also been, especially since Trump’s election—and even more so since the January 6 insurrection—the prospect of a “new civil war” bandied about, from think pieces to more than a few books. Most such speculations are careful to point out that any such conflict would necessarily be dramatically different from the actual U.S. Civil War—that the seemingly solid blocks of red and blue that replicate the territory of the Confederacy and the Union are deceptive; that however polarized U.S. politics have become, geographically speaking conservative and liberal factions are far more integrated than the maps allow. The divide is more urban/rural than north/south, with substantial blue enclaves in deep red states, like Austin in Texas, or big red swaths in rural California.

The pandemic shook the etch-a-sketch up somewhat, too, as urban professionals, forced to distance socially and work remotely, found the cheaper rents and real estate outside of their cities more amenable (whether the end of the pandemic reverses that out-migration remains to be seen). And when businesses decamp from states like California to states like Texas, they bring with them work forces that tend to be younger and more socially and politically progressive, muddying things further. (Let’s not forget that Florida governor Ron DeSantis’ current feud with Disney over the “Don’t Say Gay” bill was precipitated not by the company’s management, but by its workers, whose hue and cry over what they saw as an unconscionably tepid response prompted the CEO to, one assumes reluctantly, condemn the bill). 

What I’m wondering today is: does the imminent repeal of Roe v. Wade herald a 21st century Great Migration? Except this time, instead of Black Americans fleeing the Jim Crow south, will it be liberals and progressives fleeing Republican states for Democratic ones? Possibly that seems like I’m overstating the case, but I think it will depend on just how far this SCOTUS will take the logic of Alito’s rationale, which is essentially predicated on the assertion that there is no right to privacy enshrined in the U.S. Constitution. Numerous legal experts have weighed in on this speculation, running down a list of landmark Supreme Court cases that hinged at least in part on the premise of the right to privacy: legal contraception, the abolition of anti-sodomy laws, interracial marriage, the prohibition of forced sterilization, and same-sex marriage. Even a year or two ago I would have not worried overmuch about such cases being overturned, thinking it unlikely that any high court, however conservative its composition, would be so retrograde. But this court’s conservative majority has demonstrated a shocking unconcern for even the appearance of being measured and apolitical. They’ve pretty much made it obvious that anything and everything is on the table. That goes also for the current spate of legislating being done by Republican-dominated states: injunctions against teaching the history of slavery, the banning of books, the abolition of sex education, and of course the aforementioned “Don’t Say Gay” bill in Florida, which looks ready to be imitated in other red states. Should any challenges to these pieces of legislation make it to a SCOTUS hearing, how likely do we think it is that the current bench would quash them?

Which makes me wonder at what point being a liberal or progressive living in a blue city in a red state becomes untenable? What would that do to the U.S. polity? There would be a significant brain drain from red states; businesses would be obliged to follow when their pool of qualified workers dried up; urban centers in red states would wither; the current political polarization would in fact become geographical, as the states lost their last vestiges of philosophical diversity and became more and more autonomous, no longer subject to any federal law or statute they felt like challenging before a sympathetic Supreme Court.

That might indeed be a recipe for a “traditional” civil war.

Leave a comment

Filed under maunderings, politics, wingnuttery

On Gremlins

I’ve been thinking a lot about gremlins these past few days.

Bugs Bunny and friend in “Falling Hare” (1943)

I’m teaching a graduate seminar on weird fiction this summer (full title: “Weird Fiction: Lovecraft, Race, and the Queer Uncanny”), and the first assignment is a piece of creative non-fiction describing your most terrifying fictional experience; whether in a book, film, or episode of television, what scared you so badly that it stayed with you for weeks or years? I’ve done this kind of assignment before in upper-level classes, and it has always worked well—especially considering that I post everyone’s pieces in the online course shell so they can be read by the entire class. That always leads to a good and interesting class discussion.

In the interests of fairness and by way of example, I’m also writing one. Which is where the gremlins come in.

No, not the 1984 movie. I couldn’t watch it, given that by the time it was released I’d already been traumatized by “Nightmare at 20,000 Feet.” And no, not the original Twilight Zone episode from 1963. If I’d watched that episode—which featured pre-Star Trek William Shatner gnawing the furniture as a nervous flyer who sees a gremlin that looks like a man in a monkey suit on the wing of the plane—it’s possible gremlins wouldn’t have come to haunt my imagination the way they did. The original episode is creepy, to be certain, but not particularly scary; the gremlin is too fluffy and Shatner too Shatner to really evoke terror.

The gremlin that made me terrified to sleep at night for several months was the one in the “Nightmare” segment of the 1983 film The Twilight Zone: The Movie.

The premise is very simple, and most likely familiar even to those who haven’t seen it (not least because it was parodied in a Simpsons Halloween episode): a man who is a nervous flyer to start with is on a plane during a storm. Looking out the window, he sees … something. At first, he thinks his eyes are playing tricks, but then he sees it again. And then again, and each time it becomes clearer that there is a person-shaped thing out there on the wing. Panicked, he calls for a flight attendant, shouting “There’s a man on the wing of this plane!” This, of course, is impossible, and it is obvious that the fight staff and his fellow passengers think him hysterical (it doesn’t help his case that the segment begins with a flight attendant talking to him through the bathroom door as he’s inside having a panic attack). After talking himself down, realizing it would be impossible for a man to be on the wing at this speed and altitude, He accepts a valium from the senior flight attendant, closes the window shade, and attempts to sleep.

Of course, after a fitful attempt, he can’t help himself, and he opens the shade … and sees the thing, clearly demonic in appearance now, inches away from his face on the other side of the window.

Yup. This bastard.

This was the precise moment it broke my brain and gave me nightmares for months.

Anyway, TL;DR: either through turbulence or the creature’s sabotage, the plane lurches violently. The man grabs a gun from the sky marshal and shoots at the creature through the window. The cabin decompresses and he’s sucked out halfway. He shoots at the creature again, which wags a clawed finger at him, and flies off into the night. The plane lands and the man is taken off in a straitjacket; meanwhile, the mechanics examining the plane’s engine find it torn to shreds, the metal bent and ripped and covered in deep rents that look like claw marks.

I don’t remember any of the rest of the movie, which had three other segments based on old Twilight Zone episodes. I just remember watching “Nightmare,” being terrified, and my father telling me, in reply to my shocked question, that the creature was a gremlin and that they sabotage airplanes.

Really, it’s amazing I ever willingly went back on a plane.

I’ve been thinking about that and remembering a lot of details about the summer of 1984, which is when this all happened, and trying to work through precisely why it scared me so profoundly. I’ll post that essay here when it’s written; but in the meantime, I’ve been going down the rabbit hole on gremlins and their origins as an element of modern folklore.

***

There’s surprisingly little written about gremlins, which is possibly a function of the twinned facts that, on one hand, they’re basically a sub-species of a vast array of pixies, fairies, goblins, imps, and other mischievous fey creatures from folklore and legend; and on the other hand, they have a recent and fairly specific point of origin. Gremlins emerge alongside aviation (something The Twilight Zone hews to and the movie Gremlins ignores). More specifically, gremlins are creatures of the RAF, and start appearing as an explanation for random malfunctions sometime in the 1920s, becoming a staple of flyers’ mythos by the outbreak of WWII.

Gremlins, indeed, almost became the subject of a Disney film: author Roald Dahl, who would go on to write Charlie and the Chocolate Factory and James and the Giant Peach among innumerable other beloved children’s books, was an RAF pilot. His first book was titled The Gremlins, about a British Hawker Hurricane pilot named Gus who is first tormented by gremlins, but ultimately befriends them and convinces them to use their technical savvy to help the British war effort. In 1942, Dahl was invalided out of active service and sent to Washington, D.C. as an RAF attaché. The Gremlins brought the RAF mythos of airborne imps to America, and was popular enough that Disney optioned it as an animated feature. Though Disney ultimately did not make the movie, Dahl convinced them to publish it with the animators’ illustrations in 1943. First Lady Eleanor Roosevelt reportedly delighted in reading it to her grandchildren.

There was also a Loony Toons short in 1943 featuring Bugs Bunny being bedevilled by a gremlin on a U.S. airbase.

Though Dahl would later claim to have coined the word “gremlin,” that is demonstrably false, as the term was in use from the 1920s and was featured in Pauline Gower’s 1938 novel The ATA: Women With Wings. The word’s etymology is difficult to determine, with some suggesting it comes from the Old English word gremian, “to vex,” which is also possibly related to gremmies, British English slang for goblin or imp. Another theory holds that the word is a conflation of goblin and Fremlin, the latter being a popular brand of beer widely available on British airbases in the mid-century—one can imagine tales of mischievous airborne creatures characterized as goblins seen after too many Fremlins.

One of the more interesting aspects of the gremlins mythos is how many flyers seemed genuinely convinced of the creatures’ existence. So common were tales of malfunction attributed to gremlins that U.S. aircrews stationed in England picked up on the lore and many of them, like their British counterparts, swore up and down they’d actually seen the little bastards working their mischief. Indeed, one of the only academic pieces of writing I’ve been able to find on gremlins is not the work of a folklorist, but a sociologist: in a 1944 edition of The Journal of Educational Sociology, Charles Massinger writes gravely about the fact that “a phase of thinking that had become prevalent in the Royal Air Force”—which is to say, gremlins—“had subsequently infected the psychology of the American airmen in the present war.” Massinger’s article expresses concern that otherwise rational people, thoroughly trained in advanced aviation, who necessarily possess a “breadth of … scientific knowledge relative to cause and effect of stress on the fighting machine” would be so irrational as to actually believe in the existence of “fantastic imps.”

Massinger suggests that it is the stress of combat that gives rise to such fantasies, which is not an unreasonable hypothesis—war zones are notoriously given to all sorts of fabulation. But he says that it is the stress and fear in the moment, in which split-second decisions and reactions that don’t allow for measured and reasoned thought, that short-circuits the sense of reality: “If pilots had sufficient time to think rationally about machine deficiencies under actual flying conditions,” he says, “it is doubtful whether the pixy conception would have crept into their psychology.” Leaden prose aside, this argument strikes me as precisely wrong. The mythology surrounding gremlins may have had its start in panicked moments of crisis while aloft, but it developed and deepened in moments of leisure—airmen relaxing between missions in the officers’ club or mess, probably over numerous bottles of Fremlins. It is indeed with just such a scene that we first learn of gremlins in Dahl’s story.

I do however think Massinger’s instinct isn’t wrong here, i.e. the idea that airmen respond to the stresses of combat and the frustrations of frequent baffling breakdowns with fantasy rather than reason. What he’s missing is the way in which mess-hall fabulation humanizes the experience; the rationality of science and technology in such situations, I would hazard, is not a comfort, no matter how long the flyers have for reflection. The mechanical dimension of air combat is the alienating factor, especially at a point in time when flight was not just new but evolving by leaps and bounds. Roald Dahl’s experience in this respect is instructive: he started the war flying Gloster Gladiator biplanes, which were badly obsolete even when they were first introduced in 1934. By the time he was invalided, he had graduated to Hawker Hurricanes, which in the early days of the war were among the most advanced fighters. By the time he was in the U.S. and Eleanor Roosevelt was reading his first book to her grandchildren, the Allied bombing campaign had already lost more planes than flew in total during the First World War, with the new planes coming off assembly lines not just matching the losses but growing the massive air fleets.

Air travel has become so rote and banal today, and catastrophic airframe malfunctions so rare, that it is difficult to remember what must have been a vastly disorienting experience in WWII: ever-more sophisticated fighters and bombers that were nevertheless plagued by constant mechanical failures, machines of awesome destructive power that were also terribly vulnerable. Bomber crews suffered the highest rates of attrition in the war—about half of them were killed in action—while there was also the constant drumbeat of propaganda about the supposed indomitability of the Allied bombing wings.

When I teach my second-year course on American literature after 1945, I always start with the poetry of Randall Jarrell; specifically, we do a few of his war poems, as a means of emphasizing how the Second World War so profoundly transformed the world and the United States’ place in it, and the extent to which American popular culture became invested in mythologizing the war. Jarrell’s poetry is a disconcertingly ambivalent glimpse of the depersonalization and mechanization of the soldier by a war machine Hollywood has largely erased through such sentimental portrayals as The Longest Day and Saving Private Ryan. “The Death of the Turret-Ball Gunner” is usually the first poem we do, and I can reliably spend an entire class on it despite its brevity. In its entirety:

From my mother’s sleep I fell into the State,
And I hunched in its belly till my wet fur froze.
Six miles from earth, loosed from its dream of life,
I woke to black flak and the nightmare fighters.
When I died they washed me out of the turret with a hose.

The final line is a gut-punch, but it’s the first two lines that establish one of Jarrell’s key themes with devastating economy. The speaker “falls” from the warmth and safety of the mother’s care, where he is loved as an individual, to the ownership of the State, where he is depersonalized and expendable—rendered inhuman even before the “black flak” (anti-aircraft fire) unincorporates his body. In the second line, the State is explicitly conflated with the weapon of war, the bomber, of which he has become a mechanism, and which functions as a monstrous womb: the parallel structure of the two lines aligns the “belly” of airplane with the “mother’s sleep.” The “wet fur,” freezing in the sub-zero temperatures of the high altitude, is literally the fur lining his bomber jacket, but also alludes to the lanugo, the coat of fur that fetuses develop and then shed while in the womb.

The bomber functions in Jarrell’s poetry as the exemplar of the Second World War’s inhuman scope and scale, built in vast numbers, visiting vast devastation on its targets—the last two of which were Hiroshima and Nagasaki—but which itself was terribly vulnerable and always in need of more bodies to fill out its crews. The machine itself was never scarce.

All of which might seem like a huge digression from a discussion of gremlins, but it’s really not: gremlins are identifiably kin to myth and folklore’s long history of mischievous “little people,” from pixies to the sidhe. That they emerge as a specific sub-species (sub-genre?) at the dawn of aviation—specifically, military aviation—is suggestive of a similar mythopoeic impulse when faced with the shock of the new. That some airmen become convinced of their existence as the war went on and the air war grew to unthinkable proportions is, I would suggest, pace Massinger, utterly unsurprising.

A Disneyfied gremlin.

SOURCES

Donald, Graeme. Sticklers, Sideburns, and Bikinis: The Military Origins of Everyday Words and Phrases. 2008.

Leach, Maria (ed). The Dictionary of Folklore. 1985.

Massinger, Charles. “The Gremlin Myth.” The Journal of Educational Sociology., Vol. 17 No. 6 (Feb. 1944). pp. 359-367.

Rose, Carol. Spirits, Fairies, Gnomes, and Goblins: An Encyclopedia of the Little People. 1996.

1 Comment

Filed under history, maunderings, what I'm working on

History, Memory, and Forgetting Part 3: The Backlash Against Remembering

“The struggle of man against power is the struggle of memory against forgetting.”

—Milan Kundera The Book of Laughter and Forgetting

I had drafted the first two installments of this three-part series of posts and was working on this one when the news of the discovery of the remains of 215 indigenous children on the grounds of a former residential school in BC broke. I paused for a day or two, uncertain of whether I wanted to talk about it here, in the context of my broader theme of history, memory, and forgetting. Part of my hesitation is I honestly lack the words for what is a shocking but utterly unsurprising revelation.

I must also confess that, to my shame, one of my first thoughts was the dread certainty that we’d soon be treated to some tone-deaf and historically ignorant column from the likes of Rex Murphy or Conrad Black or any one of the coterie of residential school apologists. So far however the usual suspects seem to be steering clear of this particular story; possibly the concrete evidence of so much death and suffering perpetrated by white turnkeys in the name of “civilizing” Native children is a bridge too far even for Murphy and Black et al’s paeans to Western civilization and white Canada’s munificence.

What I’m talking about in this third post of three is remembering as a political act: more specifically, of making others remember aspects of our history that they may not want to accept or believe. Scouting as I did for anything trying to ameliorate or excuse or explain away this evidence of the residential schools’ inhumanity,[1] I found my way to a 2013 Rex Murphy column that might just be the epitome of the genre, as one gleans from its title: “A Rude Dismissal of Canada’s Generosity.” In Murphy’s telling, in this column as in other of his rants and writings, conditions for Native Canadians in the present day are a vast improvement over historical circumstances, but the largesse of white Canadians and the government is something our indigenous populations perversely deny at every turn. He writes: “At what can be called the harder edges of native activism, there is a disturbing turn toward ugly language, a kind of razor rhetoric that seeks to cut a straight line between the attitudes of a century or a century and a half ago and the extraordinarily different attitudes that prevail today.”

“Attitudes” is the slippery word there: outside of unapologetically anti-indigenous and racist enclaves, I doubt you’d have difficulty finding white Canadians who did not piously agree that the exploitation and abuse of our indigenous peoples was a terrible thing. You’d have a much harder time finding anyone willing to do anything concrete about it, such as restoring the land we cite in land acknowledgments to its ancestral people. Attitudes, on the balance, have indeed improved, but that has had little effect on Native peoples’ material circumstances. And in his next paragraph, Murphy seems intent on demonstrating that not all attitudes have, in fact, improved:

From native protestors and spokespeople there is a vigorous resort to current radical jargon—referring to Canadians as colonialist, as settlers, as having a settler’s mentality. Though it is awkward to note, there is a play to race in this, a conscious effort to ground all issues in the allegedly unrepentant racism of the “settler community.” This is an effort to force-frame every dispute in the tendentious framework of the dubious “oppression studies” and “colonial theory” of latter-day universities.

And there it is—the “radical jargon” that seeks to remember. Referring to Canadians as colonialist settlers isn’t radical, nor is it jargon, but is a simple point of fact—and indeed, for decades history textbooks referred to settlers as brave individuals and the colonizing of Canada as a proud endeavour, necessarily eliding the genocidal impact on the peoples already inhabiting the “new world.” Murphy’s vitriol is, essentially, denialism: denying that our racist and oppressive history doesn’t linger on in a racist present. He speaks for an unfortunately large demographic of white Canada that is deeply invested in a whitewashed history, and reacts belligerently when asked to remember things otherwise.

This is a phenomenon we see playing out on a larger scale to the south, most recently with a substantial number of Republican-controlled state legislatures introducing bills that would forbid schools from teaching any curricula suggesting that the U.S. is a racist country, that it has had a racist history, or really anything that suggests racism is systemic and institutional rather than an individual failing. The bogeyman in much of the proposed legislation is “critical race theory.” Like Rex Murphy’s sneering characterization of “latter-day universities” offering degrees in “oppression studies” (not actually a thing), critical race theory is stigmatized as emerging from the university faculty lounge as part and parcel of “cultural Marxism’s” sustained assault on the edifices of Western civilization.[2] While critical race theory did indeed emerge from the academy, it was (and is) a legal concept developed by legal scholars like Derrick Bell and Kimberlé Crenshaw[3] in the 1970s and 80s. As Christine Emba notes, “It suggests that our nation’s history of race and racism is embedded in law and public policy, still plays a role in shaping outcomes for Black Americans and other people of color, and should be taken into account when these issues are discussed.” As she further observes, it has a clear and quite simple definition, “one its critics have chosen not to rationally engage with.”

Instead, critical race theory is deployed by its critics to connote militant, illiberal wokeism in a manner, to again quote Emba, that is “a psychological defense, not a rational one”—which is to say, something meant to evoke suspicion and fear rather than thought. The first and third words of the phrase, after all, associate it with elite liberal professors maundering in obscurantist jargon, with which they indoctrinate their students into shrill social justice warriors. (The slightly more sophisticated attacks will invoke such bêtes noir of critical theory as Michel Foucault or Jacques Derrida[4]).

But again, the actual concept is quite simple and straightforward: racism is systemic, which should not be such a difficult concept to grasp when you consider, for example, how much of the wealth produced in the antebellum U.S. was predicated on slave labour, especially in the production of cotton—something that also hugely benefited the northern free states, whose textile mills profitably transformed the raw cotton into cloth. Such historical realities, indeed, were the basis for the 1619 Project, the New York Times’ ambitious attempt to reframe American history through the lens of race—arguing that the true starting-point of America was not with the Declaration of Independence in 1776, but when the first African slaves set foot on American soil in 1619.

The premise is polemical by design, and while some historians took issue with some of the claims made, the point of the project was an exercise in remembering aspects of a national history that have too frequently been elided, glossed over, or euphemized. In my previous post, I suggested that the forgetting of the history of Nazism and the Holocaust—and its neutering through the overdeterminations of popular culture—has facilitated the return of authoritarian and fascistic attitudes. Simultaneously, however, it’s just as clear that this revanchist backsliding in the United States has as much to do with remembering. The reactionary crouch inspired by the tearing down of Confederate monuments isn’t about “erasing history,” but remembering it properly: remembering that Robert E. Lee et al were traitors to the United States and were fighting first and foremost to maintain the institution of slavery. Apologists for the Confederacy aren’t wrong when they say that the Civil War was fought over “states’ rights,” they’re just eliding the fact that the principal “right” being fought for above all others was the right to enslave Black people. All one needs to do to clarify this particular point is to read the charters and constitutions of the secessionist states, all of which make the supremacy of the white race and the inferiority of Africans their central tenet. The Confederate Vice President Alexander Stephens declared that the “cornerstone” of the Confederacy was that “the negro is not equal to the white man; that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”

Statue of Nathan Bedford Forrest in Memphis, Tennessee

Bringing down Confederate monuments isn’t erasure—it’s not as if Robert E. Lee and Nathan Bedford Forrest disappear from the history books because people no longer see their bronze effigies in parks and town squares—but the active engagement with history. That was also the case with the erection of such monuments, albeit in a more pernicious manner: the vast majority were put up in the 1920s and 1950s, both of which were periods when white Americans felt compelled to remind Black Americans of their subordinate place by memorializing those who had fought so bloodily to retain chattel slavery. Like the Confederate battle flag, these monuments were always already signifiers of white supremacy, though that fact has been systematically euphemized with references to “southern tradition,” and of course the mantra of states’ rights.

Martin Sheen as Robert E. Lee in Gettysburg (1993)

Once when I was still in grad school and had gone home to visit my parents for a weekend, I was up late watching TV, thinking about going to bed, and while flipping channels happened across a film set during the Civil War. It was about halfway through, but I stayed up watching until it was done. The story was compelling, the acting was really good, and the battle scenes were extremely well executed. The film was Gettysburg, which had been released in 1993. It had a huge cast, including Jeff Daniels and Sam Elliott, and Martin Sheen as General Robert E. Lee. Because I’d only seen the second half, I made a point of renting it so I could watch the entire thing.

Gettysburg is a well-made film with some great performances, and is very good at pushing emotional buttons. Colonel Joshua Chamberlain’s (Jeff Daniels) bayonet charge down Little Round Top is a case in point:

I’m citing Gettysburg here because it is one of the most perfect examples of how deeply the narrative of the Lost Cause became rooted in the American imagination. The Lost Cause, for the uninitiated, is the ongoing project, begun just after the end of the Civil War, to recuperate and whitewash (pun intended) the Confederacy and the antebellum South. Its keynotes include the aforementioned insistence that the Civil War wasn’t fought over slavery but states’ rights; its foregrounding of cultured and genteel Southern gentlemen and women; depictions of slaves (when depicted at all) as happy spiritual-singing fieldhands under the benevolent supervision of the mastah; Northerners as rapacious carpetbaggers who proceeded to despoil everything good and noble about the South in the years following the war; and the war itself as a tragic but honourable dispute between sad but dutiful officer classes prosecuting their respective sides’ goals not in anger but sorrow.

This last element is Gettysburg’s connective tissue. Let’s be clear: the film isn’t Southern propaganda like D.W. Griffith’s Birth of a Nation. It is, indeed, high-minded and even-handed. Martin Sheen—Martin fuckin’ Jed Bartlett Sheen!—plays Robert E. Lee, one of the prime targets of statue removers. But the film is propagandistic: there is, to the best of my recollection, not a single Black character in the film, and slavery as the cause of the conflict is alluded to (again, to the best of my recollection) only once—and then only obliquely.

My point in using this example—I could just as easily have cited Gone With the Wind or Ken Burns’ docuseries The Civil War—is how invidiously the Lost Cause narrative has wormed its way into the American imaginary. It is of a piece with everything else I’ve been talking about in this thee-part series of posts. What novels like The Underground Railroad and historical reckonings like the 1619 Project—as well as the campaign to tear down Confederate monuments—attempt is a kind of radical remembering. And as we see from the ongoing backlash, to remember things differently can be threatening to one’s sense of self and nation.

NOTES


[1] As I said, none of the usual suspects seems to have advanced an opinion, but there was—not unpredictably—an awful lot of such attempts in the comments sections on articles about the grisly discovery. They ranged from the usual vile racist sentiments one always finds in these digital cesspools, to one person who argued at length that the child mortality rates in residential schools were consonant with child mortality rates in the population at large, nineteenth century hygiene and medicine being what it was. This individual was not undeterred from their thesis in spite of a long-suffering interlocutor who provided stats and links showing that (a) what the person was referencing was infant mortality rates, which is not the same thing; (b) that the death rates in residential schools were actually more egregious in the 1930s and 40s, and (c) that mass burials in unmarked graves without proper records and death certificates spoke to the dehumanization of the Native children on one hand, and the likelihood on the other hand that this indicated that the “teachers” at these schools were reluctant to leave evidence of their abusive treatment.

[2] I will have a lot more to say on this particular misapprehension of “the university faculty lounge” in a future post on the more general misapprehensions of what comprises a humanities degree.

[3] Crenshaw also developed that other concept that triggers conservatives, “intersectionality.”

[4] Stay tuned for my forthcoming post, “The Conspiracy Theory of Postmodernism.”

Leave a comment

Filed under maunderings

History, Memory, and Forgetting Part 2: Forgetting and the Limits of Defamiliarization

“We cross our bridges when we come to them, and burn them behind us, with nothing to show for our progress except a memory of the smell of smoke, and a presumption that once our eyes watered.”

—Tom Stoppard, Rosencrantz and Guildenstern are Dead

In my first year at Memorial, I taught one of our first-year fiction courses. I ended the term with Martin Amis’ novel Time’s Arrow—a narrative told from the perspective of a parasitic consciousness that experiences time backwards. The person to whom the consciousness is attached turns out to be a Nazi war criminal hiding in America, who was a physician working with Dr. Mengele at Auschwitz. Just as we get back there, after seeing this man’s life played in reverse to the bafflement of our narrator, the novel promises that things will start to make sense … now. And indeed, the conceit of Amis’ novel is that the Holocaust can only make sense if played in reverse. Then, it is not the extermination of a people, but an act of benevolent creation—in which ashes and smoke are called down out of the sky into the chimneys of Auschwitz’s ovens and formed into naked, inert bodies. These bodies then have life breathed into them, are clothed, and sent out into the world. “We were creating a race of people,” the narrative consciousness says in wonder.

Time’s Arrow, I told my class, is an exercise of defamiliarization: it wants to resist us becoming inured to the oft-repeated story of the Holocaust, and so requires us to view it from a different perspective. Roberto Benigni’s film Life is Beautiful, I added, worked to much the same end, by (mostly) leaving the explicit brutalities of the Holocaust offstage (as it were), as a father clowns his way through the horror in order to spare his son the reality of their circumstances. As I spoke, however, and looked around the classroom at my students’ uncomprehending expressions, I felt a dread settle in my stomach. Breaking off from my prepared lecture notes, I asked the class: OK, be honest here—what can you tell me about the Holocaust?

As it turned out, not much. They knew it happened in World War II? And the Germans were the bad guys? And the Jews didn’t come out of it well …? (I’m honestly not exaggerating here). I glanced at my notes, and put them aside—my lecture had been predicated on the assumption that my students would have a substantive understanding of the Holocaust. This was not, to my mind, an unreasonable assumption—I had grown up learning about it in school by way of books like Elie Wiesel’s Night, but also seeing movies depicting its horrors. But perhaps I misremembered: I was from a young age an avid reader of WWII history (and remain so to this day), so I might have assumed your average high school education would have covered these bases in a more thorough manner.[1]

The upshot was that I abandoned my lecture notes and spent the remaining forty minutes of class delivering an off-the-cuff brief history of the Holocaust that left my students looking as if I’d strangled puppies in front of them, and me deeply desiring a hot shower and a stiff drink.

In pretty much every single class I’ve ever taught since, I will reliably harangue my students that they need to read more history. To be fair, I’d probably do that even without having had this particular experience; but I remember thinking of Amis’ brilliant narrative conceit that defamiliarization only works if there has first been familiarization, and it depressed me to think that the passage of time brings with it unfamiliarization—i.e. the memory-holing of crucial history that, previously, was more or less fresh in the collective consciousness. The newfound tolerance for alt-right perspectives and the resurgence of authoritarian and fascist-curious perspectives (to which the Republican Party is currently in thrall) proceeds from a number of causes, but one of them is the erosion of memory that comes with time’s passage. The injunctions against fascism that were so powerful in the decades following WWII, when the memory of the Holocaust was still fresh and both the survivors and the liberators were still ubiquitous, have eroded—those whose first-hand testimonials gave substance to that history have largely passed away. Soon none will remain.

What happens with a novel like Time’s Arrow or a film like Life is Beautiful when you have audiences who are effectively ignorant of the history informing their narrative gambits? Life is Beautiful, not unpredictably, evoked controversy because it was a funny movie about the Holocaust. While it was largely acclaimed by critics, there were a significant number who thought comedy was egregiously inappropriate in a depiction of the Holocaust,[2] as was using the Holocaust as a backdrop for a story focused on a father and his young son. As Italian film critic Paolo Mereghetti observes, “In keeping with Theodor Adorno’s idea that there can be no poetry after Auschwitz, critics argued that telling a story of love and hope against the backdrop of the biggest tragedy in modern history trivialized and ultimately denied the essence of the Holocaust.” I understand the spirit of such critiques, given that humour—especially Roberto Benigni’s particular brand of manic clowning—is jarring and dissonant in such a context, but then again, that’s the entire point. The film wants us to feel that dissonance, and to interrogate it. And not for nothing, but for all of the hilarity Benigni generates, the film is among one of the most heartbreaking I’ve seen, as it is about a father’s love and his desperate need to spare his son from the cruel reality of their circumstances. Because we’re so focused on the father’s clownish distractions, we do not witness—except for one haunting and devastating scene—the horrors that surround them.

In this respect, Life is Beautiful is predicated on its audience being aware of those unseen horrors, just as Time’s Arrow is predicated on its readers knowing the fateful trajectory of Jews rounded up and transported in boxcars to their torture and death in the camps, to say nothing of the grotesque medical “experiments” conducted by Mengele. The underlying assumption of such defamiliarization is that an oft-told history such as the Holocaust’s runs the risk of inuring people to its genuinely unthinkable proportions.[3] It is that very unthinkability, fresher in the collective memory several decades ago, that drove Holocaust denial among neo-Nazi and white supremacist groups—because even such blinkered, hateful, ignorant bigots understood that the genocide of six million people was a morally problematic onion in their racial purity ointment.[4]

“I know nothing, I see nothing …”

They say that tragedy plus time begets comedy. It did not take long for Nazis to become clownish figures on one hand—Hogan’s Heroes first aired in 1965, and The Producers was released in 1967—and one-dimensional distillations of evil on the other. It has become something of a self-perpetuating process: Nazis make the best villains because (like racist Southerners, viz. my last post) you don’t need to spend any time explaining why they’re villainous. How many Stephen Spielberg films embody this principle? Think of Indiana Jones in The Last Crusade, looking through a window into a room swarming with people in a certain recognizable uniform: “Nazis. I hate these guys.” It’s an inadvertently meta- moment, as well as a throwback to Indy’s other phobia in Raiders of the Lost Ark: “Snakes. Why’d it have to be snakes?” Snakes, Nazis, tomato, tomahto. Though I personally consider that a slander against snakes, the parallel is really about an overdetermined signifier of evil and revulsion, one that functions to erase nuance.

Unfortunately, if tragedy plus time begets comedy, it also begets a certain cultural amnesia when historically-based signifiers become divorced from a substantive understanding of the history they’re referencing. Which is really just a professorial way of saying that the use of such terms as “Nazi” or “fascist,” or comparing people to Hitler has become ubiquitous in a problematic way, especially in the age of social media. Case in point, Godwin’s Law, which was formulated by Michael Godwin in the infancy of the internet (1990). Godwin’s Law declares that “As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.” This tendency has been added to the catalogue of logical fallacies as the “Reduction ad Hitlerum,” which entails “an attempt to invalidate someone else’s position on the basis that the same view was held by Adolf Hitler or the Nazi Party.”

Perhaps the most egregious recent example of this historical signification was Congresswoman and QAnon enthusiast Marjorie Taylor Greene’s comparison of Nancy Pelosi’s decision to continue requiring masks to be worn in the House of Representatives (because so many Republicans have declared their intention to not get vaccinated) to the Nazi law requiring Jews to wear a gold Star of David on their chests. She said, “You know, we can look back at a time in history where people were told to wear a gold star, and they were definitely treated like second class citizens, so much so that they were put in trains and taken to gas chambers in Nazi Germany. And this is exactly the type of abuse that Nancy Pelosi is talking about.”

To be certain, Greene was roundly condemned by almost everybody, including by many in her own party—even the craven and spineless Minority Leader Kevin McCarthy had some stern words for her—but what she said was different not in kind but in degree from the broader practice of alluding to Nazism and the Holocaust in glib and unreflective ways.[5]

Though this tendency is hardly new—Godwin’s Law is thirty-one years old—it has been amplified and exacerbated by social media, to the point where it made it difficult to find terms to usefully describe and define Donald Trump’s authoritarian tendencies. The ubiquity of Nazi allusions has made them necessarily diffuse, and so any attempt to characterize Trumpism as fascist in character could be easily ridiculed as alarmist and hysterical; and to be fair, there were voices crying “fascist!” from the moment he made that initial descent on his golden escalator to announce his candidacy. That those voices proved prescient rather than alarmist doesn’t obviate the fact that they muddied the rhetorical waters.[6] As the contours of Trump’s authoritarian tendencies came into focus, the fascistic qualities of the Trumpian Right became harder and harder to ignore; bizarrely, they’ve become even more clearly delineated since Trump left office, as the republicans still kowtow to the Mar-A-Lago strongman and move to consolidate minoritarian power.

Historians and political philosophers and a host of other thinkers of all stripes will be years in unravelling the historical and cultural strands of Trump’s rise and the previously unthinkable hold that Trumpism has over a stubborn rump of the electorate; but I do think that one of the most basic elements is our distressing tendency toward cultural amnesia. It makes me think we’re less in need of defamiliarizing history than defamiliarizing all the clichés of history that have become this inchoate jumble of floating signifiers, which allow neo-Nazis and white supremacists to refashion themselves as clean cut khaki-clad young men bearing tiki-torches, or to disingenuously euphemize their racism as “Western chauvinism” and meme their way out of accusations of ideological hatefulness—“It’s just about the lulz, dude.”

There is also, as I will discuss in the third of these three posts, the fact that remembering is itself a politically provocative act. On one hand, the diminution in the collective memory of Nazism and the Holocaust has facilitated the re-embracing of its key tropes; on the other, the active process of remembering the depredations of Western imperialism and the myriad ways in which slavery in the U.S. wasn’t incidental to the American experiment but integral gives rise to this backlash that takes refuge in such delusions as the pervasiveness of anti-white racism.

NOTES


[1] To be clear, this is not to castigate my students; as I’ll be expanding on as I go, historical amnesia is hardly limited to a handful of first-year university students in a class I taught fifteen years ago.

[2] One can only speculate on what such critics make of Mel Brooks’ career.

[3] Martin Amis attempts something similar in his 1985 short story collection Einstein’s Monsters, which is about nuclear war. His lengthy introductory essay “Thinkability” (to my mind, the best part of the book) addresses precisely the way in which military jargon euphemizes the scope and scale of a nuclear exchange, precisely to render the unthinkable thinkable. 

And speaking of humour used to defamiliarize horror: Dr. Strangelove, or How I Finally Stopped Worrying and Learned to Love the Bomb, and General Buck Turgidson (George C. Scott)’s own “thinkability” regarding the deaths in a nuclear war: “Mr. President, we are rapidly approaching a moment of truth, both for ourselves as human beings and for the life of our nation. Now, truth is not always a pleasant thing. But it is necessary now to make a choice, to choose between two admittedly regrettable, but nevertheless distinguishable, post-war environments: one where you got 20 million people killed, and the other where you got 150 million people killed! … Mr. President, I’m not saying we wouldn’t get our hair mussed, but I do say no more than 10 to 20 million killed, tops! Uh, depending on the breaks.”

[4] Which always rested on a tacit contradiction in their logic: it didn’t happen, but it should have.

[5] We see the same tendency, specifically among conservatives, to depict any attempt at raising taxes or expanding social programs as “socialism,” often raising the spectre of “Soviet Russia”—which is about as coherent as one of my favourite lines from Community, when Britta cries, “It’s just like Stalin back in Russia times!”

[6] I don’t say so to castigate any such voices, nor to suggest that they were premature—the contours of Trump’s authoritarian, nativist style were apparent from before he announced his candidacy to anyone who looked closely enough.

1 Comment

Filed under maunderings

My Mostly Unscientific Take on UFOs

Over the past year or so, is has seemed as though whatever shadowy Deep State agencies responsible for covering up the existence of extraterrestrials have thrown up their hands and said “Yeah. Whatever.”

Perhaps the real-world equivalent of The X-Files Smoking Man finally succumbed to lung cancer, and all his subordinates just couldn’t be bothered to do their jobs any more.

Or perhaps the noise of the Trump presidency created the circumstances in which a tacit acknowledgement of numerous UFO sightings wouldn’t seem to be bizarre or world-changing.

One way or another, the rather remarkable number of declassified videos from fighter pilots’ heads-up-displays of unidentified flying objects of odd shapes and flying capabilities has evoked an equally remarkable blasé response. It’s as if the past four years of Trump, natural disasters, civil tragedies, and a once-in-a-century (touch wood) pandemic has so eroded our capacity for surprise that, collectively, we seem to be saying, “Aliens? Bring it.” Not even the QAnon hordes, for whom no event or detail is too unrelated not to be folded into the grand conspiracy have seen fit to make comment upon something that has so long been a favourite subject of conspiracists (“Aliens? But are they pedophile child sex-trafficking aliens?”).

Perhaps we’re all just a bit embarrassed at the prospect of alien contact, like having a posh and sophisticated acquaintance drop by when your place is an utter pigsty. I have to imagine that, even if the aliens are benevolent and peaceful, humanity would be subjected to a stern and humiliating talking-to about how we let our planet get to the state it’s in.

“I’m sorry to have to tell you, sir, that your polar icecaps are below regulation size for a planet of this category, sir.” (Good Omens)

Not to mention that if they landed pretty much anywhere in the U.S., they’d almost certainly get shot at.

And imagine if they’d landed a year ago.

“Take us to your leader!”
“Um … are you sure? Perhaps you should try another country.”
“All right, how do I get to Great Britain?”
“Ooh … no. You really don’t want that.”
“Russia then? China? India? Hungary?”
“Uh, no, no, no, and no.”
“Brazil?”
“A world of nope.”
“Wait–what’s the one with the guy with good hair?”
“Canada. But, yeah … probably don’t want to go there either. Maybe … try Germany?”
“Wasn’t that the Hitler country?”
“They got better.”

You’d think there would be more demand for the U.S. government to say more about these UFO sightings. The thing is, I’m sure that in some sections of the internet, there is a full-throated ongoing yawp all the time for that, but it hasn’t punctured the collective consciousness. And frankly, I don’t care enough to go looking for it.

It is weird, however, considering how we’ve always assumed that the existence of extraterrestrial life would fundamentally change humanity, throwing religious belief into crisis and dramatically transforming our existential outlook. The entire premise of Star Trek’s imagined future is that humanity’s first contact with the Vulcans forced a dramatic reset of our sense of self and others—a newly galactic perspective that rendered all our internecine tribal and cultural squabbles irrelevant, essentially at a stroke resolving Earth’s conflicts.

To be certain, there hasn’t been anything approaching definitive proof of alien life, so such epiphany or trauma lies only in a possible future. Those who speak with any authority on the matter are always careful to point out that “UFO” is not synonymous with “alien”—they’re not necessarily otherworldly, just unidentified.

I, for one, am deeply skeptical that these UFOs are of extraterrestrial origin—not because I don’t think it’s infinitesimally possible, just that the chances are in fact infinitesimal. In answer to the question of whether I think there’s life on other planets, my answer is an emphatic yes, which is something I base on the law of large numbers. The Milky Way galaxy, by current estimates, contains somewhere in the neighbourhood of 100 billion planets. Even if one tenth of one percent of those can sustain life, that’s still a million planets, and that in just one of the hundreds of billions of galaxies in the universe.

But then there’s the question of intelligence, and what comprises intelligent life. We have an understandably chauvinistic understanding of intelligence, one largely rooted in the capacity for abstract thought, communication, and inventiveness. We grant that dolphins and whales are intelligent creatures, but have very little means of quantifying that; we learn more and more about the intelligence of cephalopods like octopi, but again: such intelligences are literally alien to our own. The history of imagining alien encounters in SF has framed alien intelligence as akin to our own, just more advanced—developing along the same trajectory until interplanetary travel becomes a possibility. Dolphins might well be, by some metric we haven’t yet envisioned, far more intelligent than us, but they’ll never build a rocket—in part because, well, why would they want to? As Douglas Adams put it in The Hitchhiker’s Guide to the Galaxy, “man had always assumed that he was more intelligent than dolphins because he had achieved so much—the wheel, New York, wars and so on—whilst all the dolphins had ever done was muck about in the water having a good time. But conversely, the dolphins had always believed that they were far more intelligent than man—for precisely the same reasons.”

To put it another way, to properly imagine space-faring aliens, we have to imagine not so much what circumstances would lead to the development of space travel as how an alien species would arrive at an understanding of the universe that would facilitate the very idea of space travel.

Consider the thought experiment offered by Hans Blumenberg in the introduction of his book The Genesis of the Copernican World. Blumenberg points out that our atmosphere has a perfect density, “just thick enough to enable us to breath and to prevent us from being burned up by cosmic rays, while, on the other hand, it is not so opaque as to absorb entirely the light of the stars and block any view of the universe.” This happy medium, he observes, is “a fragile balance between the indispensable and the sublime.” The ability to see the stars in the night sky, he says, has shaped humanity’s understanding of themselves in relation to the cosmos, from our earliest rudimentary myths and models, to the Ptolemaic system that put us at the center of Creation and gave rise to the medieval music of the spheres, to our present-day forays in astrophysics. We’ve made the stars our oracles, our gods, and our navigational guides, and it was in this last capacity that the failings of the Ptolemaic model inspired a reclusive Polish astronomer named Mikołaj Kopernik, whom we now know as Copernicus.

But what, Blumenberg asks, if our atmosphere was too thick to see the stars? How then would have humanity developed its understanding of its place in the cosmos? And indeed, of our own world—without celestial navigation, how does seafaring evolve? How much longer before we understood that there was a cosmos, or grasped the movement of the earth without the motion of the stars? There would always of course be the sun, but it was always the stars, first and foremost, that inspired the celestial imagination. It is not too difficult to imagine an intelligent alien species inhabiting a world such as ours, with similar capabilities, but without the inspiration of the night sky to propel them from the surface of their planet.[1]

Now think of a planet of intelligent aquatic aliens, or creatures that live on a gas giant that swim deep in its dense atmosphere.

Or consider the possibility that our vaunted intelligence is in fact an evolutionary death sentence, and that that is in fact the case for any species such as ourselves—that our development of technology, our proliferation across the globe, and our environmental depredations inevitably outstrip our primate brains’ capacity to reverse the worst effects of our evolution.

Perhaps what we’ve been seeing is evidence of aliens who have mastered faster than light or transdimensional travel, but they’re biding their time—having learned the dangers of intelligence themselves, they’re waiting to see whether we succeed in not eradicating ourselves with nuclear weapons or environmental catastrophe; perhaps their rule for First Contact is to make certain a species such as homo sapiens can get its shit together and resolve all the disasters we’ve set in motion. Perhaps their Prime Directive is not to help us, because they’ve learned in the past that unless we can figure it out for our own damned selves, we’ll never learn.

In the words of the late great comedian Bill Hicks, “Please don’t shoot at the aliens. They might be here for me.”

EDIT: Stephanie read this post and complained that I hadn’t worked in Monty Python’s Galaxy Song, so here it is:

NOTES


[1] And indeed, Douglas Adams did imagine such a species in Life, the Universe, and Everything—an alien race on a planet surrounded by a dust cloud who live in utopian peace and harmony in the thought that they are the sum total of creation, until the day a spaceship crash-lands on their planet and shatters their illusion. At which point, on reverse engineering the spacecraft and flying beyond the dust cloud to behold the splendours of the universe, decide there’s nothing else for it but to destroy it all.

Leave a comment

Filed under maunderings, Uncategorized

Liz Cheney is as Constant as the Northern Star …

… and I don’t particularly mean that as a compliment.

Literally minutes before he is stabbed to death by a posse of conspiring senators, Shakespeare’s Julius Caesar declares himself to be the lone unshakeable, unmoving, stalwart man among his flip-flopping compatriots. He makes this claim as he arrogantly dismisses the petition of Metellus Cimber, who pleads for the reversal of his brother’s banishment. Cimber’s fellow conspirators echo his plea, prostrating themselves before Caesar, who finally declares in disgust,

I could be well moved if I were as you.
If I could pray to move, prayers would move me.
But I am constant as the northern star,
Of whose true-fixed and resting quality
There is no fellow in the firmament.
The skies are painted with unnumbered sparks.
They are all fire and every one doth shine,
But there’s but one in all doth hold his place.
So in the world. ‘Tis furnished well with men,
And men are flesh and blood, and apprehensive,
Yet in the number I do know but one
That unassailable holds on his rank,
Unshaked of motion. And that I am he.

Caesar mistakes the senators’ begging for weakness, not grasping that they are importuning him as a ploy to get close enough to stab him until it is too late.

Fear not, I’m not comparing Liz Cheney to Julius Caesar. I suppose you could argue that Cheney’s current anti-Trump stance is akin to Caesar’s sanctimonious declaration if you wanted to suggest that it’s more performative than principled. To be clear, I’m not making that argument—not because I don’t see it’s possible merits, but because I really don’t care.

I come not to praise Liz Cheney, whose political beliefs I find vile; nor do I come to bury her. The latter I’ll leave to her erstwhile comrades, and I confess I will watch the proceedings with a big metaphorical bowl of popcorn in my lap, for I will be a gratified observer no matter what the outcome. If the Trumpists succeed in burying her, well, I’m not about to mourn a torture apologist whose politics have always perfectly aligned with those of her father. If she soldiers on and continues to embarrass Trump’s sycophants by telling the truth, that also works for me.

Either way, I’m not about to offer encomiums for Cheney’s courage. I do think it’s admirable that she’s sticking to her guns, but as Adam Serwer recently pointed out in The Atlantic, “the [GOP’s] rejection of the rule of law is also an extension of a political logic that Cheney herself has cultivated for years.” During Obama’s tenure, she frequently went on Fox News to accuse the president of being sympathetic to jihadists, and just as frequently opined that American Muslims were a national security threat. During her run for a Wyoming Senate seat in 2014, she threw her lesbian sister Mary under the bus with her loud opposition to same-sex marriage, a point on which she stands to the right of her father. And, not to repeat myself, but she remains an enthusiastic advocate of torture. To say nothing of the fact that, up until the January 6th assault on the Capitol, was a reliable purveyor of the Trump agenda, celebrated then by such current critics as Steve Scalise and Matt Gaetz.

Serwer notes that the Cheney’s “political logic”—the logic of the War on Terror—is consonant with that of Trumpism not so much in policy as in spirit: the premise that there’s them and us, and that “The Enemy has no rights, and anyone who imagines otherwise, let alone seeks to uphold them, is also The Enemy.” In the Bush years, this meant the Manichaean opposition between America and Terrorism, and that any ameliorating sentiment about, say, the inequities of American foreign policy, meant you were With the Terrorists. In the present moment, the Enemy of the Trumpists is everyone who isn’t wholly on board with Trump. The ongoing promulgation of the Big Lie—that Biden didn’t actually win the election—is a variation of the theme of “the Enemy has no rights,” which is to say, that anyone who does not vote for Trump or his people is an illegitimate voter. Serwer writes:

This is the logic of the War on Terror, and also the logic of the party of Trump. As George W. Bush famously put it, “You are either with us or with the terrorists.” You are Real Americans or The Enemy. And if you are The Enemy, you have no rights. As Spencer Ackerman writes in his forthcoming book, Reign of Terror, the politics of endless war inevitably gives way to this authoritarian logic. Cheney now finds herself on the wrong side of a line she spent much of her political career enforcing.

All of which is by way of saying: Liz Cheney has made her bed. The fact that she’s chosen the hill of democracy to die on is a good thing, but this brings us back to my Julius Caesar allusion. The frustration being expressed by her Republican detractors, especially House Minority Leader Kevin McCarthy, is at least partially rational: she’s supposed to be a party leader, and in so vocally rejecting the party line, she’s not doing her actual job. She is being as constant as the Northern Star here, and those of us addicted to following American politics are being treated to a slow-motion assassination on the Senate (well, actually the House) floor.

But it is that constancy that is most telling in this moment. Cheney is anchored in her father’s neoconservative convictions, and in that respect, she’s something of a relic—an echo of the Bush years. As Serwer notes, however, while common wisdom says Trump effectively swept aside the Bush-Cheney legacy in his rise to be the presidential candidate, his candidacy and then presidency only deepened the bellicosity of Bush’s Us v. Them ethos, in which They are always already illegitimate. It’s just now that the Them is anyone opposed to Trump.

In the present moment, I think it’s useful to think of Liz Cheney as an unmoving point in the Republican firmament: to remember that her politics are as toxic and cruel as her father’s, and that there is little to no daylight between them. The fact that she is almost certainly going to lose both her leadership position and lose a primary in the next election to a Trump loyalist, is not a sign that she has changed. No: she is as constant as the Northern Star, and the Trump-addled GOP has moved around her. She is not become more virtuous; her party has just become so very much more debased.

2 Comments

Filed under maunderings, The Trump Era, Uncategorized, wingnuttery

Of Course There’s a Deep State. It’s Just Not What the Wingnuts Think it is.

There is a moment early in the film The Death of Stalin in which, as the titular dictator lays dying, the circle of Soviet officials just beneath Stalin (Khrushchev, Beria, Malenkov) panic at the prospect of finding a reputable doctor to treat him. Why? Because a few years earlier, Stalin, in a fit of characteristic paranoia, had become convinced that doctors were conspiring against him, and he had many of them arrested, tortured, and killed.

I thought of this cinematic moment—the very definition of gallows humour—while reading an article by Peter Wehner in The Atlantic observing that part of the appeal of QAnon (the number of whose adherents have, counter-intuitively perhaps, inflated since Biden’s election) is precisely because of its many disparate components. “I’m not saying I believe everything about Q,” the article quotes one Q follower as saying. “I’m not saying that the JFK-Jr.-is-alive stuff is real, but the deep-state pedophile ring is real.”

As [Sarah Longwell, publisher of The Bulwark] explained it to me, Trump supporters already believed that a “deep state”—an alleged secret network of nonelected government officials, a kind of hidden government within the legitimately elected government—has been working against Trump since before he was elected. “That’s already baked into the narrative,” she said. So it’s relatively easy for them to make the jump from believing that the deep state was behind the “Russia hoax” to thinking that in 2016 Hillary Clinton was involved in a child-sex-trafficking ring operating out of a Washington, D.C., pizza restaurant.

If you’ll recall, the “Deep State” bogeyman was central to Steve Bannon’s rhetoric during his tenure early in the Trump Administration, alongside his antipathy to globalism. The two, indeed, were in his figuration allied to the point of being inextricable, which is also one of the key premises underlying the QAnon conspiracy. And throughout the Trump Administration, especially during his two impeachments and the Mueller investigation, the spectre of the Deep State was constantly blamed as the shadowy, malevolent force behind any and all attempts to bring down Donald Trump (and was, of course, behind the putative fraud that handed Joe Biden the election).

Now, precisely why this article made me think of this moment in The Death of Stalin is a product of my own weird stream of consciousness, so bear with me: while I’ve always found Bannon & co.’s conspiracist depiction of the Deep State more than a little absurd, so too I’ve had to shake my head whenever any of Trump’s detractors and critics declare that there’s no such thing as a Deep State.

Because of course there’s a deep state, just one that doesn’t merit ominous capitalization. It also doesn’t merit the name “deep state,” but let’s just stick with that now for the sake of argument. All we’re really talking about here is the vast and complex bureaucracy that sustains any sizable human endeavour—universities to corporations to government. And when we’re talking about the government of a country as large as the United States, that bureaucracy is massive. The U.S. government employs over two million people, the vast majority of them civil servants working innocuous jobs that make the country run. Without them, nothing would ever get done.

Probably the best piece of advice I ever received as a university student was in my very first year of undergrad; a T.A. told me to never ask a professor about anything like degree requirements or course-drop deadlines, or, really, anything to do with the administrative dimension of being a student. Ask the departmental secretaries, he said. In fact, he added, do your best to cultivate their respect and affection. Never talk down to them or treat them as the help. They may not have a cluster of letters after their name or grade your papers, but they make the university run.

I’d like to think that I’m not the kind of person who would ever be the kind of asshole to berate secretaries or support staff, but I took my T.A.’s advice to heart, and went out of my way to be friendly and express gratitude, to be apologetic when I brought them a problem. It wasn’t long before I was greeted with smiles whenever I had paperwork that needed processing, and I never had any issues getting into courses (by contrast, in my thirty years in academia from undergrad to grad student to professor, I have seen many people—students and faculty—suffer indignities of mysterious provenance because they were condescending or disrespectful to support staff).

The point here is that, for all the negative connotations that attach to bureaucracy, it is an engine necessary for any institution or nation to run. Can it become bloated and sclerotic? Of course, though in my experience that tends to happen when one expands the ranks of upper management. But when Steve Bannon declared, in the early days of the Trump Administration, that his aim was “the deconstruction of the administrative state,” I felt a keen sense of cognitive dissonance in that statement—for the simple reason that there is no such thing as a non­-administrative state.

Which brings us back, albeit circuitously, to The Death of Stalin. There is no greater example of a sclerotic and constipated bureaucracy than that of the former Soviet Union, a point not infrequently made in libertarian and anti-statist arguments for small government. But I think the question that rarely gets raised when addressing dysfunctional bureaucracy—at least in the abstract—is why is it dysfunctional? There are probably any number of reasons why that question doesn’t come up, but I have to imagine that a big one is because we’ve been conditioned to think of bureaucracy as inevitably dysfunctional—a sense reinforced by every negative encounter experienced when renewing a driver’s license, waiting on hold with your bank, filing taxes, dealing with governmental red tape, or figuring out what prescriptions are covered by your employee health plan. But a second question we should ask when having such negative experiences is: are they negative because of an excess of bureaucracy, or too little? The inability of Stalin’s minions to find a competent doctor is a profound metaphor for what happens when we strip out the redundancies in a given system—in this case, the state-sponsored murder of thousands of doctors because of a dictator’s paranoia, such that one is left with (at best) mediocre medical professionals too terrified of state retribution to be dispassionately clinical, which is of course what one needs from a doctor.

I’m not a student of the history of the U.S.S.R., so I have no idea if anyone has written about whether the ineptitude of the Soviet bureaucracy was a legacy of Stalinist terror and subsequent Party orthodoxy, in which actually competent people were marginalized, violently or otherwise; I have to assume there’s probably a lot of literature on the topic (certainly, Masha Gessen’s critical review of the HBO series Chernobyl has something to say on the subject). But there’s something of an irony in the fact that Republican administrations since that of Ronald Reagan have created their own versions of The Death of Stalin’s doctor problem through their evisceration of government. Reagan famously said that the nine most frightening words were “I’m from the government, and I’m here to help,” and since then conservative governments—in the U.S., Canada, and elsewhere—have worked hard to make that a self-fulfilling prophecy. Thomas Frank, author of What’s the Matter With Kansas? (2004) has chronicled this tendency, in which Republican distrust of government tends to translate into the rampant gutting of social services, governmental agencies from the Post Office to the various cabinet departments, which then dramatically denudes the government’s ability to do anything. All of the failures that then inevitably occur are held up as proof of the basic premise of government’s inability to get anything right (and that therefore its basic services should be outsourced to the private sector).

In my brief moments of hope I wonder if perhaps the Trump Administration’s explicit practice of putting hacks and incompetent loyalists in key positions (such as Jared Kushner’s bizarrely massive portfolio) made this longstanding Republican exercise too glaring to ignore or excuse. Certainly, the contrast between Trump’s band of lickspittles and Biden’s army of sober professionals is about the most glaring difference we’ve seen between administrations, ever. What I hope we’re seeing, at any rate, is the reconstruction of the administrative state.

And it’s worth noting that Dr. Anthony Fauci has been resurrected from Trump’s symbolic purge of the doctors.

1 Comment

Filed under maunderings, The Trump Era, wingnuttery

In Which I Mark the One-Year Anniversary of the Pandemic With Some Thoughts on Monarchy

I did not watch Oprah Winfrey’s much-hyped interview of Prince Harry and Meghan Markle for much the same reason I did not watch the two most recent royal weddings: I didn’t care. Especially at this point in time, between marking a year of pandemic and the ongoing reverberations of the Trump presidency, the travails of Harry and Meghan—even inasmuch as I sympathize with them against the Royal Family—don’t really do much to excite my imagination or interest.

On the other hand, the fallout from the interview, coupled with related issues and events, has piqued my interest indeed. That people will be instinctively partisan for one party or the other is about as unsurprising as learning that some people in “the Firm” fretted about whether or not Meghan’s first child would be dark-complexioned. Racism in the Royal Family? Get away! But of course, this particular charge was picked up by right-wing pundits as further evidence of “cancel culture” at work, and we’ve been treated to the bizarre spectacle of self-described red-blooded American patriots rushing to the defense of HRM Queen Elizabeth II.1

Someone might want to remind them just what those boys at Lexington and Concord died for. Or perhaps tell them to watch Hamilton.

Notably, the one person emerging not just unscathed but burnished from the interview was the Queen herself—both Harry and Meghan were careful to say that none of the difficulties they’ve experienced emanated from her, and that she has indeed been the one person who is blameless (some reports have read between the lines and extrapolated that the Queen was prescient enough to have given Harry funds to see him through being cut off financially).

Leaving aside for the moment the possibility, or possibly even the likelihood, that this is entirely true, this sympathy is reflective of a broader reluctance to be critical of Elizabeth II. Even the 2006 film The Queen, starring Helen Mirren in the title role, which was all about the Palace’s cold and inept response to the shocking death of Diana, ended up painting a vaguely sympathetic portrait (though to be fair, that has a lot to do with the virtuosity of Helen Mirren). And The Crown (created and written by Peter Morgan, who wrote The Queen), which is largely unsparing of all the other royals and their courtiers, generally depicts Elizabeth as a victim of circumstance who spends her life doing her level best to do her royal duty and constrained by this very sense of duty from being a more compassionate and loving human.

The Queen is a person whom, I would argue, people tend to see through a nostalgic lens: nostalgia, in this case, for a form of stiff-upper-lip, keep-calm-and-carry-on Britishness memorialized in every WWII film ever—something seen as lost in the present day, along with Britannia’s status in the world. As we have seen in much of the pro-Brexit rhetoric, these two losses are not perceived as unrelated; and seeing Queen Elizabeth as the cornerstone of an ever-more-fractured Royal Family is a comforting anchor, but one that grows more tenuous as she ages.

There’s an episode in season four of The Crown that articulates this sensibility. In it, Elizabeth, having grown concerned that her children might not appreciate the scale and scope of the duties they’ve inherited, meets with each of them in turn and is perturbed by their feckless selfishness. Charles is in the process of destroying his marriage to Diana; Andrew is reckless in his passions; Anne is consumed by resentment and anger; and Edward is at once isolated by his royal status at school and indulgent in his royal privilege. Though her disappointment in her spawn is never put into words, it is obvious (Olivia Coleman can convey more with her facial expressions than I can in ten thousand words), and The Crown effectively indicts the younger generation of royals as unworthy of their status, and definitely unworthy of the throne.

This, I think, is where we’re at right now with Harry and Meghan’s interview. I’ve joked on occasion that “shocked but not surprised” should be the title of the definitive history of the Trump presidency, but it might also function as a general sentiment for this particular epoch. It is difficult, precisely, to put one’s finger on the substance of the outrage over Meghan’s revelations, aside from an instinctive royalist animus directed at anyone with the temerity to criticize the monarchy. This is why, perhaps, some (<cough> <cough> PIERS MORGAN <cough>) have simply chosen to call bullshit on Meghan Markle’s story of mental health issues and suicidal ideation;2 but it was the charge of racism that seems to have becomes the most ubiquitous bee in a whole lot of bonnets. Shocking, yes; surprising, no. The entire British colonial enterprise was predicated on the premise of white English supremacy, and royalty of all different nationalities has always been assiduous in policing their bloodlines. Prior to the divorce of Charles and Diana amid revelations of his relationship with Camilla Parker-Bowles, the greatest scandal the British monarchy had weathered was the abdication of Edward VIII so he could marry his American divorcée paramour, Wallis Simpson. Meghan Markle, it has been noted by many, ticks two of those scandalous boxes insofar as she is American and a divorcée.

She is also, to use postcolonial theorist Homi Bhabha’s phrasing, “not white/not quite.” Which is to say, she is biracial, and as such will never thus be qualified to be a royal in a stubborn subsection of the British cultural imagination.

Wallis Simpson and the man who might have been king.

The fascination many people have with the British Royal Family—especially among those who aren’t British—has always baffled me more than a little. But on honest reflection, I suppose I shouldn’t be baffled. In spite of the fact that hereditary monarchy is an objectively terrible form of governance, it is also one of the most durable throughout history. Human beings, it seems, are suckers for dynastic power, in spite of the illogic of its premise; as the late great Christopher Hitchens wryly observed, being the eldest son of a dentist does not somehow confer upon you the capacity to be a dentist. And yet down through the centuries, people have accepted that the eldest son (and occasionally daughter) of the current monarch had the right to assume the most important job in the nation on that monarch’s passing.

Of course, “right” and “ability” don’t always intersect, and there have been good, bad, and indifferent kings and queens down through history (of course, being democratically elected is no guarantee of governing ability, but at least the people have the option of cancelling your contract every few years). For every Henry V there’s a Richard III, and we’re equally fascinated by both, while mediocre kings and queens who preside over periods of relative peace don’t tend to get the dramatic treatment.

Indeed, on even just a brief reflection, it’s kind of amazing at just how pervasive the trope of monarchy is in literature and popular culture more broadly. It is unsurprising that Shakespeare, for example, would have made kings and queens the subject of many of his plays—that was, after all, the world in which he lived—but the persistence of hereditary monarchy in the 20th century cultural imagination is quite remarkable. It’s pretty much a staple of fantasy, as the very title of Game of Thrones attests; but where George R.R. Martin’s saga and its televisual adaptation are largely (but sadly not ultimately)3 rooted in a critique of the divine right of kings and the concept of the “chosen one,” the lion’s share of the genre rests in precisely the comfort bestowed by the idea that there is a true king out there perfectly suited to rule justly and peaceably.

More pervasive and pernicious than Shakespearean or Tolkienesque kings and queens, however, is the Disney princess-industrial complex. Granted, the fairy-tale story of the lowly and put-upon girl finding her liberatory prince pre-dates Walt Disney’s animated empire by centuries, but I think we can all agree that Disney has at once expanded, amplified, and sanded down the sharp edges of the princesses’ folkloric origins—all while inculcating in millions of children the twinned conceptions of royalistic destiny and the heteronormative gender roles associated with hereditary nobility (to be fair to Disney, it has done better with such recent excursions as Brave and Frozen—possibly the best endorsement of the latter’s progressiveness is the fact that Jordan Peterson loathes it). It’s telling that Disney’s most prominent branding image isn’t Mickey Mouse, but the Disney castle,4 a confection of airy spires and towers that any medievalist would tell you defeats the purpose of having a castle to start with. Even your more inept horde of barbarians would have little difficulty storming those paper-thin defenses, but then it’s not the bulwarks and baileys that are important, but the towers … the towers, built to entrap fair maidens until their rescuing princes can slip the lock or scale the wall.

I have to imagine that a large part of the obsession over royal weddings proceeds from precisely this happy-ending narrative on which the Mouse has built its house: the sumptuous spectacle of excess and adulation that evokes, time and again, Cinderella’s arrival at the ball. The disruption of this mythos is at once discomforting and titillating: Diana’s 1995 interview presaged Harry and Meghan’s with its revelations of constraint and isolation, and the active antagonism of both the Royal Family and its functionaries toward any sort of behaviour that might reflect badly upon it—even if that behaviour simply entailed seeking help for mental health issues. There have been many think-pieces breaking down which elements of The Crown are fact and which are fiction, but it is at this point fairly well established wisdom that being born a Windsor—or marrying into the family—is no picnic. And while Meghan’s claim that she never Googled Harry or his family strains credulity, I think it’s probably safe to say that no matter how much research one does, the realities of royal life almost certainly beggar the imagination.

Also, The Crown was only in its second season when Meghan married Harry.

I confess that, aside from the very first episode, I did not watch the first three seasons of The Crown, the principal reason being that I couldn’t get my girlfriend Stephanie into the show. While I may be more or less indifferent to the British monarchy, Stephanie is actively hostile5 to it. Born in South Africa, she and her family came to Canada when she was fourteen; having imbibed an antipathy to her birth nation’s colonizer that is far more diffuse in Canada, she gritted her teeth through the part of her citizenship oath in which she had to declare loyalty to the Queen. Her love of Gillian Anderson (Stephanie is, among her other endearing qualities, the biggest X-Files fan I’ve ever met) overcame her antipathy, however, for season four, and so we gleefully watched the erstwhile Agent Scully transformed into the Iron Lady spar with Olivia Colman’s Queen Elizabeth (we’re also pretty sympatico on our love of Olivia Colman). With each episode, we reliably said (a) Olivia, for the love of Marmite, don’t make us sympathetic with the Queen!; (b) Gillian, please don’t make us feel sympathy for/vague attraction to Margaret Thatcher!; and, (c) Holy crap, Emma Corrin looks so much like Lady Di!

It will be interesting to see The Crown catch up with the present moment. But I also have to wonder if some commentators are right when they say that the Harry and Meghan split from the Firm signals the end of the British monarchy? To my mind, by all rights it should: it’s long past time this vestige of colonial hubris went into that good night. We’ve got enough anti-democratic energy to deal with in the present moment without also concerning ourselves with a desiccated monarchy. When Queen Elizabeth dies, with her dies the WWII generation. The Second World War transformed the world in countless ways, one of them being that it spelled the end of the British Empire and the diminution of Great Britain’s influence in the world. Brexit is, among other things, a reactionary response to this uncomfortable reality, and a vain, desperate attempt to reassert Britannia’s greatness. Across the pond, fellow nativists in the U.S.  have latched onto Meghan Markle’s accusations of racism to make common cause with the monarchy. Not, perhaps, because they’ve forgotten the lessons of 1776, but most likely because they never learned them to start with.

NOTES

1. Perhaps the stupidest defense came from Fox and Friends’ co-host Brian Kilmeade, who opined that the fact that British Commonwealth countries are “75% Black and minority” demonstrated that the Royal Family could not possibly be racist. Leaving aside the pernicious history of colonialism and the kind of white paternalism epitomized by the Rudyard Kipling poem “White Man’s Burden,” can we perhaps agree that Kilmeade’s juxtaposition of “75%” and “minority” sort of gives the game away?

2. I’ve always felt that Piers Morgan was the result of a lab experiment in which a group of scientists got together to create the most perfect distillation of an asshole. Even if we grant his premise that Meghan Markle is, in fact, a status-seeking social climber who has basically Yoko Ono’ed Prince Harry out of the Royal Family, his constant ad hominem attacks on her say more about his own inadequacies than hers. And for the record, I do not grant his premise: to borrow his own turn of phrase, I wouldn’t believe Piers Morgan if he was reading a weather report.

3. We may never know how George R.R. Martin means to end his saga—at the rate he’s going, he’ll be writing into his 90s, and I don’t like his actuarial odds—but we do know how the series ended. The last-minute transformation of Daenerys into a tyrant who needed to be killed could conceivably have been handled better if the showrunners had allowed for two or three more episodes to bring us there; but the aftermath was also comparably rushed, and Sam Tarly’s democratic suggestion for egalitarian Westrosi governance was laughed off without any consideration. I will maintain to my dying day that GRRM effectively transformed fantasy, but also that he was too much in thrall to its core tropes to wander too far from their monarchical logic.

4. I recently bought a Disney+ streaming subscription in order to watch The Mandalorian. While writing this post, I remembered that Hamilton’s sole authorized video recording is Disney property. So of course I immediately clicked over to Disney+ to watch parts of it, and was treated to the irony of a play about the American revolutionary war to overthrow monarchical tyranny prefaced by Disney’s graphic of its castle adorned with celebratory fireworks.

5. When I read this paragraph to Stephanie, she liked all of it but objected to my use of the word “hostile.” “I don’t actually hate the Royal Family,” she said. “I don’t wish them harm. I just find the entire idea pointless and antiquated, and it embodies some of the worst aspects of British history.” So: she’s not hostile to the Royal Family, but I’m at a loss to find a better word, especially considering the invective she hurls at England during the World Cup.

3 Comments

Filed under history, maunderings, Uncategorized