Haven’t done one of these in a while and was thinking I need to get back into the habit of posting shorter things on a more regular basis, just to keep the pump primed her for the handful of people who actually read my blog (you few, you happy few).
Each of the items I’m mentioning here is worthy of a longer, deeper dive, and indeed I jotted notes in my journal to that end. But discretion being the better part of something or other, I acknowledged to myself that were I to attempt three ambitious posts I would almost certainly end up writing none, and the other more ambitious posts I have in the hopper would also probably fall by the wayside. So without further ado …
My dissertation is now old enough to vote, but it’s not gonna ‘cause it’s all RIGGED, man! I suddenly realized that this past Saturday was the eighteenth anniversary of my doctoral dissertation defense. That day remains one of my fondest memories of which I have little memory—the three hours of the defense itself is muddled in my mind because of the intellectual exhaustion that sets in after a certain point, which was then almost certainly compounded by the drinking that ensued.1
It is however a serendipitous bit of timing. My dissertation was titled “The Conspiratorial Imagination,” and was a study of conspiracy theory and paranoia as expressions of American postmodernity. This week we get into my fourth-year seminar on 21st century apocalypticism in earnest, and we’re diving into the novel Fight Club, the cultural earthquakes caused by 9/11, and the conspiracism endemic to both.
This class is a new iteration on a graduate class I taught in winter 2021; that class began mere days after the January 6 assault on the U.S. Capitol, a concurrence that seemed uncannily appropriate. In one of my more inspired off-the-cuff classroom riffs, I drew a line between Jacob Chansley—aka the QAnon Shaman, aka the most bonkers and thus most indelible image to emerge from a day replete with bonkers and indelible images—and the 1990s-era “crisis of masculinity.” This putative crisis inspired some men, influenced by Robert Bly’s book Iron John (1990), to go on men’s retreats into the woods where they donned loincloths, painted their faces, banged drums, and otherwise attempted to get in touch with some sort of elemental, primal, “authentic” manhood that had been erased by a combination of apathy, an excess of civilization, feminism, and a culture in general that repressed or denigrated “traditional” masculinity.
Then as now, the first novel we did was Fight Club—then it was fortuitous, now it is deliberate. As much as I dislike the novel, it’s a little too apt to my topic not to cover.
That class I titled “The Specter of Catastrophe,” which referred to the post-9/11 shift in disaster narratives from the spectacle of apocalyptic destruction (e.g. the alien ships blowing up New York and L.A. in Independence Day) to a specter haunting the devasted and/or depopulated landscapes of The Road or The Walking Dead. The current version of the course is mostly unchanged, but my focus has shifted slightly. It is now titled “Hopeful Nihilism,” which is partly an inversion of Lauren Berlant’s conception of “cruel optimism,” and a way of characterizing the burn-it-all-down ethos of Trumpism: the inchoate rage at a system perceived as broken coupled with the vague sense that its destruction will bring about better circumstances.2
I was at a loss for what image to use for a course poster until it occurred to me to run an image of the QAnon Shaman through the Obama “Hope and Change” poster filter:
It seems appropriate to lean into the conspiracy theory dimension of these topics, especially considering …
Trump unequivocally embraces QAnon. Yup. At a rally in Ohio last week for Hillbilly Elegy author, Republican senate candidate, and shameless political panderer J.D. Vance, Trump delivered a somewhat more scripted speech than is his usual wont while portentous music played underneath his words and a clot of rallygoers saluted him with a gesture that looked as though a bunch of people sieg heiling were also saying “Over there!” That is, a bunch of people raised their arms at an angle with their index fingers pointed, which is apparently a salute that has been adopted by QAnon enthusiasts. The single raised finger is a reference to the Q slogan “Where we go one, we go all” (itself adopted, weirdly, from White Squall, Ridley Scott’s most forgettable film). The music playing beneath Trump’s speech is also, apparently, music specific to the QAnon movement, titled “WWG1WGA,” an acronym for the above slogan.
Trump has been flirting with QAnon for at least as long as journalists have been asking him about it. As with most things that fall into Trump’s orbit, it’s difficult to know whether he actually knows much about it, or whether he just automatically gloms onto it because the adoration of deranged people is impossible for him to resist (the true Q believers refer to Trump as their “God Emperor,” and you can be 100% certain he loves that). But he’s been coy about it—or as coy as someone as unsubtle as Trump can be, anyway—teasing it with his showman’s instincts, but never explicitly going all in with it.
Until now, that is. On Truth Social, his MAGA Twitter knockoff, he retweeted—sorry, “re-truthed”—a post featuring a stern and heroic-looking image of him overlaid with the words “the storm is coming.” For those less familiar with QAnon parlance, the “storm” is key to their mythology—referring to the long-deferred comeuppance for the Deep State and liberal elites (who are all, let’s not forget, human-trafficking pedophile cannibals) in which Trump will reveal his master plan by arresting all the malefactors and re-assuming his rightful place as President of the United States.3
To be clear, Trump isn’t openly accusing Biden and Hillary Clinton of being sex-trafficking pedophiles (yet), but it is notable that he’s now gone as far as he has. He is not a subtle thinker, to put it mildly, but he has a very canny instinct for running a con: he knows how to string his marks along, whether they be MAGA enthusiasts, Republican pols desperate to tap into his base’s enthusiasm, or deranged conspiracists like the Q people (there’s a Venn diagram for you!). Much of that entails not giving more away than you have to, so it’s interesting that he seems to be leaning into it now.
Part of me finds this new development vaguely hopeful, even as it presages worse to come in terms of possible violence and mayhem. The true believers will almost certainly be roused to action of one sort or another, but this also makes it more difficult for his apologists to ignore his worse excesses. It may all indicate that Trump is losing relevance: out of office, off Twitter, ensconced in his Florida retirement home, and subject to ever-escalating investigations over which he has no control, and with ambitious rivals like Florida governor Ron DeSantis (who stars in the third section of this post) smelling weakness, this might be him trying to regain traction. But part of the problem with being outrageous is that you need ever-larger displays of outrageousness to keep the focus on you, especially when others have learned your lessons and deploy them more deftly. And that brings us to …
The politics of cruelty. Though Terry Pratchett’s voluminous Discworld series of over forty novels articulates a sophisticated moral philosophy, one of its principal axioms is a simple premise. Though it is present throughout the novels, it is articulated most specifically by the curmudgeonly witch Granny Weatherwax: evil, she says, comes from treating people—oneself included—like objects. When the earnest young missionary with whom she’s speaking4 protests that surely there are worse crimes than that, she agrees, but points out that most such crimes start with treating people as objects.
Every so often I circle back around to wanting to write an article on the consonance between Pratchett’s philosophy and American pragmatism as defined by thinkers like John Dewey, Judith Shklar, and Richard Rorty. Granny Weatherwax’s assertion is of a piece with Shklar and Rorty’s premise that liberal thought proceeds from the understanding that cruelty is the worst thing we do. This premise informs much of Pratchett’s fiction, so it is perhaps unsurprising that I have taken enormous comfort in rereading his work at the various nadirs of the past several years.
The most dispiriting episodes of what I assume we must needs call the Trump Era have all been those instances that have demonstrated the appetite for cruelty not as an occasional gambit or the collateral damage of a given policy shift, but as a new normal in which cruelty is business as usual. One of the most astute insights on this front was Adam Serwer’s Atlantic essay that observed the cruelty of Trumpism isn’t incidental: the cruelty is the point. This recent idiocy in which Ron DeSantis put asylum seekers fleeing Venezuela on a chartered flight to Martha’s Vineyard is just the most recent iteration of a political ethos more concerned with disruption and “owning the libs” than with any coherent ideological project. The whole exercise was an effort to expose the hypocrisy of affluent liberals, who were supposed to react with horror at having migrants suddenly in their midst. The people of the Vineyard did react with horror, but at DeSantis’ callous use of the refugees (who, as refugees, were in fact in the U.S. legally) as pawns in his political stunt. (The refugees were themselves welcomed and treated well, given food and shelter). Whatever one’s position on the current fraught state of immigration in the U.S., the basic, irreducible humanity of migrants—be they legal or illegal immigrants, refugees or asylum seekers—should always be the central factor in any consideration. Instead, we’re treated once again to the spectacle of a cruel man treating people as objects in the name of scoring points with a rump of the populace mobilized by Trump’s performative cruelty, all in the hope that he might one day take Trump’s place as their venerated leader. All of which tells us that the Trump Era is far from over.
1. There was, conveniently, a departmental function that afternoon at the Grad Club, a meet & greet for new graduate students, current ones, and faculty. One of the few vivid memories I have of that day is the newest hire—a young Shakespearean I had become good friends with—come scurrying over with a pen to edit my nametag, putting “Dr.” in front of my name.
2. I proposed an article on this subject for The Conversation apropos of the first anniversary of January 6th. It was published, but the editorial process was laborious and I was profoundly unsatisfied with the finished product … but still, might be worth a read.
3. Depending on who you talk to, the malefactors will either be subjected to humiliating show trials and imprisoned or summarily executed. Like all mythologies, there are variant versions, many of which have been necessitated by the fact that “the storm” has now been promised for six or seven years and has yet to occur.
4. The novel in which this exchange occurs is Carpe Jugulum (1998), the twenty-third Discworld novel; and the missionary in question is named Mightily Oats—which is the short version of his full name, “The Quite Reverend Mightily-Praiseworthy-Are-Ye-Who-Exalteth-Om Oats.” There’s a book to be written just about Sir Terry’s character names.
Having watched the first three episodes of The Rings of Power, I can, with relief and delight, say that I love it so far. It has its problems, which I may or may not talk about in future posts, but the end of episode three left me wishing I could binge the whole season in a day. Always a good sign. Whether it rises to the level of Peter Jackson’s Lord of the Rings films remains to be seen, but it has that potential.1
So, I won’t be talking about the episodes in detail. Nor do I particularly want to talk about the entirely predictable furor that has become commonplace when something like The Rings of Power is released—namely, the howls of protest that arise from the revanchist precincts of a given fan community when the new product is perceived to have been tailored for “woke” political sensitivities. That particular drumbeat has been pounding since the release of the first Rings trailer revealed that the series would feature elves, dwarves, hobbits, and, yes, humans who were not white—something perceived by the usual suspects as a rank betrayal of the source material in the name of political correctness.2
I don’t want to talk about that backlash … but given what I am talking about, I can’t ignore it entirely. Not that it should be ignored, mind you—but there have beenenoughgoodpiecesdealingwithit that I would just be repeating other people’s principal points.
For the moment what I want to try and do is articulate some thoughts on fantasy and mythology, and the perennial compulsion to adapt, refine, revisit, revise, and expand upon stories that are profoundly meaningful to us one way or another. These thoughts are at the front of my mind for two reasons, the first being the recent embarrassment of riches on television of stories meaningful to me: The Rings of Power’s illumination of earlier histories of Middle-earth; House of the Dragon’s return to Westeros; and just a few weeks ago, Netflix’s release of Neil Gaiman’s The Sandman, the only comic book I’ve ever read obsessively.
The second reason is that this fall I’m teaching a first-year English course called “Imagined Places.” My variation on its theme is modern and contemporary rewritings of Greek myths.3 As part of my preparation I’ve been rereading The Iliad and The Odyssey, which started as an exercise in refreshing my memory of their finer points and has become a joyful reunion with texts I first read in high school, and then again in an intro to classics course I took in my first year of university.
Few works in the history of western literature have inspired quite as much imitation and revision. Keep that in mind as we go, because it’s something I’m going to circle back around to.
My entry point to this discussion is a recent New York Times column by Tolkien scholar Michael Drout, titled “Please Don’t Make a Tolkien Cinematic Universe.” I’ll get into the particulars as I go, but the gist of his argument is that the subtle complexities of Tolkien’s moral universe are beyond the scope of contemporary film and television; unfortunately, the franchise model of cinema that has come to dominate the current entertainment market likely sees Tolkien as a wealth of source material to be exploited. Should Amazon (or, presumably, any other corporate media entity) turn Tolkien’s legendarium into something akin to the Marvel Cinematic Universe, it will inevitably miss, mangle, or pervert the moral vision at the heart of Tolkien’s work.
To be clear, that last sentence is my extrapolation from Drout’s argument—he does not himself say as much in so many words, but it is an accurate summary of his concern. “Is it fair to the legacies of writers like Tolkien,” Drout asks, “to build franchises from their works without their knowledge or permission?” There were attempts to adapt The Lord of the Rings to film while Tolkien was still alive; initial, wary interest on Tolkien’s part turned fairly quickly into antipathy to the whole idea as he rejected one spec script after another, finally declaring that his work was unsuited for film. His biographer recounts a time Tolkien attended a staged version of The Hobbit, at which he frowned every time the play departed from the novel. “So it is hard to believe,” Drout concludes, “that he would have approved of a team of writers building almost entirely new stories with little direct basis in his works.”
Here of course Drout means The Rings of Power, which is drawn from the appendices of The Lord of the Rings.4 It is true that the series is taking a big swing, building out a multi-season story arc from some short summaries and chronologies of the Second Age. Such a sketchy basis, Drout suggests, is a poor foundation on which to build—but then, from what he says, it’s a reasonably good chance Tolkien would similarly have disapproved of Peter Jackson’s films (whose absence from his discussion is an odd omission to which I’ll return). Tolkien’s imagined approval or disapproval is at least somewhat beside the point, however; it is strange to suggest that adaptations and retellings of a given author’s work are somehow “unfair” to their legacy, considering that entire cottage industries thrive on adapting authors like Shakespeare and Jane Austen. Possibly Austen might have liked Clueless, but would probably have looked askance at Pride and Prejudice with Zombies. Does a spectral Shakespeare burn with anger at all the distortions of Hamlet, from Rosencrantz and Guildenstern are Dead to The Lion King?
Or is fifty years not long enough? (Tolkien died in 1973). Must we wait for copyright to expire?
I pose these questions with my tongue only partly in my cheek: I’m trying to read Drout’s concerns generously but can’t help but find them a little disingenuous. Especially considering—as I mentioned above—that he elides any mention of Peter Jackson’s films, except to acknowledge that they exist. This omission is passing strange if for no other reason than that we already have a Tolkien cinematic universe, helpfully spanning the extremes of good and bad—with The Lord of the Rings films displaying both profound respect for their source material and moments of brilliant filmmaking, while The Hobbit films are a hot mess of poor directorial impulse control.
There have also been a handful of animated adaptations, most notably Ralph Bakshi’s somewhat trippy 1978 Lord of the Rings that blended traditional animation with rotoscoping.4
There is also a Lord of the Rings stage musical—of which I was somehow unaware until one of my students in my Tolkien class last year sent me a YouTube link, destroying my blissful ignorance.
And while we’re at it, there have also been a significant number of video games set in Middle-earth, perhaps most notably Lord of the Rings Online, an MMORPG in the mold of World of Warcraft, whose fidelity to the geography, story, and lore of Tolkien’s work would probably impress even Michael Drout.
And though Drout’s quibble is with the prospect of a Tolkien cinematic universe, there is also the inescapable fact that there has always been the Tolkien expanded universe. Created initially of course by the man himself with his “mythology”—into which The Hobbit and The Lord of the Rings were folded—it was then enlarged with The Silmarillion and the further eighteen volumes edited by his son Christopher that provided readers with world-building akin to that of the mad encyclopedists of Jorge Luis Borges’ story “Tlön, Uqbar, Orbis Tertius.”
Given this already-existing critical mass of material, it makes me wonder why Drout draws the line at The Rings of Power; why choose this moment to stand athwart the expanding Tolkien universe to yell stop?
The uncharitable reading would be to suggest that this is of a piece with the more overtly revanchist backlash, just more delicately articulated; the more generous reading is to take him at his word and presume that he’s concerned that Amazon will take Tolkien down the same road as Marvel comics have gone, or—and this was a pretty widely shared worry, though now mostly allayed—that TheRings of Power showrunners would feel compelled to create a Game of Thrones clone. Drout, indeed, alludes to both these apprehensions while asserting that “the groupthink produced by the contemporary ecosystems of writers’ rooms, Twitter threads and focus groups” of the “Hollywood of 2022” is innately out of step with Tolkien’s unique sensibility:
The writing that this dynamic is particularly good at producing—witty banter, arch references to contemporary issues, graphic and often sexualized violence, self-righteousness—is poorly suited to Middle-earth, a world with a multilayered history that eschews tidy morality plays and blockbuster gore.6
Citing the fact that the quest of The Lord of the Rings is not the attainment of power but its destruction, and that in bearing the Ring Frodo is too wounded, physically and psychically, to return to his life as it was before, Drout asks, “Can a company as intent upon domination as Amazon really understand this perspective—and adapt that morality to the screen?” While I’m not unsympathetic to anyone taking issue with Amazon as a cultural and societal blight, I find this line of argument bewilderingly obtuse—as if Jeff Bezos were the one writing, casting, costuming, and doing art direction.7
Ultimately, however, my interest in Drout’s argument is that its inconsistencies speak to a broader, more significant issue with Tolkien’s mythology—or rather, with Tolkien as mythology. In the end, whether or not Amazon does a good job with The Rings of Power, or whether they do attempt to build out an expanded cinematic universe on par with Marvel, really matters very little for a fairly simple reason: that Tolkien in fact succeeded in realizing his lifelong goal. He created an enduring mythology.
I will come to that in a moment. But first, for the sake of argument, let’s take some of the protests over The Rings of Power casting of nonwhite actors on their merits. One strain of the “I’m not racist, but—” arguments is that Tolkien’s world is explicitly modeled on Northern European geography, cultures, and ethnicities, and so introducing obviously non-European-looking characters disrupts Tolkien’s design and intent. A refinement on this line of thinking says “OK, sure, introduce Black elves, but make it clear how it serves the story!” In other words, offer an explanation of how there came to be Black elves (and hobbits, and dwarves, and humans).
To be fair, I’ve seen such concerns offered with all appearance of sincerity and earnestness,8 and there are, if one has the patience for it, substantive counter-arguments, some of which are based in Tolkien’s own lore. One example that has gone viral is from none other than Neil Gaiman:
There is also the fact that medieval Europe wasn’t quite as monolithically white as is often assumed or depicted, given the vestiges of the multi-ethnic Roman legions left behind after the western empire imploded; there was also, more significantly, brisk trade that brought people from Moorish Spain, North Africa, and the Middle East into regular contact with Europeans of all shades.9
But again, this is all at least partly beside the point I want to make. To be generous, I’ll say that the idea that the casting choices in The Rings of Power are somehow transgressive resides in a vague understanding of the original story as sacrosanct, or that there is an “authentic” Tolkienian vision to which it behooves all adaptors to hew. Leaving aside the simple fact that “authenticity” in this context is always going to be illusory and chimerical, is it really even desirable? Are we obliged to honour every boundary, real and imagined, that an author places on interpretations of their work?
Fortunately such a myopic approach is not practical or feasible. Neither the creative arts nor literary criticism have anything like the legal doctrine of originalism, something of which Tolkien was well aware. He might have gotten stroppy over prospective filmic adaptations or a staged version of The Hobbit, but his lifelong project was to shape a mythology—something he was at his most candid and open about in a long letter to the publisher Milton Waldman. Tolkien was at the time (the letter is undated but most likely written late 1951) shopping around The Lord of the Rings; he was having some difficulty finding a publisher because he was still insisting that LotR should be published in a single volume with the prehistory that would eventually become The Silmarillion. Given the already-epic length of LotR, publishers were understandably reluctant at including the dense mythology. But Tolkien had not yet reconciled himself to jettisoning the narratives of Middle-earth’s First and Second Ages, and he attempted to persuade Waldman by explaining the import of the project to him. “I do not remember a time when I was not building it,” he says. “I have been at it since I could write.” His aim, he says, was not merely to make up stories, but to build a mythology that could supplement what he saw as a lack in his native country:
I was from early days grieved by the poverty of my own beloved country: it had no stories of its own (bound up with its tongue and soil), not of the quality that I sought, and found … in legends of other lands. There was Greek, and Celtic, and Romance, and Germanic, Scandinavian, and Finnish (which greatly affected me); but nothing English, save impoverished chap-book stuff. Of course there was and is all the Arthurian world, but powerful as it is, it is imperfectly naturalized, associated with the soil of Britain but not with English; and does not replace what I felt to be missing.
He goes on to explain that his own mythopoeic project was precisely an attempt to replace what he felt to be missing:
I had a mind to make a body of more or less connected legend, ranging from the large and cosmogonic, to the level of romantic fairy-story—the larger founded on the lesser in contact with the earth, the lesser drawing splendour from the vast backcloths—which I could dedicate simply to: to England: to my country.
Not exactly a humble ambition, something he self-deprecatingly acknowledges as he pleads “Do not laugh!” and adds at the end “Absurd.” But one is put in mind of John Milton’s determination to write an epic poem in English that would be the equal—or the superior—of Homer and Virgil,10 or James Joyce, whose own grandiose ambition was voiced by his alter ego Stephen Dedalus at the end of A Portrait of the Artist as a Young Man when he says, absurdly, “I go to encounter for the millionth time the reality of experience and to forge in the smithy of my soul the uncreated conscience of my race.”
But like Milton and Joyce11, Tolkien succeeded in his seemingly hubristic ambition—he did create a mythology that has taken on its own life as it captured people’s imaginations, albeit not perhaps as he thought it might. His vision, it turns out, was too modest: Middle-earth has transcended England and become the world’s mythology. And like all such mythologies, it is futile to try and contain it.
This is where we come back around to my first-year English class: as I said, we’re looking at modern and contemporary poetry and fiction that reimagines and revisits classic Greek myth, often in ways that give voice to those figures who are ancillary to the main action, or disposable in the name of advancing the plot. The modernist poet Hilda Doolittle (H.D.) writes “Eurydice,” in which the title character excoriates her husband Orpheus for consigning her to Hades—again!—when he couldn’t help but look back at the last moment (you had one job!). Or Canadian poet Don McKay’s reimagining of Icarus as an unapologetic adrenaline junky aiming to get that fall from on high just right.
And then there’s the story of Briseis: The Iliad, long considered the origin point of Western literature, begins when the great warrior Achilles and High King Agamemnon quarrel over Briseis, a Trojan woman who is Achilles’ spoil of war, whom Agamemnon takes from him. In a fit of pique, Achilles retires to his tent to sulk and withholds the forces he commands, allowing the Trojans to turn the tide and bring the fight all the way back to the Greek ships, forcing Agamemnon to swallow his pride and send Briseis back to Achilles. In her novel The Silence of the Girls, Pat Barker retells the events of The Iliad from Briseis’ point of view. This shift of perspective makes explicit the brutality of the war and the way it envelops non-combatants in its implacable, bloody logic: the trauma visited on ordinary people, the slaughter of innocents, the enslavement and rape of the women taken captive. Barker’s novel is harrowing, brilliant, and faithful to The Iliad; it would also, I am certain, be considered a “woke” attempt to tear down the edifice of classic Greek literature if read by those who view a Black elf as a profound betrayal of Tolkien’s legacy.
A thought I’ve had a lot lately is what low esteem Tolkien purists have for Tolkien’s work, given that they seem to believe the slightest deviation from what they perceive as the original or authentic vision is somehow mortally damaging to Tolkien’s legacy. The opposite is the case: Tolkien’s peculiar genius lies in the very capaciousness of his vision. It contains multitudes. Not even Amazon can dent it. And that, of course, is the nature of mythology: if the ancient Romans had had Twitter, there would have been endless arguments in the Greek myth fan community over the Latinizing of the gods’ names; Ovid would have been excoriated for the liberties he took, Virgil for being an Augustan propagandist; and legions of toga-clad hipsters would sniff into their wine (locally sourced, of course) that Hesiod was the only truly authentic mythographer.
But none of the Roman emendations and transformations of the Greek myths had a deleterious effect on Homer, any more than do contemporary novels by PatBarker or MadelineMiller. And no more than The Rings of Power, Ralph Bakshi’s The Lord of the Rings, or even Peter Jackson’s The Hobbit have had on Tolkien’s mythos.
Indeed, in his letter to Milton Waldman, Tolkien is quite clear about how he wants his mythology to be picked up, adapted, and transformed by others: “I would draw some of the great tales in fullness, and leave many only placed in the scheme, and sketched. The cycles should be linked to a majestic whole, and yet leave scope for other minds and hands, wielding paint and music and drama.” And so it has transpired. Whether Tolkien would have liked any of the scholarship, cinema and television, fan fiction, video games, and indeed the whole raft of imitative works that came to comprise the genre of fantasy as we now know it is irrelevant. Such is the risk you take when you create something transcendent.
1. I suppose we should appreciate that Jackson established a spectrum of quality against which all future Tolkien adaptations can be measured, with the Lord of the Rings films establishing the high-water mark and The Hobbit films as the nadir.
2. One thing I will say on this point is that two of the things giving the anti-woke crowd the nativist vapours are my favourite parts of the show: namely, the young, impetuous, warrior version of Galadriel played by Morfydd Clark, and the EOC (elf of colour) Arondir, played by Ismael Cruz Cordova. Cordova is veritably mesmerizing as Arondir, playing the character with a tightly controlled mien that just barely hints at a tumult beneath; when in the third episode he breaks out the Legolas-like balletic combat, I don’t think his expression changes. Not that I would want to typecast him, but he’s all set to play a Vulcan should he want to stick with the SF/F thing.
Meanwhile, the complaints about Galadriel are as predictable as they are exhausting, starting with whinges about making a female character the show’s main focus and filter. Beyond that, (some) people can’t seem to reconcile Clark’s smouldering rage and badass combat skills with the cool reserve with which Cate Blanchett endowed the character … never considering, presumably, that that degree of imperturbable self-control is something one earns.
Also, it’s probably mere coincidence that the first thing Galadriel does in the first episode is kill a troll. I’m sure that wasn’t something the writers did in anticipation of the backlash the character would inevitably face.
3. We’re starting with Rick Riordan’s YA novel Percy Jackson and the Olympians: The Lightning Thief; then Pat Barker’s The Silence of the Girls, which is a retelling of The Iliad from the perspective of Briseis, the Trojan woman over whom Achilles and Agamemnon quarrel; Circe by Madeline Miller, which tells the story of the titular witch who plays a small but significant role in The Odyssey; and The Just City by Jo Walton, a novel in which the goddess Athene creates Plato’s Republic out of curiosity, to see if it would actually work. We’ll also be doing a bunch of poetry (Yeats’ “Leda and the Swan,” for example, and William Carlos Williams’ “Landscape with the Fall of Icarus”), as well as reading the relevant myths as recounted in Edith Hamilton’s Mythology (I would have actually preferred to use Stephen Fry’s book Mythos, which is an arch and elegant compendium, but it is too expensive and too long to justify including it, especially when my students would only be reading a fraction of it. I’ve been listening to it on audiobook off and on all summer—read, of course, by Fry himself—and it is a delight).
4. For legal reasons I haven’t entirely parsed, Amazon has the rights to The Lord of the Rings—and its appendices—but not The Silmarillion.
5. The Bakshi LotR was just the first half—that is, all of Fellowship and parts of The Two Towers. He’d planned the film to take place in two parts, but the second film never got made … for reasons that become clear when you watch the first.
6. If current film and television was in fact limited to these possibilities—which seem to be delineated here as Marvel’s archness, Game of Thrones’ sex and violence, and the self-righteousness of a vaguely imagined wokeness—he might just have a point. But I don’t have to reach back fifteen years to The Wire and The Sopranos and the rest of prestige television’s first glut of great work to refute Drout’s characterization. In just the past two or three years, Severance, Better Call Saul, Our Flag Means Death, The Mandalorian, The Plot Against America, Succession, The Good Place, Sex Education, Slow Horses, Yellowjackets, and countless other series have shown the depth and breadth of the subtlety and sophistication that is out there.
7. I’ve written previously on this blog about the contradictions inherent in Bezos’ avowed love of Star Trek, given that Gene Roddenberry’s utopian vision of the future was utopian in part because it imagined a future in which a billionaire class was unthinkable … but this isn’t really what Drout’s on about.
8. Though, really, any time you feel compelled to preface a statement with “I’m not racist, but …” there’s about a 100% chance you’re about to say something racist.
9. Also, not for nothing, but it’s not as if The Rings of Power is doing for the Tolkien expanded universe what Black Panther did for the MCU, in which Martin Freeman and Andy Serkis were the sole Tolkien white guys (get it?). The “diversity” of The Rings of Power, in the end, is a handful of dark faces in a large cast that is otherwise still overwhelmingly white. The fact that so many people are reacting as if the showrunners had put a sign saying “Caucasians need not apply” on the audition room door is perhaps the most telling point.
10. Interestingly, Milton had about as much regard in the end for Arthurian legend as Tolkien did—which is to say moderate regard, but like Tolkien he found it wanting. In the early days of considering what the subject for his epic would be, Milton considered King Arthur, but discarded the idea as too parochial.
11. The rumbling you hear is Harold Bloom spinning in his grave at me mentioning Tolkien in the same sentence as Milton and Joyce.
I’ve received a number of questions from people asking whether I plan to post on a weekly basis about House of the Dragon. Thanks to those who’ve reached out: it is gratifying to know the Game of Thrones posts I did with Nikki over nine (egad!) years were enjoyed. I’m afraid my answer has to be an unsatisfying maybe. I’ll certainly have stuff to say, which may or not make it to my blog, but what I definitely won’t be doing is posting long, detailed blow-by-blow recaps/reviews—not with Nikki, and not by myself. We did that for Game of Thrones eight seasons, and I think it’s safe to say we both hit our saturation point.
And there’s also the fact that one of the reasons we were able to sustain that output—by the end, our posts were averaging over ten thousand words, which even between two people is a lot of text to produce in a couple of days—is because GoT was a genuine cultural phenomenon. It’s not that I wouldn’t be amenable to the idea of doing that sort of thing again, but not with a property like HotD, whose Westrosi territory we’ve mapped pretty thoroughly.
But having written as much as I did—one half of seventy-three 8-12K word posts is a hefty chunk of text—there’s a certain amount of muscle memory there that would be hard to ignore, even if I wanted to. Two episodes has got my mind flexing in some familiar ways, though less with the granular details of the show than with the broader issues it raises that reflect back on GoT, having to do with fantasy and genre and world-building. Also, given that Rings of Power powers up this week, there’s some interesting possibilities arising in the serendipitous juxtaposition of HBO’s ongoing televisual adaptation of George R.R. Martin’s latter-day fantasy and Amazon’s attempt to build upon Peter Jackson’s success in bringing Middle-earth to the screen.
I also watched Wheel of Time, also on Amazon, this past winter. And while I didn’t have anything to say about it at the time—honestly, I was underwhelmed—revisiting it might make for some useful points of comparison.
I commented in my previous post that I was more worried about Rings of Power, because that show has set itself a much more difficult task. The two shows are superficially similar insofar as they’re both prequel-ish narratives adapted from the histories preceding the stories that made people interested in those histories to start with. But beyond that point of comparison, there really is, well, no comparison.
For the moment I’m going to keep my powder dry on The Silmarillion and Tolkien’s legendarium more broadly until I’ve seen the first episode of Rings. Suffice to say that adapting Tolkien’s mythology is not the same kind of task as adapting The Lord of the Rings.
By contrast, HotD is entirely the same kind of task as adapting A Song of Ice and Fire, and is made easier by the fact that all of the heavy lifting—which is to say, its principal world-building, its aesthetic, its general narrative sensibility—was well established by the eight seasons of GoT. Perhaps more significantly, there is much less at stake in HotD: whatever dragon-borne sexytime shenanigans these Targaryens get up to, we know that they’ll still be around for Robert Baratheon to usurp in two centuries or so. We even know, should we wish to consult Fire and Blood, GRRM’s “history” of the Targaryen dynasty, who wins and who loses and how these conflicts either resolve or don’t resolve themselves.
To be clear, when I say there isn’t much at stake, I mean artistically and narratively. HBO has the pots of money it’s investing at stake, so if people’s appetite for GRRM’s particular brand of venality and violence, of incest and intrigue proves to have been exhausted by GoT’s ignominious exit, well, that’s going to result in a fresh batch of firings. But for those of us tuning in, we’re going to be propelled less by where we’re going than how we get there—by which I mean how the episode-to-episode drama unfolds, and how much we invest emotionally in the characters.
The good news, based on episode two, is that they’re off to a good start. The awkwardness of the writing in the first episode was largely smoothed out, and we’re starting to get a better sense of the main players and their motivations. The casting is proving its quality: there are no discordant notes. The MVPs for this episode are Milly Alcock as the young and precocious Rhaenyra and Eve Best as the jaded but shrewd Rhaenys, the Queen that Never Was.
(Something that comes with adapting fantasy is the realization that names as written on the page don’t always do well when spoken out loud. GRRM is usually pretty user-friendly with his Jons and Roberts and Neds, but occasionally we stumble over the jaw-crackers—as Samwise Gamgee might call them—more typical of the genre).
The conversation between the elder and younger Rhaenladies is a good indication of how this series will play out, and a reminder that GoT, for all its spectacle, was at its best when its antagonists fenced with words as well as swords. Rheanyra surprising the meeting on the bridge at Dragonstone and calling her uncle’s bluff was also a close contender for the episode’s high point.
But where GoT was about a multi-front civil war taking place against the looming threat of the Night King’s malevolent return, HotD is about the internecine conflict of a single dynastic family, whose end—as observed above—is foreordained by history. This much is made clear by the opening credits, which retain the theme music of the original and the general aesthetic sensibility, but which unfold within the contained and claustrophobic walls of a castle we assume is the Red Keep.
The unifying symbol of the GoT credits was the armillary sphere containing the sun, soaring above the vastness of Westeros and Essos. From this celestial perspective we were shown the key points of geography relevant to the given episode. By contrast, the armillary sphere is replaced at the start of the HotD credits with the House Targaryen sigil, which connects to all the other nodes within the castle walls with criss-crossing streams of blood—which itself has the triple meaning of referencing familiar bloodlines, one half of the Targaryen motto (which makes me wonder if future credits will feature fire?), and presumably the rivers of blood that will flow once the fighting begins in earnest.
[CORRECTION: The first image we see in the sequence is not the Targaryen sigil, but a sort of bas-relief of dragons flying around a stronghold that, based on King Viserys’ hobby-model (what kings do instead of model trains, presumably), is meant to be old Valyria. This almost certainly means that the stone walls and corridors through which the blood flows are probably those of the model and not, as I’d blithely assumed, the Red Keep. Not that this really changes my interpretation: it’s still contained within the confines of the Targaryen dynasty, except more explicitly, and with the added sense of being yoked to a mythologized history. Still, more food for thought there.]
By way of conclusion, few quick thoughts in no particular order:
The CGI ain’t great. It’s generally OK, but there are those telltale moments when it looks more like a middling video game than a high-budget series. I have to imagine HBO is hedging a bit, which is fair enough: it took GoT a few seasons to establish itself. Before that, they either avoided expensive, large-scale sequences by, say, knocking out Tyrion just before a battle, or limiting themselves to one or two big spectacles a season. It’s fortunate the show proved its worth by the time the dragons got big.
The scene in which King Viserys walks and talks with his possible grade-school bride is one of the squickier sequences I’ve seen in any show, but all credit to Paddy Consadine for exuding profound discomfort throughout.
I’m in equal measures intrigued and concerned with the character Mysaria, Daemon Targaryen’s prostitute-turned-paramour. Her scene following the bridge confrontation, in which she takes Daemon to task for using her as a provocation was powerful, and it makes me hope the writers paid attention to the criticism leveled at GoT’s cavalier use of sexual violence and disposable female characters. And I am concerned, well, because perhaps this will end up being just more of the same.
If Daemon and Rhaenyra are any indication, the Targaryens ruled by divine right of cheekbones; two hundred years hence, Cersei Lannister attempts to follow suit.
I was asked recently which TV series I was most excited about: House of the Dragon, HBO’s prequel to Game of Thrones; or Rings of Power, Amazon’s prequel-ish adaptation of elements of Tolkien’s The Silmarillion (which I hesitate to call a “prequel” to The Lord of the Rings for reasons that aren’t entirely germane here but which I’ll likely articulate once the series starts).
I wasn’t sure how to answer that question. I suppose that, all other considerations aside, I’m more looking forward to Rings of Power … but that anticipation is tempered by the awareness that the line between amazing and terrible is a trickier one to negotiate with that source material. It was broadly thought that The Lord of the Rings was unfilmable until Peter Jackson proved everyone wrong on that front, but that presumption had more to do with the prior limitations of special effects technology than with storytelling. The principal reason LotR was so good—and why The Hobbit shanked so badly—is because Jackson treated the source material of the former with profound respect. The story of LotR needs little tinkering, as evidenced by how often the films take dialogue verbatim from the novels. The Silmarillion, by contrast, is written as, and unfolds like, mythology, which will necessitate some significant tinkering. Finding that happy medium between rendering it naturalistic and hewing to the spirit of Tolkien’s story will be a difficult needle to thread.
Which is why I’m more confident that House of the Dragon will hit its marks, given that it was always meant to be entirely consonant with its predecessor. I have not however been particularly looking forward to it, for what I assume are obvious reasons. Like most people who loved GoT, the final season left a sour taste in my mouth and the ending felt like a betrayal—not so much a betrayal of the characters, as many people felt, but a betrayal by the showrunners of the show itself. After seven seasons of often superb, unhurried, nuanced storytelling and world-building, showrunners D.B. Weiss and David Benioff—now without the scaffolding of the great bearded glacier George R.R. Martin to shore up their own writing faults—raced to a slapdash finish in a truncated final season that effectively upended everything that had come before and slapped the goodwill of fans in the face.
Which isn’t to say I won’t watch HotD, but I don’t feel inclined to write lengthy recaps/commentaries like Nikki and I did for GoT.
That being said, I watched the first episode last night, so here are my thoughts in no particular order. Mild spoilers ahead.
The first episode was … OK. I was more or less on board by the end, which is a good sign—it means, possibly, that the awkwardness of the writing was more to do with this being a pilot setting up the characters and contexts, and will find its rhythm as we go. Fingers crossed.
I can tell it’s going to take a few episodes to adjust to Matt Smith playing the villain. He’s still so indelibly the Eleventh Doctor for me, though I suspect it will be a lot like watching David Tennant play Kilgrave in Jessica Jones. Smith as Daemon Targaryen has similar energy, which is the whole manic alien-among-humans thing being repurposed as gleeful psychopathy.
As trepidatious as I was going in, that theme music … man, it’s good to hear it again.
Not to repeat myself, but I do hope the writing finds its groove. Too many awkward moments of dialogue to really overlook … however much Benioff and Weiss floundered with the plotting after they overshot GRRM’s runway, even by the end the moment-to-moment of GoT never felt inauthentic.
The juxtaposition of the bloody birthing scene and the jousting was a little … obvious. C’mon, man. We get it.
Something Stephanie pointed out right at the outset: do clothing styles not change in Westeros? This is almost two centuries before GoT, you’d think there’d be some differences.
I can’t quibble with the casting. Once I’ve worked through my Doctor Who issues, Matt Smith looks to be great. Paddy Considine (King Viserys) has been great in everything I’ve seen him in. Rhys Ifans (Hand of the King Otto Hightower) is also always a solid bet. I had to do an IMDb search to figure out where I’d seen Eve Best (Princess Rhaenys, aka the Queen That Never Was); it was Nurse Jackie, a series I can’t recommend enough, and she was amazing in it, as she looks to be here. Though I really, really hope she gets to do more. And I really like Milly Alcock as Princess Rhaenyra: the ambitious and precocious teenage girl who wants more than the life normally allotted a woman is a role we last saw done superbly by Maesie Williams as Arya Stark; Alcock so far isn’t overplaying it, and I appreciate the subtlety she brings, especially considering we can expect it to be contrasted by Matt Smith’s gleeful scenery-chewing as her principal antagonist.
So I guess we’ll see. Not a bad start, but with the way GoT left things, the series has its work cut out convincing fans they’re willing to be hurt all over again.
One thing I’ve found interesting: there have been more than a few (by which I mean two or three that I’ve seen) think pieces wondering whether HotD will fill the vacuum left by GoT as a show that gives people a common point of contact—as whatever these days passes for water-cooler conversation, or in the more highbrow terminology, a “monoculture.” As Alyssa Rosenberg writes in the Washington Post, “It’s not just that Game of Thrones left behind unfinished conversations. Rather, the show seemed to mark the end of mass, sustained cultural debate period.”
However much GoT was a hugely popular show, I feel this overstates things … or possibly evinces a critic’s nostalgia for entertainment properties that captured more than the niche attention that has increasingly been the norm since television fractured into a wider cable universe, and which itself then gave ground to streaming services and their infinitude of offerings. It’s odd to consider that the ratings for the GoT finale, its most-watched episode at just shy of 14 million viewers, was the same as your average episode of Seinfeld in the 1990s. When I was a TA in the first years of my PhD, I could cite The Simpsons in my classes by way of explaining things and be confident that all my students were familiar with my references.
It’s been a long time since there’s been that kind of touchstone—GoT was the closest thing we’ve had, and I somehow doubt HotD will capture the same lightning in a bottle. But I suppose we shall see.
So, as anyone who knows me will attest, I have issues with brevity. Some people have a talent for the pithy 500-word article or post that gets right to the heart of a thought. I most emphatically do not … I am almost invariably digressive, finding my way down endless rabbit holes of thought, chasing the shiny objects of serendipity where they lead me. Some time ago I got in the habit of having at least two different-coloured pens at hand when I write in my journal; when my train of thought veers in a different direction, I switch pens. That way I have an obvious visual guide when I read over my notes highlighting where I change topics.
On the other hand, serendipity and digression have been my twin engines of discovery over the course of my fifty years, and given that I don’t have to chase large audiences, I get to indulge my tendencies on my blog, which really should have been named “Thinking Out Loud.”
All of which is by way of saying the second half of my original post has swelled to a point where I need to cut it in half, less for the sake of brevity than I pretty much move on to a new topic. So the original attempt at posting will now be in three parts.
As I said toward the end of part one of this post, I went back and forth a few times on whether to title these two posts “The Banality of Ego.” Why was I uncertain about that title? My main concern is that it will come across as a bit too cute and too clever by half; if perceived as glib it invites justifiable ire, given that it’s an allusion to the subtitle of Hannah Arendt’s book Eichmann in Jerusalem. It’s also not exactly accurate insofar as that the needless death and destruction caused by Harris’ egoistic obstinance was not itself banal so much as catastrophic. On the other hand, Bomber Harris was hardly a sui generis personality. His particular blend of unearned self-assurance, cultural chauvinism, and bullying arrogance is something of which there are numerous other examples through Britain’s military and imperial history. That he rose to prominence in a system that rewards such an ego—which is not, to be clear, peculiar to Britain—indicts the system as much as the individual. It is in this respect that my thinking is consonant with Arendt’s thesis.
Arendt’s argument is that Adolph Eichmann—a Nazi functionary who’d escaped Europe at the end of the war only to be captured by the Mossad in Argentina in 1960 and shipped to Israel to stand trial for war crimes—embodied “the banality of evil.” Arendt, a German Jew who had fled the Nazi regime and made it to the U.S. in 1941, was assigned to cover Eichmann’s trial for The New Yorker. Arendt was struck by how unremarkable Eichmann was: he was, as she described, a mediocrity of a mid-level manager who seemed befuddled by the gravity of his trial. The quotidian reality of his job was banal, a nine-to-five, paper pushing, number-crunching, form-filling job that organized the transport of millions of Jews from their homes to the death camps. He seemed out of place: the antithesis of the kind of arrogant, defiantly villainous monster already becoming familiar in cinematic depictions of Nazis. What was truly horrifying was just how ordinary and unremarkable he was, how ordinary and unremarkable were the particulars of his job—in apparent contrast with the unthinkable horror he was instrumental in perpetrating. Except that, as Arendt realized in her reports, such a contrast was illusory; banality was not discordant but necessary in the organization of systematic mass slaughter.
Her observations about the banality of both Eichmann the person and the job he had performed did not sit well with many people, who found Arendt’s thesis abhorrent. Personally, I have always found Arendt’s thesis persuasive; Eichmann in Jerusalem was for me one of those books that profoundly affected my perspective on the world. This indeed is the nub of why my mind snagged unpleasantly on Malcolm Gladwell’s breezy dismissal of Bomber Harris as a psychopath, and why I find myself writing several thousand words about his otherwise forgettable book: relegating the indiscriminate death and destruction of area bombing to the psychopathy of a single figure is too easy and exonerates everyone else.
People took umbrage with Arendt’s thesis in part because the very notion of banal evil seems a contradiction in terms. It offends the instinctive sensibility that tells us evil is necessarily exceptional. Arendt’s argument that evil on the scale perpetrated by Nazi Germany is necessarily systemic is an assertion that still offends a large number of people, not least because a corollary of systemic is complicit. The current furor over critical race theory in the U.S. is a case in point: the cynical political calculations involved with attacking it aside, it’s an easy target for conservative politicians because it is deeply discomforting for people to consider that they benefit from the legacy of slavery and the long history of racist policies designed to benefit whites. It is much more comforting to think of racism as an individual failing, especially when embodied in Hollywood’s caricatures of crapulescent chaw-chewing Southerners or hate-spitting Klansmen, who share with Nazis the cinematic appeal of being obviously evil.
I should note that I am uncomfortable with using the word “evil” in a straightforward or unproblematic way. There are a number of reasons for this, but two stand out: first, “evil” tends to connote a transcendent, otherworldly quality that identifies those it describes as somehow different in kind rather than degree. Second, this understanding tends to eliminate nuance, as it renders evil necessarily exceptional, sui generis in its discrete instances in spite of its oft-lamented ubiquity. Arendt never downplays the monstrousness of the Final Solution and its principal architects, nor does she forgive or ameliorate Eichmann’s participation in its prosecution. Substitute “systemic” for “banal” and we begin to grasp the most instinctive objection to Arendt’s argument: not just the idea that evil could be banal, but that, being systemic, we are all potentially implicated.
None of the foregoing is about equating Bomber Harris with the likes of Goebbels or Himmler, nor is it to offer some sort of twisted both-sidesism about WWII; nor is it to suggest that the greatest evils of the war were merely a function of thoughtless bureaucratic inertia. I am rather attempting to work through some of the knottier problems attached to thinking the Second World War, at least in terms of how we’ve come to represent it. I’ve fixed on Randall Jarrell’s bomber poems in part because they help elucidate the stark contrast between how art and literature grappled with the First World War versus how it engaged with the Second. To oversimplify for the sake of framing the question: why did so much great art come out of WWI and so little from WWII?
Part three will pose some thoughts in answer to that question.
 Whatever else we might say about Bomber Harris, he was vastly more competent than Hermann Goering.
 Though in truth it would be more accurate to say it is Arendt’s thesis and its influence on my thinking that shapes my own discussion here.
 Not for nothing, but this is what Stephen Spielberg got exactly right in Schindler’s List. The scenes of brutality and the psychopathy of Amon Göth, the commandant of Płaszów, are harrowing. But more insidiously present throughout the film are the legions of clerks with their pens and papers and blotters and portable desks, who appear at any point when a large number of Jews need to be processed.
 I wrote two fairly long paragraphs at this point in the post that explored this point in greater depth, but decided to leave them out, as they took my discussion down yet another rabbit hole. TL;DR: I explicated my point about complicity by way of the example of the pop cultural fascination with serial killers, both such fictional characters as Hannibal Lecter and real-world examples as Ted Bundy. They are attractive to the imagination, I suggest, because they appear to be embodiments of absolute evil, but our very fascination is an expression of complicity. Saving these thoughts possibly for a future post.
Trying something new: In the process of researching for articles in progress, I often come across cool and interesting stuff that, while cool and interesting, I can’t really use. Conversely, I find myself writing myself into tangents of thought that, again, while cool and interesting, are at best ancillary to the project. Sometimes these things make it into footnotes, but more often than not they remain as jots in a notebook, or are lost to my memory when I delete them from a draft.
Hence, “Research Notes,” an outlet for such ancillary thinking and tangents. At this rate, the essay I’m currently working on will produce a few of these posts.
I’m currently working on an article about the war poetry of Randall Jarrell. Jarrell was a mid-century American poet, most famous for his “bomber poems.” Jarrell did not himself see combat: he enlisted to be a pilot in the U.S. Army Air Force, but washed out of flight school. He did however prove to be quite adept at celestial navigation, and was kept stateside as an instructor. His bomber poems are in part about the depersonalization of soldiers; the heavy bomber as it appears in his poems is the distillation of the individual’s mechanization and assimilation into the war machine.
Or that’s what’s I’m arguing, at any rate.
In the process of researching this paper I’ve been immersing myself in the history of the bombing war, which makes for fascinating, if often harrowing, reading. Several weeks ago, I picked up Malcolm Gladwell’s most recent book The Bomber Mafia. It was with a bit of surprise that I realized, as I read the list at the front of all the other books Gladwell has written, that I had never read a book by Malcolm Gladwell. It felt as though I’d read several, which is because I’ve read a lot of his essays in The New Yorker and other publications, I’ve read or listened to a number of interviews with him, and I listened to at least one season of his podcast Revisionist History. And for a while it seemed as if every time I turned around he’d written a new book.
But I’d never read any, until now. The fact that it felt as though I’d read his books but hadn’t is either ironic or appropriate—or ironically appropriate—given that Gladwell’s usual practice is to take a piece of conventional wisdom, something people feel is true, and then show the various ways in which it isn’t. He’s the sage of counter-intuition, and is very deft at building narratives that, though they start out counter-intuitively, come themselves to feel true, becoming (potentially) new nuggets of conventional wisdom.
So it was interesting to read Gladwell’s take on a subject in which I’d been recently immersed, given that it makes his schtick pretty obvious; I can see why he irks a lot of historians, who often take issue with his tendency to oversimplify. Gladwell writes for a broad audience, and his books are basically pop-history (and pop-other academic disciplines). He covers much of the same territory as the chunkier histories I’ve read on the subject, in two hundred breezy pages, and in the process crafts a narrative that, while not historically incorrect per se, prunes the history in ways convenient to his story.
To be fair, all history engages in such pruning to some extent; but Gladwell’s has the effect of eliding much of the nuance of the history he recounts. It was one such oversimplification that my mind caught on and sent me down a rabbit hole of thought that led to this post.
This one detail leapt out at me not least because it was in both the main text of the book and in the summary on the back. The back of the book promises that “In The Bomber Mafia, Malcolm Gladwell weaves together the stories of a Dutch genius and his homemade computer, a band of brothers in central Alabama, a British psychopath, and pyromaniacal chemists at Harvard to examine one of the greatest moral challenges in modern American history.” The research I’d done so far rendered this sentence, designed to be mysterious and tantalizing, satisfyingly transparent: the Dutch genius is Carl Norden, who designed the famous Norden Bombsight (the “homemade computer” in question), which, it was believed, would prove so accurate as to revolutionize warfare; the “band of brothers” in Alabama are the aviators who essentially invented the U.S. Air Force, and who became the generals responsible for prosecuting the air war in Europe and the Pacific; and the “pyromaniacal chemists” were the people who improved and refined the incendiary bombs—eventually inventing napalm—that wrought so much destruction.
That leaves the “British psychopath,” who could only be Air Marshall Sir Arthur “Bomber” Harris, the man who led the RAF’s Bomber Command like a medieval fiefdom. Sure enough, when Gladwell introduces Harris into his narrative, he says unequivocally, “Arthur Harris was a psychopath.”
I found myself somewhat irked by this characterization, and I wondered why on earth it bothered me. By all rights it shouldn’t have: while I’m unqualified to clinically diagnose anyone with psychopathy one way or another, I also wouldn’t want to oblige myself to argue that Bomber Harris wasn’t a psychopath … for the simple reason that he most probably was one, and certainly was in the more colloquial way we mean the term when we refer to someone who is at once cruel and malicious and takes pleasure in inflicting pain and death, and/or is callously indifferent to the suffering he inflicts. Harris was absolutely determined to use his fleet of heavy bombers to reduce every German city of note to rubble and kill as many German civilians as possible; and in pursuing this bloody goal he was also quite cavalier with the lives of his air crews, to the point—as Gladwell notes—that while he was popularly known as “Bomber” Harris to the public, his men called him “Butcher.”
But a psychopath? Well, probably. But it was psychopathy enabled by an unholy combination of military bureaucracy and the inertia it produces, bloody-minded obstinance, and national and personal ego. And while it’s more narratively and morally satisfying to ascribe the kind of indiscriminate destruction wrought by RAF Bomber Command to an unhinged mind, it obscures the ways in which the sheer scale of violence in WWII was only possible by way of bureaucratic banality.
Let me back up a bit. The bomber war was divided between two distinct ethos: “precision” bombing and area bombing. I put the one descriptor in quotes but not the other for the simple reason that precision in high-altitude bombing was a comforting fiction; area bombing, by contrast, is an accurate name for an inherently inaccurate practice. The “bomber mafia” of Gladwell’s title were a small group of American military aviators who came to believe—believe passionately, in fact—that wars of the future could be won by a relatively small number of bombers destroying specific targets crucial to the enemy’s military and civilian infrastructure. Take out power stations and relays, factories manufacturing key items like ball bearings, transportation hubs, and so forth, and you would cut the strings that made the enemy war machine dance. So simple! All you need is a consistent way to deliver munitions with reliable accuracy from a safe altitude.
It should be noted that this blue-sky thinking proceeded from two laudable premises: first, anything that could prevent or circumvent the kind of horrific attrition recently experienced in the trenches of WWI—indeed, anything that could make any war regrettable but short—was a moral good; and second, that precision bombing theoretically allowed you to solely strike military targets and avoid civilian casualties.
The Norden Bomb Sight seemed to promise such accuracy. It was heralded as a marvel of engineering that would take into account a huge range of variables—among other things, windspeed, forward velocity, air temperature and density, bomb size, even the earth’s rotation—which the bombardier would feed into the machine’s difference engine and, from twenty thousand feet up or more, plant a bomb neatly into a pickle barrel.
Now is when the narrator cuts in to say, “They did not, in fact, plant bombs in pickle barrels.”
Far from it: bombing was never anything close to the exact science imagined by the bomber mafia and touted by wartime propaganda. Leaving aside the fact that the Norden Bomb Sight was not nearly as accurate as promised, bomber fleets had to deal with complications ranging from the harrowing (enemy fighters, flak) to the quotidian (cloud cover). Sometimes bomber crews lost their nerve and turned tail before reaching the target, often dropping their payloads over farmland or forest. Sometimes their navigation was so off they bombed the wrong city entirely. As Paul Fussell notes in his book Wartime, an exhaustive evisceration of WWII myths, “The fact was that bombing proved so grossly inaccurate that the planes had to fly well within anti-aircraft range to hit anywhere near the target, and even then they very often missed it entirely.” As the war progressed, “’precision bombing’ became a comical oxymoron relished by bomber crews with a sense of black humor.”
There was a constant push-and-pull between the British and the Americans throughout the bombing war in Europe. The putative psychopath Arthur Harris considered the American insistence on bombing specific targets absurd; he constantly harangued his American counterparts—and harangued Churchill to harangue Roosevelt—to give up their daytime attacks and join the RAF in night-time area bombing to more quickly effect his goal of reducing every German city to rubble. Harris dismissed the American insistence on hitting crucial targets, whether they were travel hubs, ball bearing factories, or oil production, as “panaceas.” That became his favourite word every time he was urged have his bombers try to hit targets more specific than city centers: he imbued it with a haughty, derisive disdain for what he saw as naïve and even childish American thinking. He embodied the antithesis of the Bomber Mafia’s idealism. War for Bomber Harris was a necessarily brutal affair and every single German was, to his mind, a valid target. American queasiness at the prospect of civilian casualties was, he believed, a weakness that wilfully ignored reality and would prolong the war and, paradoxically, cause greater suffering. Only by breaking the German spirit by pitilessly raining destruction on their cities would the war be won—and ultimate victory, he insisted all the way through, could be achieved by bombing alone.
Again, cue the narrator: “The war could not, in fact, be won by bombing alone.”
However, as the war wore on through 1943 and into 1944 and the behemoth of the U.S. military-industrial complex started producing bombers faster than they could recruit and train aircrews, the American strategy ultimately proved more effective. To be clear: “precision bombing” was never a reality, never became anything more than an oxymoron to be savoured as gallows humour. But as the USAAF was able to put more and more planes in the air, focused raids on high-value targets increasingly paid dividends that Harris’ night-time area bombing did not, even as Harris’ air fleet similarly grew larger and larger. As Canadian historian Randall Hansen details in Fire and Fury, what kept Albert Speer (who had quickly risen through the ranks from his role as Hitler’s pet architect, to being in charge of all war production) awake at night wasn’t the devastation wrought on civilian populations by the RAF, but the massive blows to industry and, especially, to oil production by the American attacks. Yet in the face of mounting evidence that daytime bombing focused on military and industrial targets was having greater effect than his night-time area bombing—as well as increasing pressure from his superiors to follow suit—Arthur Harris remained obstinate in his rejection of “panaceas.”
Hence Gladwell’s characterization of him as a psychopath: Bomber Harris had a list of German cities, and he was determined to reduce every last one of them to rubble on the premise that it would destroy German morale and result in the collapse of the Nazi regime (“area bombing” was also often euphemized as “morale bombing”). There was a method to the madness, albeit a phantasmic one: the best targets for area bombing, so the argument went, were the densely populated city centers, where a carefully calibrated combination of high-explosives and incendiaries would have the greatest effect, creating firestorms that would overwhelm fire prevention efforts; those who weren’t killed were rendered homeless. The people living in these areas were predominantly working-class; the belief was that killing them and/or destroying their homes would create worker shortages and enervate the German war effort.
Except that this did not happen. In fact, just the opposite happened: the German people held firm in defiance of the bombing. Area bombing, rather than breaking their spirit, stiffened their spines … much as the Lufftwaffe’s bombing of London in 1940-41 had done for the British.
All of the histories of the bombing war I’ve read, Gladwell’s included, make this point: Harris’ theory that the bombing of civilians would break a nation’s spirit had already been definitively disproved by the British citizenry. Indeed, this much was pointed out to Harris by Ira Eaker, the commander of the U.S. bomber force. One evening after the two men had eaten dinner together, Harris expounded at length on why the Americans should join the British in night-time area bombing. Eaker pointed out that that the Blitz had not broken the British spirit—why would he think it would break the Germans’? According to Eaker’s account, Harris dismissed the comparison, asserting that the Brits were made of sterner stuff; the Germans, by contrast, would surely crack.
It’s important to note that at the outset of the Luftwaffe bombing campaign—itself launched on the premise that it would break the British and soften them up for Operation Sea Lion, Hitler’s planned invasion of Britain—the British high command’s greatest fear was precisely that their people’s spirit would break, and civil society would collapse. The “keep calm and carry on” ethos that has become so identified with the British response to the Blitz was post-facto: the citizenry’s implacability, alongside the RAF fighter corps’ success in repelling the Germans, were two rare bright spots in Britain’s darkest hour. The British have always been justified in feeling proud of that stoicism; Harris’ assumption, which he was not alone in making—that this was somehow exceptional to the British—is exemplary of cultural chauvinism, national ego replicated in Harris’ personal arrogance and obstinance.
Harris rose through the ranks of the nascent RAF in some of Britain’s many colonial frontiers, cutting his teeth for his later career by putting down local uprisings in Iraq in the early 1920s through intimidation bombing—an early model for area bombing, as it was unconcerned with accuracy and designed to terrify the unruly locals into submission. After the start of WWII, he took advantage of confusion and incompetence in Britain’s then-tiny bombing fleet to advance himself to taking over Bomber Command. He was notorious for his arrogant, bullying behaviour, but developed—and cultivated—a popular public profile as someone who would take the fight to the Germans. This popularity, along with his ability to browbeat both his underlings and superiors, made him a favourite of Churchill’s—and thus effectively impossible to dislodge from his fiefdom, even when it became painfully obvious to everyone concerned that area bombing simply didn’t work.
His monomaniacal determination to destroy every German city coupled with his refusal to see the obvious evidence of his strategy’s failure does suggest a measure of mental derangement, to put it mildly. But it is also an example of ego on a catastrophic scale, enabled by bureaucratic inertia.
For reasons I’ll get into in my next post, I went back and forth on using the expression “the banality of ego” as my title. Suffice to say here, my quibble with Gladwell’s characterization of Harris as a psychopath isn’t that he’s necessarily wrong, or somehow doing Harris a disservice; rather, he’s doing our understanding of WWII a disservice, as reducing Harris’ aims and agency to the vagaries of a single, unhinged man elides the more complex and disturbing reality of the war’s massive scope and scale. Harris was a bad actor, something tacitly recognized by the fact that after the war, much to his annoyance, he was almost entirely ignored in the conferring of honours and the recognitions bestowed upon the U.K.’s national heroes. There wasn’t much appetite among the British elites for celebrating Harris, whose abrasive and bullying manner had made him so many enemies. Like his patron Churchill, he became something of an embarrassment to the ruling class once the indomitable Bosch were safely domitable again. But there is also, I would argue, a hint of collective guilt at work in the official ignoring of Harris: his single-minded quest to pound every German city of note into rubble was as much a product of the structures of power facilitating him as it was Harris’ own putative psychopathy. Bomber Harris, far from being stymied by his superiors, was enabled by them.
The nature of this enabling is the subject of part two of this post, so stay tuned.
 I’d been slogging my way through Richard Overy’s excellent The Bombers and the Bombed for about two weeks before picking up Gladwell’s book. Overy’s clocks in at over four hundred dense pages, plus another hundred pages of endnotes. In contrast, I read The Bomber Mafia in slightly more than a day. And I enjoyed it! This post isn’t an exercise in trashing Gladwell by any means. But reading him on a topic you know well is a bit like seeing your profession depicted on TV—you can’t help but get worked up over the details that get missed.
 The ability to put a “bomb in a pickle barrel” was the common boast made in American propaganda vaunting the technological marvel of the Army Air Force’s bomber fleet. Why pickle barrels and not some other receptacle is a question that remains unanswered.
 He came close to getting the Americans to capitulate—Churchill was on his side, and for a time it looked as though Churchill had convinced Roosevelt. General Ira Eaker, commander of the Eighth Air Force, kept the Americans’ daytime bombing alive by convincing Churchill that, with the Americans bombing by day and the Brits bombing by night, German cities would be pounded relentlessly—“round the clock,” as Eaker said. Churchill liked the sound of that.
 Or in the RAF’s parlance, “lack of moral fibre.” This was the verdict leveled against many airmen who cracked up because of the strain of repeated missions. Sometimes, sympathetic psychiatrists found more acceptable diagnoses that spared traumatized flyers’ reputations, but many more suffered the ignominy of a simple “LMF” in their files, a badge of dishonour they wore for years.
 Eaker and Harris, counter-intuitively perhaps, became great friends.
Pride Month is almost over. I might be a basic cishetero dude, but I love Pride—I love the celebration, the camp, the colour, the music (dancing in the London, Ontario Pride Parade to “I Am What I Am” blasting from the speakers remains one of my fondest memories). I love the love. Some of this has to do with me being a liberal lefty bleeding heart social justice type, but then one of the reasons I’m a liberal lefty bleeding heart social justice type is because many of the people in my life I have loved, who have been dearest to me, who have shaped my worldview, have been queer. I want to take this space to say to all of my queer friends: I love you. My world is a better place for you being in it.
This past Monday morning I was at the Starbucks closest to campus, writing my daily entry in my journal. These past several months, I’ve been making a point of writing something every day—even if it’s just a paragraph noting the weather.[i] On this particular morning, I had a lot on my mind, as I had been reading a bunch of articles about the Texas GOP convention that took place in Houston last weekend. It was allthrough my newsfeeds, especially in regard to the resolution that Texas will hold a referendum no later than 2023 on whether or not to secede from the union; and the addition to their platform asserting that homosexuality is “an abnormal lifestyle choice.”[ii]
I’d read a long thread on Twitter discussing the secession referendum and whether (a) it was legal; (b) it was even feasible; and (c) if refugees from the newly founded Republic of Texas would be supported by a Democratic administration in their relocation. Twitter being Twitter, there was also a lot of sentiment advocating letting Texas go, anticipatory schadenfreude predicting the crash of their economy, and just a lot of general dunking on the stupidity of the whole thing. I had my own thoughts about how Texas Republicans would be making a big bet on being correct in their climate change denial in a state that already has areas very nearly unlivable at the height of summer, as well as a long gulf coast uniquely vulnerable to extreme weather events (to say nothing of a decrepit electrical grid).
Anyway, because I generally avoid tweeting except for the rare occasions when I think of a witty Wildean aphorism, I was sorting my thoughts out in my journal when, looking idly around the Starbucks, I had an odd moment of defamiliarization.
The Starbucks in question is not my favourite coffee shop to work in;[iii] evil corporation aside, I’m generally amenable to Starbucks so long as they at least have the feel of a comfortable café—soft lighting, warmly coloured décor, that sort of thing. This one is new, just opening a few months ago, and is unfortunately very bright and austere, bordering on institutional. But it is also the most proximate coffee to campus,[iv] so I find myself there sometimes.
I had paused in my writing for a moment and was staring into the middle distance, lost in thought. In that moment, Gloria Gaynor came on the sound system—“I Will Survive,” a song that is impossible not to bob along with. And a staple of the Gay Pride soundtrack. Looking around, I suddenly became aware of all the Pride Month paraphernalia decorating the place: there was a big Pride Progress flag behind the counter, another over the chalkboard at the front, crossed with a Trans flag; a row of Starbucks cups was arrayed on the counter in front of the espresso machines, each with a piece of tissue paper corresponding to one of the colours from the Pride flag. There was also the staff themselves, many of whom were, if not actually queer, then certainly rocking the aesthetic.[v]
To be clear: I hadn’t not noticed any of this before. I’ve been there at least once or twice since June began when the profusion of Pride décor went up, but I seem to think they’ve always had a Pride flag at the front of the store, and I had previously taken note that the staff would not be out of place behind the bar or on the dance floor of a gay club that would be way too fashionable for my basic self. But that was the nub of my moment of defamiliarization: I had noticed all these things without noticing them. As I sat there listening to Gloria dunking on her ex, I was taken out of the moment and thought about how when I was the baristas’ age this kind of décor would be limited—even during Pride Month—to specific spots on campus and the “gay ghetto” (as it was then called) of the Church & Wellesley neighbourhood in Toronto. That it was unremarkable that it should be on display in a corporate franchise coffee shop[vi] in St. John’s was, it suddenly occurred to me, remarkable … or remarkable to someone who grew up in the 1980s while attending a Catholic high school, some of whose teachers were quite vocal in their opinion that HIV/AIDS was divine punishment for homosexuality.[vii]
It was an odd moment: what should have been a warm glow of vicarious pride in suddenly seeing such cultural progress that literally snuck up on me sat in stark contrast to the recent full-bore attacks on queer people epitomized in the Texas GOP’s delineation of homosexuality as “an abnormal lifestyle choice.” This specific language is telling, as it hearkens back to an earlier era of anti-gay rhetoric, when it was a point of homophobic common wisdom that being queer was purely a matter of choice. The Texas GOP’s phrasing effectively elides at least thirty years of hard-won progress whose signal event, when Obergefell v. Hodges established federal recognition of same-sex marriage, felt less like revolutionary upheaval than simple confirmation of prevailing attitudes.[viii]
One of the frequent rhetorical recourses made by people opposed to social justice progress—or those who profess to be all about social justice but worry that it is progressing too quickly—is to point to all the obvious advances that have been made in terms of feminism, gay rights, cultural diversity, civil rights, and so forth, as evidence that the current crop of “social justice warriors”[ix] and activists are overreacting, making mountains out of molehills, or otherwise being disingenuous in pretending society hasn’t improved over the past decades and centuries. “So what you’re saying is there’s no difference between today and the Jim Crow era?” is how a typical argument attacking Black Lives Matter might go.
One hears such a tone, for example, in critiques of The Handmaid’s Tale when its misogynist dystopia is dismissed as alarmist fantasy; but then a SCOTUS leak reveals the un-edited version of Samuel Alito’s Gileadean thinking about abortion … and then, somewhere halfway through the writing of this post, the Court strikes down Roe v. Wade with nary a word of the opinion edited. For the better part of my adult life, but especially since Obergefell, when the rainbow flags come out (so to speak) in June there have been predictable harrumphs wondering why Pride is even necessary anymore. Haven’t those people got all the rights now? Why do they need this display?
But then the reactionary Right pivots with the agility of a very agile thing from anti-CRT campaigns in schools to Florida’s “Don’t Say Gay” bill and the broader attempt to associate any mention of LGBTQ people, history, lifestyles, or art and literature with the “grooming” of children. Actually, I shouldn’t say this was a pivot “from” anti-CRT campaigns, given that it’s not as if these people have given up on excising mention of slavery and systemic racism from curricula; it’s more a matter of a broadening of the battlefront, with cynical operators like Christopher Rufo setting the plays. What started as the targeting of trans people has, not unpredictably, exploited the momentum of the Right’s broader culture war to open fronts against what has become assumed to be settled issues—gay rights in particular.[xi] It is telling that none other than Donald Trump Jr. rebuked the Texas GOP for, presumably in accordance with their plank about homosexuality being “an abnormal lifestyle choice,” did not allow the Log Cabin Republicans to set up a booth at the convention. “The Texas GOP should focus its energy on fighting back against the radical democrats and weak RINOs currently trying to legislate our 2nd Amendment rights away,” he said, “instead of canceling a group of gay conservatives who are standing in the breach with us.” That this weak tea was almost certainly the noblest sentiment Don Jr. has ever voiced says as much about the Frankenstein’s monster the MAGA movement has become as it does about Trump’s most toxic spawn.
All of which is by way of saying that, however much progress we as a society have made, there is never cause for complacency. Progress is never inevitable; though there is a tendency to think that it is,[xii] I’d be interested to see how that assumption breaks down on cultural and national lines, and if those who have benefited most from prosperity and the attention to cultural issues that often affords are the most complacent about the moral arc of history bending in their favour. I started writing this post earlier in the week; I picked it up again on Friday and worked on it at one of my favourite downtown spots.[xiii] Whereas Monday morning I’d looked up, lost in thought, and noticed the Pride paraphernalia, on Friday I looked up at one of the televisions over the bar to see the news that Roe v. Wade had been struck down. It was an odd set of bookends to the work week.
As dire as things seem right now—and they are dire, from the worsening climate to the revanchist culture war to the not-zero possibility that Pierre Polievre could be our next Prime Minister—I do still have hope. I have hope because my local Starbucks, in spite of its unfortunate austere design choices, was decked out more queerly than most gay bars I went to in my 20s. I have hope because my students have hope: this generation is earnest and determined, and they have no patience for our prevarications. I laugh, often out loud, when I read shrill screeds about how youth are being indoctrinated with woke ideology by postmodern neo-Marxist (to coin a term!) professors like me. Dude, I want to say, they’re not the ones being indoctrinated. I am.
[i] This on the premise that it takes my attention away from the computer and my phone and their endless doomscrolling, and also that it will help center me (especially if I get to it early in the day) and make me organize my thoughts. All this has been moderately successful, if for no other reason than when I start missing days that functions as a bit of a canary in the coal mine for my mental state.
[ii] In addition to secession and their homophobic throwback to the heyday of Jerry Falwell, the Texas Republicans also declared that the 2020 election was corrupt and Joe Biden is therefore an illegitimate president; called for the repeal of the 16th Amendment, which established a federal income tax; declared that students should be required to “learn about the humanity of the preborn child”; the abolition of any and all abortion; the abolition of the 1965 Voting Rights Act; and the reinstatement of school prayer. John Cornyn, the senior Senator for Texas, was booed during his speech because he had indicated he was open to voting for the minimal gun control bill currently before Congress; and perhaps most remarkably, Congressman Dan Crenshaw and his entourage was attacked in the corridors for … being Trumpy but not Trumpy enough, I guess? Crenshaw, who was a Navy Seal and famously wears an eyepatch because of a wound he received in Iraq, not long ago released a genuinely bonkers political ad featuring himself in full military regalia parachuting out of a plane to attack “antifa and leftists.” He has also tied himself in logical knots while trying to be a “reasonable” conservative while not ever saying a bad word about Trump. Apparently this wasn’t enough for his Texas detractors, who harangued him as an “eyepatch McCain,” an epithet they took from Tucker Carlson.
[iii] Coffee shops and pubs will always rival both my home and campus offices as a preferable work space. One might think that now, in contrast to undergrad and grad school, I do in fact have actual offices—and very nice ones, too!—such public spaces as a local café or pub wouldn’t hold quite the same allure. But they do, for me at least; I like the white noise of these places, and the fact that people-watching can be very zen when you’re paused in thought. I have long been that person you see ensconced in a comfortable seat or booth with a latte or a pint, scribbling in a Moleskine or tapping away at my laptop (though usually the former—working longhand without internet distraction is another point in the coffee shop’s favour).
[iv] For all that Memorial University has to recommend it, it weirdly lacks for cafes and pubs on campus. This, I feel, is contrary to the academic educational process. My undergraduate degree was massively enhanced by the endless discussions and arguments I had with friends at the Absinthe Pub of Winters College at York University; and I’m reasonably certain I wrote at least a chapter’s-worth of my dissertation at UWO’s Grad Club.
[v] When I shared this story with my students in my grad seminar later that morning, one of them helpfully said, “Oh, don’t you know? Everybody who works at Starbucks is queer.” I’ll assume this is at least a slight exaggeration and there are a handful of closeted heterosexuals making lattes, but I’ll take his authoritative word on the subject.
[vi] Not that corporate coffee franchises should be antithetical to queer culture: one of the rabbit holes my mind went down in the immediate aftermath of this reverie was the memory of the infamous steps in front of the Second Cup on the corner of Church & Wellesley in Toronto. On pleasant days, the steps would be full of queer folk (with a few straights like myself occasionally thrown in), drinking coffee, chatting, watching the street, as much spectacle as spectators. To anyone who’d ever been on Church Street, those steps were instantly recognizable in the series of Kids in the Hall sketches called, well, “Steps.”
Curious if the Second Cup was still there, I did a quick Google search and discovered that the steps had been removed (how, I don’t know) by the owners to “discourage loitering,” and the café itself had departed in 2005—but that a new Second Cup took up residence just down the block six years later.
[vii] No, seriously.
[viii] Which is not, I hasten to add, to downplay its importance or the enormity of the victory it represented; but the fact that it was met more with a shrug from the balance of straight people than with outrage is pretty extraordinary in itself.
[ix] This is by no means an original observation, but you’ve gotta hand it to the forces of reaction that “social justice warrior,” or SJW for short, was turned so quickly into a pejorative, and that anything and everything related to or involving social justice became so resolutely understood as the precinct for the shrill, the sanctimonious, as well as being synonymous with the suppression of free expression. Ditto for critical race theory (or CRT—is there power in reducing something to a three-letter acronym?), and the more recent retread of Anita Bryant’s association of homosexuality with pedophilia, this time with the handy epithet “groomer” attached to anyone committing the sin of admitting to queer identity in the presence of children.
[xi] It would be entertaining, if the circumstances weren’t so distressing, to watch such gay conservatives as Andrew Sullivan—who have taken “gender critical” positions on trans rights—suddenly gobsmacked to find their own subject-positions under attack. Sullivan is a particularly notable example: he has become increasingly strident on what he characterizes as the tyranny of pro-trans discourse, and recently had Christopher Rufo on his podcast in which he agreed on many of Rufo’s attacks on critical race theory. More recently however, he has thrown up his hands and said (figuratively) “Whoa, whoa!” in response to Rufo’s most recent rounds of anti-gay rhetoric.
I’m thinking of writing a one-act play á là Samuel Beckett’s Krapp’s Last Tape that would feature progressive billionaire George Soros alone on stage in a small pool of light with only a stool and an iPhone for company. The play would mostly be him scrolling and muttering to himself in increasingly unhinged non-sequiturs, the gist of which we eventually glean is his existential angst at being the only progressive billionaire, which means he thus must shoulder all of the instinctive hatred for billionaires directed at him from right-wing media and politicians. “Mercer, Murdoch, Musk,” he mutters in what becomes a refrain, “Koch. Bezos. They share. They share. (he scrolls for a long moment, then looks out at the darkness in the direction of the audience) Soros. Alone.”
The play ends midway through the pandemic. Soros grows more and more excited as he reads conspiracy theorists’ attacks on Bill Gates accusing him of putting mind-control chips in the covid vaccine. Soros looks up from his phone with a look of fragile hope on his face as he whispers, “Is thereanother?”
All of which is by way of saying that I’m endlessly fascinated by the way in which George Soros has become the singular bogeyman of the alt-right and Steve Bannon’s cohort, of QAnon conspiracy theorists, and Hungary’s Viktor Orban, current darling of the Tucker Carlson wing of the GOP (lots of overlaps in that Venn diagram, to be sure). When it comes to the question of billionaires, I’m generally in agreement Elizabeth Warren: that is, the existence of billionaires qua billionaires isn’t a problem; a proliferation of multi-billionaires existing concurrently with systemic poverty and hunger, widespread lack of access to health care, and the ongoing climate crisis is a moral obscenity. And while apologists might point to the Gates Foundation’s work to address some of these problems or the fact that Elon Musk has done more to move us toward electric vehicles than any group of people, well, kudos to them … but they remain a vanishingly small minority of the class.
More common is the Atlas Shrugged brand of libertarianism embraced by the likes of Charles Koch and his late brother David, which frames the making of obscene amounts of money as a form of virtue—and which tirelessly spends a huge amount of that money to ensure that it stays with the ultra-rich and furthers their ability to accumulate even more wealth. New Yorker staff writer Jane Mayer wrote an excellent, exhaustive book titled Dark Money in 2017, which did a deep and detailed dive into the vast sums spent by right-wing billionaires in shoring up conservative politicians at all levels of government—from local school boards to congress and the White House—as well as funding climate disinformation campaigns, conservative think tanks like the Claremont Institute and anti-tax organizations like American for Prosperity, as well as a huge constellation of other right-wing causes.
If there was something even approaching numerical parity between progressive and conservative billionaires, each advancing their political interests like Olympian gods choosing sides in the Trojan War, that would be one thing (it wouldn’t resolve or really even ameliorate the structural problems of billionaires in an inequitable society, but it would definitely be a thing). But the fact of the matter is that the conservative vilification of Soros’ progressive agenda is profoundly disingenuous, for the simple reason that he’s all they’ve got to attack, while on their side they’ve got Rupert Murdoch, Peter Thiel, Charles Koch, Robert and Rebekah Mercer—who are collectively worth over $100B compared to Soros’ $8.6B—as well as a legionofothers who actively spend their money on conservative political causes.
Of course, there’s also Warren Buffett, one of the world’s richest men who tends to voice liberal political opinions and is nominally in favour of higher taxes for the rich, but he’s largely left alone—for reasons I won’t speculate on for at least a few paragraphs—by the right-wing mediasphere.
There’s also the fact that, for all his mouthing of liberal platitudes, Buffett doesn’t do much to put his money where his mouth is, and has frequently been accused of hypocrisy by right and left alike. Indeed, even the most “liberal” of billionaires tend more to gesture at social progressivism while accumulating wealth through the most ruthless means available, espousing a libertarianism that puts free speech and legal weed in the same philosophical framework as industry deregulation and low taxes for the über-rich. Even Bill Gates, arguably the most socially conscious of the billionaire class, dismisses the policies of Elizabeth Warren and Bernie Sanders out of hand on the principle that individuals are better judges of how to spend their money than the government.
And if all billionaires—or really, just some of them—established their own versions of the Gates Foundation, he might have something approaching a point. But of course he and George Soros are the outliers, with most billionaires who engage in politics doing so with an eye to entrenching their wealth and facilitating the means to make more.
(If I’m being conspicuous in not mentioning Elon Musk, it’s because Musk is pretty much sui generis, falling into a category of his own devising that is somewhere between chaos muppet and Bond villain. If it weren’t for the fact that the man can tank the stock market with a tweet, it would be amusing to watch his new alt-right fanboys reconcile their love of Musk’s shitposting with the fact that he’s the godfather of the EV revolution).
So pity poor George Soros, the loneliest billionaire. As the sole progressive plutocrat who actually funds progressive causes, he gets the brunt of the paranoid right’s vitriol. Though if you find it puzzling as to the frequency and intensity of the attacks on a man with one sixteenth of Jeff Bezos’ wealth—which is still more money than any one person should be able to possess—you might want to take note of how often the name “Soros” is spoken in the same breath as “globalist.” Or to put it more plainly: it’s the anti-Semitism, stupid.
Two blog posts ago I went on at length about gremlins—both in general, and specific to The Twilight Zone episode “Nightmare at 20,000 Feet.” That episode comprised my most terrifying fictional experience, something that stuck with me for years. My students’ first assignment in my weird fiction class this summer is to write a piece of creative non-fiction describing their most terrifying fictional experience. As I said in that post, I was planning to write my own, by way of example and in the name of fairness—given that we’ll be sharing everyone’s pieces. And I said I would post it here.
So here we are. As might have been obvious from my last gremlins post, this is has become something of a very interesting and serendipitous rabbit hole for me, at once touching on a handful of my current research interests as well as jogging a lot of memories that I want to explore. Possibly this turns into a larger project, possibly it becomes an avenue of self-exploration, possibly both.
So what I’m saying is, don’t be surprised to see more posts here vectoring off from this line of inquiry.
One caveat: I made a point of not consulting with my parents as I wrote this. What I’ve related in this essay is purely based on my memory of the summer of 1984, and as such might be wildly off base. I’ll be interested to see what my Mom and Dad have to say and whether their own memories are at all consonant with mine. If not, I will write a follow-up.
Meanwhile, without further ado …
A man sees something on the wing of the airplane, a vague shape in the rain and lightning. It’s impossible that anything alive could be out there, at this speed and altitude. But he sees it again. He’s a nervous flyer; perhaps his mind is playing tricks. But then he sees it again: person-shaped but inhuman, its intent obviously malevolent as it tears into the wing.
He is the only one who sees it. He cries out to the flight attendants, to his fellow passengers in panic, but they think he’s crazy. He wonders himself if he’s losing his mind. He lets himself be calmed down, he closes the window shade, he tries to sleep. But soon enough, he can’t help himself. He opens the shade, and there is the thing, a creature that looks like a demonic goblin, inches from his face on the other side of the window, staring back at him with something like sadistic glee.
It’s a gremlin, of course, a folkloric imp that emerged in the age of flight, invented by RAF aviators in the years before WWII. Heir to a long lineage of mischievous pixies and fey folk, the gremlin is nevertheless a modern creation, blamed for the frequent and seemingly random malfunctions that bedeviled airplanes during the frantic steeplechase of flight technology between the wars. Gremlins weren’t just the comic antagonists of tall tales told by pilots and crew on airbases between missions—enough airmen were genuinely convinced that gremlins were real, swearing up and down that they’d seen the little bastards on their wings, that concerned psychological papers were written.
Roald Dahl’s first novel was about gremlins. Bugs Bunny tangled with one in the Looney Tunes short “Falling Hare.” Like their folkloric predecessors, gremlins were given to mischief and occasional cruelty, but were mostly depicted as annoyances and not threats.
For a time in my childhood, gremlins were a source of abject terror for me.
When I think of gremlins, I think of the summer of 1984. The movie Gremlins was released that June, but I never saw it. I still haven’t. By the time it hit theatres, I’d already been terrified beyond what was strictly reasonable by the gremlin in The Twilight Zone: The Movie, which my father rented for us to watch some time after its release in 1983 and before Gremlins came out. The fourth of the anthology film’s four segments was “Nightmare at 20,000 Feet,” in which a nervous flyer sees a gremlin on the wing of his plane. The moment when he opens the shade to see the demonic creature staring back at him haunted me for years. When I lay in bed at night and the scene came to mind, I hid my head under the covers—trapping myself, for suddenly I couldn’t shake the idea that the gremlin would be perched there, staring at me, if I lowered them.
Lest you assume these were the infantile fears of a young boy, let me clarify: these were the infantile fears of a twelve-year-old.
A vicious heat wave hit our Toronto suburb in 1984. It coincided with the Olympics, which ran from the last week of July into August. It was the kind heat that pervades my childhood memories of summer: a baking sun in a clear sky, air that was somehow stifling and humid while also drying the grass to brittle blades that abraded bare feet. Even basements were no refuge.
Our house had no air conditioning, so my father brought the television set outside onto the side deck where there was at least some shade and, occasionally, a breeze. This arrangement appealed to my mother: puritanical about not spending summer days indoors watching the tube, she also hated missing even a moment of Olympic coverage. Because the 1984 Games were held in Los Angeles, the time difference meant our Olympics viewing stretched into the darkening evening. We ate dinner on the deck and watched athletes run, swim, hurl, and paddle. Neighbours came over, bringing beer and wine and snacks. An ongoing PG-13 bacchanal took up residence on our deck and spilled out onto the yellowing grass of our corner lot as the neighbourhood kids staged our own Games.
The 1984 Games were notable for the absence of the Soviet Union and most of the Eastern Bloc. They boycotted Los Angeles in retaliation for the United States’ boycott of the 1980 Moscow Games, which had been in protest over Russia’s invasion of Afghanistan.
It was petulance, said one neighbour. Hypocritical, said another. Someone made an off-colour joke I didn’t understand about women’s weightlifting being fair this time. I didn’t grasp the nuances of the politics, but I knew that the Soviet absence tinged everything with vague unease. The 1984 summer Olympics marked the precise midpoint of Ronald Reagan’s presidency and the renewed belligerence of the Cold War. Fears of nuclear conflict that had smouldered like banked coals during the détente years of the 1970s leapt again into open flame. Pious sages of geopolitics kept inching the Doomsday Clock closer to midnight. I was in some ways a literal-minded child and did not quite understand that the clock was metaphorical. Every time its minute hand crept forward, I could not sleep for days afterward.
The Day After, a terrifying depiction of a nuclear exchange, aired in late 1983. It showed the effects of multiple warheads striking in the American heartland, and the immediate aftermath as people suffering from severe radiation poisoning struggle and fight over food and water. The images of the mushroom clouds and their devastation were the most graphic ever portrayed at the time. A disclaimer at the end told the audience that, however ghastly the film’s depiction had been, it was mild in comparison to what the reality would be. With over one hundred million viewers, it was the most-watched television event in history.
I did not see it. I didn’t even know it existed until I heard about it at school from classmates who had watched it. It had been recommended that parents watch it with their children; guides were made available to help with the discussions afterward. But it was not mentioned in my house and I was somehow smart enough not to ask why.
Whatever sleep I lost worrying about the Bomb, my mother’s nuclear anxieties contained multitudes.
Because serendipity is like gravity, that summer one of the television channels aired old episodes of The Twilight Zone. Every night when Olympics coverage ended, when most of the neighbours had gone home, while the lawn and the hedges and the asphalt of the street sighed the stored heat of the day into the darkness, we switched over to the slow cascading fall of the theme music and the studied portent in Rod Serling’s voice.
With one exception, I don’t remember which episodes we watched. I do remember my parents waxing on about episodes we didn’t see. “The Monsters Are Due on Maple Street” was a favourite of theirs. To this day I haven’t seen it, but the plot as they related it stuck in my mind: a quiet suburban neighbourhood like ours suffers an inexplicable blackout on a summer night; the neighbours congregate in the street, anxious but not concerned until the lights go on at one house. Suspicion starts to set in: what’s special about that person’s house? Other houses get power, then lose it, until the previously friendly neighbours descend into paranoid warring factions. The episode ends with aliens in a ship overhead, who have been manipulating the power grid, saying, Look, we don’t have to attack them, these humans will turn on each other.
But because serendipity is the gravity of my life, we did see “Nightmare at 20,000 Feet.” I can’t help but think that if I’d seen the original episode first, the spectre of a gremlin on a wing wouldn’t have stuck so deep in my brain’s fear centers. It featured a pre-Star Trek William Shatner as the nervous flyer, demonstrating that scenery-chewing was always his first and best talent. The gremlin itself looked like a man in a monkey suit, less a demon than an ugly teddy bear. Creepy but hardly terrifying.
But in the context of that uncanny summer fortnight, with the memory of the movie gremlin colouring its black and white predecessor with shades of fear out of space, what was otherwise risible had the effect of driving my original horror deeper into my mind. It became existential. Each night when we changed the channel over and the theme music played I felt ill, and yet could not look away. Then one night, the show began with Serling’s narration: “Portrait of a frightened man: Mr. Robert Wilson, thirty-seven, husband, father and salesman,” showing William Shatner (whom I did not then recognize as Captain Kirk) slouched in his airplane seat. We learned he had recently spent time in a sanitarium—that he had been there because he had a nervous breakdown on a flight “on an evening not dissimilar to this one.” On this night, however, Serling tells us, “his appointed destination … happens to be in the darkest corner of the Twilight Zone.”
I could not look away, even as part of me knew just how much this campy earlier iteration was about to make the lingering effects of the later one indelible. I have little memory of the actual episode. What I have is sense memory: the night’s heavy, suffocating humidity, the creak of crickets in the hedge, the smell of the grass, the bilious weight in my belly, and the dread of knowing I would soon have to try and sleep in my dark and stuffy room.
In German, “uncanny” is unheimlich, literally unhomely, that which makes you feel not at home. Those two weeks that summer were dislocating: I was not at home in my home, and my home itself felt adrift, untethered. Or perhaps what I felt was the dread certainty that it was always untethered on the world’s currents, and that the feeling of safety remote from the larger world was the illusion—that there was always a gremlin on the wing, marking time in missile silos and in the minds of world leaders. RAF airmen invented gremlins in part to resolve a contradiction: flight technology was advancing by leaps and bounds but left them uniquely vulnerable while aloft. What more unthinkable technology has existed than nuclear weapons? Perhaps for me the gremlin was not a narrative comfort as it was for the aviators, but the certainty of the technology’s malevolence.
The Twilight Zone was in many ways the quintessential Cold War TV show, as it embodied the nagging, unhomely sense of something being not quite right, which was the constant undercurrent of the bland suburban order that America was so desperate to convey to itself. It is no surprise that so many of the show’s episodes are set in such innocuous suburbs as Maple Street.
My father, who grew up in just such a suburb, loved The Twilight Zone when he was my age; he told me that he watched the episodes eagerly when they first aired. He was twelve when the show premiered in autumn 1959. He was a different twelve-year-old than me, apparently—I tried to imagine actually enjoying something that unsettling, actually looking forward to seeing what each new episode would bring, but that sensibility was still alien to me. In a few short years I would learn to love horror when I discovered Stephen King and tore through his novels at breakneck speed. But at twelve I had not yet grown out of the nausea the uncanny inspired in me. That two-week stretch of an otherwise idyllic summer was a perfect storm of subtle dislocations: the heat wave, the outdoor television viewing, the constant low-grade party atmosphere, the hours and hours of Olympic coverage, all with the Soviet absence drawing attention to the Cold War’s constant menacing background hum.
It occurs to me that the current state of the U.S. Supreme Court is like climate change … which is to say, it has been ongoing for several decades and visible to anyone willing to see it developing, but it has not prompted anything but the most tepid of responses. And now that we’re experiencing the judicial equivalent of massive flooding, it’s already too late.
(I can’t decide whether this analogy is ironic or appropriate, considering this court is likely to do everything in its power to curtail efforts to reverse climate change).
I remember reading Angels in America for the first time over twenty-five years ago, and coming on the scene in which the notorious lawyer and fixer Roy Cohn—now most famous for having been Donald Trump’s mentor in the 1970s—takes the closeted law clerk Joe Pitt out to dinner and introduces him to a Reagan Justice Department apparatchik who waxes poetic about how they’re seeding the federal bench with conservatives judges. “The Supreme Court will be block-solid Republican appointees,” he enthuses, “and the federal bench—Republican judges like land mines, everywhere, everywhere they turn … We’ll get our way on just about everything: abortion, defense, Central America, family values.”
I remember reading that and thinking, wow, diabolical. And then every time I read a news item about the Federalist Society or the GOP’s SCOTUS-oriented machinations, I thought of that scene. When Mitch McConnell held the late Antonin Scalia’s seat hostage from Merrick Garland, I thought of that scene, and thought of it again through Neil Gorsuch’s hearings and the debacle of Brett Kavanagh, and of course once again when McConnell rushed Amy Coney Barrett’s nomination through in what ended up being the last days of the Trump Administration. By then, the full crisis of the American judiciary (my first inkling of which was from a play that first ran off-Broadway in 1992) was plain to see. The U.S. has been experiencing extreme judicial weather events for over a decade now; the leak of the Samuel Alito-authored decision repealing Roe v. Wade is like knowing not just that there’s a category 5 hurricane just below the horizon, but that such storms and worse are the new normal for the foreseeable future.
Recently it has not been uncommon, especially at moments of more acute racial discord, for people to post images on social media juxtaposing recent electoral maps with maps circa 1860. The red states east of the Mississippi River match almost precisely with the Confederacy; and though Biden’s win in Georgia in 2020 is a welcome disruption of that consonance, otherwise the geography or red v. blue has been increasingly entrenched since Nixon first embarked on The Southern Strategy and accelerated a shift that, sadly, was probably inevitable the moment Lyndon Johnson signed the Civil Rights and Voting Rights Acts.
There has also been, especially since Trump’s election—and even more so since the January 6 insurrection—the prospect of a “new civil war” bandied about, from thinkpieces to more than a fewbooks. Most such speculations are careful to point out that any such conflict would necessarily be dramatically different from the actual U.S. Civil War—that the seemingly solid blocks of red and blue that replicate the territory of the Confederacy and the Union are deceptive; that however polarized U.S. politics have become, geographically speaking conservative and liberal factions are far more integrated than the maps allow. The divide is more urban/rural than north/south, with substantial blue enclaves in deep red states, like Austin in Texas, or big red swaths in rural California.
The pandemic shook the etch-a-sketch up somewhat, too, as urban professionals, forced to distance socially and work remotely, found the cheaper rents and real estate outside of their cities more amenable (whether the end of the pandemic reverses that out-migration remains to be seen). And when businesses decamp from states like California to states like Texas, they bring with them work forces that tend to be younger and more socially and politically progressive, muddying things further. (Let’s not forget that Florida governor Ron DeSantis’ current feud with Disney over the “Don’t Say Gay” bill was precipitated not by the company’s management, but by its workers, whose hue and cry over what they saw as an unconscionably tepid response prompted the CEO to, one assumes reluctantly, condemn the bill).
What I’m wondering today is: does the imminent repeal of Roe v. Wade herald a 21st century Great Migration? Except this time, instead of Black Americans fleeing the Jim Crow south, will it be liberals and progressives fleeing Republican states for Democratic ones? Possibly that seems like I’m overstating the case, but I think it will depend on just how far this SCOTUS will take the logic of Alito’s rationale, which is essentially predicated on the assertion that there is no right to privacy enshrined in the U.S. Constitution. Numerous legal experts have weighed in on this speculation, running down a list of landmark Supreme Court cases that hinged at least in part on the premise of the right to privacy: legal contraception, the abolition of anti-sodomy laws, interracial marriage, the prohibition of forced sterilization, and same-sex marriage. Even a year or two ago I would have not worried overmuch about such cases being overturned, thinking it unlikely that any high court, however conservative its composition, would be so retrograde. But this court’s conservative majority has demonstrated a shocking unconcern for even the appearance of being measured and apolitical. They’ve pretty much made it obvious that anything and everything is on the table. That goes also for the current spate of legislating being done by Republican-dominated states: injunctions against teaching the history of slavery, the banning of books, the abolition of sex education, and of course the aforementioned “Don’t Say Gay” bill in Florida, which looks ready to be imitated in other red states. Should any challenges to these pieces of legislation make it to a SCOTUS hearing, how likely do we think it is that the current bench would quash them?
Which makes me wonder at what point being a liberal or progressive living in a blue city in a red state becomes untenable? What would that do to the U.S. polity? There would be a significant brain drain from red states; businesses would be obliged to follow when their pool of qualified workers dried up; urban centers in red states would wither; the current political polarization would in fact become geographical, as the states lost their last vestiges of philosophical diversity and became more and more autonomous, no longer subject to any federal law or statute they felt like challenging before a sympathetic Supreme Court.
That might indeed be a recipe for a “traditional” civil war.