Three reflections on the occasion of Remembrance Day

Don’t mention the war!

Last week in my fourth-year seminar, we covered Tim O’Brien’s book The Things They Carried—a not-quite-a-novel, not-quite-a-memoir about his experiences as a combat soldier in the Vietnam War. It’s an extraordinary piece of work that I recommend to everyone. I prefaced our discussion as I always do whenever I teach a text about war, with an anecdote about James Joyce living in Zurich during the First World War: when asked why his current project (which was to be an obscure little novel titled Ulysses) wasn’t an explicitly anti-war novel—Joyce was, after all, living in the middle of a community of vociferously anti-war artists who had come to Zurich specifically to avoid military service—Joyce responded that the best way to write an anti-war novel was simply “don’t write a novel about war.”

I have always been struck by the simultaneous wisdom and inadequacy of this zen-like assertion. On one hand, Joyce identifies the principal problem of depicting warfare, which is that in aestheticizing it you risk celebrating it. To illustrate this point, I showed my class the notorious helicopter attack scene in Apocalypse Now, in which Robert Duvall’s sociopathic Colonel Kilgore swoops in with his air cavalry on a Vietnamese village to the strains of “The Ride of the Valkyries,” a mission he’d rejected until one of his soldiers told him the beach there was perfect for surfing. The scene is part and parcel with the film as a whole, a commentary on the absurdities and psychopathies of warfare; in and of itself, however, it is one of the most thrilling depictions of combat ever put on celluloid, a point made strikingly in the film adaptation of Anthony Swofford’s memoir of the first Gulf War, Jarhead. When Swofford’s Marine unit is given its orders to ship out to the Middle East, they get the men’s bloodlust up by showing them Apocalypse Now:

Possibly the most famous line from the film—and one of the most famous lines in cinema—is Kilgore saying “I love the smell of napalm in the morning,” just after a flight of jet fighters rain fire down on a treeline.

 

napalm

The moment is iconic: the shirtless Kilgore, wearing a traditional cavalry hat, crouches down beside Captain Willard (Martin Sheen), who, with a handful of other soldiers, lies prone as bullets snap and whiz about them; Kilgore, heedless or perhaps oblivious of the danger, regretfully tells Willard, “This war’s gonna end one day.” Then he stands and walks off.

kilgore

As I said, the moment is iconic, but almost certainly for all the wrong reasons. We’re not meant to identify with Kilgore, but with the soldiers cowering on the ground. Between Francis Ford Coppola’s direction and Duvall’s performance, however, Kilgore has become a compelling symbol of badassery, and his iconic line a celebration of warfare.

 

I’m here today because my grandfather was good at math

My maternal grandfather, Norman Brown, was one of the kindest, gentlest, and most intelligent people I’ve ever known. He never went to university because he had to work as soon as he was done school to help support his family, which was something of a minor tragedy—if there was ever anyone who would have loved academe and flourished within its ivied walls, it was my grandfather. Even without postsecondary education, however, he showed his acumen, winning an award in mathematics. When World War Two began, he volunteered for the Royal Canadian Air Force, and became a navigator on Lancaster bombers.

lancaster

A Lancaster-class bomber

I remember him showing me the various ways in which he would figure out the location of his bomber. He’d pull out pads of graph paper and draw lines with a pencil and ruler, writing numbers in the margin, and show me how to calculate one’s position. All that was lost on me, and would almost certainly be lost on me today, but I loved him so much, and loved listening to his stories, that my grasp of his words was beside the point.

I was a devoted reader of WWII history from an early age, and would have loved even more to hear stories about his bombing missions over France and Germany. But he had none to tell. He never shipped overseas. He was so good and so precise in his navigation that he was kept in Canada to teach others. (My grandfather, modest to a fault, never said as much; it was my grandmother who told me, recounting how, upon receiving his letter with the news, she wept with relief).

The fact that my grandfather had no actual war stories to share was, to my young self, the sole element lacking in those discussions. It would only be much later, after he had passed, that I fully appreciated the substance of my grandmother’s relief. Bomber crews sustained one of the highest rates of attrition of any combat units of the war. Some forty-five percent of bomber crews were killed in the skies over Europe. That’s a number that gives me what I can only characterize as existential vertigo. Had my grandfather been just decent at math as opposed to extraordinary, there’s close to a fifty-fifty chance I would not be sitting here writing this post.

And as I write this, another eventuality presents itself to me: had he gone overseas and survived, would he have been the same man I loved and idolized? After suffering the trauma and terrors of night-time bombing, seeing crewmates chewed by flak, looking death in the face, seeing the bombers on his wing explode in flame, would that gentleness of spirit have survived?

“Thank you for your service” is the emptiest of sentiments if we’re not going to acknowledge that sometimes the greatest sacrifice is surviving.

 

Seriously, can Don Cherry just retire already?

Almost eighteen years ago, I went to a bar in London, Ontario with a good friend to watch Team Canada play Finland the Salt Lake City Olympic hockey quarterfinals. Canada won, and went on to take the gold. That evening sticks in my memory because there were two things that struck my friend and I as the quintessence of Canada.

First was a young Sikh man in a turban and a Toronto Maple Leafs hockey jersey; where the player name would be, it said THE HIP. He was with a group of racially and ethnically diverse friends, all cheering exuberantly for Team Canada.

Second was a moment between periods featuring Don Cherry declaiming in his usual strident tones about the game thus far. He was rinkside; a group of Canadians who had made their way to Salt Lake City started cheering and chanting his name, loudly enough that they were drowning him out. Annoyed, Cherry turned and pointed at them and said “Shut up!” And they did. Everyone in the bar burst out laughing, because of course Canadians, however excited to be watching their team hand Finland its ass, would be polite enough to be chagrined to have their rambunctious behaviour called out.

It has been quite awhile since I thought of that evening, but it came to mind today when I read and then watched Don Cherry’s asinine suggestion that what he perceived as a dearth of Remembrance Day poppies on the streets of Mississauga and Toronto was attributable to a surfeit of immigrants disrespecting Canadian traditions and history:

You people love – they come here, whatever it is, you love our way of life, you love our milk and honey, at least you could pay a couple of bucks for a poppy or something like that. These guys pay for your way of life that you enjoy in Canada, these guys paid the biggest price.

Don Cherry might be a Canadian institution, but he has been descending into the nativist know-nothing reactionary well for some time now—or, more likely, has always been there and finds the present moment amenable to voicing his antipathy to anyone he perceives as not being a “real” Canadian.

I don’t know that his anti-immigrant comments regarding wearing the poppy are the worst thing he’s said in recent years, but they do serve to exemplify the kind of pernicious nativist thinking that has consumed contemporary conservatism. On the day we put aside to solemnly observe the sacrifice of our nation’s soldiers, it’s worth considering just why Cherry’s rant was not merely asinine and bigoted, but also displayed precisely the kind of ignorance and disrespect of history of which he accuses immigrants.

To start with, I’ve written at some length on this blog about how the assertion that Canadian soldiers fought and died “for our freedoms” irks me. Click the links to read my thoughts in full, but here’s the TL;DR: Canada has never faced existential threat from a foreign foe, whether from the Central Powers or the Axis or North Korea or Saddam Hussein or the Taliban; we have fought for the freedoms of others, which should be celebrated and memorialized, but that is not the same thing; and to unthinkingly celebrate our soldiers’ sacrifice and our veterans’ service without understanding its context and complexity is to do it a disservice.

That is why I wear the poppy, while some on my section of the ideological spectrum consider it an endorsement of war and aggression. I understand that argument while respectfully disagreeing; as I outlined in the first section of this post, the depiction of war is a fraught affair—so too is its commemoration, not least because while November 11th invites us to remember all who have fought for and served Canada, the day specifically honours the First World War.

Given that I’m a third-generation Canadian of predominantly British heritage (with just enough Irish mixed in to sharpen my Catholic guilt), the legacy of WWI resonates with me quite strongly. But I can’t blame people whose families emigrated from formerly colonial nations if they’re ambivalent about commemorating a war that was fought by imperial powers over imperial holdings. There was a host of reasons for the tensions in Europe leading up to the outbreak of war, but a major factor was Germany’s imperial expansion, encroaching as it did on French and British interests in Africa and Southeast Asia. WWI was, as I’ve argued here before, an entirely unnecessary conflagration contested by imperial and generally nondemocratic powers in the name of keeping possession of their colonial interests.

Which is not, I hasten to add, a reason for not commemorating the horrific sacrifice made by soldiers fighting for what poet Wilfred Owen called “the old lie: / Dulce et decorum est / Pro patria mori” (translation: “It is a proper and beautiful thing to die for one’s country”). And the particular idiocy of Don Cherry and his ilk omits how many of “you people’s” forebears fought for and alongside the imperial powers: by conservative estimates, “well over four million non-white men were mobilised into the European and American armies during the First World War, in combatant and non-combatant roles.” (Just going out on a limb here that the “you people” Cherry’s targeting aren’t Swedish immigrants). To say nothing of the significant number of indigenous Canadians who fought in both world wars, to no great benefit to their communities.

Commemoration and memorializing are important, but even more important is having a nuanced appreciation of history. When we boil it down to the mere presence or absence of a flag pin or a poppy on one’s lapel, it is profoundly disrespectful to the very people we’re ostensibly remembering.

Leave a comment

Filed under Uncategorized

Revolutionary Thoughts

76 vs. 89

I had an odd thought yesterday morning, apropos of what I’m about to write about in this post, but I thought it was funny enough in the weird connection it makes to lead off with it.

The musicals Hamilton and Rent don’t have very much in common besides being huge Broadway hits and featuring generally attractive, youthful casts. But they do both focus on ensembles of people who fancy themselves revolutionaries: in the first case, the ardent young men who become the United States’ founding fathers; in the second, a ragtag group of bohemian would-be artists who rebel against the suffocating strictures of mainstream culture. The title song of Rent signals their first act of resistance upon receiving an eviction notice. The song agonizes over how they’re “gonna pay last year’s rent,” but by the end resolves:

When they act tough—you call their bluff
We’re not gonna pay
We’re not gonna pay
We’re not gonna pay
Last year’s rent
This year’s rent
Next year’s rent
Rent rent rent rent rent
We’re not gonna pay rent

Whenever I think of Rent or hear its music, it always puts me in mind of the late great David Rakoff’s eviscerating critique of the musical (which you can listen to here), in which he points out that none of the play’s would-be artists seem ever to want to do the work of being artists. But his key bone of contention is: “Well … why won’t you pay your rent?” At the very end of his essay, he recounts, of his agonistic 20s:

There were days when it hardly seemed worth it to live in a horrible part of town just so that I could go daily to a stupid, soul-crushing, low-paying job, especially since, as deeply as I yearned to be creative, for years and years I was too scared to even try. So I did nothing. But here’s something that I did do. I paid my fucking rent.

It occurs to me, perhaps uncharitably, that the Revolutionary War part of Hamilton is basically the founding fathers chanting “We’re not gonna pay rent!—albeit with better songs and a somewhat more nuanced rationale for why they’re not gonna pay rent than their bohemian counterparts.

***

I had this weird thought after reading a column by Bret Stephens, one of the New York Times representative conservatives, titled “Robespierre’s America.” Happily, the TL;DR is in the subtitle: “We need to reclaim the spirit of 1776, not the certitudes of 1789.”

If you’re at all familiar with Stephens’ columns, you probably know what’s coming: an invective against the woke sanctimony of the politically correct left, compared unfavourably with the reason and rationality of the Enlightenment principles on which the Declaration of Independence and Constitution were based. He enumerates a series of excesses—starting with his own victimization at the hands of a Twitter mob for calling Reza Aslan stupid—mostly recounted in the abstract, referring to professors afraid to offend students and publishers dropping books at the first whiff of controversy, comparing the ideological rigidity of the woke left to that of the Jacobins:

“Armed with the ‘truth,’ Jacobins could brand any individuals who dared to disagree with them traitors or fanatics,” historian Susan Dunn wrote of the French Revolution. “Any distinction between their own political adversaries and the people’s ‘enemies’ was obliterated.”

Leaving aside the egregious comparison of Twitter warriors with people who literally decapitated thousands, let’s address the implicit comparison Stephens makes between the American Revolution and the French—implicit, because he never explains what he means by the “spirit of 1776.” One assumes he’s citing the tacit understanding of America’s founding as rooted in and emerging from Enlightenment principles of reason, rationality, and spirited public debate—the very understanding, indeed, that made it possible for Lin-Manuel Miranda to write compelling rap battles about the creation of a national bank and the wisdom of carrying a national debt. Certainly, that’s the implied contrast with the ideological fanaticism of Robespierre and his murderous Jacobin thugs.

Normally, this sort of thing wouldn’t bother me overmuch—I find Bret Stephens’ columns annoying, but predictable and forgettable—but given that yesterday was the Fourth of July, I found myself in a headspace to think about 1776 and the American Revolution, so to me the most glaring aspect of “Robespierre’s America” is the way it so perfectly recapitulates—albeit implicitly—certain fallacies not just about the American Revolution, but revolutions generally.

I tend to be leery of revolutions, given that history teaches us that, the more extreme they are, the more they tend to turn into versions of their own worst selves. Hence, the French Revolution devolves into the Terror; the Russian Revolution turns into Stalinism. The fact that the American Revolution did not transform into something equally pernicious has been cited as evidence of American Exceptionalism, which is at least partially true; but I would argue that the principal reason the American Revolution had a relatively placid aftermath (yes, a lot of Loyalists were persecuted, often egregiously, but that hardly compares to 1790s Paris) is that nothing really changed. The radicalism of 1776 wasn’t that of material effect, but of promise—not what actually changed on the ground, but what could possibly change in the future.

For all intents and purposes, there were no upheavals in American life after the Declaration of Independence (well, aside from the war itself), by which I mean that the people in charge stayed in charge, and the power structures of the new United States were not appreciably different from the power structures of the Colonies. The King was not beheaded; the King was not even dethroned. George III basically had his status as absentee landlord revoked.

Hence my thought about Hamilton and Rent: the Boston Tea Party was basically a defiant gesture saying “We’re not gonna pay rent! Rent rent rent rent rent!”, as was the conflict that followed, and that defiant gesture is celebrated today as it was then. But after turfing the Brits, you bloody well better believe you’re paying your rent to the new owners.

By contrast, the French Revolution was about the radical overthrow of extant power, power so rooted in history, religion, and tradition that it went by the name of the ancien regime. And because of the weight of that history, it took decades to stabilize, something exacerbated by the fact that the rest of Europe was undergoing similar political upheavals. Is it any wonder that, mere years after guillotining the king, France had an Emperor?

(All of this is very broad strokes and probably has my historian friends pulling their hair out.)

As I said above, the true radicalism of 1776 wasn’t about the founding fathers’ present moment, but about the future—about what the principles of the Constitution and Bill of Rights could and can do when they become uncomfortably unavoidable. I’d argue that the true American Revolution—which is to say, the truly revolutionary moment in American history—wasn’t 1776 and the aftermath, but the Civil War. The confederates might have been the rebels, but Lincoln was the revolutionary, insofar as that the abolition of slavery overturned a foundational basis of American society. No such upending occurred in 1776, and the principle of revolutions turning into their worst selves has been painfully present in the U.S. since Andrew Johnson reversed all of the provisions made for newly freed slaves during Reconstruction, and white people in the South embarked on a sustained campaign of terror against them.

(To say nothing of everything that has happened since then, which I can’t do justice to here. If you haven’t already, read Ta-Nehisi Coates landmark essay “The Case for Reparations”).

Stephens’ opposition of “the spirit of 1776” to “the certitudes of 1789” completely glosses the material circumstances of both. The Revolutionary era of America comprises one of the most astounding argumentative ferments of history, with the debates over democracy, individual rights, proper governance, the best ways to defy and prevent tyranny, and myriad other considerations, taking place in taverns, drawing-rooms, the streets, and, most importantly, in print, with pamphlets and newspapers flying back and forth in paper fusillades. It was a period that evinced precisely the kind of civic engagement to which we should aspire, but always with one crucial caveat in mind: it was the provenance of what we today call privilege, and it has largely remained so ever since. The irony of Stephens’ longing for the “spirit of 1776” as inspired by having been savaged on Twitter, is that had the spirit of that era been as inclusive in practice as it was in principle, we might not be experiencing quite the same polarization today. Stephens’ Twitter Jacobins aren’t analogous to Robespierre, but to the citizens who stormed the Bastille: people finding a voice, voice which had previously been denied to them, through newly available means.

Speaking of revolutions that turn into their worst selves: the tech and digital revolution, specifically the rise of the internet, was heralded by many in the early-mid 1990s as a utopian shift in human connection and collective knowledge; quarter of a century later, we can see clearly how, even where some aspects of that dream have been realized, the benefits are ambivalent at best. But one key element of digital culture is that it has eroded the prominence of traditional gatekeepers of public discourse in print and visual media, allowing for a host of other platforms online or in social media. These platforms give voice to people who long went unheard, and it should not come as a huge shock that a lot of these voices are angry. It is difficult to try and make the case for “the spirit of 1776” to groups of people for whom, historically, that place within spirited public debate was never an option.

I have to believe, however, that that particular spirit isn’t dead, and if the Bret Stephens of the world would pay closer attention to the nuanced and thoughtful arguments unfolding both in “legacy” media and the new, insurgent spaces (and less attention to Twitter), they might be less convinced that there’s a tumbrel waiting for them. Of course, that’s likely a futile suggestion: more likely, it is precisely the growing presence of previously marginalized voices that threatens them and gives rise to the spectre of a guillotine with their name on it.

Leave a comment

Filed under history, politics

Ranking the Democratic Candidates. Also, What Job They Should Have.

I cannot watch political debates without playing out in my mind what I would have said. Long before the first debates between the painfully swollen field of Democrats, I’d composed a line in my head that went something like this: “I want to say for the record that everyone here on this stage with me is extraordinary, and it gives me great hope for our future that so many talented, intelligent people are vying for the nomination. And if I may add, I make a promise here: if I am so fortunate as to earn the nomination for the presidency, you can bet that everyone on this stage with me will have a role in my administration.”

Of course, there’s a certain amount of bullshit packed into that platitude: I would be deeply suspicious of anybody, for example, who employed Marianne Williamson. But in broad strokes, I think that sentiment works. What I’ve listed here is my ranking of the Democratic candidates, in order of my preference, but also with the jobs I think they should have going forward.

debate

1. Elizabeth Warren and Kamala Harris: President.

Yup, these two are in a dead heat for me. Prior to the first debates, I was totally Team Warren. I still mostly am, but watching Kamala Harris vivisect Joe Biden was a good reminder of her intellect and, perhaps more importantly, her killer instinct. I’m now of the mindset that, even if Harris doesn’t get the nomination, she should still debate Trump: because that is something we all need to see.

I love Elizabeth Warren and have loved her since the first moment I saw her interviewed. She makes billionaires’ bowels turn to water, and in the present moment, that’s a great thing. She’s fearless, she’s brilliant, and she loves a good fight. I think the most endearing moment for me of Hilary Clinton’s campaign was when Hilary got excited over a question posed during one of the debates, and gave a delighted smile and a little shoulder wiggle. That few seconds is Elizabeth Warren ALL THE TIME. She’s basically Hermione Granger as a presidential candidate.

I do not, however, see them sharing a ticket. I think that if it ends up being President Harris, Elizabeth Warren needs to be either Treasury or Commerce secretary—ideally with the Consumer Protection Bureau once again under her aegis. If it’s President Warren, then Kamala Harris needs to be Attorney General. That’s just science.

2.Pete Buttigieg: Vice President, or, conversely, Governor of Indiana.

Mayor Pete has been a breakout candidate, largely due to the fact that he’s hellishly impressive. He’s also a wee bit callow and unseasoned, and needs time in an office not oval-shaped to grow into his potential. At the age of thirty-seven, he has an awful lot of years to do so. Practically speaking, I’d like to see him parlay his newfound visibility into a gubernatorial run, which would benefit the Democrats more than almost anything else he could do. On the other hand, the prospect of watching him debate Mike Pence almost overrules practical concerns for me.

3. Cory Booker: Attorney General? I guess? Or possibly VP?

Cory Booker’s an odd figure, for me … I always want to be more impressed with him than I am. He’s a compelling person with an inspiring message, but he lapses too often into vague appeals to love. It’s not that I don’t find that inspiring, it just makes me wonder what’s going on behind the curtains. Early on, I thought of him as Barack Obama’s heir apparent—a telegenic African-American man with a general message of positivity, but he lacks Obama’s gravitas, and Obama’s obvious grasp of the more granular aspects of policy and history.

4. Amy Klobuchar: stay in the Senate.

Before his ignominy, I listened to Al Franken’s book Giant of the Senate on audiobook, and one of the key take-aways was just how impressive Klobuchar is. This was well before she was bandied about as a presidential possibility—indeed, at the time Franken was considered a more likely candidate—so when she rose to prominence during the Bret Kavanaugh hearings, I already felt like I had a good sense of who she was. The picture painted in Franken’s book is of a frighteningly competent legislator. I would not object to her nomination as candidate, but my general sense is that she does enormous good where she is (reported temper tantrums with he staff excepted).

5. Julian Castro: Secretary of Homeland Security.

Julian had a good debate night, and I quite like him. I don’t think he has any traction for the big job, but he’s obviously talented, ambitious, and very smart. I’d be happy to see him run for Senate or take on the task of repairing all the damage Ben Carson’s done at his old post as HUD secretary, but his powerful words on immigration during the debate make me think he might be just the right person to fix all the shit perpetrated by the current administration, starting with the radical reformation or outright abolition of ICE.

6. Bernie Sanders: take one for the team and retire.

Left-wing American politics owes a massive debt to Bernie Sanders: his insurgent challenge to Hilary Clinton in 2016 did more to move the center of political gravity leftwards than anything since FDR. Let’s keep in mind that nothing Bernie proposes is genuinely “radical” or even technically socialist, but tends to conform to the status quo of most of the democratic world. He recently outlined the ways in which his self-applied label of “socialist” applies, but ultimately what he described makes it clear he’s really a New Deal liberal. Which is not of course a problem, except that it highlights the degree to which he relishes his outsider status and relies upon a combativeness that he substitutes for policy substance.

He’s a brilliant rabble-rouser, but would make a terrible president.

7. Kirsten Gillibrand: stay in the Senate.

I have always been underwhelmed by Gillibrand, and continue to be. I think her most useful role is to stay right where she is.

8. Joe Biden: respectfully: please just go away. I love you. But seriously.

I was once at the gym, listening to a podcast that replayed, in its entirety, a speech that Biden delivered to an audience of military families who had had relatives killed in Iraq and Afghanistan. He went off-script in the first few minutes, sharing with the people there his own story of grief, of how his wife and child were killed in a car crash just after he had been elected to the Senate. Come on, people, he said, his voice getting husky, commiserating with them how useless and impotent others’ expressions of grief—however well-meaning—are in the face of such enormous loss.

As I said, I was at the gym as I listened to this, and had to stop what I was doing and face a wall to hide the fact that tears were streaming down my cheeks. And I thought to myself: this is a politician? I had not understood Obama’s decision to go with Biden until that moment, and I have had an abiding love for the man ever since.

But. He might have been a useful and possibly necessary balance to Obama’s cool, but everything Obama brought to the office (i.e. the main reason Biden is still leading the polls), Biden lacks. Even leaving aside his legislative baggage and lack of message discipline, the very premise of his candidacy—that Trump is an aberration and he can return comity to Congress—is, or should be, disqualifying. For one thing, it suggests he wasn’t paying attention during his eight years as Obama’s VP, when congressional Republicans turned themselves into unrepentant obstructionists. His problematic callbacks to halcyon days of cooperation are bizarrely amnesiac.

His choice to make his campaign all about Trump is similarly obtuse. The biggest threat liberals and leftists face—in terms of their own thinking—is to imagine that any one person is the problem, whether it be Trump in the U.S. or Doug Ford in Ontario. Simply removing Trump from office doesn’t return us to a prelapsarian state of bliss and balance. Anyone who doesn’t grasp the fact that Trump is the symptom and not the disease needs to take a powder.

9. Jay Inslee: Secretary of Climate.

So far there has been distressingly little discussion of the climate crisis among the Democratic candidates. If we’re being charitable, we can chalk that up to the fact that there’s probably a consensus that it is a crisis and requires significant governmental action, and hence the candidates understandably choose to put their focus elsewhere. If we’re being uncharitable—which I think is the wiser choice—they’re avoiding the issue because practical solutions lose traction with voters the moment they understand what the cost will be. People want action on climate in the abstract, but become far more reluctant when it means paying more at the pump.

I like that Jay Inslee is in the race as a single-issue candidate. My sense is that he knows he has no chance, but he’s determined to make everyone pay attention to his issue. Good on him. I hope he sticks it out as long as he can, and forces the front-runners to speak to his issue. Hopefully one upshot is the creation of a new cabinet position: a secretary dedicated to climate solutions.

10. Andrew Yang: Secretary of Tech.

Climate is one issue that deserves its own cabinet enclave; the tech industry is another.

One of the things that has become painfully obvious in the past few years is that the tech industry has completely outstripped government’s capacity to understand it. Some of the most cringe-inducing moments of political theatre in recent memory involved septuagenarian lawmakers asking inane questions of people like Mark Zuckerberg. The key part of the problem is how few people—both within government and without—genuinely understand the nuances of Silicon Valley. Elizabeth Warren’s plan to break up such monoliths as Facebook and Amazon is a pretty good start, but I’d say it’s past time there was a part of government solely dedicated to the tech industry, staffed with people who actually understand its ins and outs, but also—and this is a crucial thing—aren’t acolytes of its utopian promises.

Is Andrew Yang that person? Quite possibly. Whenever I see him interviewed, I find myself nodding along to a lot of what he says, while also thinking to myself that he would be a catastrophic president. Like Jay Inslee, he’s too much of a single-issue guy, but has obviously thought long and in great depth on that issue. He’s a tech dude who’s obviously developed a healthy skepticism about tech, which is the kind of thing the world badly needs.

11. Beto O’Rourke: honestly? I don’t care.

This bit is actually an edit, as I forgot about Beto on my first go-around. I think he’s more impressive than most of the field of bland white guys, but at this point? Not by much. He did a great job campaigning against Ted Cruz, and mobilizing a moribund progressive electorate in Texas, but he hasn’t shown much substance since throwing his hat in the presidential ring.

12. Tulsi Gabbard: Secretary of Defense

Hear me out on this one: she’s a veteran, and made her anti-war sentiments quite plain during the debates. Possibly someone who could shake up the Pentagon in ways it dearly needs. She wouldn’t be my first choice for SecDef, but it would be far preferable to have here there than in the Oval.

13. Miscellaneous white men (Tim Ryan, Bill De Blasio, John Delaney, John Hickenlooper, Michael Bennet, Eric Swalwell, Joe Sestak, Steve Bullock, Seth Moulton, Wayne Messam): RUN FOR SOMETHING OTHER THAN PRESIDENT.

I don’t like lumping all of these guys into a single undifferentiated category, as it’s obvious many of them have talents and intelligence not so obviously on display in such a crowded field, but SERIOUSLY. Democrats have been living the nightmare of having focused so specifically on presidential races for too long. Who’s in the Oval Office matters less and less depending on how many senators, House representatives, and governors—to say nothing of the composition of state legislatures—are the opposing party.

 

14. Marianne Williamson: You’re perfect where you are, don’t change.

Honestly, people: can we in all good conscience allow such a sensitive soul to inhabit the punishing office of the presidency?

Leave a comment

Filed under politics

Jordan Peterson’s “Identity” Fallacies

pride-flag

Happy Pride Month, everybody! And what better way to celebrate Pride than with a screed by Jordan Peterson in the National Post?

Ugh. Sorry. Bad joke. But still, he has resurfaced and written a column so chock-a-block with Petersonian fallacies that I really couldn’t do anything else than write a post about it.

What ostensibly inspired Peterson to write this was a piece of reporting by Barbara Kay about a case involving a six-year-old girl whose teacher apparently taught a series of classes on gender fluidity and gender identity, and caused the girl distress when she asserted that there was no such thing as gender, no such thing as boys or girls.

Honestly, I don’t know what to make of that story, except that it feels a little hinky, and I habitually take anything Barbara Kay says or writes with a grain of salt. Leaving aside for a moment the question of its substance, it’s safe to say Jordan Peterson knew precisely what to make of the story. What his column basically says is this is what I’ve been telling you, people!, i.e. that postmodern neo-Marxist gender theory is dangerous and will lead to psychological distress in society at large.

He ends up writing what we might call a reverse-Wente. Where Margaret Wente’s modus operandi is to cherry-pick a story that reflects badly on leftists and extrapolate out from some isolated incident to bemoan the general idiocy and moral bankruptcy of liberalism, what Peterson does in his column is save the story of the distressed little girl to the end, after he reiterates his arguments against the inclusion of “gender identity” and “gender expression” in legislation pertaining to discrimination laws and hate speech.

You all of course remember that time! September, 2016—Brexit had happened, but Trump wasn’t yet president, and a U of T psychology professor was vaulted from relative obscurity to alt-right superstardom by railing against Bill C-16 and refusing, loudly and often, to refer to his students by their preferred pronouns.

How innocent we were back then.

But about his reverse-Wente: Peterson spends the first two-thirds of his column reiterating what by now is essentially boilerplate for him, and comes to the Kay piece as a vindication for his earlier extrapolation.

What’s interesting to me is not the Kay piece, but Peterson’s boilerplate. I have, for a variety of reasons (self-loathing and masochism not least among them), read an awful lot of Peterson’s work—12 Rules For Life and (gods help me) large swaths of Maps of Meaning—and watched about as much of his YouTube lectures as I can stomach. So when I read his column, it was not at all unlike reading a Sparks Notes summary of his, well, everything.

I’m not going to link to his column, but I do reproduce most of its text below. I go through bit by bit, parsing what he says and offering my own perspective. It’s my Pride gift to everyone: I read and respond so you don’t have to.

Peterson opens by reminding us of Bill C-16 and his initial response to it, and then asserts that the most basic problem with the contemporary conception of identity, ostensibly articulated in C-16, is “that ‘identity’ is something solely determined by the individual in question (whatever that identity might be).” This is a dangerous notion, he says darkly, one that not even sociologists agree with, as they know that “identity is a social role, which means that it is by necessity socially negotiated.”

OK, so before we get into it, it’s important to emphasize the distinction he’s making, because everything that follows is more or less predicated on it. First, actual identity is a socially negotiated thing, whereas the delusional conception of identity as apparently promulgated by C-16 and “the tenth-rate academic dogmas driving the entire charade” emerges entirely as the sole product of an individual’s whim.

Here’s the thing, and I say this as a paid-up member of the postmodern neo-Marxist club: Huh? Did I miss that section of reading on my comprehensive exams? Because I’m pretty sure that all the theorists and philosophers of note who comprise the pantheon of Peterson’s hated postmodernists are all pretty much in agreement that individual identity is a product of negotiation, of power relationships, of performance, or, to use the term coined by Louis Althusser (who, now that I think about it, might actually have been a postmodern neo-Marxist), “interpellation”—i.e. the process through which the individual is “hailed” by various ideological state apparatuses (e.g. school, family, church, etc.) and forms an identity through these interactions.

The other thing to keep in mind going forward is how slippery Peterson’s prose is. Basically, this column is a repetition of his anti-transgender sentiments. He doesn’t of course say as much, but that’s what forms the substance of his complaint: people who have the selfish temerity to identify as a gender they weren’t born with, or to reject a gender distinction at all.

All set? Let’s dive in.

Your identity is not the clothes you wear, or the fashionable sexual preference or behaviour you adopt and flaunt, or the causes driving your activism, or your moral outrage at ideas that differ from yours: properly understood, it’s a set of complex compromises between the individual and society as to how the former and the latter might mutually support one another in a sustainable, long-term manner.

OK, first of all: can it be more obvious that Peterson is writing this screed during Pride Month? Referring to “fashionable sexual preference” and “behaviour you adopt and flaunt” is really just a more elevated way of castigating the very deliberate and glorious excesses of Pride—another way of phrasing the old complaint “do you have to shove your sexuality in our faces?” Also, let’s parse this for its most telling words: “fashionable” and “adopt,” both of which suggest that queer identity has more to do with individual whim than anything emerging from personal struggle and pain.

That being said, I wonder if Peterson is aware of just how postmodern this formulation is? (Spoiler alert: probably not). The thing is, I have to imagine that he thinks the first part of what he’s saying here is entirely representative of postmodern thought. But really, nobody—at least, nobody with any intellectual credibility—is arguing that identity resides absolutely within a solipsistic conception of self. What he then goes on about—identity as a negotiation between self and society—is actually a central component of what gets blandished as “identity politics,” the central premise of which (insofar as it has a central premise) is not that identity is wholly subjective, but that it is not determined by any absolute or extrinsic principles.

By contrast, Peterson’s own unreconstructed Jungian psychomythic conception of identity, as outlined in Maps of Meaning and 12 Rules For Life, specifically suggests a sense of immutable identity, mostly rooted in gender. Although we (like the noble lobster) might interact with our culture and society and forge identity by way of pitting whom we want to be against whom we are by way of whatever unpleasant or uncomfortable realities we might face, all we’re really doing in these agonistic sagas is playing out the timeless conflict between order and chaos. Peterson’s antagonism to questions of transgender identity specifically and feminism more generally becomes more comprehensible once one grasps this basic premise, which Peterson argues through an odd grafting of myth-criticism and biology.

He then goes on to say:

It’s nothing to alter lightly, as such compromise is very difficult to attain, constituting as it does the essence of civilization itself, which took eons to establish, and understanding, as we should, that the alternative to the adoption of socially-acceptable roles is conflict — plain, simple and continual, as well as simultaneously psychological and social.

We start getting into typical Petersonian verbiage here, so let’s start with the first assertion: “It’s nothing to alter lightly.” The “it” of this statement is one’s identity, which in the broader context of his column refers most specifically to one’s gender identity. And if I may say: I agree with Peterson completely on this point. I will hazard a guess that everyone who has struggled with this issue would also agree. There’s an awful lot in this column with which to take issue, but one of the most galling things is the casual suggestion running through it—which runs though most of his arguments on transgender identity—that people who come to identify as a gender other than their birth assignation, or who identify as gender non-binary, do so “lightly.” That it is akin to a “fashionable sexual preference” which one “adopts” for trivial or selfish reasons.

This is always where Peterson and his ilk lose me. (To be certain, they lose me much earlier, but it’s on this point that I can no longer see the taillights of the car and all I’m left with is blessed silence and the stars). I think there are reasonable arguments to be made about speech codes and the excesses of political correctness, but what we’re on about here is basic empathy and compassion. “It’s nothing to alter lightly”? No fucking shit. Show me, please show me, the person who comes out as transgender who hasn’t gone through the emotional and psychological wringer to arrive at the point where they declare to the world who they actually are. That person in your classroom asking you to refer to them by their preferred pronoun didn’t arrive at that request on a whim.

But to move on to the rest of the quoted passage: this is all so very characteristic of Peterson’s prose. Which is to say, it is convoluted and vague, and rooted in a mythic-historical sensibility that makes sweeping pronouncements on the nature of humanity and civilization, and which tends to crumble under scrutiny. If you’re not familiar with Peterson’s pseudo-scholarly schtick (which, make no mistake, totally informs his public persona schtick), he’s basically, as I say above, an unreconstructed Jungian—a myth-critic in the mold of Joseph Campbell and Mircea Eliade. What he says here in his confused, run-on sentence, is typical of his worldview. To break it down:

  • the “essence of civilization itself” resides in the stability of male and female identity
  • this stability? dude, it took EONS, hence has the authority of ANCIENT HISTORY
  • also, this “stability” comprises “socially acceptable roles,” i.e. men and women knowing their place
  • the “alternative” to these “socially acceptable roles” is conflict; which is to say, when men and women forget their roles (but, really, it’s mostly women), society devolves into chaos

(Let’s be clear on something: this is my reading of Peterson’s words based on my reading of his many, many other words, but there’s a method to the madness of his prose. His vagueness and indeed obscurantist writing invariably contains a rhetorical trap door. “You completely misunderstood me!” is his most common riposte when anyone tries to pin him down on anything he says or writes. “‘It’s nothing to alter lightly’ has nothing to do with trans identity,” I can easily hear him complaining, “I was merely referring to the broader currents of postmodernist neo-Marxist thought in society!” Imprecision is this man’s greatest friend).

To the degree that identity is not biological (and much, but not all of it is), then it’s a drama enacted in the world of other people. An identity provides rules for social interactions that everyone understands; it provides generic but vitally necessary direction and purpose in life. If you’re a child, and you’re playing a pretend game with your friends, you negotiate your identity, so the game can be properly played. You do the same in the real world, whether you are a child, an adolescent, or an adult. To refuse to engage in the social aspect of identity negotiation — to insist that what you say you are is what everyone must accept — is simply to confuse yourself and everyone else (as no one at all understands the rules of your game, not least because they have not yet been formulated).

Oh, my … so if identity is a “drama enacted in the world of other people,” does that then make it—oh, what’s the word—PERFORMATIVE? Is Peterson about to invoke Judith Butler?

Just kidding. Of course not—the point isn’t the drama, but the rules of the game. See how he switches the analogy up in the middle there? In order to play a game, we must agree upon the rules, yes; you can’t play a proper game of chess when your opponent decides to randomly change the moves the pieces can make, but that’s not what Peterson’s example evokes. Rather, the kind of “pretend game” he mentions is improvisational, and if the game is ruined because one player can’t stick to the provisional rules, it is also ruined when you have a bossy player who sucks the joy out of the game by refusing to allow any degree of improvisation and flexibility.

That’s where the aspect of negotiation comes in, a term whose meaning seems to have passed Peterson by. The suggestion being made in this analogy is that someone identifying as transgender or non-binary is being perverse in refusing to play by the rules, and instead play only by their own private rules, which Peterson then dismisses as being nonexistent anyway. What seems lost on him is the fact that someone saying “this is who I am” is precisely engaging with “the social aspect of identity negotiation.” It is, in fact, an act inviting a re-negotiation of the “rules.” The problem with what Peterson argues here isn’t the idea that social negotiation of identity is a matter of give and take, it’s that ultimately Peterson refuses to give. He’s the kid taking his ball and going home because he’s upset the gang decided to let a girl play.

Peterson then goes on to list four increasingly dire consequences of individuals asserting identities at odds with normative “rules.” We’ll break those down one at a time, but first we need to address his prefacing assertion: “The continually expanded plethora of ‘identities’,” he writes, “recently constructed and provided with legal status thus consist of empty terms.” Again, imprecision is a hallmark of Peterson’s writing: one is left wondering what “the continually expanded plethora of identities” is, because he never specifies. One assumes he’s referring to the spectrum of LGBTQ+, but he doesn’t say. If, as seems borne out by the substance of the column, his preoccupation is with transgender and gender-fluid identities, then it’s hardly a “plethora”—it’s male, female, and non-binary. I suppose that, theoretically, these three create a spectrum with an infinitude of points between its poles, each representing a possible unique identity, but then we get into a variation on Xeno’s Paradox. Just as the hare will always catch the tortoise, so we know, commonsensically, that people are people.

Also, pay attention to the weasely “thus” thrown in there: based on what he has said so far, this “plethora” of identities “thus consist of empty terms.” I have to imagine he feels he has proven his point and earned his “thus,” but I beg to differ.

At any rate, he then enumerates the problems including this putative proliferation of identities in C-16 will cause.

(1) [They] do not provide those who claim them with any real social role or direction.

Remember, what he’s talking about is the inclusion of gender identity and gender expression in the legal questions of discrimination and hate crimes. The legislation wasn’t about giving transgender and gender non-binary people social roles or direction, it was about protecting them from the actions of others.

(2) [They] confuse all who must deal with the narcissism of the claimant, as the only rule that can exist in the absence of painstakingly, voluntarily and mutually negotiated social role is “it’s morally wrong to say or do anything that hurts my feelings.”

“The narcissism of the claimant.” Here it is again: the premise underlying Peterson’s entire argument in this piece is that trans identity isn’t real. Therefore, anyone identifying as anything other than that signified by their genitalia at birth must be an unserious and selfish person choosing an alternate identity for reasons passing imagination, with no consideration for the confusion it causes in the innocent bystanders upon whom they inflict their petulant demands for recognition. Because they’re the real victims.

I’ve read enough of the science on this subject to accept that there are real genetic and biological underpinnings to being transgender, but really all I need to do to accept and respect somebody’s gender expression is answer a commonsensical question: considering the social stigma, the hatred, and the real danger of violence facing the transgender community, why would anyone choose to so identify for reasons other than for a deeply felt need to be true to oneself? Peterson frames his opposition of using people’s preferred pronouns as a question of free speech, but in reality it articulates a profound lack of empathy. Know that when you are faced with someone asking that you use a pronoun that seems wrong to you, that person has endured a probably traumatic struggle to arrive at the point where they can voice the request.

Also, not for nothing, but nobody who has this as his author bio should be casually accusing others of narcissism:

JP-bio

(3) [They] risk generating psychological chaos among the vast majority of individuals exposed to the doctrines that insist that identity is essentially fluid and self-generating (and here I’m primarily concerned about children and adolescents whose standard or normative identity has now merely become one personal choice among a near-infinite array of ideologically and legally defined modes of being).

Psychological chaos? Seriously? Seriously. This makes about as much sense as the old chestnut that exposing children to depictions of gay people will somehow turn them gay. Acknowledging and respecting alternative identities and challenging traditional repressive figurations of sex and gender isn’t about to destabilize “the vast majority of individuals”—except perhaps those who incorrectly see their protected status as straight white men threatened. (Considering how many of those dudes probably bought 12 Rules For Life, Peterson might not want to complain too much).

Also, let’s keep some perspective on the size of the issue. Transgender, gender-non conforming and non-binary people comprise a tiny fraction of the population, and they suffer disproportionately from violence, sexual assault, and suicide. As a white cishet man, my own quality of life and my own sense of self does not suffer from the presence or visibility of trans people. It behooves those of us with such privilege first to acknowledge it, and secondly to listen and learn. I have, to the best of my knowledge, known six people identifying as transgender. Six people—in my life. So let’s be real here: they’re hardly storming the Bastille, which you would never know from the edge of hysteria in Peterson’s warnings.

(4) [They] pose a further and unacceptably dangerous threat to the stability of the nuclear family, which consists, at minimum, of a dyad, male and female, coming together primarily for the purposes of raising children in what appears to be the minimal viable social unit (given the vast and incontrovertible body of evidence that fatherlessness, in particular, is associated with heightened risk for criminality, substance abuse, and poorly regulated sexual behaviour among children, adolescents and the adults that they eventually become).

All right. You know what, folks? I’m done. This is some Focus on the Family shit he’s now getting into. All I’ll say about this particular head-smacker is that at least it pulls the curtain briefly aside: as I observe above, Peterson frames his anti-trans and anti-feminist rhetoric as being about freedom of speech, railing against the PC left and SJWs for their ostensible attempts to impose Sovietesque speech codes on everyone. But at the heart of it all is the stern 1950s dad persona he has cultivated, and much of his popularity proceeds from nostalgia for a time when white, straight men’s centrality wasn’t questioned or characterized as “privilege.”

***

I’ll end with a message of love. To all of my queer friends: I am in awe of you. You are the embodiment of strength. I hope your month of Pride is fabulous and remains undimmed by such assholes as Peterson or the hate-mongers who disrupted the festivities in Hamilton. I am with you, and I will go with you.

Leave a comment

Filed under wingnuttery

The Politics of Meanness

The word “mean” is typical of the glorious clusterfuck that is the English language, insofar as that it wears many hats. Generally speaking, our first encounter with the word was probably to sound a note of wounded complaint: someone was being mean to us. “Stop being mean!” “He’s such a meanie.” And so on. As our vocabularies grew, we developed a more nuanced quiver of words that spelled out the spectrum of what being “mean” might be, distinguishing between thoughtlessness, selfishness, cruelty, spite, or just general assholery.

But “mean” has its own subtleties as well, connoting not just cruelty but a certain kind of small-mindedness. To be a mean person can entail a sense of willful ignorance, especially ignorance of the value of the intangible or ephemeral. It can also connote a lack of generosity or compassion, the short-sightedness of NIMBYism or the inability to see value in anything that does not yield immediate benefit. To be mean is to dislike seeing others benefit. To be mean is to lack empathy.

I’m ruminating on this semantic question because it helps articulate something about our present moment, which is a moment in which the politics of meanness threatens to become the status quo.

doug-ford

I lived in Ontario for the first thirty-three years of my life; I went to university and grad school there, and now work as an educator. Which means that a significant proportion of people with whom I’m friends on social media live in Ontario and work in education at all levels, from grade school through high school, at colleges and universities and in libraries. What this further means is that for the past year my news feed on Facebook has consistently featured friends’ anger, incredulity, and despair at whatever indignities Doug Ford’s government has recently inflicted on Ontario’s education system.

To many who lived through the 1990s, it feels like déjà vu, a terrible throwback to the Mike Harris years and the assault perpetrated on the educational system by his minister of education John Snobelen—a man who had no background in the field, and came into the job determined to “create a crisis in education.” I felt very keenly the effects of Snobelen’s high-handed and contemptuous treatment of teachers and schools, as my father was a grade school principal at the time. What was worst to watch was my father’s mounting bafflement as Harris and Snobelen went through the education budgets like buzzsaws, with little to no concern for the effect their cuts had on students. It was particularly hurtful for my dad because he and my mom had both voted conservative in that election, buying into the rhetoric of Harris’ “Common Sense Revolution” and the promise to right the fiscal ship after what they saw as the New Democrats’ feckless mismanagement. I’m probably cutting myself out of the will by revealing this, as the years of Harris’ regime cured them of their conservative leanings, which had at any rate always been more about financial conservatism. Their conservative party had always been that of Bill Davis and Joe Clark, the sort of avuncular Tories whose politics a left-wing type might disagree with, but whose compassion and public-spiritedness was not in doubt. What my parents didn’t grasp until it was too late—what my father learned particularly acutely—was that Harris and his ilk embodied a politics of meanness. My dad’s bafflement at John Snobelen’s evisceration of the educational system was the confusion of a dedicated educator who could not understand the rationale behind the cruelty of the cuts. It was only after months of the Harris government’s sustained assault that he came to understand that the cruelty was the point. Harris and Snobelen hated teachers, were in fact antagonistic to the very idea of education more broadly, and the “Common Sense Revolution’s” project of budget-slashing austerity was a very blunt tool for carrying out a mean-spirited revenge, and ultimately drove my father into early retirement.

Fast-forward to the present moment. Doug Ford is basically Mike Harris on steroids, but lacks even the patina of ideological veneer that informed Harris. Everything he has done since taking office has had the quality of a bully’s taunt. Like Donald Trump, he revels in being antagonistic, and his most devoted followers love nothing better than enraging liberals, leftists, and “elites”—this last term which has come to connote not social status but a kind of attitude, in which a millennial with an MFA in creative writing earning minimum wage qualifies, but not the premier who was born heir to millions and spent the better part of his professional life in the corridors of political power. Ford and his followers really might as well change his slogan from “For the People” to “NERRRRRRDS!” That would at least be more honest in terms of his policy preoccupations, to say nothing of his general disposition, personality-wise.

Whatever complaints one may have about the Liberals’ long tenure from 2003 to 2018 under Dalton McGuinty and then Kathleen Wynne, the province’s investment in education during this period saw great dividends, with high school graduation rates rising from 68% to 86.5%. That fifteen-year interval tells its own story, namely that these changes take time and diligence, and also that the greater effects are likely always going to be intangible. Speaking as a professor in the humanities, I’m all about the intangibles: getting a degree in English or languages or philosophy or history doesn’t train you for a specific job, necessarily, but there is an innate value to learning to read and write critically. There is an innate value, also, to taking drama in high school, or learning an instrument, exploring your creativity, or just opening your mind up to new ideas and stories.

empty-class

Unfortunately, it is always these programs—music, art, drama—that tend to be the first on the chopping block when budgets are slashed, as they are not seen as “useful” topics. I often ask my students how many of them, when they answer “English” to the question “what are you majoring in?” receive one of two responses: either “what kind of job are you going to get with that?” or “so … you’re going to be a teacher?” The response is pretty much always 100%. Since I first started grad school, there have been more and more articles, columns, and think-pieces by prominent businesspeople, tech moguls, and the like, all pleading with universities to stop cutting humanities programs, as these courses of study produce graduates with precisely the kind of communicative skills and creativity otherwise lacking in industry (most recently it was Mark Cuban, predicting that a degree in philosophy will soon enough be more valuable than one in computer science). And yet the predominant administrative priority, both in secondary and postsecondary education, resides in expanding STEM programs.

Which brings us back to the politics of meanness. Doug Ford and his ilk may be mean in that original sense we all learned as children when someone was cruel to us, but they have also weaponized the sense of the word as “miserly, stingy; not generous” (as denoted in the Oxford English Dictionary), both literally and spiritually. Perhaps the epitome of this sensibility was the absurd claim made by Ford’s education minister Lisa Thompson when she announced that average class sizes in Ontario would increase from 22 to 28. When asked about the deleterious effects of larger classes, Thompson suggested that it would make the students “more resilient,” as if a smaller fraction of the teacher’s attention taught toughness of spirit.

I remember quite vividly a question posed by someone during the worst depredations of Mike Harris’ government: do you remember who your MPP was when you were ten years old? Or do you remember who your teacher was? Our education system isn’t perfect—what would that even look like?—but it has a profound effect on literally everyone. Starving it of resources is, well, mean, both in the sense of being short-sighted, and being and cruel.

2 Comments

Filed under politics

Thoughts on D-Day and Generational Memory

When Tom Lehrer was asked why he quit doing political satire, he famously quipped, “Because Henry Kissinger won the Nobel Peace Prize.” Translation: where do you go from there? What kind of parody or satire can rise to the level of the architect of Pinochet’s coup in Chile and the secret bombing of Laos and Cambodia being lauded as a peacemaker?

If the years since Lehrer’s quip have taught us anything, it’s that metaphors of bars being lowered and new depths being plumbed no longer work. There is no bottom, and new normals will always provide a basis for ironic, satirical critique—even if that critique comes to feel more and more like affectless laughter in the dark. Since Kissinger’s peace prize, a B-movie actor was elected president, a subsequent president was essentially impeached for getting a blowjob, and the Terminator was elected governor of California … and that only brings us up to 2003. The fact that a critical mass of liberals would probably be happy to swap Donald Trump for either Reagan or Schwarzenegger both speaks to the fact that they had depths belied by the prior entertainment careers, but also how far down the political slope arse-first we’ve slid.

(Just as an aside: I will maintain to my dying day that Saturday Night Live missed a golden comedic opportunity when, apropos of Schwarzenegger’s re-election campaign, they did not stage a skit in which the Governator debated political opponents Sylvester Stallone and Jean-Claude van Damme).

trump-d-day

All of this is by way of saying that, if Kissinger’s peace prize was what drove Tom Lehrer out of political satire, I wonder what he makes of the spectacle of President Donald Trump, he of the bone spurs and dictator-envy, speaking solemn words on the 75th anniversary of the D-Day landings. The layers of irony are thicker than the Burgess Shale: a president whose slogan “America First” was originally used by isolationists and Nazi sympathizers like Charles Lindbergh, who wanted to keep the U.S. out of the war; a president who has consistently attacked NATO and the European Union, both of which were established with the express purpose of preventing another war in Europe; a president who has refused to condemn neo-Nazis and white nationalists, and whose presidency has indeed proved to be a clarion call emboldening the racist and anti-Semitic right; a president whose racist populism has been mirrored in the rise of comparable alt-right groups in France, Germany, Hungary, Poland, and in the viler strains of Brexit rhetoric; a president who loves the idea of military thuggery but seems incapable of recognizing honour and sacrifice, who is so thin-skinned that his aides panicked at the thought of him seeing the name “John McCain” on a ship or its sailors’ uniforms; a president who is even now poised to pardon actual war criminals; a president who, sitting mere feet away from the graves of American war dead, petulantly smears the name of Robert Mueller, a decorated veteran; this president recites the prayer delivered on D-Day by Franklin Roosevelt—a president whose legacy is the antithesis of everything Trump embodies—and speaks some boilerplate platitudes before returning to his golf course in Ireland.

I used to get outraged at George W. Bush’s blithe ignorance, but that was before I knew what was coming: first Sarah Palin as a potential VP, but then Trump himself, someone not just ignorant but functionally illiterate. I’m hardly a monarchist, but I do admire Queen Elizabeth’s capacity to deliver a diplomatic fuck-you, as she did in her choice of gift for Trump: a first edition of Winston Churchill’s history of WWII, something entirely appropriate for the occasion, but also painfully discordant with this president’s aggressive, ahistorical ignorance. Back in the halcyon days of late 2016, such a gift might have encouraged the naively optimistic—those poor souls who fervently wanted to believe that assuming the office would transform Trump—to hope that Trump would read and learn. But that was then and this is now, and so the subtler insult of the gift—the Queen gave him the abridged edition—is lost in the mere fact that simply giving Trump a book, any book, is to draw attention not just to the fact that he doesn’t read, but to his arrogant incuriosity. The Queen could have given him a boxed set of the Harry Potter series and made the same point.

The Queen’s gift and the insult it delivers, sadly, is a potent symbol for the present moment, in which the felt history of WWII and its transformative effects on the 20th century have become abstract and mythologized. I teach a class on American literature after 1945, and I always begin with a lecture on the sea-change wrought by the Second World War. I ask my students: where do you think the U.S. military ranked, globally, in size and strength in 1939? My students are astute enough to recognize that, if I’m asking the question, there’s a trick in there somewhere. But they’ve also grown up in a world in which American military might is indomitable, and if they know anything about WWII, it’s probably through movies like Saving Private Ryan that depict the vastness of the U.S. war machine. So … Fourth? they say, tentatively. Fifth? A more audacious student might suggest tenth.

No, I reply. Nineteenth. And by 1945, a scant four years after Pearl Harbor, they were rivaled only by the Soviet Union. And then went the way of the rest of the century. Without a grasp of the war at mid-century, one cannot properly understand what came next, and indeed what is happening today.

American troops approaching Omaha Beach.

D-Day occurred 75 years ago, which means that the youngest person who stormed those beaches or parachuted in behind German lines would be 93 years old today (assuming they didn’t lie about their age). We’re on the cusp of losing the last of the Greatest Generation, and when the last of those people die, so too does the generational memory they carry. We’re already seeing the effects: there’s an awful lot of reasons for the rise of the alt-right, but baked in there is a cultural amnesia, a collective forgetting that isn’t just about the passing of the generation that fought WWII, but an erosion of historical consciousness. Ask any student of mine and they’ll tell you (presumably with an eye roll) that I reliably harangue pretty much every class I teach at some point about the need to read history. The last few years I’ve taught Philip Roth’s novel The Plot Against America, an alternative history that imagines what would have happened if Charles Lindbergh, a Nazi sympathizer, had run against FDR in 1940 and won. I taught the novel once before, when I first started my job at Memorial, but it didn’t get much traction with students. Now, however, I assumed it would grip them with its eerie prescience: a story about a populist celebrity with autocratic and racist tendencies upsetting an establishment politician with the message “America First.”

But no. It did resonate with a few students, but overall the reaction was meh. A colleague of mine has also taught the novel a few times in recent years, and he reports much the same response.

In my very first year here at Memorial, I taught Martin Amis’ novel Time’s Arrow in a first year fiction course—a Holocaust novel that takes place with time reversed, the conceit being that only when witnessed backwards can the Holocaust be understandable. Backwards, it becomes a story of German munificence, in which they call down smoke and ash from the sky to create inert bodies, into which they breathe life and send them on trains out into the world. My argument in lecture was that Amis works to defamiliarize the narrative of the Holocaust as a means of combatting the way in which repeatedly hearing a story inures us to it, and reawake the reader to the pure unthinkability of the atrocity. I cited Art Spiegelman’s graphic novel Maus and Roberto Benigni’s film Life is Beautiful as texts that perform a similar function, but by that point the blank expressions on my students’ faces made it clear that defamiliarizing the Holocaust was a bridge too far when mere familiarity was lacking.

To be clear, I don’t blame my students. They have grown up in a culture that has de-emphasized history, both within the educational system and without, and terms like “Nazi” and “fascist” have more traction as online insults than as historical actualities. Millennials are understandably more preoccupied with the future, given that the realities of climate change mean they may not have one. But if the future is to be secured, it must needs be with global and internationalist solutions—we’re well past the point when nation-states can turn inward. The European Union was hardly a perfect construct, but it emerged from the recognition that the world would not survive another conflagration like WWII. Now that that memory has faded, the EU looks to be on a knife’s edge, and nativist autocracies have been making a comeback worldwide. We should of course honour the sacrifices made by those who fought and died 75 years ago, but more importantly we should remember the collective sacrifices of nations mobilized to large-scale action, and the ways in which alliances and cooperation made the defeat of Nazism possible.

The generational memory of WWII is fading. Let’s lose the platitudes about freedom and sacrifice and the why of it all, and honour the dead by not forgetting the how.

1 Comment

Filed under maunderings, politics, Trump

The Sense of an Ending

WARNING: this post contains spoilers for, well, everything.

 

When I was eleven years old, my parents allowed me to stay up late and watch the series finale of M*A*S*H. I loved M*A*S*H, and still do—it was, I think, the first bit of television (aside perhaps from The Muppet Show) that was more than just mere entertainment for me … I was deeply invested in those characters and their situations, and when it came to an end I was gutted by the fact that there would never again be new episodes. Hence my parents’ willingness to let me stay up late for once.

The series finale of M*A*S*H, which ran for a feature-length two hours, remains the single-most-watched episode of television ever, pulling in over 120 million viewers. I have never again watched it, and only vaguely remember a few key plot points—Hawkeye has a nervous breakdown, Charles teaches North Korean prisoners to play Mozart, Klinger ends up staying behind to help his new Korean bride find her parents. That, and of course the iconic final shot of the word “GOODBYE” spelled out in rocks for Hawkeye as he choppers away.

Endings are tricky things. When done well, they bring everything that has preceded into sharp relief, or deliver a satisfying sense of closure. I tell my students that the period is the most significant bit of punctuation, because it defines the sentence. Without a period, a sentence simply runs on and on and adds more and more possibly extraneous information, or digresses into the eddies of subjunctive clauses, twisting about its length like the confused coils of a snake, which can of course be virtuosic in the hands of a talented writer, but if the sentence, like a story writ small, cannot be brought to a satisfactory conclusion, then, well …

There are two endings in fiction that have devastated me. The first was when I finished The Lord of the Rings, the novel that first taught me that literature can have affect, can change you on the molecular level. In the final chapter, Frodo and Sam, along with Merry and Pippin, ride to the harbour of the Grey Havens; Sam does not know that Frodo means to leave Middle-Earth forever. Along the way they meet up with Gandalf, Elrond, Galadriel, and Bilbo. Frodo and Bilbo depart with the others across the sea to the Undying Lands. Frodo cannot stay—he has been too deeply hurt by his time as Ring-Bearer. In spite of his grief at losing his best friend, Sam watches him go and returns home to Bag End and his wife and baby daughter.

At last they rode over the downs and took the East Road, and then Merry and Pippin rode on to Buckland; and already they were singing again as they went. But Sam turned to Bywater, and so came back up the Hill, as day was ending once more. And he went on, and there was yellow light, and fire within; and the evening meal was ready, and he was expected. As Rose drew him in, and set him in his chair, and put little Elanor in his lap.

He drew a deep breath. “Well, I’m back,” he said.

It is a simple enough ending, but that is where its power lies—in the sense of return, of homecoming, a narrative depiction of what T.S. Eliot expressed lyrically in “Little Gidding”: “the end of all our exploring / Will be to arrive where we started / And know the place for the first time.” There is also, however, a profound sense of loss: though Sam is now entering the next, fulsome stage of his life, the world of Middle-Earth has ended—the magic has literally gone out of the world with the destruction of the Ring and the departure of the elves, all of which for Sam is encapsulated in the loss of his beloved Frodo.

The sense of loss I felt at the end of The Lord of the Rings functioned on several levels, not the least of which was the inchoate recognition that I could never again read the novel for the first time. It was, like Sam’s farewell to Frodo, like saying goodbye to a good friend.

The other ending that devastated me was that of One Hundred Years of Solitude by Gabriel García Márquez. While the end of LotR was all about departing a world that had held me in greater thrall than any I’ve ever read, Solitude was about getting hit with the hammer of narrative virtuosity. A defining text of magical realism, the novel is a multi-generational, sprawling tale about the (fictional) isolated Columbian village of Macondo. Early in the story, an elderly Gypsy man writes out in coded language the very story of Solitude; the text is indecipherable until decades later when a younger scion of the central family cracks the code and realizes that the Gypsy had essentially foretold his family’s story down to the last detail. He reads the final lines of the story just as a hurricane strikes the village, and reads of his own death at the conclusion just as the storm kills him:

Before reaching the final line, however, he had already understood that he would never leave that room, for it was foreseen that the city of mirrors (or mirages) would be wiped out by the wind and exiled from the memory of men at the precise moment when Aureliano Babilonia would finish deciphering the parchments, and that everything written on them was unrepeatable since time immemorial and forever more, because races condemned to one hundred years of solitude did not have a second opportunity on earth.

The convergence of that moment left me quite literally breathless—I had to put the book aside and inhale deeply to deal with its emotional impact.

Novels are one thing, as are films, as they tend to be self-contained narratives. Television is quite another thing, unfolding as it does episodically and often over multiple seasons. The shift from episodic to serial TV changes this dynamic, but not entirely: the length of a series’ run still tends to be determined by its popularity, and even the most rigorously serial series—I’m thinking especially of The Wire, in which the credits at the end of individual episodes often caught me by surprise—tend to have season-long narrative arcs. And one way or another, television tends to have a cumulative effect: even when we’re considering classic syndicated TV (in which self-contained episodes don’t require you to have seen anything previous), there is still a great emotional weight when it comes to the conclusion of a series. Theoretically, episodic TV shouldn’t need a definitive finale: there is really no need to put a bow on a sitcom or a procedural when each episode follows a wash-rinse-repeat formula. In these cases—excluding, of course, series that find themselves cancelled unexpectedly—making a big deal of the finale is largely about fan service. It was unthinkable to end M*A*S*H mid-stream, just as it was unthinkable to end Friends or Seinfeld or Cheers without giving longtime viewers something approaching the closure of an emotional goodbye.

But what makes a “good” series finale? In case it wasn’t blindingly obvious, I’m writing this post apropos of the conclusion of Game of Thrones, and the social media backlash that has accompanied not just the finale, but the entire final season. As I made clear in the previous two posts I wrote with Nikki, I have some fairly serious complaints about the way the series was brought to an end, but they are complaints that fall well short of shitting on the entire show retroactively or demanding that HBO entirely redo season eight with “competent writers” (good luck with that, people). That being said, I think that GoT does fall into the category of Very Good Shows That Ended Badly. It is not asymptomatic of HBO, which has tended at times to rush or condense series for budgetary reasons; most notably, Deadwood and Rome had each planned to run one season longer than they were allowed, with the predictable effect that the conclusions the showrunners had planned were arrived at with somewhat less narrative subtlety than was really needed. We see this most egregiously with Rome, whose first season, I will always maintain, is about as perfect a season of television as has ever been made. It was never intended to be a series to run indefinitely: the creators planned a modest three seasons, but HBO stepped in and told them that would be too expensive for too few viewers, and made them end it in two. Hence, they had to cover way too much historical ground: presumably in the original plan, season two would have ended with the defeat of Brutus and Cassius at the Battle of Philippi, and given season three breathing space to explore the fraught story of Antony, Cleopatra, and the rise of Augustus.

As has been made clear, however, Game of Thrones’ hasty ending was not a budgetary imperative but the active choice of showrunners Benioff & Weiss. HBO was willing to let them take as much time as they wanted—unsurprising, considering that the show is the most profitable property ever for the network, even with the huge budgets it demanded—but they opted for brevity. This choice makes me an awfully lot less sympathetic to the last two seasons’ flaws. Serenity might not be the greatest film ever made, but one can see in it the nascent virtuosity of a final season of Firefly, had certain executives at Fox not been ginormous douchenozzles; similarly, the final few episodes of GoT feel more like plot sketches than fully realized story, but one can see the shape of a subtle and nuanced conclusion, if only it had had the space to fill.

I suppose it should go without saying that none of this would really be noteworthy were it not for the fact of the series’ massive popularity. Had GoT only boasted viewership numbers on par with, say, The Wire—which topped out at about two or three million—not only would it have been an extremely different series, it probably would not have survived eight seasons. As it was (the final episode drew over nineteen million viewers), its popularity fed its budget, giving us vastly more lavish set pieces and special effects than anything we saw in season one (if you recall, there were no large-scale battles then: we only saw the aftermath of the Battle of Whispering Wood, where Robb Stark captured Jaime Lannister; and the climactic battle between the Starks and the Lannisters resorted to the expedient of having Tyrion knocked cold before the battle started, waking up to hear how it had gone). Lacking the viewership it developed, it might well have gone the way of Firefly—a short-lived and cruelly decapitated piece of well-made TV loudly lamented by fans crying for the blood of the studio execs who wielded the axe.

But its popularity also fed its fans’ expectations, and at the time I’m writing this, the Change.org petition to have the entire eighth season re-done has surpassed one million signatures.

In some ways, the unevenness of series finales is simply reflective of the unevenness of television itself. Episode to episode, season to season, the necessarily collaborative nature of the medium and the necessarily sprawling nature of the storytelling lends itself to a significant ebb and flow of quality and focus. The revolving doors of writers’ rooms, the switching up of showrunners, pressures brought to bear by ratings and studio interference, the departures and arrivals of key characters and actors—all of these considerations and more mean that it becomes difficult to look at a television series in its entirety as a cohesive, finished text. (By way of example, a question for passionate fans of Lost: if you could go back and change ONE THING, would you “fix” the finale or excise the protracted Nikki and Paolo storylines?)

The rise of the televisual auteur á là Joss Whedon, Amy Sherman-Palladino, Aaron Sorkin, Shonda Rimes, David Simon, or Benioff & Weiss has meant that there is more television out there now with more coherence in terms of vision and over-arching narrative, but the flip side of that is when the auteur departs a given show, especially when the departure is acrimonious: fans of Gilmore Girls, The West Wing and Community will all attest to, if not necessarily a decline in quality, then certainly a change in the basic character of these shows when Sherman-Palladino, Sorkin, and Dan Harmon were respectively given the boot. The situation with Game of Thrones was a bit different, as it was (the consensus seems to be) the point at which the series definitively outstripped the extant source material than things started to go pear-shaped—perhaps revealing that the showrunners were very good at adapting rich and complex narrative to a more abbreviated format (mostly—I think most of us would agree that the Dorne subplot was something of a failure), but not so good at building out from a thumbnail sketch to a nuanced and textured story.

I suppose the TL;DR of all that is that almost all television, but especially longer-running series, has peaks and valleys, good episodes and bad, stronger and weaker seasons, and that how a series ends is a function of that inconsistency. Game of Thrones always had its work cut out for it, as it is a story that necessitates an end in a way that almost all the other flagship dramas on HBO have not. Deadwood ends with the passing of a lawless order and the establishment of a corrupt legal order; Six Feet Under ends with one stage of Claire’s life ending and a new one beginning; The Wire ends with a recognition that nothing really ever changes; and so on. Which is why I think the series finale of The Sopranos—which evoked Lost-level howls of complaint—was particularly brilliant. Cut to black. Wait? What happens? Was Tony about to get whacked in the diner? Analyses of that final scene have been written with Talmudic intensity, trying to come to a definitive answer, but I think the point was that it doesn’t matter. The cycle continues one way or another, a point made more lyrically by the montage at the end of The Wire, which shows change at the personal level for some characters, but none at all on the societal level.

In the end, there’s a certain truism in that, ultimately, series finales are about fan service. I was thinking about this after watching Avengers: Endgame. I would imagine that, to someone who has been an indifferent and sporadic viewer of the MCU, that film would seem needlessly protracted; speaking for myself, as a fan who has seen all of the preceding films, I felt quite definitely served, to the point where I really could not care less about the glaring time-travel inconsistencies. We do expect a certain emotional punch at the end of things, which was probably why the one part of the Game of Thrones finale I haven’t read or heard many complaints about was the final montage of the Stark children: Sansa being crowned, Arya the Explorya heading west on a direwolf-prowed ship, Jon returning north and being reunited with Ghost. Those few minutes, at least, felt something like closure accompanied by a swelling soundtrack.

I think this might be why proleptic endings, i.e. those that project into the future to show you the fates of beloved characters, tend to be the most successful. I asked the question on Facebook of people’s favourite series finales, and by far the most common answer was that of Six Feet Under: as Claire drives east to her new life, we have a montage to Sia’s haunting song “Breathe Me” of the deaths of all of the series’ main characters. What makes this work so well is that it is entirely in step with the series key theme: at the start of every episode we see someone die, who will then end up at the Fischer family funeral home, along with a title card with their name and the years they lived.

Another favourite was the final episode of Parks and Recreation, which similarly looked into the future to show us where and how everyone would end up. And more recently, the series finale of Veep ended twenty-four years in the future, with everyone attending Selina Meyer’s funeral. After seven seasons, Veep has the distinction of being one of the more consistent television series, in both tone and quality, ever made … but the fact that coverage of the Meyer funeral was pre-empted by the death of Tom Hanks at 88 seems like a sly acknowledgement of the fact that the conclusion of Veep was almost certainly going to be overshadowed a week later by the conclusion of Game of Thrones.

Leave a comment

Filed under Game of Thrones, television