Wildland

WILDLAND: THE MAKING OF AMERICA’S FURY
By Evan Osnos

I’ve written before about how the dominant political emotion of our age is anger, a point brought home just by looking at a list of some of the titles I’ve reviewed: Gavin Esler’s The United States of Anger, Alexander Zaitchik’s The Gilded Rage, Pankaj Mishra’s The Age of Anger, and even the second book of Bob Woodward’s trilogy on the Trump presidency, Rage.

In Wildland: The Making of America’s Fury Evan Osnos tries to come to grips with this same phenomenon. For the most part, his diagnosis runs along what have now become clearly established lines, with the fuel for America’s fury being provided by the decades-long growth in economic and social inequality. As the sociologist Nicholas Christakis has found, inequality is a social cancer, one that has “subverted group cohesion, making people less cooperative, less friendly, and ultimately less able to work together.” Government is no longer able to offer a solution, as one of America’s two effective parties (and in any first-past-the-post electoral system there can only be two effective parties) has now defined itself as at war with the very concept of using federal power for any other purpose than deepening inequality. This becomes a vicious circle. The wrecking crew destroys government, leading to more people blaming government for being ineffective.

The public has given up. Shuttling between Clarksburg, West Virginia and the South Side of Chicago, Osnos picks up on “a sensation that was calcifying in America’s political culture – a feeling of being trapped by an undertow of economics and history, of being ill-served by institutions, of being estranged from a political machinery that was refined, above all, to serve itself.” Government had become identified with the dreaded elites, while being unresponsive to and unrepresentative of the people.

The larger fact was that, year by year, the West Virginia public was losing faith in politics at all. In 1960, more than 75 percent of eligible voters had cast ballots – almost 14 percent more than the national average. By 2012, West Virginia’s turnout had sunk to 46.3 percent, the second-lowest level in America. Over the decades, the compounding effects of political cynicism and influence had broken public faith in government.

I mentioned Clarksburg and Chicago as two of Osnos’s ports of entry into America’s Wildlands (a term firefighters use to describe dried-out terrain that provides perfect tinder for forest fires). The third place he goes to is Greenwich, Connecticut. This last is a place not like the others, being the sort of Emerald City where the economy’s winners (principally hedge-fund managers and people working in finance) have built their fortress-style McMansions. But though living in another world, the citizens of Greenwich are part of the same story:

As Americans reckoned with the origins of our political moment – the Trump years, the fraying of a common purpose – we tended to focus on the effects of despair among members of the working class who felt besieged by technology, globalization, immigration, and trade. But that ignored the effects of seclusion among members of the governing class, who helped disfigure our political character by thrusting absolutists into positions of power and then ignoring their violence – all while enfeebling the basic functions of the state. They had secured their control over the levers of democracy but disowned the consequences of its deterioration. They had receded behind gracious walls.

The point Osnos is making is that while on the most visible level inequality favours the few at the expense of the many, in fact it’s bad for everybody. I think this is right, and the effects are probably even worse for the ruling class, at least in moral terms. That said, where would you want to live, Greenwich or Clarksburg?

There is also a warning implicit in the metaphor of the wildfire, which will burn everything down when it’s lit. This idea of a wildfire suggests political revolution, and it may well be that we’ll look back upon the Trump years, culminating in the assault on the Capitol buildings, as a kind of revolution. I dislike revolutions though, preferring the natural evolution of political systems as they adapt to deal with emerging changes and crises. The problem with revolutions is they have a bad habit of spinning off in directions no one anticipated or desired.

But what is to be done? I quoted Osnos earlier talking about America’s “calcifying” political culture. This is a word that brought to mind Ross Douthat’s The Decadent Society, which saw America as sclerotic and sterile. In short: old. This is no longer the America of Paine and Emerson, issuing radical calls to make the world new. Instead it’s an America of affluent retirees, where the average age of a Senator is 63 and the last presidential election was between two men over the age of 70 who were both in pretty obvious mental decline. The greatest threat to such a governing class is change, any change. As Osnos observes, by 2020

Money and concerted obstruction [in Washington] were damning the natural routes of political evolution. This was easy to overlook because it was less a matter of what was happening than what was not happening. Historically, Americans had maintained the fitness of democracy by amending the Constitution, on average, at least once a decade. But that pace had stalled for half a century. Other than a minor amendment in 1992, to raise congressional salaries, the last major change to the Constitution was in 1971, when the voting age was lowered to eighteen. Despite campaigns for the Equal Rights Amendment, to prevent gender discrimination, and for reforming the Electoral College, Americans had entered the longest stretch without a substantive amendment since before the Civil War. The sclerosis extended to the inhabitants themselves. The Senate was the oldest in history, including eight octogenarians, nearly twice the number who had ever served at one time.

Canada is in no better shape. We seemingly can do nothing to make any changes to our dysfunctional electoral system, or reform our Senate, a body that serves no purpose whatsoever. So instead we lurch from crisis to crisis, while our politics, shaped by the first-past-the-post system become ever more polarized.

Wildland is a well-written and insightful book of on-the-ground reporting. It also gives me no hope for the future. If we can’t choose to change, and direct that change, then change will eventually be thrust upon us. And we aren’t going to like that one bit.

Notes:
Review first published online April 25, 2022.

The Anatomy of Fascism

THE ANATOMY OF FASCISM
By Robert O. Paxton

Robert Paxton begins this authoritative account of fascism by calling it “the major political innovation of the twentieth century.” There’d been nothing like it before, and some would argue we haven’t seen anything like it since 1945.

Grounding fascism’s origins in the conditions specific to post-First World War European society may limit it somewhat, but I think fairly so. In his penultimate chapter on fascism as it has appeared in “Other Times, Other Places,” what Paxton shows is how the movement’s twin ur-types (Italian Fascism and German Nazism) now only provide a toolkit for contemporary authoritarians. But to the question of “Can it happen here?” (meaning the West, and more specifically America) he provides a monitory send-off (and remember, this is 2004):

The well-known warning signals – extreme nationalist propaganda and hate crimes – are important but insufficient. Knowing what we do about the fascist cycle, we can find more ominous warning signals in situations of political deadlock in the face of crisis, threatened conservatives looking for tougher allies, ready to give up due process and the rule of law, seeking mass support by nationalist and racialist demagoguery. Fascists are close to power when conservatives begin to borrow their techniques, appeal to their “mobilizing passions,” and try to co-opt the fascist following.

There is nothing unique to fascism in this. The great political –isms have all gone the same way, evolving into new forms. Communist China has little to do with anything anyone in the nineteenth, or even much of the twentieth century would recognize as communist. Populism has had its meaning hijacked by its enemies, a process recently described by Thomas Frank in his book The People, No. It’s almost impossible to say where liberalism lines up today, whether it be something progressive or neoliberal or libertarian.

I use the word evolve to describe this transformation, as the great –isms have adapted to a changing political environment while converging in their development into a new species of political power: a global caste of tech-enabled kleptocrats without any political ideology beyond self-enrichment. In this they may be seen as representing what will turn out to be the major political innovation of the twenty-first century.

But the new authoritarians aren’t entirely new. As Ronald Syme put it in his classic work on the end of the Roman Republic: “In all ages, whatever the form and nature of government, be it monarchy, republic, or democracy, an oligarchy lurks behind the façade.” Today’s ruling elites constitute a more cynical and, perhaps paradoxically, less politically engaged class than previous historical examples, but they are no less dangerous (even if less militaristic) or efficient in their capture of state resources. They have learned to take advantage of new opportunities and public anxieties, from immigration and economic disruption to increasing inequality, political polarization, and the baneful effects of social media. Fascism, in brief, is no longer the threat it was but only because it has mutated into something that authoritarians have found works better for them.

Notes:
Review first published online April 18, 2022.

The Decadent Society and On Decline

THE DECADENT SOCIETY: HOW WE BECAME VICTIMS OF OUR OWN SUCCESS
By Ross Douthat

ON DECLINE: STAGNATION, NOSTALGIA, AND WHY EVERY YEAR IS THE WORST ONE EVER
By Andrew Potter

It’s ironic that the age of postmodernism – broadly, the back half of the twentieth century – among whose foundational beliefs is the invalidity of historical meta-narratives, has itself been characterized by many historians as representing one of the clearest, and certainly most recent, examples we have of such a meta-narrative in operation.

What I’m referring to is the myth of a decline from a golden age. The golden age in this context refers to what Eric Hobsbawm, in his magisterial history of the twentieth century Age of Extremes, more specifically called the golden age of capitalism, and which ran from roughly the early 1950s to the early 1970s. During this period Western economies boomed, there was rapid technological progress, internal improvements were the order of the day, and societies became more egalitarian.

The 1970s saw a swing away from all this, a turn often seen as triggered by the oil shock and identified as a hard turn to the political right and the neoliberal agenda of leaders like Margaret Thatcher and Ronald Reagan. Wages stagnated. Economic inequality grew. Environmental issues like pollution, extinction, and global climate change went from being persistent and intractable to lost causes. Even life expectancies began to decline for the first time since we started recording them.

Scientists began talking about “peak science,” that from now on we were going to spend more and more time, money, and effort to learn less and less, while in the cultural field commentators began to take note of the triumph of nostalgia. “As it evolves into the dominant mood of the twenty-first century,” Andrew Potter writes, “nostalgia culture has just become the culture, one where consumer crazes and social media shivers amount to little more than the context-free curation of the past.”

I’ve written a lot about this point myself in regard to Canada’s fetishizing of a golden age of CanLit, but it’s a phenomenon that’s widely attested elsewhere. Kurt Andersen, perhaps the first person to sound the alarm on the nostalgia cult, is fixated on the subject in his book Fantasyland. In Hatchet Job the film critic Mark Kermode makes the argument that movies peaked in terms of their popularity as far back as the 1930s and ‘40s and that “to all intents and purposes we are now merely sifting through the wreckage of an art form whose popular supremacy has long been superseded.” That’s true, and what’s more to the point, how many of our biggest movies today are remakes, reboots, and legacy intellectual properties going on fifty years-old that have now been turned into franchises? Quite a lot of them.

Music? As recently reported by Ted Gioia, old songs now represent 70 percent of the U.S. music market, and it’s a trend that’s worsening: “The 200 most popular new tracks now regularly account for less than 5 percent of total streams. That rate was twice as high just three years ago.” I was struck by this most recently when listening in to a neighbour’s house party and hearing nothing but songs from the 1980s. What, I wondered, has happened to today’s kids, to be so much in love with the music of their parents, or even grandparents? But maybe the kids were alright. Perhaps the question I should have been asking is what had happened to their music.

How can one live in such a social, political, and cultural moment and not start to think about stagnation and decline? Ross Douthat and Andrew Potter are two writers whose thoughts have turned in that direction, and in The Decadent Society and On Decline they present very similar takes on the problem. From their analysis only dismal conclusions can be drawn.
Though it’s much shorter (it’s part of the Biblioasis series of Field Notes), On Decline strikes me as having the firmest grip on what’s going on. Front and center is the historical myth of boom and bust, golden age and fall. For Potter, as for many observers of the period, the golden age wasn’t an example of the inevitability of progress so much as a historical blip brought about largely by a wealth of easily exploited energy resources. In the post-WW2 period we hadn’t advanced to some higher state of civilization but only won a lottery:

Our mistake was believing that the world had figured things out in a way that was more or less stable and permanent. It turns out that this period of stability and growth was temporary. Progress itself was something that fed off a massive one-time windfall we gained access to in the nineteenth century. We didn’t climb a ladder, we stumbled into a buffet. We’ve been feasting off that buffet for a few centuries now. Unfortunately, it looks like the party is coming to an end.

Having gorged ourselves at this buffet, or sucked dry what Potter elsewhere calls the post-WW2 “oasis” of low-hanging fruit, we are now coming up against the hard limits of growth. Douthat likens what’s happening to the frontier thesis of the American historian Frederick Jackson Turner, which saw the American West as giving rise to a spirit of democracy and egalitarianism in that country. In turn, the frontier’s closing (dated 1890) could be taken as marking a high tide in these values. Douthat thinks this “can be usefully applied to the entire modern project” because “bedrock assumptions” like perpetual progress can now be seen as having been based upon our expansion into new worlds that no longer exist. There’s no more free land, or free lunch.

Another analogy I had brought to mind was that put forward by Pierre Berton in his book 1967: Canada’s Turning Point. Why, Berton wondered, looking back thirty years later, were we so nostalgic for the Centennial?

By a number of measurements we are a great deal better off today than we were thirty years ago. We are healthier and we are wealthier than we were in 1967. The real net worth of the average Canadian is almost double what it was back then. Babies born today can expect to live longer – six years more than the centennial crop of babies. The death rate for infants has dropped from twenty-two per thousand to six. Far fewer mothers die in childbirth. And, as far as minority groups are concerned, we live in a much more tolerant society and one that is far less repressed.

Why, then, do we look back to 1967 as a golden year compared to 1997? If we are better off today, why all the hand wringing?

In answering that question Berton suggests various reasons, like the fear of the country splitting apart, but more broadly he draws a connection to an aging population. What happened from 1967 to 1997? The Boomers got old, and with their youth went their optimism and dreams for a golden future.

We were all high in 1967, like somebody who has just won the lottery. Expo taught us to go first class, and we reveled in the pride that inspired. In those days we felt secure as Canadians, confident enough to push for a better, freer life. We did not count the cost until the bills began to come in. The years that followed had some of the effects of a hangover after a binge.

The buffet, the oasis, winning the lottery, the drunken binge – they all work as metaphors. The point being that now the party’s over. The optimism, confidence, and sense of security enjoyed in the golden age is gone.

This is bad news because, as Ross Douthat argues, progress is a necessary fiction for modern societies. Indeed, he even goes further and equates the notion of progress with civilization itself. What happens when we stop believing in our very purpose?

The biggest effect this loss of faith has had so far is on our politics. A society that sees itself, correctly or not, as being stuck in a state of (terminal) decline will be first and foremost one that is, paradoxically, resistant to changing course. All change will be seen as change for the worse, or as losing everything in what is a zero-sum game (hence the current vogue for seeing every crisis as “existential”). A voter’s prime directive becomes holding on to one’s privileged lifestyle. The beneficiaries of the banquet/oasis/post-War party were the Boomers and, being old, they are the ones who now have the most to lose. What Douthat means by a decadent society is one that can be characterized more accurately as a society of retirees, with stagnation being synonymous with sclerosis and sterility (both being words that he uses). The whole world, to paraphrase Eliot, is our nursing home. Or, per Douthat:

we are aging, comfortable and stuck, cut off from the past and no longer optimistic about the future, spurning both memory and ambition while we await some saving innovation or revelation, burrowing into cocoons from which no chrysalis is likely to emerge, growing old unhappily together in the glowing light of tiny screens.

Those screens, in turn, are our invitation into more comforting virtual realities, the environment of Andersen’s fantasyland. True belief being no longer necessary for survival, we are cut free to believe anything we want in what Steven Pinker calls the tragedy of the belief commons. Here is Potter on the political endgame brought about by the closing of the Western mind as well as the political frontier:

It’s the simple fact of economic expansion that inclines people towards feelings of openness and toleration and that inspires trust in our democratic institutions. Just as the knowledge the pie will keep getting bigger makes people more generous in the divvying up of that pie, the sense that we can expect things to get even better – no matter where we currently are on the development curve – acts as a sort of bellows of fellow-feeling, making people more hopeful for the future and more generous-minded. More than anything else, the mere fact of growth is a signal that the future will be better than the past.

Unsurprisingly, the opposite holds during periods of stagnation, when zero-sum thinking kicks in. When the economy stops growing or even starts to shrink, people become fearful for the future, suspicious of immigrants and diversity in general, and distrustful of democracy. Stagnation breeds authoritarianism – that, of course, is one of the great lessons of the 1930s, as the Great Depression drove diverse, democratic populations toward nationalism and into the arms of fascist dictators. While there are no iron-clad laws of history, economic stagnation and the decline of liberal democracy are strongly linked.

Not a happy ending, but these are books about the end of the world as we knew it. Is that decline, or decadence, or something new that we can’t identify yet? I think the answer lies in our past, which gives me little hope for the future.

Notes:
Review first published online March 21, 2022.

On Fascism

On Fascism: Lessons from American History
Matthew C. MacWilliams

Fascism is a label that gets thrown around a lot, and while that has diminished some of its impact I think it still has some usefulness. For Matthew MacWilliams it basically means an authoritarian form of government brought about by a demagogue’s manipulation of the electorate’s fear. This fear is, in turn, directed toward a mostly racialized “other.” In the U.S. this means Native Americans, Blacks, Chinese, Japanese, Mexicans, Muslims, and any other readily identifiable groups.

MacWilliams draws on various recent polls on America’s authoritarian attitudes and concludes that his country is today facing a real threat to its ideals, particularly in relation to democracy and the rule of law. He provides a quick survey of some of the most significant lowlights of American history, but there’s little deep or connecting analysis showing how these ideas work together to constitute a clear and present danger.

“Broadsword Calling Danny Boy”

“Broadsword Calling Danny Boy”: Watching Where Eagles Dare
Geoff Dyer

Recent years have seen an explosion of monographs on famous (and some not-so-famous) movies, from standalones like Noah Isenberg on Casablanca, Sam Staggs on Sunset Boulevard, Sam Wasson on Chinatown, and W. K. Stratton on The Wild Bunch (these are all on the shelf beside me now) to whole series like the BFI and Soft Skull’s Deep Focus companions. “Broadsword Calling Danny Boy” is a bit like one of these, and may also mark the mid-point of a trilogy of film books by Geoff Dyer, beginning with Zona (on Andrei Tarkovsky’s Solaris) and with the possibility of a follow-up appreciation of John Boorman’s Point Blank teased at the end of this one.

I say this book is like the other books I mentioned, but it’s something quite a bit lighter: nothing scholarly about it but rather just a breezy running commentary on Where Eagles Dare, a 1968 WW2 action film that has gone on to achieve minor cult status, I think mainly for the sense of nostalgia it evokes among men of a certain age. I don’t think Dyer did much if any research into the film, instead choosing to get by with lots of smart talk and breathless run-on sentences. It’s a quick read – quicker than the movie even – and a lot of fun, but don’t be looking to get more out of it than you would re-watching Where Eagles Dare on late-night TV while half-awake. In addition to being irreverent (was Eastwood’s Lieutenant Schaffer fellating Richard Burton in the back of that sedan?) Dyer is also a deeply personal, impressionistic critic and frankly describes the book as yet another chapter in his autobiography. I thought that a welcome change of pace, but if you don’t care for such an approach you might want to take it as a warning.

Reign of Terror

REIGN OF TERROR: HOW THE 9/11 ERA DESTABILIZED AMERICA AND PRODUCED TRUMP
By Spencer Ackerman

Most of the time, when people speak of American exceptionalism they mean it as something to be proud of, if not an outright boast. This positive brand of American exceptionalism refers to the sense of the United States as having a providential purpose and providing a light unto other nations.

There is, however, a darker side. This is what Spencer Ackerman explores in Reign of Terror. The light of nations is only an “exceptionalist euphuism that mask[s] a boundless, direful ambition.” What exceptionalism really refers to is the U.S. being an exception to moral and legal norms, which it feels free to enforce without having to follow. It refers to racial exceptionalism, of the kind that says white nationalist terrorism isn’t real terrorism and can’t be dealt with in the same way (the “foremost lesson of 9/11” would be “the terrorists were whomever you said they were”). And it refers to actions being free of consequences, the idea that jettisoning principle and the rule of law would all work out in the end and that the War on Terror would always be fought “over there” and have no impact on lives at home.

What Ackerman wants to underline is not only the falsity of this belief, but that counterterrorism may in fact be a case of the cure being worse than the disease. It would be the War on Terror that would pose the greatest threat to the fabric of American life, not terrorism itself. American exceptionalism, however, suggested a state of perpetual innocence: no loss, no consequences, no responsibility. Or, in the language of Trump: “I don’t take responsibility at all.”

On the question of whether Trump marked a break with the past or a continuation or logical progression of a rightward political drift Ackerman comes down more on the side of the latter. Not just Republicans but the whole apparatus of the technocratic security state, the military-industrial-information complex, had its fruition in Trump. The so-called “Resistance” to Trump would cheer on the “Adults in the Room, without considering that an earlier set of adults, the adults they esteemed, had already prepared the room.” Trump only took the varnish of the good exceptionalism off. “You think our country’s so innocent?” he would ask, rhetorically. His “great insight was that the jingoistic politics of the War on Terror did not have to be tied to the War on Terror itself.” Instead, he could just plug directly into a racial “war of civilizations” and talk about destroying the Middle East in order to take its oil. Many people found this refreshing.

While there’s much to take note of here, I had the feeling that Reign of Terror was a bit rambling, covering a lot of ground but in need of greater focus. There were times when I thought a long essay might have done the trick. But something is added to the argument for there being a through-line or continuity in American foreign and domestic policy over the course of the last twenty years, contributing to a period of endless, often invisible wars that would “achieve neither peace nor victory, only prolonged violence.” A result that everyone would complain about, but which might have been the goal all along.

Notes:
Review first published online February 21, 2022.

Making Darkness Light

MAKING DARKNESS LIGHT: A LIFE OF JOHN MILTON
By Joe Moshenska

When the biographer Edmund Morris was given the job of writing an authorized life of Ronald Reagan he found himself at a bit of a loss as to how to draw an honest and accurate picture of the man. In what was a highly controversial move he decided to write Dutch as a fictional historical memoir, telling Reagan’s story from the point of view of a made-up character. Some scenes were also dramatically embellished while others were simply made up.

I had to think of what Morris did in Dutch when reading Joe Moshenska’s new biography of John Milton, Making Darkness Light. Given how much has already been written about Milton, Moshenska found himself feeling a bit like Samuel Johnson, who a quarter-millennium ago faced the same task. Was there anything new to say? And how, in pursuit of that holy grail of every biographer, could he get inside the man?

Moshenska’s answer is to adopt a highly personal approach, and to use a lot of dramatic imagination.

These are both important points that need to be explained. In the first place there is the personal or subjective approach taken to the subject. “The way in which Milton matters to me is now entangled with the whole of my life,” Moshenska begins, “and this means that to write about him, to make any kind of sense of him, is partly to think of his place within this whole.” That is, within the whole of Moshenska’s life, which is what gives this biography its true intellectual context.

I can only write about Milton’s life in his times by reckoning along the way with his place in my own life, in my times. This will mean bringing Milton’s life and his writings into contact with the personal and public worlds that he inhabited, but also showing along the way how his writings have come alive for me . . . If this means staying less than fully focused on the facts of Milton’s life and work, I hope it will be truer to what I see as one of his deepest preoccupations: the place of literature in a life.

What this means in practice is that, for example, Moshenska will not only travel to many of the places where Milton lived or visited, but that he will write directly about his (that is, Moshenska’s) experience as a twenty-first century literary tourist.

Does it work? Only some of the time. The idea that any biographer or historian or literary critic approaches their subject from a particular, personal point of view that colours their interpretation and understanding of the evidence seems trite to me, and doesn’t justify this amount of self-awareness. Reading non-fiction, one cares about the story being told more than finding out about the storyteller. That said, the blending of biography and memoir is a powerful current in our own time, something seen most obviously in the true crime genre recently.

The second point has to do with dramatically imagining scenes from Milton’s life that may or may not have happened. For example, the question of whether Milton actually got to meet Galileo on his trip to Italy is one that’s argued for centuries by scholars, but Moshenska goes ahead and gives us an account anyway, even having the Tuscan artist let the visiting poet gaze through his optic glass. Might this, or something like this, have happened? We can only say that it’s possible. Similarly, when visiting Paris it’s not known if Milton was at a dinner where he sat next to Sir Kenelm Digby. So after presenting an account of the same Moshenska has this to say:

At this point I need to put my cards firmly on the table. What evidence is there that this dinner, or one like it, ever took place? None at all. What evidence is there that he ever met Sir Kenelm Digby, the man next to whom I seated him? Again, none. So why have I troubled you with it? On one level I must own up to some self-indulgence, though of a sort I am happy to defend. Digby is another figure from Milton’s era in whom I have invested many years of thought, whose writings I have read and in whose footsteps I have sometimes travelled. I’ve often been thinking about the two of them at the same time, and so of course they are braided together in my own mind: in this sense, bringing in Digby reflects my own interconnected preoccupations . . .

Because Moshenska has spent a lot of time studying Digby he finds the idea of such a meeting of intellectual opposites, “the polar extremes of seventeenth-century life,” intriguing. It can also help illustrate the nature of those extremes. In this way, like a good historical novel, the dinner scene can be said to aid our understanding of the social and political milieu of Milton’s life, even if it’s wholly imaginary. But one still wants to object: is it true? And if it isn’t, how misleading might all this be?

These two directions taken by Moshenska are what set Making Darkness Light apart as a Milton biography. I will be honest and admit (in a manner that I think Moshenska, who is interested in how reading happens, would appreciate) that they made me feel at times, especially in the early going, like giving up on the book entirely. There are moments when it becomes impossibly precious. What are we to make of a chapter that begins like this:

Where are we?

Unclear.

Vision is doubled, split, one possibility layered over and vibrating with another, but the two refusing to coalesce into a single scene. As if the two eyes are each seeing something different, peering in different directions like the revolving eyeballs of a chameleon, but into entirely different spaces, worlds.

When are we?

No clearer.

Time is what we’re not supposed to notice; it’s what allows our attention to take place, not something to which we attend. But that seems impossible, here. It too is divided; no, rather, it’s overstuffed, full; there seems to be too much of it. Somehow the flow of time wants and manages to do impossibly different things all at once. It races eagerly ahead like a river swollen after rain; it curls around and back on itself like that river’s eddies and whorls, its turbulent pockets and vortices; it seems to want to freeze into ice, hard enough to skate upon, and pause at the moment of its sudden crystalline stillness. None of these can happen, but none of these possibilities will surrender to the others.

This, to make use of a very technical word, is mush. It’s at moments like these that you have to remind yourself you’re reading a critical biography. A subjective, dramatic approach has collapsed into a sort of poetic expressionism, and it’s a long way from telling us anything about Milton.

That said, I can now happily add that I’m glad I stuck with Making Darkness Light. The thing is, when he gets down to talking about Milton, Moshenska proves himself to be an adept, well-informed, insightful, and sensitive reader. When he has a bit of text between his teeth he can really pull. I learned quite a bit, and found myself being led into thinking about Milton in interesting new ways. There were moments of disagreement – I doubt Edward King ever made any kind of an impression on Milton at all, for example – but I can see where Moshenska is coming from. Indeed, where he’s coming from is often, as we’ve seen, the point.

That can be off-putting and overdone, but given the dilemma faced by any scholar writing on a figure as canonical as Milton, labouring under the weight of centuries of critical overload, one has to accept that some risks must be taken in order to create a Milton for our own time. This is something that Moshenska has done, and as co-creators of the cultural moment I think we have to take the good with the bad.

Notes:
Review first published online February 14, 2022.

When America Stopped Being Great

When America Stopped Being Great
Nick Bryant

Another effort, this time by a Brit, to try to understand what went wrong with America in 2016 and thereafter. That is, how the Age of Trump happened. As the title indicates, the post-mortem looks for continuity, and addresses the question of how much of Trump’s rise was continuous with trends in the Republican Party and how much was a clean break. Bryant sees a through line, calling Trump’s election a revolution “decades in the making.” His was less a hostile takeover and more “a merger and acquisition, with shareholder support and buy-in from a large portion of the customer base.” This is something that had been “brewing for years,” and only came as a surprise to those who had misunderstood and downplayed “the transformative changes that had been overtaking America – politically, economically, culturally and technologically – for the past 50 years.” The roots, in other words, lay in the Reagan revolution. What’s even more disturbing than the process of how we got here, however, is that the “economic, technological and demographic trend-lines all point to politics becoming more polarised and extreme” moving forward. “I fear more American carnage,” Bryant concludes.

Speaking of Universities and The Tyranny of Virtue

SPEAKING OF UNIVERSITIES
By Stefan Collini

THE TYRANNY OF VIRTUE: IDENTITY, THE ACADEMY, AND THE HUNT FOR POLITICAL HERESIES
By Robert Boyers

Headlines about universities, and the Humanities in particular, being in crisis have gone through several phases just in my lifetime. When I was in university in the late 1980s and early ‘90s what I’ve since come to call the first wave of political correctness was all the rage. Dinesh D’Souza’s Illiberal Education: The Politics of Sex and Race on Campus (1991) was a founding document, with Allan Bloom’s The Closing of the American Mind (1987) being ur-text. What most of this first wave boiled down to was attributing the crisis in the Humanities to their drifting away from teaching the classics of Western Civilization while taking on a lot of postmodern critical theory.

There was a fuss for a while, but the first wave subsided and for many years after political correctness seemed a spent force: little more than a fad without any lasting relevance or impact.

Honestly, at the time it felt like a safe bet.

But new threats were soon discerned on the horizon. Canadian sociologists James E. Côté and Anton L. Allahar wrote a couple of books surveying some of the biggest challenges: Ivory Tower Blues: A University System in Crisis (2007) and Lowering Higher Education: The Rise of Corporate Universities and the Fall of Liberal Education (2011). The main problem, in their view, had to do with declining academic standards and the value (liberally conceived) of the university experience dropping as a result of government underfunding and the “massification” of higher education. This latter was a messy term that basically just meant there were too many unprepared, unqualified, and unmotivated students entering the system. My response to this at the time was to say that while there may be a lot of negative effects that went along with massification, it was unlikely our system of higher education could continue without it. This is one reason I’ve never been all that upset about bad students, or even students behaving badly (being rowdy on campus, cheating in their courses, etc.). Their tuition helps keep the universities going just as much as anyone’s.

Underlying these different waves of critiques and analyses, the most essential fact to take note of is that university enrollment in the Humanities has been experiencing a pretty steady decline since the 1970s. Indeed, I’ve been reading about this for so long that I only wonder how it is that many departments, like mainstream churches with aging congregations, still manage to keep going.

Item: In December 2021, the CBC reported that at the University of Western Ontario undergraduate enrollment in the department of Arts and Humanities had dropped by 28 per cent over the last decade.

Item: Writing in the New Yorker, also in December 2021, Louis Menand casually dropped the following alarming numbers:

Between 2012 and 2019, the number of bachelor’s degrees awarded annually in English fell by twenty-six per cent, in philosophy and religious studies by twenty-five per cent, and in foreign languages and literature by twenty-four per cent. In English, according to the Association of Departments of English, which tracked the numbers through 2016, research universities, like Brown and Columbia, took the biggest hits. More than half reported a drop in degrees of forty per cent or more in just four years.

Forty percent (or more!) in only four years? So . . . in another six years there effectively won’t be any degrees being awarded? Or will things level off at some point? A point, I might add, well below where these departments could be sustained.

An earlier signpost was the 2005 PBS documentary Declining by Degrees: Higher Education at Risk, which had a companion volume of essays edited by Richard H. Hersh and John Merrow. Declining by Degrees looked at the higher education industry from a bunch of different angles, beginning with the observation that most people don’t really know what’s going on at universities these days, leaving them “Teflon-coated [and] remarkably immune to criticism.”

If that was true in 2005, and I don’t think it was, it’s certainly no longer the case today amid the tsunami of second-wave PC. Meanwhile, the higher ed industry has been finding itself under increasing financial pressure at the same time as its consumer base, the middle class, was being squeezed more than ever. Those declining enrollment numbers? They’re also an accounting problem.

My use of words like “industry” and “consumer base” in this context is deliberate. That isn’t the sort of language we feel is appropriate to define the mission of our universities. But as David L. Kirp puts it in his Declining by Degrees essay, lacking

a principled defense of nonmarket values, higher education may degenerate into something far less palatable than a house of learning that – as a prophetic report on undergraduate education put it nearly two centuries ago – is “attuned to the business character of the nation.” It may degenerate into just another business, the metaphor of the higher education “industry” brought fully to life. Should that scenario come to pass, America’s undergraduates will be among the biggest losers. But if there is to be a less dystopian future, one that revives the soul of this old institution, who is to advance it – and if not now, when?

It’s a question that’s been hanging now for nearly twenty years. And in truth it’s been kicking around for longer that. One person who has tried to come up with a response is Stefan Collini, whose essays on the subject are collected in Speaking of Universities. What Collini focuses on are the metaphors Kirp mentions, the language of free-market fundamentalism and of instrumental vs. ideal visions of higher education. The adoption of the language of the market is part of a “great mud-slide” in the vocabulary we use to talk about universities, making it “difficult to find a language in which to characterize the human worth of various activities, and almost impossible to make such assessments tell in public debate.” This is key for Collini, a literary scholar by training and someone sensitive to the different ways language can be used to frame a debate (one chapter in Speaking of Universities is given over to a brilliant close-reading of a government White Paper). For example, the particular debate he’s most invested in, the funding system of British universities, is one now conducted largely in the terms of the market economy, with universities being defined in terms of their contributions to economic growth and how well they serve the needs of industry and commerce.

The matter of funding is indeed important, and ties in with related questions like that of student debt relief in the U.S. No amount of idealism and appeals to nonmarket values will make these issues go away, and there seem to be no practical solutions available. The rising costs of higher education might beggar any attempt at funding or relief. For years now university has been becoming more expensive at a rate far outstripping inflation. Between the end of the Reagan and Obama presidencies, the period of crisis that most of these analyses cover, the cost of attending higher education rose eight times more than wages. Thomas Frank in Rendezvous with Oblivion is worth quoting at length here:

One thing defenders of the humanities don’t talk about very much is the cost of it all. In the first chapter of her 2010 book Not for Profit, for example, the philosopher Martha Nussbaum declares that while the question of “access” to higher ed is an important one, “it is not, however, the topic of this book.”

Maybe it should have been. To discuss the many benefits of studying the humanities absent the economic context in which the humanities are studied is to miss a pretty big point. When Americans express doubts about whether (in the words of the Obama pollster Joel Berenson) “a college education was worth it,” they aren’t making a judgment about the study of history or literature that needs to be refuted. They are remarking on its price.

Tellingly, not a single one of the defenses of the humanities that I read claimed that such a course of study was a good deal for the money. The Harvard report, amid its comforting riffs about ambiguity, suggests that bemoaning the price is a “philistine objection” not really worth addressing. The document produced by the American Academy of Arts & Sciences contains numerous action points for sympathetic legislators, but devotes just two paragraphs to the subject of student debt and tuition inflation, declaring blandly that “colleges must do their part to control costs,” and then suggesting that the real way to deal with the problem is to do a better job selling the humanities.

Ignoring basic economics doesn’t make them go away, however. The central economic fact of American higher ed today is this: it costs a lot. It costs a huge amount. It costs so much, in fact, that young people routinely start their postcollegiate lives with enormous debt loads.

This is the woolly mammoth in the room. I know that the story of how it got there is a complicated one. But regardless of how it happened, that staggering price tag has changed the way we make educational decisions. Quite naturally, parents and students alike have come to expect some kind of direct, career-prep transaction. They’re out almost three hundred grant, for Christ’s sake – you can’t tell them it was all about embracing ambiguity. For that kind of investment, the gates to prosperity had better swing wide!

No quantity of philistine-damning potshots or remarks from liberal-minded CEOs will banish this problem. Humanists couldn’t stop the onslaught even if they went positively retro and claimed their disciplines were needed to understand the mind of God and save people’s souls. The turn to STEM is motivated by something else, something even more desperate and more essential than that.

What is required is not better salesmanship or more reassuring platitudes. The world doesn’t need another self-hypnotizing report on why universities exist. What it needs is for universities to stop ruining the lives of their students. Don’t propagandize for your institutions, professors. Change them. Grab the levers of power and pull.

What change does Frank want to see the professoriate effect by grabbing those levers of power though? Lowering the cost of higher education isn’t going to be in their self-interest. And while I think Collini’s analysis of the language we use to frame the way we talk about universities is spot on, I agree with Frank that the cost, and even more than that the inflation in the cost, of a university degree is the woolly mammoth in the room. When I attended university nearly forty years ago I could easily make enough to pay for my tuition, rent, and supplies (meaning everything from books to groceries) with the money I made working at a park over the summer, and part-time during the school year doing odd jobs. I don’t think that’s remotely possible for most students today.

The current state of debate over universities has gone in yet another direction, however, taking us back to relive an earlier crisis in the Humanities. What I mean is the furor over wokeism and cancel culture and Marxism in the academy, which first crested in the early 1990s with that first wave of political correctness. Those glory days are back with a vengeance now, making the idea of universities as Teflon-coated and immune to criticism seem positively quaint. It’s hard to think of a more public battleground.

Why? Because the cannonades of the culture wars are immensely popular. As the Internet has shown, pushing the buttons of outrage is like a drug. No surprise then that The Tyranny of Virtue by Robert Boyers (another English professor) only takes flight when he starts banging his enemies on the head. Those enemies are the commissars of a new political orthodoxy or groupthink consensus “that takes it to be an unconscionable violation of propriety to raise serious questions about anything that has even remotely to do with race or identity when the relevant issues have been officially agree on by a duly constituted, administratively sanctioned program or committee.” Diversity has become a loaded buzzword, as “at most institutions ‘diversity’ does not refer to diversity of opinion, and . . . diversity officers are often appointed chiefly to ensure that a party line be promulgated and enforced.” What this leads to is a “total cultural environment” (Boyers borrows the term from Lionel Trilling) that sounds like something out of 1984:

What does “a total cultural environment” look like? In the university it looks like a place in which all constituencies have been mobilized for the same end, in which every activity is to be monitored to ensure that everyone is on board. Do courses in all departments reflect the commitment of the institution to raise awareness about all of the approved hot-button topics? If not, something must be done to address that. Are all incoming freshmen assigned a suitably pointed, heavily ideological summer reading text that tells them what they should be primarily concerned about as they enter? Check. Does the college calendar feature – several times each week, throughout the school year – carefully orchestrated consciousness-raising sessions led by human resources specialists trained to facilitate dialogues leading where everyone must agree they ought to lead? Check. Do faculty recognize that even casual slippages in classrooms or extracurricular discourse are to be met with condemnation and repudiation? See to it. Is every member of the community primed to invoke the customary terms – “privilege,” “power,” “hostile,” “unsafe” – no matter how incidental or spurious they seem in a given context? Essential. Though much of the regime instituted along these lines can seem – often does seem – kind and gentle in its pursuit of what many of us take to be a well-intentioned indoctrination, the impression that control and coercion are the name of the game is really hard to miss.

Has it really come to this? I honestly can’t say, as most of what I know about what goes on in universities – meaning what really goes on, on a day-to-day basis, and not the sort of stuff that gets reported on – I only pick up second hand. But my gut feeling is that the kind of thing Boyers describes, an almost violent insistence on performative virtue as the coin of the academic realm, is like the twitch of a death nerve. Tests of moral purity have become a proxy for a sense of public relevance and self-worth.

The future of the Humanities, at least in their academic form, can be summed up in a word: contraction. This is the reversing of the movement toward “massification” that characterized the boom years of academe. It is, in fact, a process that is already well under way, as evidenced by the declining enrollment numbers. The experience of contraction, on the ground as it were, is both depressing and painful to experience for academics, and making matters even worse is the fact that the workplace has become more bitterly divided than ever between the haves and have-nots. In such a toxic environment it’s no surprise that the temperature has risen so much.

I’m glad I went to university when I did. It doesn’t sound like a lot of fun now, and the new directions taken by the Humanities seem to me to be less politically (in)correct than a total waste of time. Talking with today’s undergraduates, it seems as though they’re not reading much of anything. I recently talked to one second-year history student who honestly didn’t know how to sign a book out of the library. This is another major contributor to the crisis of the Humanities: our turning away from a culture of reading. Whatever else they may be, the Humanities are essentially text-based courses of study, which makes them seem positively archaic now.

But even the diminished rump that’s left of the Humanities today is probably unsustainable. In other words, things are not likely to get better. All the railings against cultural Marxism and woke campuses may be inspired by legitimate causes for concern, but in the end they are only sound and fury. Postmodernism is as much a bogeyman today as it was in the ‘90s, and that’s coming from someone who was no fan of it then either. But even better models for funding, however well-intentioned, will do nothing to arrest the current decline. The crisis of the Humanities is being driven by broader economic and cultural forces, ones that universities can do little to influence and nothing to stop.

We might still, however, find something worthwhile in the final episodes of what has been a long-running drama that was not without some good seasons. If we live in an age of diagnostics and elegies, these are at least respectable intellectual and artistic occupations, and can be of some consolation in dark times.

Notes:
Review first published online January 25, 2022.