The Age of American Unreason

THE AGE OF AMERICAN UNREASON
By Susan Jacoby

Laments about the Decline of American Civilization are nearly as old as America itself, suggesting that things have pretty much always been going downhill. But even leaving aside the current president’s well-attested lack of “intellectual curiosity” there has been some recent evidence beyond the merely anecdotal that things are getting worse. Findings include the fact that fewer than half of all Americans believe in evolution, while one out of four public school biology teachers believes that humans and dinosaurs inhabited the earth simultaneously. A 2005 assessment ranked American fifteen-year-olds twenty-fourth out of twenty nine countries in mathematical literacy. “Literary” reading, a category that includes Harry Potter and Harlequin, has dropped almost thirty percent in the last ten years among those under twenty-five. Three years into the occupation of Iraq, two-thirds of Americans ages eighteen to twenty-four couldn’t find that country on a map.

That young people feature so prominently in these statistics strengthens the narrative of decline. It doesn’t seem as though things will be getting better anytime soon. This is a conclusion that fits with Susan Jacoby’s breakdown of the two main causes for America’s entering into an age of unreason. The first of these is technology, and in particular electronic media like television and the Internet that have collapsed attention spans and drowned our ability to think in a flood of images and distracting infotainment. The screen has become the natural habitat of the human organism, the very fabric of our lives, its reach descending to infants who are hooked before they can even walk or talk by “educational” DVDs. Print and social interaction become the opportunity cost, leading to a corresponding decline in our ability to think for ourselves.

Jacoby’s other villain is less obvious. It’s easy to see why technology, being market-driven, is so addictive and makes its greatest appeal to the lowest common denominator. The resurgence of fundamentalism, defined as faith based on a literal interpretation of the Bible, is harder to understand because it involves such a regression. There were opiates of the people before television, but back at the turn of the last century who would have thought Biblical inerrancy was going to be making a comeback? Jacoby references her great-uncle, an astronomy professor who died in 1932, as unlikely to have “anticipated that American religious denominations in the twenty-first century would continue to concern themselves with the very questions he thought had been settled by the end of the nineteenth century.” But they did. Why? And specifically why in the United States, alone among fully modernized countries? The early history of American religion that Jacoby goes into doesn’t explain such a development, but rather makes it seem all the more perverse. It is the “greatest irony” indeed that the separation of church and state and progressive religious liberties of the eighteenth century resulted two centuries later in the rise of an intolerant, anti-rational and anti-intellectual faith-based politics typical of the world’s most backward and impoverished nations. “Based on the prevalence of anti-rational religion, a visitor from another planet would have to conclude that the United States must be a nation of poor, hungry, and warring people who can only look to the supernatural for a way out of their miserable earthly existence.” There may be something to that.

Of course there is nothing terribly new in any of this. Jacoby’s project is to bring books like Richard Hofstadter’s classic Anti-Intellectualism in American Life and Neil Postman’s Amusing Ourselves to Death up to date with the latest news from the front. But her analysis is wandering, and on many points she seems overly grumpy and ill-informed. Most of the early historical background is unnecessary, and she might have been better off writing a shorter book starting with the much-lamented demise of America’s “middlebrow” culture, which is one of her most persuasive and engaging lines of attack.

We are left, finally, with the question of what is to be done. Trying to find a silver lining, Jacoby suggests the possibility of an intellectual backlash, led by the “revenge of the reality-based world.” She thinks we might have arrived at a “‘teachable moment’ – a point at which citizens are attuned, as a result of events that cannot be ignored, to the perils of making decisions based on faith and emotion rather than facts and logic.” This will lead to a call for better education, less dependence on technology and religion, and more responsible government held to account by the active, informed and knowledgeable members of a properly functioning democracy.

Which is one way things could go. Unfortunately such a path is all uphill from here, and the perils we face may not be motivation enough. One thinks of all the current interest in crusades to get us eating better food. Yet even the least rational and intelligent among us know that fast food is expensive as well as unhealthy, and that broccoli and whole wheat bread are good for you. Just as most people now accept that human activity is having an effect on global climate change. These are important matters bearing directly on our own health as well as that of the planet. But ignorance and “unreason” are not what’s standing in the way of solutions, and more knowledge is not going to change anything. We are as stupid, or as unreasonable, as we choose to be.

Notes:
Review first published March 22, 2008.

Advertisement
%d bloggers like this: