Entry from October 25, 2009
The following was a talk delivered by me at the fall regional meeting of the Philadelphia Society in Indianapolis on October 24, 2009 at a panel convened to consider “The Pursuit of Wisdom in the Age of the Internet”
You might not know it to look at me, but I used to be pretty smart. In fact, if it doesn’t sound too immodest, I could even say that I was known for being smart, which is better than actually being smart. Everybody seemed to know how smart I was. People I had never met before somehow knew it on saying hello. My reputation for braininess had preceded me, I know not how. I remember once I went for a job interview where, out of the blue, I was asked if I had ever met anyone more intelligent than I. Stunned by such a stupid question, I actually tried to answer it. That’s how smart I was. In other words, not very. But if, today, I would know better than to try to answer a question like that, it is nevertheless true that I was in other ways quite a lot brighter then than I am now. Now, I can almost feel my brainpower diminishing by the day. People no longer greet me with that slightly intimidated look that is the reward — if you can call it a reward — of those with a reputation, however undeserved, for intelligence.
The powers of reasoning that remain to me are just about sufficient to enable me to understand that this decline may be owing to natural causes. The late Mitch Hedberg had a joke about how not-smart it was to say, “This is a picture of me when I was younger.” Every picture of you is a picture of you when you were younger, dummy! So, to say that I am not so young as I was is similarly tautologous, though no less an understated truth for that. Brain cells diminish with age and hard wear, so perhaps I should expect to be stupider than I was in my 20s and 30s. But my deep suspicion is that this stupidity is also owing to the way in which I use my brain these days, which is by sitting in front of a computer screen for 12 or 15 hours a day, clicking between websites and writing or copying based on what I read there.
Back when I was smart, I didn’t do that. I was almost 40 when I bought my first computer and then, for the next ten or so years it was more a substitute for a typewriter than the all-consuming time-Hoovering device it has since become. I was past 50 by the time I largely gave up the printed versions of newspapers and periodicals for their on-line versions. Most of my life and pretty much all of the smart part has been spent with the printed word, not a cathode-ray tube or LCD. Is this a case of post hoc ergo propter hoc? — which my colleague, Professor Kopff, can confirm if I’m no longer smart enough to remember it means after this, therefore because of this. In other words, the fact that I am dumber than I was before I took up the computer doesn’t mean that the computer has made me dumb, though I want to believe that it has. Those who dismiss that belief always cite the bit in Plato’s Phaedrus where Socrates says that the invention of writing will wreck people’s ability to remember. To them I say, well, hasn’t it? When Milman Parry visited the Balkans in the 1920s and 1930s, he found that among the illiterate, the oral tradition of the Homeric epics had survived and that Bosnian bards could reel off volumes of their own and others’ verse from memory. Because they were illiterate.
But “science,” the god of smart people, tells us that the Internet is not making us dumber. A study by Dr Gary Small and others published in The American Journal of Geriatric Psychiatry tells us that surfing the web is better for the brains of middle-aged and older people than reading books. Professor Small, of the Semel Institute for Neuroscience and Human Behavior at University of California, told the London Daily Telegraph that “Internet searching engages complicated brain activity, which may help exercise and improve brain function.” Just the other day, Professor Small and his colleague, UCLA researcher Teena D. Moody were back in the news with another study of the same tendency. “We found that for older people with minimal experience, performing Internet searches for even a relatively short period of time can change brain activity patterns and enhance function,” said the Professor. “The results suggest that searching online may be a simple form of brain exercise that might be employed to enhance cognition in older adults,” said the researcher. In the current issue of the Wilson Quarterly the economist Tyler Cowen points to what’s called the Flynn effect — that is, the tendency throughout the developed world for IQs to rise, generation by generation — as evidence that in the age of the Internet, most people, unlike me, are not getting dumber.
Intuitively, however, I feel that my time spent online has robbed me of at least some of my powers of concentration, and I believe that a very significant component, if not the principal one, of intelligence is the power of concentration. Or, to put it the other way around, stupidity is the inability to focus, and my ability to focus has become severely compromised. Professor Cowan pooh poohs another research finding that “periodically checking your e-mail lowers your cognitive performance level to that of a drunk. If such claims were broadly correct,” he writes, “multitasking would pretty rapidly disappear simply because people would find that it didn’t make sense to do it.” Well, it doesn’t make sense to get drunk either, but people haven’t stopped doing that, so far as I can tell.
Indeed, the experience of the Internet seems to be like that of a drug in other ways, most notably in being addictive. I am happy noodling away on my computer, but, as with all drugs, the happiness is a product of what that artificial focus doesn’t allow you to attend to — which are the kinds of experience that the focus has taken you away from. All attention is choice, but the easy choices of the online world rob you of the ability to make harder ones, producing a different kind of knowing — for example, the kind that comes from the time it takes to plow through a Victorian novel and learn about its multitude of characters and absorb the dense texture of its prose as well as the little incidental facts about Victorian life that you might know if someone extracted them from the novel for you and put them up on the Internet as bullet points — if you ever happened to stumble on the site.
This is the way in which focus and intelligence are one, since focus is what is required to produce the depth of knowledge that it takes to understand a different culture and set of assumptions about the world than your own. And it is just that which seems to me to be lacking among those who have been educated by or with the Internet. It is the difference between information and knowledge, and a difference which few people are any longer well-equipped to comprehend. That the distinction is increasingly an arcane one, however, is not the fault of the Internet. Or not primarily, anyway. Our education system, especially education in the arts and humanities, has been doing its level best to obscure it for a generation now. For it, too, has no interest in understanding other cultures, or even our own up until 40 years or so ago. That may seem a paradoxical thing to say in the era of multiculturalism, but the level of engagement with other cultures which the multiculturalists want to take us to is pretty superficial, and they tend to ignore or minimize differences, for instance those of primitive honor cultures, that are not politically correct.
Actually, it’s not the Internet itself but the culture of the Internet that is to blame. I hate to keep picking on Tyler Cowen, because I am a fan of his economic thought, but I can’t resist citing his use of the example of Mozart’s Don Giovanni. He acknowledges that the opera “represents a great achievement of the Western canon” — gee, thanks, says Mozart — but he points to the fact that it takes three or four hours to watch and listen to in its entirety, even though this is still a lot less time than it takes to read a novel by Dickens. And it is in Italian. And good seats are expensive. But never mind all that. Just look at what the Internet can give you in return for not making the effort to see and appreciate Mozart’s opera. This is what he writes:
Instead of experiencing the emotional range of Don Giovanni in one long, expensive sitting, on the Web we pick the moods we want from disparate sources and assemble them ourselves. We take a joke from YouTube, a terrifying scene from a Japanese slasher movie, a melody from iTunes, and some images — perhaps or own digital photos — capturing the sublime beauty of the Grand Canyon. Even if no single bit looks very impressive to an outsider, to the creator of this assemblage it is a rich and varied inner experience. The new wonders we create are simply harder for outsiders to see than, say, the fantastic cathedrals of Old Europe.
Can it really be that he is comparing these bits and bobs of electronic effluvia to Chartres cathedral because, to someone who has never seen Chartres cathedral, or Don Giovanni, the “inner experience” they give him is as “rich and varied” as that of a great work of art is to someone who is equipped to appreciate it? I’m afraid he is. And he is very far from being alone. University English courses today routinely treat Shakespeare’s plays and Batman comics as being on the same plane and not meaningfully different from each other.
“It’s not so much about having information as it is about knowing how to get it,” writes Professor Cowen. But if you don’t already know the difference between King Lear and Batman, or between Chartres cathedral and a computer image of the Grand Canyon — or the Grand Canyon itself, for that matter — all the information in the world is going to be useless to you. Or, if not quite useless, useful only in trivial ways. As a professional critic, I notice that criticism itself is changing. I’m old enough to have been trained up to the job in the days when it was still thought by most if not all people that the object of criticism was, to use the title of our conference this weekend, the pursuit of truth.
Not definitive truth, not conclusive truth, not truth that left no room for other truths, but still truth — truth, perhaps, even as beauty, as Keats saw it, which I disagree with Peter Wood, who spoke this morning, in thinking not a lie but a poetic truth. In any case, truth certainly as something distinguishable from error. Now that a generation has grown up believing that that kind of truth is invidious, or “privileged” or authoritarian or hierarchical or, God help us, “patriarchal,” and that everybody has a right to his or her own truth, what we have instead of truth as the purpose of criticism— where it is not simply Marxist political analysis — is “intertextuality.” That is, we harvest as many points of connection as we can think of between King Lear and Batman and between both of them and the universe of texts awaiting us out there on the Internet — and then we put in the links between them.
It’s a highly idiosyncratic exercise, since there are an infinite number of texts and an infinite number of possible connections. That’s how you end up with that grab-bag as outlined by Professor Cowen, “a joke from YouTube, a terrifying scene from a Japanese slasher movie, a melody from iTunes,” and so forth. The only real connection between them is in the fancy of the critic, who thus steps forward as the real hero of the critical enterprise whose only aim is to enhance his own “rich and varied inner experience.” We go to the Internet to find reflections of ourselves, not to acquire knowledge of others — and in particular the others who impressed previous generations as being among the greatest of those who had preceded them in the pursuit of truth. If we have stopped valuing their accomplishments, which is what it means to value them as equivalent to a joke on YouTube, we have already reached a middle stage on the road to forgetting them entirely, which is why — I think — so many of the formerly smart are having trouble remembering how to read.