The debate erupted again recently, incited by a column from NY Times editor Bill Keller and a Wired article by Nicholas Carr, both of which suggested that the Internet is deteriorating our mental capacity.

Both men are serious people, who write well and thoughtfully. They give voice to something we all feel. From senior citizens at the Apple store to high-tech hippies collecting music on vinyl, we all take comfort in the past while we parse the future. However, the evidence is clear: Technology makes us smarter in meaningful ways.

Outsourcing our brains to the cloud

Socrates liked to be a “gadfly”. He went around ancient Athens arguing with everybody he met. He would play possum, asking seemingly innocent rhetorical questions and then berate his victims with ever more queries. It’s no wonder they made him drink hemlock!

So, according to Plato at least, he was not particularly happy about the invention of writing, which could not adequately substitute for a human conversation and he feared it would contribute to an overall intellectual decline. As he said in the Phadreus:


…."I cannot help feeling, Phaedrus, that writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence".

In his NY Times column, The Twitter Trap, Executive Editor Bill Keller points out that, in a similar way, the invention of the printing press reduced our capacity for memory. He also notes that we are freeing up our faculties to do other things (although his examples leave some doubt about whether he actually understands his own point).

Whatever your sentiments, it’s clear that technology does things for us and that we get worse at those things. As Keller says, “we are outsourcing our brains to the cloud.” So it’s natural for us to feel concerned, but our collective choice to adopt advancements indicates that, despite our trepidation, we generally see progress as a good thing and I believe it is.

Altering our brains

Nicholas Carr makes a different argument in his article, The Web Shatters Focus, Rewires Brains. He cites research showing that the web encourages us to jump around and that inhibits our ability to take in information. Our brains just aren’t that good at switching.

However, what Carr seems to miss, and I consider this a crucial point, is that the research he cites focuses on either unusually enthusiastic multitaskers or people who were in the midst of web activity. That’s unrealistic.

We don’t surf the Internet during important meetings (at least not those that we consider important). We don’t multi-task while working on a hard problem. We concentrate on things that we value. Technology merely helps us do that (much to the consternation of those who don’t share our priorities).

It’s worth noting that arguments similar to Carr’s can be made about almost anything. For instance, someone who spends too much time reading would diminish his social skills (especially if he reads while in a conversation with others). And that, ironically, is actually the crux of Keller’s true problem with the web.

Untangling the rainbow

In reality, Keller isn’t really worried that digital technology is making us dumber. He writes, “My own anxiety is less about the cerebrum than about the soul…”

His main concern seems to be that his teenage daughter’s ability to appreciate visceral, emotive meaning will be diminished by joining Facebook and immediately accumulating nearly 200 “friends.”

Again, this is not a novel concern. In fact, it was voiced long ago by John Keats in his poem37. Lamia where he decries Issac Newton’s demystifying natural wonders through scientific investigation. Nearly 200 years later, Richard Dawkins gave a rebuttal in Unweaving the Rainbow. (As I said, this truly is a neverending story). Nobel prizewinning physicist Richard Feynman had this to say on the subject.

For my part, I’m not very concerned about digital technology replacing our passions. Computers might crash, but won’t come in drunk or screw their assistant on the conference room table (when they invent that, I’ll start to worry). Excel might diminish my computational skills, but has done nothing for my nasty temper.

I’d like to think that my more positive emotions are similarly unaffected (although my wife may enthusiastically argue otherwise).

The flynn effect

One thing that’s conspicuously absent from the discourse is the very real evidence that we’re getting more adept. Certainly, our IQ’s are rising – a phenomenon known as the Flynn effect. Moreover, as Steven Johnson points out in his book,Everything Bad Is Good For You, when you watch old TV shows it’s obvious that media has become more intellectually challenging.

So it’s curious, to say the least, that in his Wired article Carr chooses to define mental capacity in this way: “The depth of our intelligence hinges on our ability to transfer information from working memory, the scratch pad of consciousness, to long-term memory, the mind’s filing system.”

With all due respect, I think the Pulitzer nominee is pulling a fast one here – defining his terms to suit his argument. Transferring information between our short-term and long term memory is what we’re actually very, very bad at (which is why we take notes). We’re much better off outsourcing that kind of stuff to computers.

Smart is as smart does

As I explained in a previous post about what makes us smart, humans excel at pattern recognition. A particularly salient example is chess Grand Masters, who have no better memories than the rest of us, but can perform amazing feats of recall in chess games, thanks to a device that psychologists call chunking information.

Further, research into decision making shows that we only transfer information between our working and long-term memories as a last resort. We usually bypass that circuit in favor of accessing patterns we’ve internalized. Neuroscientist Antonio Damasio calls this the somatic marker hypothesis.

So what patterns do we internalize? Naturally, the ones we choose to specialize in. And that is what makes all the difference in the world.

Trading tasks

Despite the hand-wringing about lost skills, the evidence that technology makes us dumber is selective and scant. We don’t kill our own meat (well, except for Mark Zuckerberg) or make our own shoes, but surely that doesn’t make us dumber than prehistoric man.

In his book, The Rational Optimist, writer Matt Ridley remarks that King Louis XIV of France had 498 people preparing 40 dishes to choose from for his dinner. However, today’s average urban dweller has far more. Instead of doing things for ourselves we all now specialize in a few skills that we offer to many.

Web developers bid up art so that artists can go to trendy fusion restaurants, so that the sous chef can spend money on things that pay for newspaper editors salaries. They, in turn, show their appreciation by writing snarky pseudo-intellectual columns. That’s the way the world works these days. In other words, what technology does is let us choose the areas in which we want to be smart and in which we would rather be ignorant. That’s a cause for celebration, not lament.

Greg Satell is a blogger and a consultant at the Americal online media Digital Tonto. You can read his blog entries at http://www.digitaltonto.com