242

A weathered man slumps into a chaise, periodical perched on his stomach, remote in hand. The subtle hisses of a cathode ray tube come into fruition, followed by a mirage of a picture becoming increasingly defined by the second — a seemingly ubiquitous daily, post nine-to-five ritual that is equally enjoyed by the unemployed.
The phenomenon that Readers Digest subscribers and pop-psychologists everywhere enjoy labeling as “TV addiction” is nothing new; it is just one of many new electronic, or more specifically digital, distractions that have entered the mainstream way of life in the last 50 to 100 years.
Many in their early 20s may remember their childhood interactions with emerging technologies. First, there were computers and next the introduction to a parent’s first cell phone, its function far outweighing the burden of its form. Or maybe their first must-have digital toy was a CD player or the original Game Boy.
Those of us who have grown up with information technology are comfortable with these modern gadgets; tragically, the analog dinosaurs who invented or marveled at the invention of such technologies are not. I am often dumbfounded by comments like, “Control-C is ‘copy’ and Control-V is ‘paste’ you say? Geez! Back in college, it was a big deal if you had an electric typewriter — and if you made a big enough mistake that was beyond whiting-out you had to start all over.”
If reminiscing about a certain movie or political figure “dates” you, such comments take one back to the age of the Pharaoh. Our generation cannot remember a time when we could not microwave something. We are too jaded and spoiled to even pause and marvel at our technological advancement. Thus, digital entitlement is one of two things that constitute the dividing line between past and future generations.
The other generational divider is the possession of both the temperament and the ability to digest, process, categorize and prioritize the constant barrage of information available thanks to the microprocessor. The frequency with which news and personal communications arrive is staggering, but to 21st century children, it is normal. The rings of incoming calls and the vibration alerts of text messages on a cell phone almost have a circadian quality to them. The long silences that were once standard are now awkward as we expect life to be punctuated by technology.
So when someone like Dan Tapscott, deemed “champion of the net generation,” author of Wikinomics and member of the sub-100 periodic element generation, makes the claim that rote memorization in education is no longer necessary due to the Internet, I have to keep the inner cynic in me at bay. In order to attempt to even process what he said without bias, I have to put aside the gut reaction stirred by the horrible self-promoting slogan that his publisher probably put in the book’s press release, ignore a title that coattails on the popularity of “Freakonomics” and just completely bury the fact that the book’s homepage has a blog, a wiki, an RSS feed and everything else cliché about Web 2.0 — all of which do a great job of making the author the digital equivalent of Larry King in a lowered Integra with street glow covered in a bad airbrush mural.
After ignoring all of that baggage, I actually agree, at least superficially, with Tapscott. He is only saying what most children of the 21st century are already aware of and have been discussing for quite some time. I was first consciously aware of our newfound dependence on computers in middle school, when a teacher of mine yammered, “What’s wrong with you kids? Does no one teach you how to spell anymore? Why can’t you spell?” I retorted with, “Spell check.”
It is very tempting and empowering when we realize that we have the potential to outsource information storage to computers. (Who knew outsourcing could be more than domestic IT-department water-cooler taboo?). Currently, computers are not built like human brains and excel in linear tasks like data storage and recollection, far better than our brains could ever hope to do. However, computers that think like us – known in computer talk as neural-net processing – are able to learn, create hierarchies and accommodate for missing pieces of information extremely well. Then the most efficient use of our brain power is to determine personal preferences about which movies are “best;” menial details like their directors, actors, current show times and locations of theaters should be left to computers, right?
The answer is partially. As exciting as it may be to partake in and observe this trend of mental outsourcing, it is the 21st century youth that is realizing first that computers can’t replace basic knowledge. While I believe that Tapscott still wants us to know difficult dates, people, places etc., if we use our newly digital stores of knowledge it will require so many interruptions in order to pull up the necessary information that we’ll lose our ability to communicate well.
I noticed this when I purchased my first smart phone. Countless times, I would be watching TV with someone, and when a news anchor would say, “The tornado destroyed the only pizza parlor in Burma,” one of us would think or say, “I wonder what it was called?” or “When did they build that?” only to shrug at a smart phone browser or sleeping computer nearby. “Just because you can doesn’t mean you should,” isn’t parental vernacular without reason. Digital outsourcing can and has led to indifference about knowing anything resembling “meaningless” facts, partly to maintain a flow in conversations and partly because thinking you could know something becomes an equitable substitute to actually knowing it. I think any 20-something-year-old can admit to making some really scary web searches such as “how to cook pasta,” or “define: (extremely basic two to three syllable word).” Queries like these remind us that outsourcing should be used in moderation, and not as a substitute for knowing some concrete data.
That is not to say the prospects of digital outsourcing are not exciting; the potential to augment our brain capacity is huge. I just refuse to let myself get too carried away with digital hype or trends until they’ve been well established. And as for learning hard facts in school, I don’t think it’s going away anytime soon, at least as long as those from before the digital divide are still teaching.

Michael Boileau is a third-year business economics major. He can be reached at mboileau@uci.edu.

In this article