The other day, I found myself reading the back-to-school edition of The New York Times’ Circuits section with my usual stunned incomprehension and a heightened sense of alarm. The electronic gadgets that have become standard equipment for a 21st-century undergraduate bear generic names, brand names, acronyms, model and serial numbers (DVP-CX995? PIXMA MP760?) that no doubt mean something to many, but nothing whatsoever to me. A Times reporter interviewed a Duke University undergraduate named Eddy Leal, who confessed to owning three laptops with multifarious accessories (“It’s like another world in my dorm room”) as well as, of course, a cellphone and a 500-song iPod which are, he says, “with me no matter where I am – I wouldn’t mind if I could have them implanted in my body”.
“I know, it’s kind of crazy,” said Leal of his three-computer installation, guessing that he was eccentrically overwired – but guessing wrong. Other students in this same article boasted even more bewildering batteries of personal hardware, far beyond my vocabulary to describe. Returning college students in the United States now spend more than $8 billion to rewire themselves, two thirds of what they’ll spend on textbooks, and of course each year the gap decreases.
The long-term implications of mechanised education are overwhelming, but first let’s deal with the subject of silence. I’m not ancient, yet my college education 40-plus years ago was pretechnological, by current lights antediluvian. Though telephones and television had been invented, none of us, not even the most affluent, had installed them in our rooms, far less on our bodies. My fraternity house contained one of each, a battered basement television set with a small clientele and a payphone next to which we waited for hours, playing cards and drinking beer and coffee, for our turns to call home or plead our cases with girls.
Cellphones and email had not yet made their appearance in science fiction. Ninety-eight per cent of communication was verbal and face-to-face. If you had an urgent message for someone, you stuffed a note in his box at the student union or trudged half a mile across an icebound campus and hoped you’d find him in. Only juniors and seniors were allowed to drive cars.
Winter or summer, that was a lonely walk, silent, a time to think without threat of interruption. Blessedly disconnected. “Alone with his thoughts”, now a literary anachronism, was a commonplace reality. Without that freedom to disconnect, then and now, I for one would have gone mad. And at this point most readers under 45 may disconnect. How could Eddy Leal understand that if a cellphone and an iPod were implanted in my body, I’d pay virtually any price to have them removed?
Computers and allied technologies have created the most intimidating generation gap in human history, one so wide and so rapidly created that I stand staring across the chasm like an aborigine watching Krakatoa split the sky.
Not long ago, it was generally accepted that humanity’s most creative achievements, from art and poetry to major scientific discoveries, were the precious fruits of solitude. But in a single heartbeat on history’s timeline, this sacred, fecund privacy has become the unpardonable social sin for the generation on which future creativity depends. I’ve tried to explain to young people that unspoilt privacy is the most important thing a person like me could ever ask from his life. Just so they know where I stand. Urgent warnings that technology is recklessly exposing our darkest secrets to every eager peeping Tom – official, corporate or criminal – fall on deaf (or at least numb and overtaxed) ears. The traditional concept of privacy, which anchors America’s Bill of Rights, is a tough sell to technophiliacs who spend half their waking hours on sites such as MySpace and YouTube, recklessly exposing themselves.
A recent US study (published in January 2010) found that eight to 18 year-olds log an average daily exposure of just under 11 hours of electronic media. An increase of two hours daily since 2004, it includes computers and social networks, cellphones, instant messaging, television, video games and iPods. Media consume nearly all their waking hours when they’re not in school. Privacy has deep, deep roots in Western civilisation, yet a few mediocre gadgets uprooted it in less than a decade. Who knew the young were so lonely, so susceptible, so desperate for connection? Who’s to blame for their loneliness, for their seduction and metamorphosis into electro-cyborgs who bear only a physical resemblance to their parents? What sort of lives were they leading before they were wired? It’s as if prisoners buried in the dungeons of the Chateau d’If, with no previous communication except tapping on the stone walls of their separate cells, were suddenly issued mobile phones with email. What else but a compulsive frenzy of messaging, no content required?
Digital products seemed harmless enough in the beginning, meeting obvious demands for faster, more efficient commercial communication. Business will have its way. But the personal computer and all its derivative technology were not so obvious, not to most of us now left behind. We were sure it was boring – to liberal arts majors of my vintage, most tools more complex than a hammer are invisible. We never dreamed it was more addictive than heroin. “I lost my cellphone once,” a 25-year- old woman with a master’s degree told a reporter. “I felt like my world had just ended. I had a breakdown on campus.”
Some of the wizards who fathered the digital revolution have had misgivings. The late Joseph Weizenbaum, an MIT mathematician and computer scientist who authored one of the first conversational computer programs, became a profound sceptic about technology’s influence on the human condition. Weizenbaum, who was a child in Nazi Germany, believed that obsessive reliance on technology was a moral failure in society and an invitation to fascism.
Weizenbaum’s scepticism was shared by American computer pioneer and mogul Max Palevsky, who died recently at 85. Palevsky, founder of the computer-chip giant Intel, told an interviewer in 2008, “I don’t own a computer. I don’t own a cellphone, I don’t own any electronics. I do own a radio.” Given decades to reflect on what they wrought, it’s eerie that many of the scientists who created our electronic cocoon sound like the scientists who worked on the atom bomb at Los Alamos.
The wailing of the wire-wary only aggravates the captive multitudes and widens the dreadful gap. But we can’t just fold our tents and quit the field, because we, the pre-wired generations, bear most of the blame. We betrayed them. We turned them over to habit-forming, mind-altering, behaviour-warping gizmos when they were helpless children. There was almost no resistance. Politicians, colleges, school boards, doomed publishers, libraries and media all welcomed these technologies uncritically, enthusiastically, like Stone Age savages fainting with wonder over a transistor radio. Americans have always been suckers for technology – our love affairs with automobiles, television and nuclear power haven’t turned out well either. But this was the most pitiful submission, and may prove the most fateful.
No one denies the impact of these new devices, or their usefulness. Who at my age, watching precious time fly, wouldn’t bless email for the pointless, time-consuming conversations it replaces? Who denies that Barack Obama’s epic rout of the Republicans would have been impossible without his mastery of internet communication? But with truly revolutionary technology no one stops to factor in the human cost.
Chronic, epidemic obesity among American children, along with unprecedented levels of juvenile diabetes and heart disease, coincides exactly with the advent of “personal technology”. An alarming study that followed 4,000 subjects for three decades indicates that 90 per cent of American men and 70 per cent of American women will eventually be fat.
Worse news is that the American mind is emulating its body – it’s turning to suet. A few years ago the educational benefits of the new technology were hyped hysterically, with futurists and investors predicting an intellectual renaissance anchored by computers.
The reality seems to be just the opposite. Though the educational potential of the internet is limitless, it’s becoming apparent that students use technology less to learn than to distract themselves from learning, and to take advantage of toxic short cuts such as research paper databases and essay-writing websites. Entrance exams administered by ACT Inc establish that half the students now entering college in the US lack the basic reading and comprehension skills to succeed in literature, history or sociology courses. Reading and writing skills among eighth graders decline each year, as internet penetration rises. Only three per cent now read at the level scored “advanced” and the state of Maine recently scrapped its eighth grade writing test because 78 per cent of the participants failed. Half the teenagers tested by the advocacy group Common Core could not place the Civil War in the second half of the 19th century, a quarter drew a blank on Adolf Hitler, a fifth failed to identify America’s enemies in the Second World War. A third of America’s high school students drop out – one every 26 seconds – and two thirds prove incapable of higher education.
Doubts are spreading, though perhaps too late. In the spring of 2007, Liverpool High School in upstate New York made national news when it abandoned its laptop programme as a failed experiment and went back to books. “After seven years there was literally no evidence it had any impact on student achievement – none,” said Mark Lawson, president of the Liverpool school board. While their test scores stagnated, Liverpool students used their laptops to cheat on exams, message friends, hack into local businesses, update Facebook profiles and download pornography.
“The teachers were telling us that when there’s a one-to-one relationship between the student and the laptop, the machine gets in the way,” Lawson concluded. “It’s a distraction to the educational process.”
There’s so much more to dislike about our cocoon woven of wires, our house built of chips. Thieves, grifters and predators of every description have flourished in the cyber-forest; the signature crime of the 21st century is identity theft. The internet is the greatest gift to the paedophile community since the Vatican stood its ground on celibate priests.
But if you think these are all quibbles compared with the joy and comfort your hardware provides, try out your polished indifference on the prospect of environmental apocalypse. “E-waste”, as it’s now called, is the sobering dark side to even the rosiest view of an all-wired future. In the US in 2005, more than 1.5 million tons of discarded electronic devices ended up in landfills, where hi-tech’s toxic metals, including lead, mercury, cadmium and beryllium, find their way into the soil, the water tables and the air.
In China, which produces a million tons of e-waste annually and imports, for profit, 70 per cent of the world’s lethal garbage (estimated at as much as 50 million tons), whistle-blowers are already blaming high rates of birth defects, infant mortality and blood diseases on e-waste. With their reliance on instant obsolescence and limited commitment to recycling, hardware manufacturers create an unmanageable flow of poisonous trash that the planet can’t possibly tolerate: Americans alone discard 100 million computers, cellphones and related devices every year, at a rate of 136,000 per day. Half a billion of the US’s old cellphones sit in drawers, dead but not buried. There is no place and no plan for all this stuff. Our world has been wired by wildly inefficient technology – it takes roughly 1.8 tons of raw materials (fossil fuels, water, metal ores) to manufacture one PC and its monitor, and mining the gold needed for the circuit board of a single cellphone generates 220lb of waste. These industries are self‑evidently unsustainable. They are not environmentally sane.
The case against technology is not a difficult one to make, not even for someone from a generation like mine, which chose to fry millions of healthy neurons with LSD, psilocybin, cannabis and cocaine. The walking wounded from that excess are still around, but most of us kicked our habits and descended safely from those treacherous highs.
High tech is a habit too new to boast any record of survivors, recovering addicts, successful rehabs. So far, no one’s coming back. In the words of recovery programmes, users have yet to acknowledge that they have a problem. Or that there is a problem. Staring for hours at glowing squares, gossiping with needy strangers, poking away at little keyboards, playing half-assed violent games – does this strike anyone as an interesting and honourable life, or even a preparation for one? And the answer, more often than not, would come back, “Sure, what’s your problem?”
With that last outburst, I probably sacrifice half the readers I have left. But if you’re offended or threatened, console yourself with the impotence and rapid extinction of my kind. We pose no threat to your habit.
Technology’s sceptics are ageing and thinning out. Soon, by conversion or attrition, they will vanish. Soon, when everyone is born wired into the hive, no more of them will appear. All the more reason to have our say, leave our protests on the record, exit cursing and fighting.