Opinion

The texting dead

iZombies, by Leah Tiscione for Sunday PostScript

iZombies, by Leah Tiscione for Sunday PostScript (
)

(
)

(
)

On July 19, 1989, United Airlines Flight 232 from Denver to Chicago crash-landed in Sioux City, Iowa, broke up and burst into flames. A third of the 296 people on board died, but all of them might well have perished because of information overload.

The failure of the plane’s rear engine led to the extraordinarily unlikely breakdown of all three of the independent hydraulic systems. The two pilots were bewildered. One of them, Alfred Haynes, later testified that the duo had never been trained for such a scenario, and as a result both were called upon to execute hundreds of decisions under unbelievable stress. The difficulty of deciding what to do next caused them to freeze up. British psychologist David Lewis said the two men were suffering from information fatigue syndrome, or IFS.

Perhaps both of their lives were saved by a pilot who wasn’t even flying the plane: passenger Dennis Fitch, an off-duty flight instructor who happened to be on the plane. He volunteered to help, and the fact that he brought fresh eyes to the task — he hadn’t been bombarded with all of the information the other two pilots and the flight engineer were trying to handle — was critical.

Haynes later estimated that Fitch had cut the pilots’ information processing requirements by 30%, enabling them to reduce their stress level and save most of the souls onboard.

“Having too much information can be as dangerous as having too little,” Lewis concluded.

So if you can’t get any work done because you can’t stop checking your Twitter, Facebook, texts, e-mail and Tumblr: It could be worse. Nevertheless, the stress of constant and infinite connectivity — we’re all RAM and no hard drive, all present and no history, all one-liners and no reflection — will continue to have consequences.

“Our daily schedule, dividing work time from time off, is discarded. Rather we are always on,” writes Douglas Rushkoff in “Present Shock: When Everything Happens Now” (Current), an attempt to grapple with our current predicament. “Our boss isn’t the guy in the corner office but a PDA in our pocket. Our taskmaster is depersonalized and internalized — and even more rigorous than the union busters of yesterday. There is no safe time.”

Some consequences of this permanent state of being plugged in seem bizarre: Take the drop in gum and magazine sales. Consumers waiting in line at the grocery store are ignoring the tempting racks of impulse buys that are meant to attract them while they wait. Instead, they’re checking their smartphones. Gum sales fell 5% last year. Newsstand magazine sales, which were hurt by the recession, looked like they were coming back but plummeted another 12% in just the second half of last year. Magazine companies are scrambling to find other places in the store to sell their wares. One retail analyst calls smartphones “mobile blinders.”

Even if you don’t work at Time or Trident, though, you need to be very worried about what Alvin Toffler called “information overload” way back in 1970’s “Future Shock,” whose title inspired Rushkoff’s.

The increased stress of being always on is being linked to adverse health effects, but what may be more surprising is the dropoff in creativity that comes with present shock: Eureka moments of sublime breakthroughs tend to arrive the way Don Draper once described the process on “Mad Men” “Think about it deeply, then forget it . . . then an idea will jump up in your face.”

A researcher named Kevin Dunbar analyzed hundreds of hours of video of scientists and concluded that “the vast majority of breakthroughs did not occur when the scientists were alone staring into their microscopes or poring over data,” writes Rushkoff, “but rather when they were engaged with one another at weekly lab meetings or, presumably, over lunch.”

The deep concentration that sets up that moment of relaxed inspiration is becoming increasingly elusive in an era of fragmented attention spans.

Teresa Amabile of Harvard Business School studied 12,000 diary entries of 238 workers and found that people create most effectively when they concentrate intensely, not when they’re being interrupted frequently. In a University of Michigan study, people whose attention was divided among several projects at once both took longer and made many more mistakes than people who completed each job before moving on to the next.

Yet we’re addicted to the “dopamine squirt” of that little bit of attention being paid to us every time someone pings us. Real-world relationships suffer: witness the sad phenomenon of two people sitting in a restaurant together, each staring at their mobile devices.

“In the real world,” Rushkoff notes, “94% of our communication occurs non-verbally. Our gestures, tone of voice, facial expressions, and even the size of our irises at any given moment tell the other person much more than our words do.”

As Rushkoff reminds us, the pride we take in our multitasking ability is a lie. You’re either here or there. We are monotaskers, period. You can shift your attention back and forth from e-mail to “Game of Thrones” if you want, but you can’t process both of them at the same time. A quick glance at the sidewalks of Manhattan, where you can easily pick out the absurdly slow-moving iZombies who are checking their smartphones, reveals that even the simplest, most brainless task, like walking, is severely adversely affected when attention is focused elsewhere.

Rushkoff isn’t the first to worry that our brains are being rewired as we struggle to keep up with the moment. For instance, Nicholas Carr, in his book, “The Shallows: What the Internet Is Doing to Our Brains,” noted that even when he sat down to read a book he found himself hungering to replicate the online experience of jumping around to different things.

“Any kind of thought process that requires focus on one thing is what is being disrupted,” he told PBS, “and, unfortunately another thing brain science tells us is that the process of paying attention, paying deep attention, activates a lot of our deepest thought processes. Our long term memory, the building of conceptual knowledge, critical thinking, all of those things hinge on our ability to pay attention.”

Moreover, our obsession with the present is self-defeating. “Our efforts to keep up with the latest tweet or update do not connect us to the present moment, but ensure that we are remaining focused on what just happened somewhere else,” Rushkoff writes. “We guide ourselves and our businesses as if steering a car by watching a slide show in the rear-view mirror.”

One study found people consumed three times as much information in 2008 as they did in 1960. The purpose of the machines that (increasingly) rule us rather than the reverse is seemingly to maximize the flow and storage of information. But in the elegant phrase of virtual reality pioneer Jaron Lanier, writing in his book “You Are Not a Gadget,” “Information is alienated experience . . . Stored information might cause experience to be revealed if it is prodded in the right way. A file on a hard disk does indeed contain information of the kind that objectively exists . . . But if the bits can potentially mean something to someone, they can only do so if they are experienced. When that happens, a commonality of culture is enacted between the storer and the retriever of the bits. Experience is the only process that can de-alienate information.”

The sense of alienation from even our actual literal space troubles Rushkoff. In a play on schizophrenia, Rushkoff calls the condition of being mentally in various places at once “digiphrenia.” The neologism is a sign of Rushkoff’s weakness, in the tradition of authors who are writing not so much to be read as to be hired as gurus, for buzzwords. The author expends much of his energy trying to squeeze and reshape modern culture to fit one of his idea-molds (“overwinding,” “fractalnoia,” “chronobiology” and “narrative collapse” are other coinages that seem unlikely to catch on).

Still, Rushkoff makes some resonant points. Such as this one: “If everyone in the world is your Facebook friend, then why have any Facebook friends at all? We’re back where we started. The ultimate complexity is just another entropy.”

Shutting out is the new hooking up: pare those virtual connections down to a manageable level. Google+ is meant to be a more restricted, less frenetic answer to Facebook; two years ago a group of NYU students launched another, less centralized, open-source social network, Diaspora, that aims to do a better job of respecting privacy than Facebook.

A risk of overconnection: you may start seeing alarming connections everywhere (this is “fractalnoia”), to the point of making you a conspiracy theorist. “Simultaneity often seems like all we have,” writes Rushkoff.

“That’s why anyone contending with present shock will have a propensity to make connections between things happening in the same moment — as if there had to be an underlying logic. On the business-news channels, video footage of current events or presidential press conferences plays above the digital ticker tape of real-time stock quotes. We can’t help but look to see how the president’s words are influencing the Dow Jones average.”

Don’t connections strengthen us socially, each filament making a stronger web? Rushkoff says that at the dawn of the 21st century, he was “one of the many pushing for more connectivity and openness . . . It seemed the only answer for our collapsing, top-down society was for everyone and everything to network together and communicate better and more honestly . . . our society could emulate a coral reef — where each organism and colony experiences itself as part of a greater entity.”

Now he takes a more cautious stance: today it’s evident that a tweet can ruin a company (or a congressman), a “fat finger trade” can cause a stock market crash, the collapse of a small economy in the Mediterranean can spark global turmoil or a handful of determined terrorists can bring down a group of office buildings and push a gigantic economy into a recession.

Even to bother to write an entire book about the cult of the present seems a bit eccentric and obtuse. “Most of my audience . . . will not be getting this far into the text,” Rushkoff says, accurately, on the penultimate page. The public will read excerpts and get the gist, like the student of Rushkoff’s who said he got the gist of “Hamlet” by reading only the “to be or not to be” speech.

In the years it took him to write “Present Shock” Rushkoff could have simply joined the flow of the present: “I could have written dozens of articles, hundreds of blog posts, and thousands of tweets, reaching more people about more things in less time and with less effort. Here I am writing opera when people are listening to singles.”

It looks like there’s no going back now. Not when, for instance, one of humanity’s oldest stories is being pulverized into tweets. Consider the (three, so far) tweets of Pope Francis via @Pontifex. After thanking the faithful in Tweet one, he moved on to his second thought: “Let us keep a place for Christ in our lives, let us care for one another and let us be loving custodians of creation.”

Remember when they came out with those vernacular “Good News” Bibles and the dumbing-down seemed like an affront? Those volumes will soon seem way too challenging now that you can take the Bible in bite-sized form.

Kyle.Smith@nypost.com