The Virtual World: Darwin Says We Can Handle It
alt="Just Behave - A Column From Search Engine Land" align="left"
hspace="5" vspace="3" width="100" height="100">One of the things that has always struck me when I look at online user behavior is the scattered and frenetic scanning of the page. This becomes particularly clear when you look at eye tracking results. We quickly shift our eyes to cue to cue, picture to picture, headline to headline, link to link. We focus quickly on what is important and relevant, and discard that which is off task. We don’t read in a linear manner, we “berrypick” in a random way. Many user behavior analysts have indicated that this represents new behavior for humans. But the more I look at it, the more I think that this is actually more natural than the learned behaviors we’ve become accustomed to in the past few centuries.
Multi-tasking has become a buzzword with the advent of computers and the Internet. The ability to split our limited attention amongst various tasks, doing a quick mental triage to determine which of the tasks receive slices of our rational focus, has emerged as a core capability that is required for coping in our increasingly harried world. But the ability to multi-task is nothing new. Humans evolved as the superior species in part because of our ability to multitask. Our brains are marvelously complex mechanisms that allow us to juggle multiple sensory inputs, moving some into the spotlight of rational attention when we need to, and moving others into the subconscious queue. There, without us being conscious of it, our brain, through our senses, continues to monitor their status, quickly moving them to the conscious level when we need to. The brain is actually a very sophisticated multi-tasking device.
Gord Hotchkiss, please see the gate agent…
Let me give you an example. You’re in the airport working on your laptop, and as you focus on your work, the constant gate announcements blur into the background. If I interrupted you and asked you to tell me what the last one was, you probably wouldn’t remember. You weren’t consciously aware of it. But if suddenly your name is announced over the speakers, your brain immediately alerts you that your attention is needed and your conscious mind shifts from the work you were doing to listen intently to the announcement. Your brain was monitoring it all the time. It’s called selective perception.
Driving and daydreaming
Here’s another example. Ever drive home on a route you take all the time, either from work or your children’s school, and get home only to realize you didn’t really remember driving there? You’ve driven the route so often that it’s worn a path in your brain and you can do it on autopilot. Meanwhile, your mind wanders in a million different directions, thinking about work, what’s for supper, your next vacation, and the marks on your daughter’s report card. But all the time, you’re scanning your environment. If a pedestrian steps in front of you, you slam on the brakes. And you did it faster than you could ever rationally think about it. It’s a hereditary hardwired shortcut, straight to your amygdala, the emergency response center of your brain, bypassing your conscious mind.
By the way, while we’re on the subject of driving, if we’re so good at multitasking, why is talking on a cell phone so dangerous when we’re behind the wheel? It’s not because one of our hands is tied up, as we previously thought. Studies have found that even with hands free devices, we’re four times more likely to be in a car accident when talking on a cell phone. This risk is the same as driving while drunk. And it’s all about reaction time. One study found that if you put a 20 year old behind the wheel talking on a cell phone, their reaction time is the same as a 70 year old not talking on a cell phone.
Here’s the reason. It’s one thing to daydream. That happens in a part of our brain that can be instantaneously turned off, when required, to focus on more urgent matters. Daydreaming is like the brain idling. It doesn’t put too much of a cognitive load on the brain. But a conversation puts a much higher load on the brain. You have to focus your attention on what the other person is saying, and the minute we focus one sense on one stimulus, we lose much of our ability to monitor our environment with that sense.
But it’s more than just the act of listening. Carrying on a conversation requires us to process language, to translate what we’re hearing into concepts, and to take our concepts and translate them back into language. This is one of the most demanding tasks our brain has to do. While carrying on a conversation might not seem like much work, it’s moving our brain from slow idle to 5000 RPMs, firing on all cylinders. Which means there’s less capacity there to process emergency stimuli. In practical terms, we’re talking about a handful of milliseconds, as the brain switches tasks, but that difference can be several car lengths when slamming on the brakes. It’s the difference between a head-on collision and a near miss.
Steering back to the Internet
So, what does this have to do with online behavior? Well, like everything we do, it comes back to how our brain processes information. As you can see, we’re used to operating in environments where we’re always scanning around, looking for the things that merit our attention. When we use a computer, we have multiple windows open. Everyone at Enquiro has two monitors, so we can have even more windows open. Right now, as I write this, I have a browser window open, my email, my calendar, and, every so often, my instant messaging window pops up with a question from a co-worker. Plus, I have the Google sidebar, which is alerting me to new RSS feeds, various alerts, and incoming email. If I were on an eye tracking machine now, you would see my eyes alternating between focusing on the words of this column and skipping over the two monitors, seeing if there’s anything I need to focus on. The most common flag for our attention, again a product of evolution, is a change in status. We’re conditioned to notice change. Although I’m not paying conscious attention to the RSS feeds, if it changes, I’ll probably notice and briefly glance at it.
Computers offer a rich set of multiple stimuli, and much thought has been given to the nature of our engagement with those stimuli. Also, like humans often do, we’ve compared that engagement with what we’ve previously been used to. Do we interact with a computer screen the same as we interact with a book, or, for that matter, a TV? The assumption was that we should carry forward our behaviors from one medium to another. But in the case of computers and our online world, that assumption may have been mistaken.
Leaving reason behind
As we moved from the 19th to the 20th Century, we were at the tail end of the age of reason. As we discovered our rational minds, we believed that studied, focused thought was the epitome of human development. We discounted the importance of the subconscious and embraced conscious thought. The ideal, we “reasoned,” was the type of engagement we had when we were reading a book; a linear, rational gathering and interpretation of information. As electronic media came along, we kept this linear approach. Stories unfolded in a straightforward way, either through sound on the radio, or sight and sound on TV. The more we paid attention, the better the experience. Advertising took advantage of our focused attention by slipping a few ads into the linear experience. The theory was, now that we have your full attention, let us sell you some stuff. It was the paradigm of “focus on one thing at a time.”
But as we’re now finding out, a step by step forward approach is not how we normally act as humans. We’ve learned to do it in the last few centuries, primarily because we didn’t have any choice. We love to be entertained, and our only entertainment options all took this same format. One idea at a time, carefully crafted to keep your attention. But if you think about it, the more we’ve been offered options, the more we’ve reverted to our old, pre Age of Reason behaviors. The old days of a family sitting in a darkened living room, watching the gray, flickering screen have given way to ad zapping, channel hopping, multiple screens, and fleeting attention spans. It’s not just that we have more fighting for our attention; it’s that we were built to handle more.
Survival of the surfer
So, when the computer and the internet came along, the reason we adapted to it so quickly is that we were already pre-wired for it. The multi-screen, multi-task environment, while drastically different than reading a book or watching a TV show, was much more like real life. The challenge now is for us to understand this. The golden days of advertising, where we could have a person’s undivided attention, are forever over. It was a temporary anomaly, caused by new options that temporarily forced our brains to work in an unnatural, singularly focused manner. And, because of our love affair with rational thought, we fooled ourselves into thinking that this was the way it should always be. Not so. You can’t discard millions of years of evolutionary development that quickly. Humans will be humans, and given the option, we will naturally go where our genes take us.
So, as interface designers and marketers, we have to understand things like selective perception (the cause of banner blindness), conscious focus, and subconscious monitoring. In the real world, we are in control of what we pay attention to and what we don’t. We pick and choose based on what’s important to us. For a while, our advertising channels fooled us into paying attention to whatever was presented to us. But not anymore. In the multi-tasking world of online, the user is back in control. They may be juggling a lot of things, but it’s up to them which ones they drop and which ones they keep in the air.
Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.
(Some images used under license from Shutterstock.com.)
Sign up for content marketing news and tips delivered every Tuesday.