Eyetracking & SEO: Fad, Fact, Or Fiction?
Do SEO professionals use eyetracking usability studies as a link-juice fad? Can eyetracking tests yield useful information about Web searchers?
As a search usability professional, I am always interested in the results of high-quality usability tests on search engine results pages (SERPs).
I want to know whether test participants can truly complete their search tasks more efficiently, more effectively and with greater satisfaction. Were any roadblocks encountered? If so, what were they? Can we minimize or eliminate some roadblocks? If so, the result is better search results pages.
For search optimization professionals, the result is better optimization because, for the most part, search engines get content for search listings from your website. Better content, better labels, better aboutness, better search listings.
However, I am always troubled by usability studies that are not conducted properly. So for today’s article, here are some things you should look out for when hiring a usability firm to conduct eyetracking tests on search interfaces.
Characteristics Of Test Participants
Whenever you conduct a usability study, it is very important to conduct the study with participants who fit a persona or profile. For example, if a website’s primary target audience consists of women who makes major health care decisions for their families, then adolescent boys should not be among the test participants.
Testing SERPs is no exception. Characteristics of Web searchers vary by gender, age, experience and so forth (see resources below).
Quantity Of Test Participants
I admire the knowledge, experience and writings of usability guru Jakob Nielsen very much. However, I almost feel that his research, Why You Only Need To Test With 5 Users, was greatly misinterpreted — especially by SEO professionals.
Nielsen has published more recent and more detailed work on usability testing. In his Eyetracking Methodology Report — How To Conduct And Evaluate Usability Studies Using Eye Tracking (2009), he stated:
“… heatmaps can be dangerous because they appear to be qualitative representations of multiple users’ fixations, when in reality they are quantitative because they are based on statistics. If you are using heatmaps to actually draw conclusions based on an aggregate of users’ experiences, or if heatmaps are the main deliverable, then eyetracking requires many more test users than traditional usability studies. If using heatmaps to analyze data, ensure that you have 30 users per heatmap. Thus, you should include about 39 users (as opposed to five or so for a traditional qualitative study).”
For those of you who wish to conduct valid, high-quality eyetracking tests on SERPs, please recruit the right amount of test participants who fit a persona or profile. Don’t overgeneralize.
As I mentioned in a Search Engine Land article earlier this year, usability test participants should be presented with the same scenario. Eyetracking tests are no exception. According to Usability.gov:
“A scenario is a short story about a specific user with a specific goal at your site. Scenarios are the questions, tasks, and stories that users bring to your Web site and that the Web site must satisfy. Scenarios are critical both for designing Web sites and for doing usability testing. (Source: Create Scenarios.)
“Eyetracking is an interesting technology, but it can be very misleading,” says Dr. Susan Weinschenk, founder of the User Experience Institute. “One problem with eyetracking is that researchers underestimate the effect that the wording of instructions has on where participants look. Early research by Yarbus in the 1960s showed that the pattern of the eyetracking depends on what you say to the participants during the study.”
(Please see Susan’s blog article 100 Things You Should Know About People: #18 — What People Look At On A Picture Or Screen Depends On What You Say To Them, for interesting photos from Yarbus’ research.)
As we all know, search listings are different from person to person. If a keyword or keyword phrase shows local intent, a Web searcher in Chicago will get considerably different search results that a Web searcher in San Francisco.
Furthermore, if a Web searcher is logged in, search results are further personalized. So the usability firm should ensure that test participants are presented with the same scenario, the same instructions, the same SERPs and the same search environment.
Foveal Vision, Peripheral Vision & Attention
According to design researcher Jim Ross in the article Eyetracking: Is It Worth It?, from UX Matters:
“Eyetracking can be misleading, because it does not capture peripheral vision. Eyetracking records and displays foveal fixations, in the small part of our visual field that produces the sharpest vision. It does not record peripheral vision, which makes up 98% of our visual field. This is significant, because we use peripheral vision to choose where to fixate our fovea next.”
Foveal vision, peripheral vision, saccades, fixations, the eye-mind hypothesis — these are all terms that I would expect a firm conducting a usability study to know. In the1980s, Marcel Adam Just and Patricia Carpenter came up with the eye-mind hypothesis, which states that there is a strong correlation between where one is looking and what one is thinking about.
Well, I am thinking about giant, pink, fire-breathing dragons right now. I’m not looking at one. OK, I admit I am being a bit sarcastic. The eye-mind hypothesis does have some validity.
However, I do believe SEO professionals need to evaluate eyetracking studies more critically. Searchers might seem to ignore a search listing or a search engine ad when they might see (and remember) one of them with peripheral vision.
Although it might seem that I am a naysayer of eyetracking studies, I am not. I don’t believe eyetracking is a fad. A well-conducted usability test has always provided me with insights into creating better and more useful search-engine friendly websites.
But I am critical of search firms that conduct this type of research. Make sure they are qualified. Make sure they are not overgeneralizing. Make sure they are conducting eyetracking studies properly.
“Just as with any research, you have to make sure that the research had enough participants, and ask whether the participants were representative of your audience,” Weinschenck concludes. “It’s very important that you stop and think about what the eyetracking results mean and be cautious about changing your whole design strategy based on eyetracking research.”
For those of you who are interested in eyetracking and searcher characteristics, here are some useful resources below.
- Eyetracking Web Usability book companion section on Useit.com.
- Fidel, R., Davies, R.K., Douglass, M.H., Holder, J.K., Hopkins, C.J., Kushner, E.J., Miyagishima, B.K., & Toney, C.D. (1999). A visit to the information mall: Web searching behavior of high school students. Journal of the American Society for Information Science, 50(1), 24–3
- Holscher, C. Strube, G. (2000) Web Search Behavior of Internet Experts and Newbies. International Journal of Computer and Telecommunications Networking, 33(1–6), 337–346.
- Lazonder, Ard W. et al. (2000) Differences Between Novice and Experiences Users in Searching Information on the World Wide Web. Journal of the American Society for Information Science, 51(6), 576–581.
- Large, A., Beheshti, J., & Rahman, T. (2002). Gender differences in collaborative web searching behavior: an elementary school study. Information Processing & Management, 38 (3), 427-443.
- Lorigo, L., Pan, B., Hembrooke, H., Joachims, T., Granka, L., & Gay, G. (2006). The influence of task and gender on search and evaluation behavior using Google. Information Processing & Management, 42, 1123-1131.
- Maghferat, Parinaz, & Stock, Wolfgang G. (2010). “Gender-specific information search behavior.” Webology, 7(2), Article 80. Available at: http://www.webology.org/2010/v7n2/a80.html
- Poole, A., and Ball, L. J. (2006). Eye tracking in HCI and usability research. In C. Ghaoui (ed.), Encyclopedia of human-computer interaction. Idea Group Inc., Pennsylvania.
- White, R. and Morris, D. (2007). Investigating the Querying and Browsing Behavior of Advanced Search Engine Users. In Proc. SIGIR 2007, 255-262.
- Yarbus, A. L. (1967). Eye Movements and Vision (B. Haigh, trans.), New York: Plenum.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.