Search In The Year 2010
If I ever had to build a search engine, or more precisely, the interface of a search engine, this would be the team I would want to bring together. When I came up with the idea of looking forward three years and speculating on what the search results page may look like in 2010, these […]
If I ever had to build a search engine, or more precisely, the interface of a search engine, this would be the team I would want to bring together. When I came up with the idea of looking forward three years and speculating on what the search results page may look like in 2010, these are the names that immediately came to mind:
- Jakob Nielsen, the Web’s best-known usability guru
- Marissa Mayer, Google’s VP of user experience and interface design
- Michael Ferguson, one of the architects of Ask’s unique user experience
- Larry Cornett, the VP of search experience at Yahoo!
- Justin Osmer, Product Manager for Microsoft Live search
- Chris Sherman, Executive Editor of Searchengineland and always thoughtful industry observer
- Greg Sterling, another industry analyst who always has interesting insights, particularly in the local and mobile world
- Danny Sullivan, the Go To Guy of search
This would be the dream team for designing the new search interface. So it was with a great deal of anticipation that I threw in front of them the same question: what will the search results page look like in 2010? Here, aggregated and condensed, are their answers. I’ve broken them into themes that consistently came out in these interviews. We covered a lot of ground, so we’ll cover the first half this week, and the next column will run on September 7th.
The look of the search results page
The search results page has defined itself into an accepted standard. With the exception of Ask 3-D, all the other major players have a very similar look to the page. We have some sponsored ads on top, ten blue organic links and generally some sponsored ads on the left side. It’s a very linear format that runs from top to bottom and is almost always composed exclusively of text. And although this format has refined itself over the past decade, there haven’t been any significant changes to the look. Will that continue to be true in the next three years?
Marissa Mayer: I think it will be, hopefully, a layout that’s a little bit less linear and text based, even than our search results today and ultimately use what I call the ‘sea of whiteness’ more in the middle of the page, and lays out in a more information dense way all the information from videos to audio reels to text, and so on and so forth. So if you imagine the results page, instead of being long and linear, and having ten results on the page that you can scroll through to having ten very heterogeneous results, where we show each of those results in a form that really suits their medium, and in a more condensed format. When you started seeing some diagrams, some video, some news, some charts, you might actually have a page that looks and feels more like an interactive encyclopedia. To keep hounding on the analogy of the front page of the New York Times. It’s not like the New York Times… I mean they have basically the same layout each time, But it’s not like they have a column that only has this kind of content, and if it doesn’t fill the column, too bad. They have a basic format that they change as it suits the information.
Jakob Nielsen: There could be small changes, there could be big changes. I don’t think big changes. The small changes are, potentially, a change from the one dimensional linear layout to more of a two dimensional layout with different types of information, presented in different parts of the page so you could have more of a newspaper metaphor in terms of the layout. I’m not sure if that’s going to happen. It’s a huge dominant user behavior to scan a linear list and so this attempt to put other things on the side, to tamper with the true layout, the true design of the page, to move from it being just a list, it’s going to be difficult, but I think it’s a possibility. There’s a lot of things, types of information that the search engines are crunching on, and one approach is to unify them all into one list based on it’s best guess as to relevance or importance or whatever, and that is what I think is most likely to happen. But it could also be that they decide to split it up, and say, well, out here to the right we’ll put shopping results, and out here to the left we’ll put news results, and down here at the bottom we’ll put pictures, and so forth, and I think that’s a possibility.
Larry Cornett: The search experience is becoming a lot more interactive, it’s a lot more of a dynamic experience and we’re all experimenting with bringing back a lot of richer content. And rich meta data, structured data, much more than we ever were doing before. It’s going beyond the simple text title and abstract and URL.
Does search become our own personal portal page?
As search engines get to know us better, do they become our home page for everything? Do search engines get smart enough to bring together all the information we need about any topic, at our request, and organize it into a rich portal like page that gives us a jumping off point into a number of different types of content? It used to be that search was a “get on, get off” task but increasingly, search is becoming a stickier experience. Google, probably the clearest example of the “tool based” approach to search, has recently acknowledged the importance of personalized homepages with the introduction of iGoogle, moving them much closer to a Yahoo type model.
Larry Cornett: I think one place that you’ve seen us doing a bit of that is with what we call our WOW experiences. And so we launched a few of the movies, it’s like a direct display; it’s a really rich direct display on the search page. We’ve launched movies; we recently launched travel and, most recently music, so music artists. And essentially that’s bringing a lot of very useful information from different sources together in one place. And so users don’t have to click and go look at this, for example the movie trailer, or the movie experience, they don’t have to click and go one place to see the trailer, click and go to another place to see information about the movie and ratings, for example, click to yet another place to get show times for their local city. It’s actually all brought to the user in one compact module so it’s all that information that they would find useful in one place. I can’t speak to what Google is up to but they obviously must realize that a home page with nothing but a search box isn’t quite serving everything that people want. And they launched iGoogle, obviously, for that reason.
So let’s venture one step further. What if this search portal, iGoogle, for example, let’s us put in a query and it then builds a page for us which appears as a new tab, complete with a mix of results, based on the engine’s understanding of where we’re at and what our intent is. I put that to Microsoft’s Justin Osmer (without the iGoogle part):
Justin Osmer: Yes, I can see that very easily. And we’ve just scratched the surface on that with Live.com. You can set up a personalized page on live.com and pull in search results. You can set up so that it queries news results for your everyday. You get fresh news results on a query that you always search on, you can bring in all sorts of RSS feeds from literally the whole web, so you’re constantly getting updates of feeds very easily and you can subscribe to an interest area of search and get that populated for yourself. That absolutely is a scenario that makes a lot of sense.
Search as a social experience
Personalization has somewhat pushed social search to the back burner as a promise for the future. But increasingly, the social nature of the web is converging with search in more and more cases. Will the next three years see a furthering of this convergence and the blending of Web communities and search functionality?
Larry Cornett: And that’s having more of a human influence and obviously more of a human presence in the search experience. You’ll see that quite a bit with the way that StumbleUpon has been working, the things that we’ve done with image search with a tighter integration of Flickr images and really showing the attribution that those photographs coming from real people within the network.
Larry’s comment made me think of a future where we may stay within our favorite online communities more and look for search functionality to be brought into that environment. And in that environment, will search become more of a discovery tool then a navigational tool? Will search leverage the benefits of the community to help make suggestions in less task focused situations?
Larry Cornett: You talk about Facebook, you talk about some of these other examples where people may have a fuzzier information need or a different type of desire for information. In many cases, image search for example, people use it for entertainment and so they’re using it to pass the time or they’re just curious. And in those cases, it’s not like they have really specific targeted goals in mind or very specific queries in mind even, but they’re pretty open to being given information, so you’ll see that, obviously with Facebook, a lot of that is you coming in and just understanding what’s going on within your network and that activity. And so you’re learning about things that you may not have even formed a query about. Your friend’s telling you about a new website that you wouldn’t even know to ask about on a search engine. You discover it because he tells you about it. StumbleUpon is very similar in that you want to see what people consider to be quality content and you don’t know exactly what you’re looking for, but you just press the stumble button and see what are people thinking is interesting right now. So this is definitely, I think, a lot of convergence in some of these areas.
Smarter search engines
Another major theme was not so much what search engines would look like but how they would get smarter in the background. Driving this would be factors like personalization and tweaking of algorithms.
So talk of personalization has pretty much dominated the search engine space for the last two or three months. Our panel seems somewhat split on the promise of personalization to significantly move the needle on relevance in the next three years.
Chris Sherman: I don’t really see any kind of dramatic breakthrough on the horizon. I think as long as we’re limited to the current search form factor, if you will, where we’re encouraged to do the slot machine approach, where we punch in a few keywords, pull the lever and hope to hit the jackpot. Language is so inherently ambiguous that as good as the search engine gets, as good as they are at observing our behavior and our habits of reading and so on, being limited to those very, very short queries—that’s really the governor on the whole thing. And it’s not because people aren’t trying; it’s because of those inherent ambiguities in language.
Danny Sullivan: I think personalized search is going to continue to get strong. I do think that Google is onto something with their personalized search results. I don’t think that they’re going to cause you to be in an Amazon situation where you’re continuing to be recommended stuff you’re no longer interested in. I think that people are misunderstanding how sophisticated it can be.
Jakob Nielsen: All this stuff… all this talk about personalization, that is incredibly hard to do. Partly because it’s not just personalization, based on a user model, which is hard enough already. You have to guess that this person prefers this style of content and so on. But furthermore, you have to guess as to what this person’s “in this minute” interest is and that is almost impossible to do. I’m not too optimistic on the ability to do that. In many ways I think the web provides self personalization, you know, self service personalization. I show you my navigational scheme of things you can do on my site and you pick the one you want today, and the job of the web designer is to, first of all, design choices that adequately meet common user needs, and secondly, simply explain these choices so people can make the right ones for them. And that’s what most sites do very poorly. Both of those two steps are done very poorly on most corporate websites. But when it’s done well, that leads to people being able to click—click and they have what they want, because they know what they want, and its very difficult for the computer to guess what they want in this minute.
Greg Sterling: I think there are some technical issues like, what does it mean, and what does that look like, you and I doing the same queries over a period of time, what would our results look like? Is there a real benefit there for us? We could probably argue in some cases yes: I would point to the local as an example. I think there is a political challenge, right? My point of view is that there is a PR and political challenge around privacy and the so called creep factor that people feel when they think that the engine is studying them and monitoring their behavior and that record somehow makes them vulnerable or makes them uncomfortable. But I do think it is a potentially significant advance in certain contexts, if it really goes to disambiguation. Yes, I think in certain cases it does make a meaningful difference.
Larry Cornett: I think we’re just barely on the tip of the iceberg with how useful it could be. I think traditional approaches to personalization have required a lot of work on the part of the user and I think, just given my experience over the past years in various places at different companies and working for software and the Web, people don’t like to spend a lot of time configuring their preferences. So anytime you try to take an easier approach and say I’ll let the user customize experience or personalize it, it’ll work for a small number of users that care to invest but the large majority don’t want to have spend a lot of time doing that. So I think the key to personalization is actually finding a way to do that in a way that requires very low investment from the user and it has a lot of return. And so it’s finding that balance of trying to get that just right so that the user gets a lot more value than they have to put energy into it.
I then asked Larry if 3 years was too short a time for personalization to make a significant difference.
Larry: You know, in the past I might’ve said yes but I think there’s an increasing pace of change that’s occurred within this industry. And I think that three years is not too long, within the next three years I think we’ll definitely have a lot more answers and I think there’s so many people that are springing up in this space, playing around with all these startup experiences for search that the velocity definitely increase. I think that’s we’ll see something soon.
Justin Osmer brought up a variation on personalization with mode based search, where engines become smarter at unraveling the intent of the user:
Justin: An area that we’re focusing on over here at Live search is thinking more about the mode in which people are in when they’re using search. Are they exploring, just kind of poking around, or truly researching something? Are they looking to purchase something? Are they in there simply for entertainment refreshment sake? So they just need five minutes to goof off and poke around and look up vacations or something. Then being able to present the results in a way that give you that full spectrum of experience, so that the modes of consumption will dictate how much you get in line, so the verticals in essence will become obsolete. The same rich content that you might get in a vertical experience may be brought inline or brought onto the results page in a way that shows you, wow, the search engine really does have more here, in a unique way.
Osmer said this “mode identification” could be accomplished in a few ways, including personalization:
Justin: I think we’re getting close to a tipping point on personalization where people are going to figure out that, “Wow, I can get a lot more out of my search experience if I tell the search engine more about me.” And so it may require some one-time setup time charge to you, to go in and say you like this or you don’t like this, or you want this or you don’t want this, or simply just clicking a box that says “Yes, I okay the search engine to track my queries, or look at my clickstream and give me more relevant information or I want to participate in a beta product that allows me to tell the search engine what I’m looking for so it can learn more about me or people like me.
The other way would be by using the query itself as a determiner:
Justin: We know what the super popular queries are on a day-to-day basis and usually they fall into a category and so if we know the “Paris Hilton”, for example. That’s an area where you’re probably in an entertainment mode and so we would try and offer up a user experience of search results page that would be more tuned that way.
Is Google holding a number of personalization cards up their sleeve?
So obviously, the opinions on the effectiveness of personalization are mixed. But I can’t help wondering if Google is holding a significant portion of the effectiveness of their personalization algorithm in reserve, pending further testing on the beta dataset they’re currently collecting. I posed this possibility to Chris Sherman.
Chris: I suspect that there’s probably quite a bit more that they’re not showing but I don’t know that it’s necessarily that they’re being secretive. I think there’s that caution of changing things too much to alienate the searcher with the search results. I think if they got things that they are able to do we’ll start to see them gradually, in a testing fashion where a few users will be exposed to it and not many other people, and over time, yes, they’ll be rolled out. But honestly, I don’t know, that’s just speculation on my part. But with the number of people that are working on this, they probably have tons of stuff that they’re not showing us.
Search driven by query trends
Danny Sullivan brought up the fact that search engines, with their access to query volumes and trends, should be able to alter the results for extraordinary circumstances:
Danny: I think they’re going to get a lot more intelligent at giving you more from a particular database when they know you’re doing a specific a kind of search. It’s not necessarily an interface change, but then again it is. This is the thing I talked about when I was saying about when the London Car Bombing attempt happened, and I’m searching for “London Bombings”. When you see a spike in certain words you ought to know that there’s a reason behind that spike. It’s going to be news driven probably, so why are you giving me 10 search results, why don’t you give me 10 news results?
Will usefulness become part of a search algorithm?
A tantalizing tidbit of prediction was touched on by a few different people, notably Jakob Nielsen and Marissa Mayer. As we begin interacting with our search results and websites, will the notion of usefulness be factored into future search algorithms? Jakob Nielsen first brought up the possibility.
Jakob: I think we can see a change maybe being a more of a usefulness relevance ranking. I think there is a tendency now for a lot of not very useful results to be dredged up that happen to be very popular, like Wikipedia and various blogs. They’re not going to be very useful or substantial to people who are trying to solve problems. So I think that with counting links and all of that, there may be a change and we may go into a more behavioral judgment as to which sites actually solve people’s problems, and they will tend to be more highly ranked.
And then, without prompting, Marissa Mayer indicated this may be in Google’s thinking in the future as well. She talked about people marking up search result and webpages and interacting with them in a way that indicated that they found them useful and valuable.
Marissa: I think the presentation is going to be largely based on our perceived notion of relevance, which of course leverages the user, in the ways they interact with the page, and look at what they do and that helps inform us as to what we should do.
Another area of innovation is launching a search from within the context of a task or an application through application or operating system integration. Not surprisingly, this was brought up by Justin Osmer from Microsoft, who has long been promising this integration. Chris Sherman also brought it up as another signal to disambiguate intent.
Justin Osmer: From a Microsoft perspective, also being able, when you’re in Office, to do it while you’re writing a Word document to just do a right-click and boom, you’re searching on the term you just highlighted. And are able to set up a search default to whatever engine you want to do that. Some of that technology is there today but we’re going to be doing more and more that I think.
Chris Sherman: Until search engines can find a way let us search by example—submitting a page of content and analyzing the full text of that page and then tying that in conjunction with our past behavior; that’s just one approach…
The semantic search engine?
When will Web 2.0 come to search? To this point, search is still a fairly rudimentary experience, compared to the innovation seen through the rest of the web. The text based presentation and the typical blue hyperlinks look more like the web of 1996 than the web of 2007. Will that change in the next three years? Well, it actually already has started to change. The experience presented on Ask’s 3-D search rolls in much more functionality than we typically seen on a search results page. And this seems to be acting as a catalyst for all the search engines to look at rolling in more functionality. Ajax and other richer programming environments will make the user experience more intuitive and seamless.
Marissa Mayer: We will be able to have much more rich interaction with the search results pages. There might be layers of search results pages: take my results and show them on a map, take my results and show them to me on a timeline. It’s basically the ability to interact in a really fast way, and take the results you have and see them in a new light.
But it’s not just search engine results pages that Marissa sees a higher level of interaction with. She sees a deeper or more interactive experience with all webpages by being able to annotate and markup pages for future reference.
Marissa: I think that people will be annotating search results pages and web pages a lot. They’re going to be rating them, they’re going to be reviewing them. They’re going to be marking them up, saying “I want to come back to this one later”. So we have some remedial forms of this in terms of Notebook now, but I imagine that we’re going to make notes right on the pages later. People are going to be able to say I want to add a note here; I want to search…Google… something there, and you’ll be able to do that.
Marissa also talked about the ability to sort results based on different dimensions, such as location and time:
Marissa: What I’m sort of imagining is that in the first basic search, you’re presented with a really rich general overview page, that interweaves all these different mediums, and on that page you have a few basic controls, so you could say, look, what really matters to me is the time dimension, or what really matters to me is the location dimension. So you want to see it on a timeline, do you want to see it on a map? It’s a richer experience. What’s nice about timeline and date as we’re currently experimenting with them on Google Experimental is not only do they allow you to sort differently, they allow you to visualize your results differently. So if you see your results on a map, you can see the loci, so you can see this location is important to this query, and this location is really important to that query. And when you look at it in time line you can see, “wow, this is a really hot topic for that decade”. They just help you visualize the nut of information across all the results in these fundamentally different ways that ‘sort’ kind of gets at. But it’s really allowing that richer presentation and that overview of results on the meta level that helps you see it.
Danny Sullivan also touched on the same theme:
Danny: I think the most dramatic change in how we present search results, really has come off of local. And people go “wow, these maps are really cool!” Well of course they’re really cool, they’re presenting information on a map which makes sense when we’re talking about local information. You want things displayed in that kind of manner. It doesn’t make sense to take all web search results and put them on a map. You could do it, but it doesn’t communicate additional information for you that’s probably irrelevant and that needs to be presented in a visual manner. If you think about the other kinds of search that you tend to do, Blog search for instance, it may be that there’s going to be a more chronological display, much like what we saw them do with news archive where they would do a search and they would tell you this happened within these years at this time. Right now when I do a Google blog search, by default it shows me ‘most relevant’. But sometimes I want to know what the most recent thing is, and what’s the most recent thing that’s also the most relevant thing right? So perhaps when I do a Search, a Google blog search, I can see something running down the left hand side that says “last hour” and within the last hour you show me the most relevant things in the last hour, the last 4 hours, and then the last day. And you could present it that way, almost sort of a timeline metaphor, I’m sure there are probably things you could do with shading and other stuff to go along with that.
If you really want to talk about search interfaces, what will be really fun to envision is what happens when Ajax starts coming along and doing other things. Can I start putting the sponsored search results where they are hovering above other results? Is there another issue that comes with that? There may be some confusion as to why I was getting this and I was getting that, can I pop up a map as I hover over a result? I could deliver you a standard set of search results and I can also deliver you local results on top of a particular type of picture. If I move my mouse along it I could show you a preview of what you get in local and you might go “Oh wow, there’s a whole map there” and jump off in that direction. That would be quite useful to see that stuff come off of there. But right now I just don’t see anything coming out of it. What we typically have had when people have played with the interface is, these really weird things like, ‘well we’ll fly you though the results, or we’ll group them’. None of which is really something that you’d need, that added to the choices, do I want to go vertical, do I not want to go vertical?
More hands-on experience with greater functionality
The nature of our interaction with the search results page is fairly static. We look and we click. Any attempt to incorporate more functionality on the results page, in the form of filtering options, has been met, on the most part, with apathy from users. Even the advanced search functionality that’s been around for over a decade is used by a very small percentage of users. Are we ready as users to get our hands on the buttons and dials that could fine-tune our search? Will search become more of an interactive experience?
Chris Sherman saw us getting our hands on the buttons and levers that power personalization:
Chris: I think what they might do is start to expose some of those algorithms and some of those knobs and dials to let us dial-up or dial-out certain personalization features and fine-tune their search results using controls that are more similar to what you’d find on Photoshop with sliders and dials or various graphic displays. And I think we’ll see search results actually changed dynamically in real time as we apply those various tools.
Danny Sullivan feels it depends on the task.
Danny: If you’re just doing a general search, I don’t think that putting a whole lot of functionality is going to help you, you could put a lot of options there and historically we haven’t seen people use those things, and I think that’s because they just want to do their searches. They want you to just naturally get the right kind of information that’s there and a lot of the time they give you that direct answer. You don’t need to do a lot of manipulation. It’s a different thing I think when you get into a lot of vertical, very task orientated kinds of searches, where you’re saying, ‘I don’t just need the quick answer, I don’t just need to browse and see all the things that are out there, but actually I’m trying to drill down on this subject in a particular way’. And local tends to be a great example. ‘Now you’ve given me all the results that match the zip code, but really I would like to narrow it down into a neighborhood, so how can I do that?’ Or a shopping search. ‘I have a lot of results but now I want to buy something, so now I need to know who has it in inventory? Now I really need to know who has it cheapest? And I need to know who’s the most trusted merchant?’ Then I think the searcher is going to be willing to do more work on the search and make use of more of the options that you give to them.
Jakob Nielsen believes it’s a possibility, but one he’s not too optimistic about:
Jakob: The third one is to add more tools to the search interface to provide query reformulation and query refinement options. I’m also very skeptical about this, because this has been tried a lot of times and it has always failed. If you go back and look at old screen shots of all of the different search engines that have been out there over the last 15 years or so, there have been a lot of attempts to do things like this. I think Microsoft had one where you could prioritize one thing more, prioritize another thing more. There was another slider paradigm. I know that Infoseek, many, many years ago, had alternative query terms you could do just one click and you could search on them, which was very simple. Yet most people didn’t even do that. People are basically lazy, and this makes sense. The basic information foraging theory, which is, I think, the one theory that basically explains why the web is the way it is, says that people want to expend minimal effort to gain their benefits. And this is an evolutionary point that has come about because the people, or the creatures, who don’t exert themselves, are the ones most likely to survive when there are bad times or a crisis of some kind. So people are inherently lazy and don’t want to exert themselves. Picking from a set of choices is one of the least effortful interaction styles which is why this point and click interaction in general seems to work very well. Where as tweaking sliders, operating pull down menus and all that stuff, that is just more work.
Michael Ferguson: It’s our job to actually go out of our way to make something as quickly navigable and easy to use as possible without them having to do any efforts or set preferences etc. So always through a variety of methodologies, eye tracking etc…but we are always focused on what is the fastest flow for people to get to the core of what they are trying to get to out of the search results so as far as bringing more steps onto the page …it has to be done with respect to end users intent and goals and really cannot compete with that and still, and I think this will be true in 2010 the text results are still a very important visual part of the page. They do not necessarily look as sexy as some video or some audio but that is a core of the experience and that’s why we still have those in the prominent placing that they do. So my sense is you’re not going to be able to ask the users to do work, their clicks and their footsteps will walk to the experience that is most delightful and easy for them to use but I would not expect them ever to aid the engine by setting preference that are going to …self initiated actions.
Greg Sterling: You will get more participation if it’s easy, if it’s fun, if it’s effective and makes the experience better. Like if I do travel research and I can quickly capture and copy and save hotels, destinations or whatever and manipulate and come back to those, that kind of thing is valuable. But you make a fair point about putting a burden on users to do stuff and I think that in the light of keyword query string lengths, which has stagnated I think, I don’t know where the high point of the bell curve is…it’s like two or three words. Norvig in the interview talks about getting people to interact more with the search engine so the result can be better right and he’s saying…he’s sort of admitting with his implied conversation about speech recognition or speech input that you can’t really get people to formulate these coherent questions or longer query strings and you have to find alternative strategies so some mix of active solicitation or tools that make it fun or interesting and then passive personalization or other strategies to get people a better result.
Stratification of user functionality
So, if the search engine is going to ask more of us as users, in return for giving us greater control over defining our search experience, will we run into the same problem we currently have with advanced search? Will those features only appeal to a small percentage of users who are comfortable rolling up their sleeves and interacting with the engine? And will this mean that we’ll have two versions of search, one for power users and one for the rest of us?
Chris Sherman: I don’t think so. I think most people live with the results that they get. Right now, today we have advanced search and nobody uses it. There are a lot of tools are really allow power users to get in and do a lot of fine-tuning of their queries and I’d say less than 1%, on the numbers I’ve read, actually take advantage of that functionality. So I think as we evolve and those tools do surface, you’ll still have the vast majority of people happy with the result that they get but it’s still going to be that 1% of people that are currently using advanced search that will take advantage of the surfacing of those new capabilities. But, you know, that said, if they make it easy enough so it’s more like doing something like playing around with Photoshop or some of the other graphics editors and it’s intuitively obvious then we might see people gradually start gravitating towards that and taking advantage of those tools.
Justin Osmer: There will always be the one-size-fits-all option, just based on pure market dynamics and the size of it, the head and the tail of the web and all those other factors. To tackle the head and to get most of the queries that everybody, Joe average person, is looking for you’ll need to have a simplified version or potentially, what’s available now, as a search experience, what people have come to expect. And I think we all, in the industry, agree that where we are today is great but it’s been a little stale for awhile and being able to level that up a bit and make some major inroads in improvements there, is something that we’re definitely on the verge of doing. And it may start as some sort of opt in option or maybe it’s just the separate website or a separate engine that’s doing that. Or maybe, at some point, it becomes just a toggle between two different ones, so it brings up live.com and you get to pick what engine you want, the turbocharged version or do you want a slimmed down version? The challenge has always been when you’re talking about the early adopters and the real technical elite and the heavy searchers, a lot of those folks would love to use that but, in the grand scheme of things, it’s a pretty small portion of the population.
Greg Sterling: I would point to the iGoogle home page and classic Google as validation of that point. I think that the Yahoo homepage becoming personalized but I think that Google in particular has bifurcated its search experience. They claim iGoogle, which is a version of personalization because you set up all the feeds and widgets and gadgets and so on. It’s the fastest growing product they’ve got there. Now the real numbers are going to be miniscule compared to classic Google so I think you are right that there is some kind of segmentation that may emerge where you have a class of power users that take full advantage of a bunch of tools and you have those who use the defaults and don’t do much in the way of interacting with the engines.
That’s part one of Search In The Year 2010. In the next column I write (September 7), we’ll look at whether search will become more mobile in 3 years, what advertising on the results page might look like and how will personalization impact those ads; will banner blindness revisit the search results page, how our interactions on the page may change, who will be the players driving innovation, convergence of search and entertainment and various bold predictions for the future. For the next 3 weeks, we’ll be inviting some guest writers to share their thoughts on user behavior, starting with Yahoo’s Larry Cornett next week.
By the way, we’ve saved a few surprises as well. Speaking of surprises, if you happen to be at SES San Jose, track me down (I’m doing a number of sessions) and we can share some there as well.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.