Good morning! Welcome to day two of our SMX East Conference in New York City. We’re beginning today with a keynote conversation with Eli Pariser, author of The Filter Bubble, a book in which Pariser argues that search personalization leads to users being unaware of viewpoints and opinions that are different than their own.
We should be getting started at the top of the hour, so feel free to refresh and/or come back to follow along with the discussion.
Chris Sherman and Danny Sullivan will be taking care of duties guiding the conversation. Chris has just introduced Eli and we’re beginning with what I guess will be a short speech/presentation of sorts.
Eli: I want to talk about the moral consequences of personalization. He quotes Mark Zuckerberg as saying, “A squirrel dying in front of your house today might be more relevant to you than people dying in Africa.” He wants to talk about the impact of that kind of view of relevance.
He noticed one day that updates from his conservative friends were no longer appearing in his Facebook news feed. He’s liberal-leaning, but likes to read the thoughts of his friends who think differently.
Says he did an experiment asking several friends to Google “Egypt” and send a screenshot of what they see. Shows a side-by-side comparison. “Scott” got all sorts of information about the Democratic revolutions, news about the protests, etc. But “Daniel” didn’t get any of that — he got tips about seeing the pyramids and other travel-related links.
“Increasingly, the web is showing us what it thinks we want to see. It’s not showing us what we need to see, or the world as it is.”
Quotes Eric Schmidt: “It will be hard for people to watch or consume something that has not been tailored for them.”
The “filter bubble” concept — you don’t choose what gets in your filter bubble and, more importantly, you don’t know what’s been edited out.
Personalization algorithms typically look at what you click first. On the Internet, code is the new gatekeeper. It’s making value decisions, but it doesn’t have any value system built in. “It may be showing us what we like, but it’s not showing us what matters.”
We need to make sure that these algorithms don’t focus on a very narrow definition of relevance, where relevance is defined by what we click first. It needs to look at what really matters, things that challenge us, other points of view. The Internet needs to be that thing that connects us to new ways of thinking — that’s not going to happen if we’re all stuck in a little personalized bubble of one.
And that ends his brief speech, and now Danny and Chris are going to chat with him.
DS: Do you find that there’s commonality between search engines?
EP: Says there are some searches that results are common across search engines. Mentions an appearance on a radio show when he was promoting his book. Two of three listeners got same results for “Barack Obama” search, but third got different results focused on the birth certificate issue.
DS: Do people think that everyone should get the same results?
EP: When I was going around, people were shocked that the results aren’t the same for everybody.
CS: Marissa Mayer originally said personalization was very subtle – people wouldn’t notice it at all. She also said it would surface a lot of long-tail pages. And she said that it would only be tied to the individual’s personal history. Is that what’s going on now?
EP: It’s hard to say in any given case what the algorithm is doing. It’s so complex that we don’t know why it’s doing what it does. For some people, it may be subtle. “Overall, I think Google undersells how significant it is. I don’t think Google has malicious motives in doing this. They genuinely think personalized search results will get people coming back to their search engine more. I think they also see this as a way to make it more difficult to deliberately game the results.”
One of the areas where personalization is very strong is vanity searches for your own name. The more you do it, the more the results change.
DS: Don’t you think the bubble gets popped a bit if you democratic site — they might link over to Republican sites and such. Doesn’t that mitigate the personalization?
EP: The more you click around, you may get exposed to other ideas. But you never know when you’re in the bubble. Yahoo personalizes news headlines based on your Yahoo profile, but you never know when that happens.
If it was easier to see when and how these filters are being applied, and be able to turn them on and off, it would be easier to stop them from imposing themselves on you.
CS: This raises a control issue. Should Google be the ones controlling this? Is Google becoming something like a utility that should be regulated?
EP: I think it raises these important questions. The algorithm determines how a billion plus people get where they’re going, but there’s no opacity. The New York Times has an ombudsman for oversight, but there’s nothing similar for Google.
The engineers say that most people don’t understand all this, and Google doesn’t want to make things too complicated.
From a regulatory standpoint, I think there needs to be a reset about the rules on personalization because they were written in 1977.
(Missed question Danny asked about political ads on TV and, I think, disclosure on those ads compared to Internet?)
CS: Wants to go back to discussion of tools. Google offers a variety of tools that show what data it has, but not how it’s being used. Google says this is the “secret sauce” and they can’t reveal too much.
EP: I think the Google Dashboard is a start. It’s not super obvious what Google knows about you, though – it’s a bunch of links. I think Google thinks about this stuff more than other companies, though.
The real challenge is around inferences. Few people realize, if you have a few pieces of data, how far you can extrapolate beyond those pieces of data you can to run ads. If Google is able to target based on inferred data, you should be able to see what’s being inferred about you. Google will show you what it literally knows about you. (He doesn’t say it, but he’s implying that Google won’t reveal what it infers about you.)
It’s not just literally what data you hand over, it’s also about the inferences that your haven’t told anyone about. He mentions Hunch and some of the inferences it can make based on the data users give it.
DS: Asks about the Amazon approach where users can dismiss recommendations. Is that what Google should be doing?
EP: I think Google should give people the tools to understand and play with personalization. See how personalization affects certain queries. “When they want to, these companies are amazingly good at helping people make sense at complex data. Those talents haven’t been turned on this topic.”
Google people say they don’t get many complaints from people about personalization. I tell them that very few people even know about it.
CS: You’re talking to a group of marketers. They rely on this personalization for targeting. So what would you say to them about balance?
EP: My position isn’t that personalization is bad overall, it’s that we need to be careful about how it’s done. Google could do a lot more to explain its philosophy about this, without making it super easy to boost your search rankings. People need to be able to decide when they use these tools.
And with that, the conversation is over. Thanks for following along!