Google To Publishers Concerned Over The Knowledge Graph; Searchers Still Need Your Content

In the keynote session last night at Search Marketing Expo West, Google’s head of search, Amit Singhal tried to address the concern of publishers that their content is being scraped and used by Google’s Knowledge Graph, ending with the user not having to click on the search results to get the answer. Danny Sullivan, in […]

Chat with SearchBot

google-logo-glow-featuredIn the keynote session last night at Search Marketing Expo West, Google’s head of search, Amit Singhal tried to address the concern of publishers that their content is being scraped and used by Google’s Knowledge Graph, ending with the user not having to click on the search results to get the answer.

Danny Sullivan, in his interview of Amit Singhal, brought up the tweet that went viral showing how Google is considered by many as a massive scraper site. Amit addressed the concern with an analogy.

Amit equated Google’s knowledge graph to a swiss army knife while equating the publishers content as specific tools, such as “corkscrews,” “screw drivers” and other specialty tools.

Amit explained that while Google’s goal is to give searchers a quick answer, there is no substitute for the searcher to do deeper research by clicking into the sources provided in the Google search results. He understood that many searchers would likely just want the quick answer but there is still a need for a deeper dive into the sites and publishers providing that content.

In our live blog coverage of the keynote, here is a recap of part of that conversation:

DS: Are we gonna get to a point where every search gives a direct answer?

AS: If you look at a search engine, the best analogy is that it’s an amazing Swiss Army Knife. It’s great, but sometimes you need to open a wine bottle. Some genius added that to the knife. That’s awesome. That’s how we think of the Knowledge Graph. Sometimes you only need an answer.

The world has gone mobile. In a mobile world, there are times when you cannot read 20 pages, but you need something — an extra tool on your Swiss Army Knife. When you build a better tool, you use it more.

Does this help reduce some of the concerns publishers have that Google will be sending them less and less traffic over time? Will this reduce their ability to keep producing quality content that Google can index and searchers can discover because the publisher is having a harder time monetizing their content?

What do you think?


About the author

Barry Schwartz
Staff
Barry Schwartz is a technologist and a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics.

In 2019, Barry was awarded the Outstanding Community Services Award from Search Engine Land, in 2018 he was awarded the US Search Awards the "US Search Personality Of The Year," you can learn more over here and in 2023 he was listed as a top 50 most influential PPCer by Marketing O'Clock.

Barry can be followed on X here and you can learn more about Barry Schwartz over here or on his personal site.

Get the newsletter search marketers rely on.