A new research paper authored by Matt Schruers and published by the Computer & Communications Industry Association argues that search engines have been unfairly targeted in the quest to impede online copyright infringement.
According to the paper, the perception is that search engines are the primary culprit in copyright infringement cases, offering visibility to unlawful and pirated content. But, Schruers writes, “the contention that disappearing undesirable entries from search results would substantially prevent piracy is flawed” and “the solutions to online infringement have little to do with search.”
Schruers points out that evidence shows actual infringing “rogue” sites, like the Pirate Bay or Isohunt, receive very little little traffic from search. The author cites current (and sometimes questionable) Alexa.com ratings that show only about 8 percent of the traffic to the Pirate Bay arrives via Google. Also, according to the paper, Isohunt, has claimed that less than a quarter of its traffic comes from search, predicting it could, “…survive even a complete search engine ban.”
To further support his case, Schruers references a July 2013 study commissioned by the music website Spotify that reported commercial options for online content proved effective at diminishing online piracy. He also cites research around the introduction of Netflix and Spotify to Norway, which resulted in an 80 percent drop in music piracy and a 50 percent drop in video piracy for the country.
Schruers sums it up this way:
Many music sites now demonstrate an acute awareness of the importance of a strong digital presence, and generally demonstrate effective organic and paid search optimization. Searches for such terms as “music downloads” indicate that lawful platforms such as Spotify, Last.fm, and Rdio aggressively seek to optimize their organic (i.e. “natural”) search results as well as paid search advertising for such terms, including terms that might otherwise lead to unlicensed sites.
Schruers recommends that lawful sites be more aggressive with their SEO tactics, offering up Netflix’s poor optimization around content for the show “Breaking Bad” as an example. According to Schruers’s research, Netflix’s robot.txt file prevents search indexing of the Netflix “Breaking Bad” page. “The fact that Netflix’s ‘Breaking Bad’ description page content goes unindexed may impair the search ranking for that show, from that platform,” writes Schruers.
The goal is for lawful sites, like Netflix, to create richer, more optimized content and leverage the content to its fullest extent so that it rises above search results for rogue sites that act as hosts for unlawful content.
In the end, the Schruers pushes both content licensees and licensors to leverage more effective SEO tactics to increase the search engine visibility of lawful content, making the case that promoting “the page rank of lawful sites” and increasing the “visibility of legitimate online content offering” is the most effective way to fight online copyright infringement.