Search Engines + Newspapers: Perfect Market’s Delivery System Aims To Please Both

Last year, there was a seemingly endless parade of stories on how aggregators, search engines and news blogs were apparently killing newspapers that publish original content. This year, add the rise of “content farms” to the list. Riding to the rescue, or so it hopes, comes Perfect Market and its new search marketing tool, “The Vault.”

Perfect Market has been working with newspapers publishers such as the Tribune Company to generate more revenue for them from search engine visitors. Tribune Company is also one of Perfect Market’s major investors, along with Trinity Ventures, Rustic Canyon Partners and Idealab.

Earning More Off The “Drive Bys”

The concept behind The Vault is simple. Newspapers get plenty of traffic from search engines but often do a poor job of monetizing that traffic. Indeed, last year News Corp’s chief digital officer Jonathan Miller called visitors from Google as “the least valuable traffic to us.” The Vault aims to change all that, by making newspaper content more attractive to “drive by” visitors, especially through better ad positioning and targeting.

“We’re arming newsrooms with the technology and performance intelligence they need to compete, with the content they’re writing today, and we know it’s working,” said Julie Schoenfeld, president and CEO of Perfect Market. “We’ve partnered with some of the best known publications in the world, including The Los Angeles Times and SFGate, and the results speak for themselves. Some of our partners have seen an ad revenue increase by 20X.”

The service is offered on revenue sharing basis, with Perfect Market getting a share of any increase in advertising revenues it generates from the stories.

Clearing The Clutter

Vault pages pretty much strip everything off a “regular” news story except for the news article itself and ads. Consider this side-by-side comparison:

On the left is an article from the Los Angeles Times about BP delaying a test for capping the Gulf oil spill. This is what someone would see if they found the article by clicking around at the LA Times web site. On the right is the same article but formatted differently. This is what someone would see if they came to it by doing a search at Google or other search engines.

The search engine version is much simpler. Gone are things like:

  • Social sharing and navigation options at the top of the page
  • The big Los Angeles Times logo and the big “Nation” section heading
  • More sharing options next to the start of the story, along with related stories
  • Links that were in the original story that led to more information within the LA Times site
  • “The Latest” news headlines box on the right-side of the page
  • A house ad on the right side of the page
  • The “Most Viewed | Most Commented” box

That’s just what’s been removed from the near the top of the original story. At the end, there are related story links, comments and more navigational links:

All of those also get removed. Meanwhile, the search engine version, unlike the original, gets paginated. The story is divided across two pages, so you have to click to read the second part. That’s intended to drive page views and increase the odds of showing ads that the reader will like, assuming they weren’t attracted to any on the first page.

Overall, the aim is to help give users a clearer route to what they want, the news story, and increase the odds that the post-reading click will be on an ad. The means removing elements designed to encourage the more brand-aware reader, someone who comes to the publication regularly, from sticking around.

“You can’t transform those who come in from search into ‘name brand’ users,” said Tim Ruder, Perfect Market’s chief revenue officer, a former Washington Post digital media exec.

The Content Farm Alternative?

It’s more than getting ad click, however. Perfect Market says its product is also designed to help improve the type of ads shown from ad networks such as Google AdSense, so that they’re better targeted to visitors and producing more revenue per click.

Beyond that, the product provides a “dashboard” designed to show how much particular stories are earning and which keywords driving traffic to those stories from search engines are paying off the most.

Of course, writing content to match high-paying keywords is the hallmark of “content farms,” sites run by companies such as Demand Media or Associated Content, recently acquired by Yahoo. Potentially, Perfect Market’s system could allow traditional publishers to keep up with these new demand-driven publishers. However, that’s not Perfect Market’s main aim. Instead, it sees its tool as allowing publishers to keep writing as they’ve normally done but to earn more for their stories.

“The best antidote to fighting content farm spam in the engines is arming journalists with the tools they need to engage in the web economy,” said Schoenfeld. “When a news organization’s first-rate content is optimized for search, everyone should win. Users have a better search experience when quality journalism appears at the top of the search results. Search advertisers benefit from appearing in well researched, information-rich articles. And publishers benefit from previously untapped revenue streams.”

Removing Clutter Or Context?

What’s not to love? Well, former San Francisco Chronicle sex columnist Violet Blue had a lot of issues, when she stumbled upon her columns being Vaultized, as it were, on the San Francisco Chronicle’s SF Gate web site

In March, she wrote about how Google searches began sending her to her published columns that were missing her bio, links in the content, lacking punctuation and one case, given a keyword-rich URL that had the opposite meaning of the actual story.

Much of that has now been corrected. When I followed up with Perfect Market, soon after the issues came up, I was told the decision on whether to include links was down to the individual publications. As for typos, those were attributed to to import glitches.

I also talked with Blue soon after her post and found myself struggling. On the one hand, Perfect Market is doing some of the things that I’ve also written that newspapers should do to survive (see If Newspapers Were Stores, Would Visitors Be “Worthless” Then?). If they can’t earn more for their stories, then worrying about how the stories are presented becomes a moot point. There won’t be stories at all.

But Blue makes a valid point that The Vault’s supposed content optimization process actually “decontextualizes” an article in ways that can be harmful to readers, even those “drive by” readers. Comments often add additional value and content to an article. An author’s bio or links to their past work give context about who they are and why a reader might care about what they’ve written.

“Now my articles are on a site that doesn’t have any of the context,” she said.

Link Love’s Labor Lost

Blue also is concerned that such changes will turn people off from linking to her stories, or stories published by others.

“I’ve worked really hard as a blogger for 10 years now to understand the value of people who arrive at my site,” she said. “Having multiple stories across multiple pages, that turns into ‘Does your reader feel tricked?’  For me, if something’s broken into 8 pages, I won’t link to it.”

Potentially, the linking situation gets even worse. Links are a key factor in helping content rank better in search engines. Getting good links from across the web play a huge role. Strike one for some of these articles could be that if they’re less attractive to some people, they’ll not attract links. Strike two is that they compete for links against the “original” articles.

The author, the publication’s social media person or regular brand-driven readers are likely to promote the “original” story, linking to it. But all that “link credit” is wasted, since that version of the page doesn’t show in search engines as well.

Not Cloaking, Says Perfect Market

Perfect Market says this isn’t an issue. That’s because it redirects Google and other search engines if they try to visit the “regular” version. This keeps all the link credit flowing to the page they are allowed to record.

In other words, here are those two articles from the LA Times that I mentioned earlier. Here’s the “normal version” for regular visitors to the site:

http://www.latimes.com/news/nationworld/nation/la-na-oil-spill-20100714,0,1234918.story

The version is for “search engine visitors,” people who will come to the article from search engines like Google:

If Google tries to visit the regular version, in order to record it for its searchable index of documents, it’s redirected to the search engine version. However, humans going directly to the normal version, say from a link shared on Twitter, still see it.

Isn’t that cloaking, something in particular against Google’s rules?

Traditionally, cloaking means showing search engines content that humans never see. For example, Google gets shown a special page that it records, but people clicking on that page when it’s listed in Google’s results get redirected to the “normal” page.

In Perfect Market’s system, humans do see the “search engine” version, when they find it within a search engine. If Google has recorded a special page, people clicking from Google see exactly what Google saw.

Perfect Market said it has explained the situation to its Google AdSense representative, who in turn consulted with a “search specialist” at Google, and things are apparently fine. From the email Perfect Market received:

What you want to do is acceptable and is in line with Google’s guidelines. You can redirect Googlebot (and visitors from organic search) to articles.latimes.com and continue to send direct traffic to www.latimes.com. This isn’t considered deceptive, since Googlebot and users have the same experience.

I can’t tell if that search specialist was with Google’s actual web spam team or instead someone within the AdSense (Google’s division for publishers who carry ads) who is knowledgable about Google’s rules. Matt Cutts, who’s the head of Google’s spam team, was away on vacation when I wrote this and so unable to look at it in more depth. When I get an update from him, I’ll postscript.

Perfect Market also said that should Google determine there’s an issue in the future, it can still managed to preserve the link credit from “normal” versions to support “search engine version” in another way.

The company also said that should a client stop using its system, it’s easy to redirect all the indexed stories back to their original versions (the articles typically live on on their own web site, such as articles.latimes.com rather than latimes.com).

Finding The Balance

Clearly, Perfect Market’s system offers some advantages to publishers, but it also comes with some decisions. Do you really assume that all “drive by” visitors can’t be converted, or should you do more balancing of other contextual elements. If you’re excluding links to minimize non-ad clicks, are you gaining that much more for the potential loss of context to a story that links can provide?

These and other questions are true for anyone who embarks on a strategy of effectively maintaining two sites, one designed for search visitors and one for all others, regardless of whether they use Perfect Market’s system. And as Blue notes, implementation shouldn’t be seen as solely a business issue.

“While links within my articles were restored, I think it’s important to remember that when Perfect Market was implemented on all SFGate content, the value of keeping links in articles was either not important or not understood to the decision makers at Hearst. This shows a big disconnect between business and editorial; it’s safe to say that the brands buying the Perfect Market product are either not aware or are unconcerned about online content’s fundamentals,” said Blue.

For its part, Perfect Market agrees that there’s more that can be done.

“Perfect Market helps forward-thinking publishers generate new revenue from search traffic, something they haven’t done very well to date. Our approach is designed to evolve user experience and grow revenues. We continue to innovate in both areas based on actionable data and feedback,” Schoenfeld said.

One thing has been to further evolve its story templates since earlier this year. Sometimes social sharing buttons are part of what’s shown within a story. The company is also open to more customized formatting.

As for comments:

“We would be happy to include comments. We just have hard time getting comment sinto our system now. But comments will be coming,” Schoenfeld said.

Related Topics: Channel: Other | Features: Analysis | Top News

Sponsored


About The Author: is a Founding Editor of Search Engine Land. He’s a widely cited authority on search engines and search marketing issues who has covered the space since 1996. Danny also serves as Chief Content Officer for Third Door Media, which publishes Search Engine Land and produces the SMX: Search Marketing Expo conference series. He has a personal blog called Daggle (and keeps his disclosures page there). He can be found on Facebook, Google + and microblogs on Twitter as @dannysullivan.

Connect with the author via: Email | Twitter | Google+ | LinkedIn



SearchCap:

Get all the top search stories emailed daily!  

Share

Other ways to share:
 

Read before commenting! We welcome constructive comments and allow any that meet our common sense criteria. This means being respectful and polite to others. It means providing helpful information that contributes to a story or discussion. It means leaving links only that substantially add further to a discussion. Comments using foul language, being disrespectful to others or otherwise violating what we believe are common sense standards of discussion will be deleted. Comments may also be removed if they are posted from anonymous accounts. You can read more about our comments policy here.
  • http://searchology.co.uk searchology.co.uk

    Really a thoughtful article. I wonder, how much, small news businesses can benefit from all this situation.

  • http://trafficcoleman.com/blog/ TrafficColeman

    My point of views is I think news online is really hurting the offline papers. People can now log on and get the latest news instantly.

    Also if I’m not mistaken, isn’t the new york times beginning to charge for online subscriptions.

    TrafficColeman “Signing Off”

  • sc

    Raises all kinds of interesting questions about the atomization of content, the intelligent reconfiguration of it for search, performance data in newsrooms and offers a compelling approach to figuring it all out. Good piece.

  • srchgrrl

    Great article about how newspapers can leverage the power of search.

    In the search monetization industry, it’s a given that showing relevant advertising to a user with active search intent is so much more powerful than to the typical content browser with no active signal for what they might be interested in.

    It’s great that companies such as Perfect Market are bringing that kind of power to an industry that truly needs it.

  • bc

    This is fantastic that Perfect Market is bringing this technology to the news rooms. They’re optomizing on both the front creation side as well as pulling actionable insights out of the back-end stats, so I don’t doubt the 20x quote.

    The real question, though, is in content creation. How many reporters and desks can the newspapers fund with this revenue? I’m guessing quite a few, and that’s a really good thing.

  • Tim Ruder

    Having worked in online news since 1995 at sites like washingtonpost.com and latimes.com, I know first hand that the challenge of generating revenue from search traffic is real and problematic. Publishers are not exaggerating when they say things like “this is traffic that’s not being monetized” (James Moroney from Belo) or that removing pages from the Google index won’t have a big impact on revenues (Jonathon Miller from News Corp.).

    I also know that the solutions many publishers have talked about (like removing content from Google indexes in favor of pay walls) would only insure that their publications will be left out of the search economy altogether.

    There have been many calls for what better solutions might look like, including from folks like Eric Schmidt (“innovate”). Perfect Market’s response is different from these in three ways: it is actually implemented and working; it addresses the revenue side of the equation directly by focusing on reader intent; and it has built-in tools for experimentation and innovation.

    I joined Perfect Market to bring solutions to the industry and I’m confident that the Perfect Market offerings will help news publishers have a more active role in the search economy and keep quality journalism open, accessible and financially viable.

  • http://www.inlander.com Inlander-Spokane

    I must say that I’m confused about the necessity of creating two completely different versions of each piece of content as appears to be done in the Perfect Market approach. Would it not be more appropriate to have the site maintain the copy of the content in a context-free way and then have the styles applied to the page in one of two ways dependant upon the referencing source of each visit to the site?

    It seems as though this would be immensely less prone to grammatical and styling issues during these “import” processes and would remove any potential gray area in providing Google and other search engines with modified linking.

    Could someone give an example of any possible advantages in maintaining two completely different versions of content within a site? Or perhaps get me up to speed on drawbacks or shortcomings to a page that maintains two referral-based styling methods?

    Much of my efforts of the last few years have been directed at removing the duplication of efforts in providing content to the public; whether it is online or in the paper. This would seem to be a step in the opposite direction.

  • http://perfectmarket.com jaybudzik

    @Inlander-Spokane: If the Perfect Market program were only a matter of styling and re-arranging the page, your suggestion would work really well. But the program needs to do far more than re-format the page — every story is analyzed so that the most relevant and appropriate ads and content links can be presented to the search user. Separate infrastructure has made for a much easier implementation for our customers.

  • Winooski

    Danny wrote, “Matt Cutts, who’s the head of Google’s spam team, was away on vacation when I wrote this and so unable to look at it in more depth. When I get an update from him, I’ll postscript.”

    That was almost a month ago. I’m wondering if there’s been any follow-up with Matt Cutts. The issue is whether the Google AdSense representative (via a consultation with a “search specialist” at Google) who gave a blessing to Perfect Market’s subdomain pseudo-cloaking was correct, or whether Perfect Market’s system, as implemented for LATimes.com, runs a real risk of cloaking penalization.

Get Our News, Everywhere!

Daily Email:

Follow Search Engine Land on Twitter @sengineland Like Search Engine Land on Facebook Follow Search Engine Land on Google+ Get the Search Engine Land Feed Connect with Search Engine Land on LinkedIn Check out our Tumblr! See us on Pinterest

 
 

Click to watch SMX conference video

Join us at one of our SMX or MarTech events:

United States

Europe

Australia & China

Learn more about: SMX | MarTech


Free Daily Search News Recap!

SearchCap is a once-per-day newsletter update - sign up below and get the news delivered to you!

 


 

Search Engine Land Periodic Table of SEO Success Factors

Get Your Copy
Read The Full SEO Guide