• http://searchology.co.uk searchology.co.uk

    Really a thoughtful article. I wonder, how much, small news businesses can benefit from all this situation.

  • http://trafficcoleman.com/blog/ TrafficColeman

    My point of views is I think news online is really hurting the offline papers. People can now log on and get the latest news instantly.

    Also if I’m not mistaken, isn’t the new york times beginning to charge for online subscriptions.

    TrafficColeman “Signing Off”

  • sc

    Raises all kinds of interesting questions about the atomization of content, the intelligent reconfiguration of it for search, performance data in newsrooms and offers a compelling approach to figuring it all out. Good piece.

  • srchgrrl

    Great article about how newspapers can leverage the power of search.

    In the search monetization industry, it’s a given that showing relevant advertising to a user with active search intent is so much more powerful than to the typical content browser with no active signal for what they might be interested in.

    It’s great that companies such as Perfect Market are bringing that kind of power to an industry that truly needs it.

  • bc

    This is fantastic that Perfect Market is bringing this technology to the news rooms. They’re optomizing on both the front creation side as well as pulling actionable insights out of the back-end stats, so I don’t doubt the 20x quote.

    The real question, though, is in content creation. How many reporters and desks can the newspapers fund with this revenue? I’m guessing quite a few, and that’s a really good thing.

  • Tim Ruder

    Having worked in online news since 1995 at sites like washingtonpost.com and latimes.com, I know first hand that the challenge of generating revenue from search traffic is real and problematic. Publishers are not exaggerating when they say things like “this is traffic that’s not being monetized” (James Moroney from Belo) or that removing pages from the Google index won’t have a big impact on revenues (Jonathon Miller from News Corp.).

    I also know that the solutions many publishers have talked about (like removing content from Google indexes in favor of pay walls) would only insure that their publications will be left out of the search economy altogether.

    There have been many calls for what better solutions might look like, including from folks like Eric Schmidt (“innovate”). Perfect Market’s response is different from these in three ways: it is actually implemented and working; it addresses the revenue side of the equation directly by focusing on reader intent; and it has built-in tools for experimentation and innovation.

    I joined Perfect Market to bring solutions to the industry and I’m confident that the Perfect Market offerings will help news publishers have a more active role in the search economy and keep quality journalism open, accessible and financially viable.

  • http://www.inlander.com Inlander-Spokane

    I must say that I’m confused about the necessity of creating two completely different versions of each piece of content as appears to be done in the Perfect Market approach. Would it not be more appropriate to have the site maintain the copy of the content in a context-free way and then have the styles applied to the page in one of two ways dependant upon the referencing source of each visit to the site?

    It seems as though this would be immensely less prone to grammatical and styling issues during these “import” processes and would remove any potential gray area in providing Google and other search engines with modified linking.

    Could someone give an example of any possible advantages in maintaining two completely different versions of content within a site? Or perhaps get me up to speed on drawbacks or shortcomings to a page that maintains two referral-based styling methods?

    Much of my efforts of the last few years have been directed at removing the duplication of efforts in providing content to the public; whether it is online or in the paper. This would seem to be a step in the opposite direction.

  • http://perfectmarket.com jaybudzik

    @Inlander-Spokane: If the Perfect Market program were only a matter of styling and re-arranging the page, your suggestion would work really well. But the program needs to do far more than re-format the page — every story is analyzed so that the most relevant and appropriate ads and content links can be presented to the search user. Separate infrastructure has made for a much easier implementation for our customers.

  • Winooski

    Danny wrote, “Matt Cutts, who’s the head of Google’s spam team, was away on vacation when I wrote this and so unable to look at it in more depth. When I get an update from him, I’ll postscript.”

    That was almost a month ago. I’m wondering if there’s been any follow-up with Matt Cutts. The issue is whether the Google AdSense representative (via a consultation with a “search specialist” at Google) who gave a blessing to Perfect Market’s subdomain pseudo-cloaking was correct, or whether Perfect Market’s system, as implemented for LATimes.com, runs a real risk of cloaking penalization.