• Michael Gray

    hypothetically speaking, if you wanted to do it you could 301 the URL’s after they have gotten their links/traffic to the living URL (transferring all the link juice to the living URL) and then republish that old story under a new url (for archive purposes).

    Of course this is a massive headache for the publisher and a CMS and would probably cut you out of any longtail searches.

    Another option would be conditionally redirecting to the living URL based on a refer … but well google doesnt go for that either

  • http://searchengineland.com Jonathan Hochman

    Newspapers are so wrapped up in their paradigm that they can’t see another way to present information. Take a look at the infobox on Wikipedia’s 2009 flu pandemic article. It’s updated daily by compiling info from a variety of sources. There’s another page with pretty color coded maps of the outbreak. Why can’t those silly California newspapers set up live urls, just for those stories that need them? Information should not move around when it can be updated in situ.

  • http://www.tag44.com tag44

    Thanks for the info shared Danny, its really very resourceful and to the point.

  • http://www.brentdpayne.com BrentDPayne

    Danny,

    Great info. It’s actually something I emailed Google News and Google Search Quality about yesterday. I’ll forward you the thread (scary we were on the same wavelength yesterday). Please don’t post the email publicly though as my relationships at Google are paramount.

    Furthermore, I spoke at SES London (see http://brentdpayne.com/presentations) about the insane process we have to go through in order to rank well in Google News while not losing our ranking in Google Web. The process used to work extremely well. While the slides in the presentation get extremely complicated, I’ll try to outline in a nutshell what used to work well (and explain why I feel it isn’t doing as well now).

    1. Watch Google Trends for a new keyphrase
    2. Do a site: query for that new keyphrase
    3. Find old content that ranks well for the new keyphrase.
    4. Make a copy of the old content and create a new URL for it (editorial history reasons).
    5. Create or locate a new story that is hyper focused on the new keyphrase you found in Google Trends
    6. WAIT for Google News to index the new story. (Important to get past duplicate content filter, it only takes Trib sites about 15 mins to get indexed in Google News)
    7. If the URL doesn’t get indexed by Google News within 30 minutes then build a new story on a new URL, change the Title, H1, H2, and 1st paragraph to something different. Wait until it is indexed in Google News.
    8. 301 redirect the old story’s URL to the freshest content on the topic.
    9. Repeat #8 on as many stories as possible.
    10. Put the old URLs to the old stories you 301′d to the new story on your homepage (or anywhere that Googlebot will recrawl them quickly to redistribute your PageRank).

    BUT . . . here’s the problem. Google Web is taking longer to redistributed the PageRank. It doesn’t happen in a matter of minutes anymore. Thus you are left with the choice of either doing well in Google News or doing well in Google Web. I am not sure if this was a change to stop the exact situation that I was doing above to rank well in both, but 301s have changed. (Sidebar: I also feel that PageRank isn’t passing unless the Title tag and content is similar between both pages.)

    As for Google News recrawling URLs…on RARE occasion I am seeing this. But not nearly enough.

    Here’s an example:
    http://www.google.com/search?q=site%3Ahttp%3A%2F%2Fwww.sun-sentinel.com%2Fnews%2Fnationworld%2Fsns-ap-tropical-weather%2C0%2C290319.story&hl=en&ie=UTF-8&rlz=1B3GGGL_enUS337US337&tab=nw

    Versus Google News not having it in their index at all.
    http://news.google.com/news?q=site%3Ahttp%3A%2F%2Fwww.sun-sentinel.com%2Fnews%2Fnationworld%2Fsns-ap-tropical-weather%2C0%2C290319.story&hl=en&rlz=1B3GGGL_enUS337US337&um=1&ie=UTF-8&sa=N&tab=wn

    I could list hundreds if not thousands of these but . . .

    So, why isn’t LAT doing what I listed out above? Because it’s a pain in the ass to do it. They are journalists. They are focused on the news. Furthermore, LAT gets pretty grumpy when I wipe out their old stories to redirect them to a new story. Why don’t we link? We do on occasion but it doesn’t work nearly as well. It worked great during the election and inauguration but doesn’t seem to work as well now. Why? Mostly has to do with PageRank sculpting. I used to be able to wipe all but a few links on our story pages and other pages of our sites and that would increase the power of the remaining links by many times but . . . that doesn’t seem to work as well anymore (but could be other factors, I’m getting the tools back to test it).

    As a note to help you with your insanity of trying to update ALL the old stories. You can use a single URL of /latest-california-fires.html and have ALL your stories always point to that location. Then when a new story comes out that you want to point to, just go into your server configuration and update the location of the redirect to the story you want. This allows you to do a broad sweep of all URLs and change where they are linking to quite easily. It does create yet another 301 jump though. So, you may want to be cautious of that. I’ve been doing this for over a year though and Google hasn’t seemed to make any changes to kill that (yet).

    HuffingtonPost.com has this process of breaking and developing news (as well as historical information for the user) down. I like their system a lot.

    I’ll fire off the email. Again, don’t share it but it may help answer a few questions on this topic that you can springboard off of.

    Now I need to get ready for work. ;-)

    Again, great article and great critique. It’s gotten a lot of buzz within LAT. They listen to you (not that they don’t me). Of course it helps you used to be an LAT reporter too. ;-)

  • vickiporter

    Thank you, Danny, for this meaty article. Much to chew on here.

    My professional perspective is from the user/content/usability side, so please forgive my technical naievite.

    I have been trying to help my friend whose cabin is in Millard Canyon. Even though she is well plugged in to her community, she has precious little information to go on. Given my background in Web matters, I thought I would be able to get pinpoint info for her, but nooooo, as you have so well pointed out.

    It seems to me that disaster-related content may need special treatment. It really can be a matter of life and death, as we are seeing with the LA fires. Specifically, the country is set up with this whole Incident Command System. However, they seem not to have thought about social media and are still in the twice-daily press conference/top-down model. So my first point is that after this crisis is over, can we find a way to help the governmental agencies modernize their approach to Incident Command communications?

    Second, people who are rattled by being involved in a disaster need REALLY SIMPLE ways to find out vital information. All this Twittering and searching and going to various news sites just isn’t realistic. Even if it were, the information is too diffuse to be of much use.

    Disasters are always very geographically-centered, aren’t they? Why not think of the interface, then, as a map? Once could filter info by time and by topic (only those relating directly to the disaster, e.g. burn area, evacuation area, supplies, etc).

    The LA Times map is in the right direction, though it is hopelessly out of date. Point to the Millard Canyon icon and the info is from Aug.28, saying the place is threatened by flames. Shoot! We can do better than that, surely!

    If there can be a Unified Command system, surely we can figure out a Unified Disaster Communications plan!

    Thanks again for this forum.

    Vicki Porter