Maps & Metros: Surviving And Thriving In Local Search

Last week I attended the SMX Advanced session, “Maps & Metros: Surviving And Thriving In Local Search,” moderated by Matt McGee with speakers Mike Blumenthal, Mary Bowling and Mike Ramsey. It’s extremely challenging right now to recommend new “advanced” SEO methods for local search within Google (this is my take, prior to going into this […]

Chat with SearchBot

SMX adv up close Last week I attended the SMX Advanced session, “Maps & Metros: Surviving And Thriving In Local Search,” moderated by Matt McGee with speakers Mike Blumenthal, Mary Bowling and Mike Ramsey.

It’s extremely challenging right now to recommend new “advanced” SEO methods for local search within Google (this is my take, prior to going into this session). Things in Google have been in flux, transitioning from the old Google Places to Google+ Local pages, and changes have been unpredictable and frequent.

Also, the local algorithms haven’t seemed to have had a whole lot of additions, necessarily, or as much as the key parts have perhaps had some weighting changes, resulting in relatively minor juggling of ranking factors, often in fairly marginal degrees.

Mike Blumenthal, Mike Ramsey, and Mary BowlingMike Blumenthal, Mike Ramsey and Mary Bowling spoke on local search at SMX Advanced, 2013.

Mike Blumenthal spoke first, on “The Anatomy of Local Search Results.” Mike calls himself an “Idiot Savant of Local,” which I think is both amusingly self-deprecating and a definite understatement.

He said that after Google’s Venice algorithm update about 67% were blended results vs. 23% Pack results. Now the ratio is 93% to 77%.

Depth of Blend in Local Search Query Results

Blend of Depth by City

Mike has asked himself “how do they rank things in blended results”? He defines “Pure Organic” results vs. “Pure Local Algo.”  He used AOL/Earthlink results because they license Google search results minus the local contents. He used this Dr. Pete Tip to form the search URLs to research: /search?q=Honda+dealers &start=1

He then compared Google with the AOL version results and the Map results to try to understand what bits of search are due to the local algorithm influence.

Mike finds that for some keyword searches like “restaurants,” all the results are pure map results.

Finding: Depth of Blend – researched by metro/market areas – he predicted more blended results over time. He believes that with some categories like restaurants, Google has forced more local results in the mix because restaurant sites have trouble competing with the likes of Yelp, Urbanspoon and Yellow Pages, and Google wants more direct local results to show up.

Tactic: Mine Links for Citations – upgrade people currently linking to add address/contact info and/or social annotations.

Tactic: Disrupt display ad competitors by using social to trump local in search results – get social annotations to show up in your local area for category searches to draw clicks.

Tactic: Outside of Search Area:

  • Use long tail categories to expand search radius
  • Extend radius of search by removing spammers
  • Open an office inside search radius
  • Do an SAB radius shift – shift the centroid of the service area over to center upon the important area of search.

Changes In Local

Mike showed the new version of maps. In the new ranking, there’s no order A-F, but the pins now all look the same, and it’s very personalized. The new map list view is very personalized, not pure map results.

The new carousel experimentation of pics at the top of the page is based on everything in the past so the ranking methods haven’t changed all that much.

Mary Bowling speaks next, covering NAP (which is an acronym standing for “Name, Address and Phone Number”). In David Mihm’s annual survey of Local Search Ranking Factors last year, NAP was ranked as the #3 most important factor.

Auditing NAP Info on Local Business Websites for Consistency

Google uses local phone number as the unique identifier for a business. And why doesn’t everyone work on data cleansing to get NAP data consistent? She shows case study of dedicated data cleaning over 90 days, which resulted in improved visibility for multiple terms, getting her client in the 7 pack.

She emphasizes that you can’t just add new good listings; you have to delete old wrong ones. When you think you’re done, check and recheck until you’ve got the data clean across the local ecosystem.

She recommended using the GetListed tool, Yext.com tool and the Google Places tool (phone lookup). If you have many locations, spot check 6-8 in large cities to verify their accuracy. Once you claim a listing you have to keep it updated.

Much of her advice is for agencies regarding specing the work necessary and pricing before work begins. She offered a checklist to get info directly from the client. She recommends getting on the phone with them to do it.

Use Google to discover inconsistency. Search in Google Places for all numbers associated with the business. Look for Google’s suggestions. Search Google address for name, address, phone.

Report any problems you find to Google as soon as possible. Google support has improved! You might even get a call-back from Google support within minutes, so try that.

She went on to say that the new places database much more stable. Google Map PINs are critical, so check to make sure the pin is showing in the right place for the street address.

Use Localeze, UBL, InfoGroup and Acxiom for bulk listings to manage distribution and correction of NAP across publisher websites.

Use Localeze, UBL, InfoGroup, Acxiom to fix NAP across bulk business listings.

Mary’s process when working on client sites:

  • Find, check and correct anything confusing on company websites; use schema for structured data.
  • Update or submit listings at Tier 2 local directories – trusted industry directories, trusted local directories, local phone providers, Yahoo/Bing, etc.
  • Scrub your data clean using the Whitespark tool; prioritize updating and submission. Don’t bother updating on low quality sites.

How long do updates take? See the GetListed chart, which shows the speed of data updates at major local directory sites. Establish a baseline and compare it to measure success; give yourself 3-6 months to see the impact of changes.

Mike Ramsey offers “Some Nifty Tips For a Local Audit.”

Google+ Local Rankings: David Mihm’s 4 equal ranking factors are website, links, listings, reviews.

For keyword research, he recommends the Local Marketing Source Local Keyword Tool. Enter keyword, specify ZIP Code and radius, include cities/zip, etc.

Also take a look at Blumenthal’s Category Discovery Tool. New Google+ dashboard doesn’t allow for custom categories.

Use Google Trends for things like “plumber” then set the location to see how it performs by city. Google’s new tool, Keyword Planner, which is in Google AdWords, can show you more info using geographic targeting, and it provides similar/related search terms. Whether to use plural/singular, or list place name before or after the keyword? Do the research to figure out which keyword sequence is best.

Use Google Analytics segments to sort data. Get list of keywords for a particular state; discover keywords to expand content; include additional areas indicated to target content with landing pages to match up.

Use Screaming Frog to find errors in Title tags and Meta Tags across a site. Google now tells people to brand their titles by prepending titles with the brand name. Google is now often rewriting titles to do this! Mike recommends the same. Pro tip: add phone number in descriptions to enable click-to-call in Skype and mobile apps. This may improve CTR, too. It may mean less traffic, but increased phone calls!

When it comes to local, most content sucks. The reason is that most people duplicate content, making one version of content for every location, then switch out the local keyword. New tool to find dupe content across a site: Siteliner. Example: www. Anytimefitness.com.

Fix your website before adding markup. Use the Structured Data Testing Tool to test a location page, easy way to test schema across sites. Also check reviews. Some say you shouldn’t markup reviews and testimonials on your site, however it works in many cases. Google’s guidelines are a bit confusing about it. His example used a review from Yelp that might’ve lent credibility because of this association.

The Schema Creator from Raven Tools is an easy way to create schema code.

Check out Local Site Score at Nifty Marketing and the Anatomy of an Optimal Local Landing Page infographics.

He compares a Google+ Local listing, non-upgraded vs. upgraded. He says there is a new dashboard rolling out for Places Plus Business, offering ten category choices instead of five, with new options for service area businesses. His advice – wait until you’re upgraded to the new dashboard to upgrade your listing.

For troubleshooting with Google Places, go to: bitly.com/placeshelp. Also try BrightLocal’s G+ Local Wizard.

Also, CognitiveSEO – see what percent are branded vs. keyword, which also lets you see trend lines for discovery of links, type of linking sites (blogs, forums, sites).

For reviews, use GetListed to find reviews about your business. For monitoring of reviews, use Review Push. Don Campbell’s new tool will be released soon: “Get Five Stars.” He recommends making reviews part of every transaction.

Summary

I’m a connoisseur of Local Search tactics, so I’m always avid for new, interesting and advanced methods for achieving ranking performance improvements in Google and Bing local search results. Due to personalization and localization effects, and possibly due to Google’s ongoing focus on development of Google+ Local, there’s been little development that would allow for new or different methods for optimization from my perspective.

Because of this, it wasn’t all that surprising that a large amount of the session was focused upon the basic local optimization tactics for local search, such as making certain that NAP is consistent, cleaning up incorrect listing info, claiming and distributing listing info, performing keyword research and discovering new citation sources.

That being said, advanced local SEO right now involves hyper-focus upon these table-stakes activities, because most businesses actually ignore this or have a poor understanding of how to do it, and many simply don’t take it far enough. Even within the table-stakes, each of the presenters provided a few of the more advanced tactics and methods for pushing each activity far enough to achieve advantages in results.

That is where advanced local SEO resides for the moment – truly double checking your assumptions that you’ve been doing the basics sufficiently and correctly enough to give you the best chances of ranking well. Doing it more comprehensively and better than your competition.

Mike Blumenthal’s advanced tips for ranking better for service areas were particularly noteworthy to me, including adding more long tail categories in order to match when user queries result in expanded search radius. Reducing competition in the SERPs by reporting spammers for removal – a tactic that might sound a bit like negative SEO, but Mike capably defends it by asserting that spamming or lying in the search results means that a business doesn’t deserve to be there for that query.

Opening another office inside search radius or relocating can increase your local query relevancy (a concept I’ve reported upon in the recent past), even if it’s obviously challenging for many businesses to accomplish. And, shifting the center of radius for a service area business to center upon the important area of search is simply a solid-gold idea!

Mary’s recommendations on using multiple tools for checking NAP consistency is one of my secrets for accomplishing good local SEO (damnit, Mary, why’d you let the secret out!).

Her suggestions for using GetListed, Yext.com tool, Google Places search tool, and Whitespark for finding incorrect or outdated listings of your business that need to be updated with correct/consistent data helps to align the local search ranking “juice” to push your listings up a few notches in SERPs. This is not going to strike people as sexy as some clever technical trick, but, frankly, this is the basic bedrock of local SEO from my perspective, and will be for quite some time into the future.

Finally, Mike’s presentation rounded up the whole session very nicely with what was very nearly the granddaddy of all local SEO tools lists. Many local search marketers I have run across are not very sure where to look in order to analyze their sites and online listings for areas for improvement, and if you want to get up to speed very rapidly, you could leverage the strengths of these tools to perform very robust local audits.

Of particular interest to me was the possibility of using Siteliner for analyzing a site’s possible dupes of Titles and Meta Descriptions – a very common problem. And, I will be interested in trying out GetListed, Review Push and Get Five Stars for monitoring and analyzing business reviews – an area of local search that is persistently challenging for small businesses to attempt to improve upon.

Overall, I was impressed with the advanced local SEO tactics presented in the session. If you were watching for them, there were indeed nuggets of gold to be found.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Chris Silver Smith
Contributor
Chris Smith is President of Argent Media, and serves on advisory boards for Universal Business Listing and FindLaw. Follow him @si1very on Twitter and see more of his writing on reputation management on MarTech.

Get the must-read newsletter for search marketers.