As the search engines rapidly move to offer more locally focused search results, those managing large global, single domain sites are facing more and more difficulties in getting them indexed, detected as local, then ranked in the local search results.
This is especially true with Google in English and Spanish speaking countries where it is getting difficult to find “global company” sites and products in the search results outside of the US.
I am seeing an increase in social media posts expressing frustration by searchers in markets like Australia, Singapore and India that they can’t locate companies outside of their markets unless they know the specific company or product name. A few even went as far to indicate they would stop using Google if this trend continued.
One of these comments was in response to a localization article I had written. An Australian posted that he was trying to use Google to find a high-end fly fishing rod from a specific US company, namely Orvis. He had read about their product in a magazine but could not remember the name of the company. No matter how he searched he could only locate Australian companies. He further indicated that this was becoming more common with many of the searches he had been doing. He was frustrated since global ecommerce had become the gateway to broader product choices not available in Australia.
On the other side are companies that had enjoyed being one click away from customers around the world. In the past few weeks I have received a number of questions from frustrated webmasters and desperate site owners about how to get their site back into Google local market search engines. With these questions often comes more confusion because of answers from “search experts” who are only guessing how to fix it or from engines and local agencies simply suggesting they use paid search.
I am less concerned about the companies that have a single language site who have the arrogance to believe that everyone in the world needs their products. It is the truly global companies like Apple, IBM and Nokia that sell in nearly every market around the world that are being hurt by this shift. These companies either have hundreds of sites or one very large site segmented to accommodate the multitudes of countries, languages and legal entities that represent the their multinational existence.
A typical local content detection filter hierarchy
Previously, local content detection was only a critical problem for queries in languages that were used by multiple countries such as English and Spanish. Recently, more problems are occurring with other languages like German, French and Arabic. For example, if you do a search in German, the primary result with be the best German language result that is also unique to the IP address of your location. If you enter a German language query in a German speaking IP address region like Germany, Austria or Switzerland you will see different results unique to your actual location. However, the same search done in the US or UK will simply result in the most relevant German language result regardless of location.
It seems to me that the most confusion is coming from search marketers struggling with understanding how Google’s local filters work. Let me try to clarify how they work and what companies should be doing to ensure they are getting the access to those markets they deserve.
Country code top level domain (ccTLD). The “cc” in “ccTLD” means country code and is based on the ISO 3166-1 standard which specifies a two letter code for every country in the world. If the site has a ccTLD like .co.nz then the engines will assume the content is for New Zealand. The same is true for someone using a .us domain. Content on this domain would be viewed as a United States centric site.
That is why most search experts will tell you to use the top-level domain for a particular country if you want to have a local site. Unfortunately, for both large and small companies, the expense and operational overhead is often cost-prohibitive to maintain multiple sites with unique ccTLDs.
Site server IP address location. When a site with a generic top level domain, such as .com, .net, etc., are not using a .ccTLD to associate to a country, the engines will use the IP address of the server hosting the site as a proxy for local designation.
This means someone with a .com address hosted in Germany would be considered primarily a German site. This is currently the main problem with multinational companies that use .com and then subdirectories for their country sites such as www.ibm.com/uk.
Note, that if you are using a ccTLD, Google does not factor in the server IP location as a secondary factor. This was recently confirmed in a Google Webmaster blog Q&A post on local site detection. The answer by the Google engineer caused a fair bit of confusion by those that did not read his response correctly. It is logical that if you are using a .fr domain the site content should be primarily associated with France and the location of the server would not be necessary.
The Google geographical targeting tool
A few years ago when Google launched this tool they saw a significant reduction in complaints from webmasters that they were not adequately represented in the local search engines—especially where they had a major market or physical presence. I have talked to a number of large and small companies in the past few months that either were not aware of this feature or had it set incorrectly.
To use it, simply create a Webmaster Tools account, then set up a sub account for each country. Go into the settings for each sub account and set the subdirectory location on the site to the appropriate country. For example, set .com/nz equal to New Zealand and everything in and under that /nz directory will be given the same weight as content hosted on a .co.nz domain.
Geographical proxy servers and local market links
I am seeing more chatter about using local market proxy servicing as an option for working around ccTLD and local hosting. This has been and option but it seems to be working less effectively in the past few months. Even with proxy servers you are still essentially hosting locally—but only a cached version of the site which is often not an option for an ecommerce company.
I have also seen an increase in promotion of link building services promising to get lots of links from ccTLD sites or sites hosted in a local market. The idea is to get the engines to think you must be locally relevant because local sites are linking to you. Be careful with this type of service since many of the links will not be from authoritative sites that are the only ones that seem to transfer local market link value.
How big is the potential problem?
Last year I reviewed the domain structures for the Global 1000 companies and found 85% are using .com/cc with a few even using cc.domain.com for their local sites. Further review of the IP locations found that many were hosted in the US or other central hubs with no company hosting 100 percent of their country sites in the local markets.
We have always thought of our websites as “global” since they are just a search result click away from a searcher in any country around the world. Unfortunately, that definition may not hold much longer since a multinational’s global site might actually be relegated to a single country without even knowing it.
I can only see this problem getting worse as the engines put greater emphasis on local content and the increasing demand of location-based results on mobile devices. I suspect the engines will start to do further refinements on what is truly local. If you have a global site or even a local site it would make sense to ensure you are monitoring the inclusion and ranking performance of your local content so you can take appropriate actions should Google not find your site globally relevant.
Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.