When Ignorance Isn’t Bliss: What You Don’t Know About Your Web Site Can Hurt You
It’s tough to be a small business in today’s fast paced world. Small businesses not only have to know their core industry inside out, but now they have the additional burden of being proficient in online marketing. Since many small businesses have limited staff, most people within these companies wear multiple hats, from CEO to […]
It’s tough to be a small business in today’s fast paced world. Small businesses not only have to know their core industry inside out, but now they have the additional burden of being proficient in online marketing. Since many small businesses have limited staff, most people within these companies wear multiple hats, from CEO to webmaster. Unfortunately, this often means the person responsible for the web site knows very little about it. Everything may seem to run flawlessly for a time, but then, when something goes wrong, they are left scrambling for help and at the mercy of their Internet service provider (ISP).
This is the first of two articles designed to raise the awareness of small business owners. Take heed: little details associated with your web site that are ignored can have a negative affect on your site and your business. Some problems affect the business mechanics; others affect your search positioning. The good news is, most are easy to fix once you’re aware of them. Being informed of potential challenges provides you with the opportunity to prepare and avoid serious consequences down the road.
If you wear the webmaster hat, you’ll want to carefully review the following list for some helpful information and tips. The more you know, the less time and money you’ll waste later on.
1. Your domain name is about to expire, and you don’t know it
Every domain name has at least three contacts associated with it: administrative, technical and registrant. When the domain name is about to expire, renewal notices are sent multiple times. Unfortunately, in many cases, the person whose email is on the account no longer works at your company. The notices are still sent, but since that email address is no longer valid, they go unread. What happens? Your domain name expires and your site “mysteriously” goes offline.
Your domain name is extremely important and worth protecting. Don’t assume everything is fine. Do a WHOIS search and find out the details on your account. You may discover the information is wrong, out of date, or not what you expected. More than one company has been shocked to find they didn’t actually own their domain name… and now they have to buy it.
If you are the rightful owner and your ownership laspes, you have a grace period of 30 – 60 days to renew (depending on the registrar). After that time, the domain name becomes available for anyone to purchase. Recapturing your domain name after it has been released and purchased can be an expensive process that often involves lawyers. The secondary domain market is a booming business. The players know the value of established domain names and fully intend to take advantage of them. Avoid this heartache by checking your company’s domain name status today and then registering it for a long period.
In an article entitled “How to Protect Your Domain Name,” fellow Small Is Beautiful columnist Matt MaGee tells a true story of his experience helping a small business owner who lost his domain name.
2. Your robots.txt file has banished search engines from your site
This is one of those invisible problems that can kill your site with regard to rankings. To make matters worse, it can go on for months without anyone knowing there is a problem. I don’t want to sound like a doomsayer, but don’t assume your company is immune to this problem. We’ve even seen it happen to widely known and publicly traded businesses with a dedicated staff of IT experts.
There are numerous ways to accidentally alter your robots.txt file. Most often it occurs after a site update when the IT department rolls up files from a staging server to a live server. In these instances, the robots.txt file from the staging server is accidentally included in the upload. (A staging server is a separate server where new or revised web pages are tested prior to uploading to the live server. This server is generally excluded from search engine indexing on purpose to avoid duplicate content issues.)
If your robots.txt excludes your site from being indexed, your site will drop from the engines’ databases. You may think you did something wrong that got your site penalized or banned, but it’s actually your robots.txt file telling the engines to go away.
How do you tell what’s in your robots.txt file? The easiest way to view your robots.txt is to go to a browser and type your domain name followed by a slash then “robots.txt.” It will look something like this in the address bar: http://www.mydomainname.com/robots.txt.
If you get a 404-error page, don’t panic. The robots.txt file is actually an optional file. It is recommended by most engines but not required.
You have a problem if your robots.txt file says:
A robots.txt file that contains the above text is excluding ALL robots – including search engine robots – from indexing the ENTIRE site. If you have certain sections you don’t want indexed by the engines (such as an advertising section or your log files), you can selectively disallow them. A robots.txt that disallows the ads and logs directories would be written like this:
The disallow shown above only keeps the robots from indexing the directories listed. Some webmasters falsely think that if they disallow a directory in the robots.txt file that it protects the area from prying eyes. The robots.txt file only tells robots what to do, not people (and the standard is voluntary so only “polite” robots follow it). If certain files are confidential and you don’t want them seen by other people or competitors, they should be password protected.
At SES New York 2007, Danny Sullivan hosted a robots.txt summit where search engine representatives talked about the frequent misuse of the file and how webmasters accidentally excluded their sites from indexing. To learn more about the robots.txt file see http://www.robotstxt.org. Here’s something good to know: If you are using Google Webmaster Tools, Google will indicate which URLs are being restricted from indexing.
3. Your site is scaring your customers with expired SSL certificate notices
If you’re a small business conducting ecommerce, you’re probably familiar with Secure Sockets Layer (SSL) Certificates. These certificates enable encryption of sensitive information during online transactions. When the certificate is up to date the technology protects your web site and lets customers know they can trust you. Sadly, many times the person who originally set up the certificate moves on. Because their email no longer works, the renewal notices fall to the side. So you plod along unaware of the lurking danger. Sales plummet and no one can determine why.
Finally, someone notices the “scary security messages” that appear when someone starts the checkout process. If you’re lucky, a customer will call and tell you about the problem. If you’re smart, you’ll have an employee periodically verify that your checkout process and SSL certificate are working properly.
To check your SSL certificate, visit a secure page on your site then double click on the padlock icon in the bottom right corner of your browser. A window will pop up showing the SSL certification details including the expiration date. If the certificate is set to expire in less than 2-3 weeks, you should begin working with your IT department or ISP to get the certificate renewed.
4. Your content management system (CMS) is limiting your search engine success
Search engine optimizers have a love-hate relationship with CMS. The CMS can make adding content to a site easy for the non-programmer, but often times the system is hostile toward search engines. A CMS that doesn’t allow unique titles, META tags, breadcrumbs, unique alt attributes, and other on-page optimization techniques can limit a site’s success. For more details, I highly recommend you read an article by my colleague, Stephan Spencer, on search-friendly content management systems.
5. When you changed domain names, your redirects were set up improperly
Google and other search engines will treat various types of redirects differently. To ensure that the current domain inherits all the link equity the old domain has earned, verify that your site utilizes “301 permanent” redirects rather than “302 temporary” redirects. These numbers are codes that your web server sends to browsers and search engine spiders telling them how to handle the web page. If your server tells the search engine spiders that the new location is only temporary, the search engines will ignore the redirection and not transfer the existing link equity to the new site.
To properly implement this, you need to ensure that every page of the old site is properly redirected to the corresponding new page. If the domain name changed, but the site architecture did not, then simply redirecting the old domain to the new is sufficient. If the page URLs changed as a part of a larger redesign, insure that every page in the old site is properly (301) redirected to a page on the new domain.
Lisa Barone over on Bruce Clay’s blog wrote an excellent “non-scary” article on how to set up a 301 redirect that is easy for even the non-techie to follow. And Aaron Wall at SEObook has a detailed 301 case study on how well the different engines recognized and followed 301s on his site.
6. Your site is sharing an IP address with a spamming site
Many small businesses choose to use a virtual or shared hosting service rather than purchasing their own server. This arrangement is usually less expensive than dedicated hosting and meets the needs of the small business. In many cases a virtual hosting arrangement is fine, but keep in mind that the search engines pay attention to who your neighbors are on that shared server. Some sites have the unfortunate luck of being placed on a server with sites using known spam techniques. Since your site is on the same IP address as the spammy guys your site may be unjustly penalized by the actions of other sites. In other words, you might be a victim of guilt by association.
Even if you are running your own dedicated server, there is a small chance you’ll face a similar issue. Dedicated servers are grouped into something called “Class C IP blocks.” Basically, all the IP addresses are the same except for the last number. Frequently, all the sites in these situations are owned by the same company so, in essence, while your site might be legit, there may be 253 other servers out there besmirching your site’s good name. If you are concerned about being in a bad hosting environment, ask your ISP for the names of the other sites being hosted on your IP address (in the case of shared hosting, more than one site may be served from the same IP address). Also ask them for the domain names of the other sites that differ only in the last number of their IP address.
7. You’ve got the overloaded server blues
Does your site take forever to load? If your page file size is reasonable and you have a fast browser connection, the problem may not be with your site, but with the server at the hosting company.
The hosting company may have too many sites hosted on one server. They may also have you on a server with a site that is extremely active and monopolizes the server resources. The overload can result in your server timing out when a request is made from a spider. If this condition is chronic, it could result in the engine thinking your site is down. That could result in your site being dropped from the index.
Another problem with a slow loading site is that it can cost you business. Most web surfers are impatient. If they don’t see your site loading within a few seconds, they leave. They don’t care what the cause of the slow loading is; they simply move on.
If your ISP provides a service level agreement (SLA) regarding performance, uptime, etc. you are likely OK. Any provider that offers such a guarantee will have implemented procedures that would make triggering those SLA thresholds unlikely. If your site is consistently sluggish, however, request that your site be moved to a new server. Note that this will cause some hiccups because your IP address will change, so make sure this is the actual source of the sluggish performance before requesting the switch. Consider moving to a higher class of service or a dedicated server. If your web site is a core part of your business, pay the marginal costs needed to improve the service.
8. Your site is broken on Firefox
During the “browser wars” of the late 1990s, it was important to check your site under multiple browsers (including browsers for Macs and Unix) because many times a site would “break” or render oddly under different browsers. As Internet Explorer (IE) achieved dominance, many IE-centric web designers thought of browser compatibility as an issue of the past because IE was very forgiving. IE would properly display even sloppily coded sites
With the enthusiastic spread of the Firefox browser, the compatibility issue has reared its head again. Firefox is a more W3C-standards compliant browser so sites that look great under IE sometimes break under Firefox. Pages that use proprietary tags that only work under one browser (usually IE) or pages that contain syntax errors (especially unclosed tags or strange nesting) can cause a web page to render poorly in Firefox as well as Opera or other standards-compliant browsers.
In June 2007 a OneStat study on browser use reported that Firefox commanded a 19.65% share of the US browser market and 12.72% globally. If you’re in a high-tech industry, your percentage of visitors using Firefox may be even higher.
In parts of Europe, adoption of the Firefox browser is even higher, especially in Germany where the share is over 26% (that’s better than 1 in 4 visitors!). The browser wars get even more interesting when you consider that the most widespread browser in China is Maxthon, a browser of which most Westerners have never heard.
What this means to the small business webmaster is that you can’t ignore browser compatibility any more, or you may be giving 20% of your visitors a bad experience.
Enough is enough
These are more than enough problems to keep you busy for now. Next time we’ll have a new batch of lesser-known issues small business webmasters might find lurking in the dark!
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.