What Is In Your Enterprise SEO Toolkit?
Enterprise SEO ain’t like the others. As I’ve said before, simply saying “I’m an enterprise SEO!!!” doesn’t make you one. You need different tools, and different skills, and a whole different level of diplomatic kung-fu.
I lack diplomatic-fu. And skills are always open to debate. But the tools are forever. This is a very incomplete list of the tools you probably want to look at if you’re responsible for SEO in an organization so big you get lost finding the restroom. No favorites—if it’s on this list it’ll do the job nicely:
For the record, I do not support rank tracking as a success metric. But it’s a fantastic diagnostic tool, so you gotta do it. My favorite tools are:
Advanced Web Ranking. Great since day 1. AWR is especially useful if you’re focused on a small set of sites, but want to drill really, really deep into the data. It comes in desktop and server flavors, lets you tune it down to the color assigned each keyword, and it’s dang speedy.
Authority Labs. Has an API and allows for massive tracking sets. It has had a few hiccups of late but continues to be pretty solid.
SEOMOZ Campaign Tracking. Lets you track up to 2,000 keywords, and of course dials into all the other nifty SEOMOZ tools. While it doesn’t have the customizable oomph of AWR (yet) SEOMOZ keeps building stuff out. Definitely worth a look.
Raven Tools. Also awesome, and pulls in all of Raven’s other cool tools. If you’re looking at SEOMOZ, look at Raven, too.
BrightEdge. Less well-known, Brightedge tracks lots of rankings, but also includes some other competitive stuff you’ll want to look at. Super-powerful and super-tunable. Setup lag is annoying (you have to wait a few days after submitting a keyword set) but if you want big-time share of voice analysis, it’s the winner. It’s also by far the most expensive of the list.
This is one of the few areas where I recommend having multiple tools. Link data differs from one tool to the next, and you need multiple data sources to get a good sampling.
Majestic SEO continues to be an industry standard, with a great API, and a huge dataset. Use it.
SEOMOZ is the other standard. Use it or you’re nuts.
ahrefs is the new kid on the block. They have some really cool stuff, like the ability to track new link acquisitions. If you’re paranoid about negative SEO, you want this.
Raven Tools will keep popping up on the list. Raven actually pulls data in from MajesticSEO, then ties it to their other management tools, so it can consolidate for you.
If you think links and rankings are enough for you to do, keep your resume handy. You won’t last long in a serious enterprise. Industrial-strength number-crunching is a must if you want to try to correlate, say, your last content campaign to leads and sales.
Microsoft Excel is universally accepted and used. I’m betting you know the URL for it, too. I still use Excel for 99% of my analytical needs, and chances are it’ll handle it all for you, too. Until you’ve exploited all the tools it contains, don’t bother with the next two, because you won’t get much out of ’em.
If you’re on Linux, consider Open Office, or R (below), or any number of other toolsets that get increasingly complex and powerful.
That said, R is an open-source, free statistical engine and programming language that will let you analyze huge datasets. Companies like Revolution Analytics and Statsoft build add-ons and front ends to make it a bit easier and scale it, too. There’s a pretty steep learning curve, though, so only pick up R if you know you’ll be doing some industrial-strength stuff.
SAS is the only other ‘pro’ tool I’ve used. It’s really, really slick, but again has quite the learning curve. It’s especially useful if you want to create interactive reports based on super-advanced statistical black belt-edness.
Data Visualization & Charting
This category includes structural stuff, like sitemaps, and data presentation.
First off, Excel, R and SAS will all do the job. I’ve rarely seen visualizations I couldn’t do in Excel.
But other tools will let you customize and prettify in ways you otherwise could not:
OmniGraffle is the standard for Mac charting. If you’re on OS X, get it. You can do all manner of relationship diagrams, including site maps, and import outlines or XML to automate a lot of the work. You can also create some pretty nice graphs, but I still prefer Excel.
Visio is the favorite on the PC. Same capabilities and caveats as OmniGraffle.
Gephi is a super-specialized beast. It lets you visualize relationships, like force-directed diagrams. Worth playing with if you’re charting out social media data, semantic relationships, or other stuff I can’t even think of.
Processing is a bit hard to explain. Think of it as a designer’s toolkit for data crunching. You’ll have to try it to understand, but if you need to present data, and you can’t find any other tool, Processing can probably do it.
Tableau is Windows-only (boo) but man, it’s awesome. It adds a level of polish that Excel cannot. If you’re on Windows, you should really take a look.
On Linux, I’d again look at OpenOffice first, as the simplest of the tools.
I almost skipped analytics. There are plenty of articles about basic Web traffic analysis tools. But it never hurts to revisit the basics, so:
Google Analytics now has an enterprise or ‘premium’ version that lets you process more data, faster, provides more support, gives you a 99.9% data collection guarantee (still not enough, if you ask me) and provides advanced attribution modeling. Definitely worth a look.
Adobe Digital Marketing Suite (formerly Omniture) has quite a learning curve but also lets you do a lot with attribution, path analysis and all the basics. An established tool now owned by Adobe, however. Take that as a positive or negative, based on your own experience with the big A.
These are web-based tools, so no major Linux/Mac/Windows issues.
Log File Analysis
You must analyze your site’s log files. Must, must, must. OK? Now that that’s settled, here are my favorite tools:
Splunk wins for the best name. It can also analyze everything from a server log file to randomly-sorted Nirvana lyrics. It. Can. Do. Anything. I love it. You can process up to 500mb of data per day for free. If you have more than that to process, it may give your CFO a heart attack.
Sawmill is the venerable standard. It’s been around a long time, has a fantastic list of plugins and is super-configurable. As an enterprise tool, it’s cheaper than Splunk, but not by much. A five-profile enterprise install will run you $1,750. Still, I love Sawmill. I’d be sad if I didn’t have it.
You can ask your IT team to alert you when the robots.txt file changes, sure. But don’t stake your career on it. A few low-cost monitoring tools you can use to do the same thing:
Server Density tracks MD5 hashes of any file or even a chunk of any file on your site, so you’d know right away if something changed. Plus, you can track up- and response time.
Pingdom does uptime and response time monitoring. I haven’t tried to use it to track file changes, though. Comment below if you have any experience with that.
Loggly is a cloud-based logging tool. They take a different approach, letting generate custom-ish log data and send it to Loggly’s storage. I haven’t used it enough to speak to stability, but it’s super-tweakable, which I love.
So Much More…
There’s a lot more, of course: competitive analysis, custom solutions you script yourself and who-knows-what-else. I’ll write about those in future columns, so if you have ideas, list ’em below.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.