Search Engine Marketing agencies scale not by creativity or innovation but by overhead. This is a fact that many agencies deny but is the cold hard truth. This is true even for agencies that have developed in-house technology or license technology from others. Even the “advanced” agencies are not very sophisticated – they will use rules-based bidding that works half the time because they still require humans to double-check if they actually care about their client’s bottom-line.
With these bidding systems being rules-based, they require account managers to make customizations (I believe the term is “settings update” after hearing a recent sales pitch) depending on the account/campaign.
So what are the flaws with rules-based keyword bidding? Why is it not effective?
Most rules-based bidding can only accept a limited amount of data (no matter what search marketing agencies may sell you on) – for example: 7 day, 30 day and lifetime “snapshots” of how your keywords are progressing. The agencies will check to see your cost-per-acquisition on a 30 day period, measure how much they need to change your position based off of the last 7 days, and see whether something’s happened and your keyword needs to be paused based off of the lifetime.
It is impossible for agencies to deliver on the “100,000 keywords X 5 match types X 3 ads X multiple networks” sales pitches because it requires an enormous amount of data. Take a look at the executive team of the agencies…last I checked, an MBA does not understand machine learning. This requires a PhD from a school such as Stanford and anyone who actually knows what they are doing is working at Google and not at a search marketing agency (sorry, a PhD from a state school does not necessarily qualify as “World Class”).
If an agency wants the system to look at more data, that will require more hard-coding of rules (“look at 45 days and weight X amount”, “look at one year and weight at Y amount”,…). This approach is very flawed, because accounts may vary and some might only have 6 months and would not quality for the one year rule – others may have dramatically different cost-per-clicks and cost-per-acquisition.
Many agencies will use keyword bidding to determine a change of the keyword bid by a percentage increase/decrease which may suffice if your keywords are hovering around $1 a click. But you will see serious, volatile changes if your cost-per-click averages around $10 (I have had some accounts like this). A percentage increase/decrease can dramatically increase the bid by several dollars (versus cents) and can result in killing accounts.
Another area to watch would be when you have different cost-per-acquisitions for different products, campaigns or keywords. If you are selling various products, you might have specific margins and be able to spend up to X amount, depending on the product purchased. Rules-based systems wouldn’t be able to handle this because they are based on CPA targets.
Agencies would have to go into their “settings” and enter a ballpark number for your overall average sales and make measurements from there. This will mean that most of your keywords will not be accurately tracking to their appropriate CPA — they will either be too low, which means you are not getting all the volume you deserve, or they are too high and you are actually not turning a profit for that sale.
So whenever you hear a sales pitch from an agency with “proprietary technology” and are ROI-focused do what I do and say “K, thanks, bye!”.
Opinions expressed in the article are those of the guest author and not necessarily Search Engine Land.