Proactively building negative keyword lists in Google Ads is now more important than ever

Google appears to be pushing for the premature rollout of automated features to further develop its machine learning (at the expense of the advertiser). Here are some ways to review your accounts to avoid pitfalls.

Chat with SearchBot

Google plans to begin restricting terms within the search query report, to show only those that had been “searched by a significant number of users.

As one of the main controls around monitoring and optimizing paid search activity on the platform, and with (unsurprisingly) little clarification around quite how this significance is calculated, this update was met with frustration and anger by the SEM community.

Some went as far as to start a petition for Google to allow advertisers to opt-out.

While this decision from Google’s side was substantiated as a move to further protect user privacy, it isn’t hard to imagine there are likely ulterior motives.

All aboard the automation (runaway) train

Changes in the last 12 months alone indicate that Google is pushing for the premature rollout of automated features and reducing advertiser control, in what can only be assumed as an effort to further improve and develop its machine learning (at the expense of the advertiser).

A prime example of this is Google’s continual effort to expand match type close variance, which started in late 2018 and has been gradually widening since.

While the full impact is not yet realized, this announcement does likely pose serious issues for data-hungry advertisers, looking to thoroughly understand and improve their account performance through analysis of their search term data.

But how big is the potential impact?

Back at the start of September, we analyzed search activity across our entire agency from August and found that 26% of total spend originated from unique search queries with only 1 impression.

Image1 2

Without having enough post-announcement data at our disposal, the SEM community was left to speculate exactly how Google was classifying ‘significance’ and the full effect of the change on reporting. The prospect of losing roughly one-quarter of our search terms due to volume alone was daunting, to say the least.

Fortunately, this doesn’t seem to be the case, we’re still seeing unique, single impression search queries filtering through into our reports.

So, Google, how exactly is this significance being calculated? What is actually being removed due to privacy concerns and to what extent is this widening the net of match type close variance ‘under the radar’?

Frederick Vallaeys of Optimizr was quick to publish a handy script that allows advertisers to report on the percentage of clicks, impressions or cost that are going to unreported search terms.

Running it on a handful of our larger spending accounts, we can clearly see the impact on clicks no longer being reported on in the search query report – peaking above 45% so far in September:

Image2 1

How can we ensure better quality traffic from day one?

While pre-empting search queries with bulk negative keyword lists is nothing new, it’s now more important than ever to ensure traffic quality is maintained as best it can be moving forward.

Aside from the obvious exclusions and general blocking of profanity from triggering your campaigns, leveraging contextual query data and search data ahead of pushing new campaigns live can and will save you money in the long run.

Here are just some of the ways you can continue to build comprehensive negative keyword lists:

Keyword Planner

The keyword planner tool is baked right into the Google Ads platform and provides keywords complete with volume, expected CPCs and level of competition based on either a URL or keyword examples.

From a negative keyword perspective keeping the provided keywords as vague as possible is actually to your benefit here.

It’s worth noting that each keyword you add acts like a keyword for the overall theme of what Google provides so I would recommend running this a few times with a few thematically different keyword sets to get as much data as possible before pulling it all into a spreadsheet and finding all the duds.

You can, of course, use alternative keyword research tools such as Ubersuggest, Spyfu or SEM Rush.

Keyword Sheeter
This handy semantic keyword tool allows you to type or paste in a few examples of keywords and it will keep firing less and less relevant searches back at you the longer you leave it running (Honestly, this could run forever if you gave it the chance). 

This is great at finding those outlandish queries that you may not have necessarily thought about or seen elsewhere.

The initial “ideas” tool is where the real magic happens but this tool is also capable of providing volume and CPCs if you so wish. 

Answer The Public

AnswerThePublic actively listens to Google’s suggestions API data and offers you prepositions and questions that often appear in conjunction with your keywords.

It approaches this task in a completely different way to the other tools in this list which makes AnswerThePublic indispensable when searching for negative keywords.

The person staring at you and down towards the search functionality may seem daunting, but don’t let that put you off – it’s a really useful tool!

Microsoft Ads

Although Microsoft Advertising may follow suit at some stage. For now, you can mine the search terms from your Microsoft campaigns for negatives.

We find that Bing’s match types are a little more liberal and so using this data could fill in some blanks you otherwise wouldn’t have found.

As well as this, they have their Microsoft Advertising Intelligence tool which integrates directly with Microsoft Office Excel and allows you to gain insights around up to 200,000 keywords. The Associated Keywords and Related Searches functions are very useful to explore additional potential negative keywords ahead of time.

Google Search Console

You can either navigate to Search Console or, if you have linked Search Console already, do this within the Google Ads interface. Navigate to Reports > Predefined > Basic > Paid vs. Organic – you can see queries that triggered paid results, organic results, or both.

If you have a strong organic presence and lots of data to play with, this can be a very useful tool to explore queries that you have ranked for organically, that you may wish to add to your arsenal of negative keywords.

Historical Account Data

If you’ve been running search activity for some time, and happen to have a large list of historic search terms, it may be worth reviewing in case terms that should have been negated fell through the cracks.

This could be especially true over longer periods of time if there are queries that miss any filters you might have in place to prioritise your search query reporting and negating of keywords.

A handy way to analyse this is through N-gram analysis reporting. This allows you to analyze the instances of a word or phrase across your query data.

By grouping question-led phrases such as ‘how can’ – you can quickly ascertain whether those phrases have delivered good performance over time and whether you should look to exclude them for future activity.
Our free Google Ads Data Studio report has a built-in feature to quickly analyze search queries based on an n-gram word or phrase.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

Eliot Shiner
Contributor
Eliot is the co-founder of Bind Media, an independent, specialist biddable media agency based in Bath, UK. A seasoned PPC aficionado, he has developed a passion for B2B & SaaS over his 6+ years in the industry. Sadly, he will never have a social following better than that of his pet dog.

Get the must-read newsletter for search marketers.