Google heightens adult ad policy enforcement after Reuters finds illicit ads

“Spotty” enforcement can result in inappropriate ads being shown to minors and an unfair advantage for advertisers that slip through the cracks.

Chat with SearchBot

This week, Google will improve enforcement of ad policies pertaining to underaged users, according to Reuters. Google’s renewed focus on these policies came after Reuters discovered ads for sex toys, liquor and high-risk investments in its search results that violate the company’s attempts to comply with UK regulations.

Why we care. Advertisers in age-sensitive categories are unlikely to specifically target children and having their ads shown to minors is a potentially bad look from a brand safety standpoint.

Better enforcement can and should help prevent this scenario, enabling advertisers to better trust Google’s systems. However, the case can also be made that these types of ads should never have made it through Google’s safeguards.

“According to posts on online advertising forums and two advertisers, Google’s enforcement has been spotty,” Paresh Dave wrote for Reuters, “The advertisers . . said they have been frustrated about significant lost sales due to Google’s search engine correctly blocking their ads from signed-out users while erroneously allowing their competitors’ ads.”

Google: ‘The ads in question were mislabeled.’ “We have policies in place that limit where we show certain age-sensitive ad categories,” Google told Reuters. “The ads in question were mislabeled and in this instance should have been restricted from serving. We are taking immediate steps to address this issue,” the company said.

Privacy and protection for minors. Heightened concern over user privacy has increased scrutiny over how platforms protect underaged users.

In August 2021, Google announced that it would block ad targeting based on age, gender or interests of users under 18. It also added the ability for users under 18 (or their parent or guardian) to request removal of their images from Google Image results and automatically enabled SafeSearch for users under 18. Beyond search, the company also made YouTube’s default upload mode private for children aged 13-17.

Instagram announced similar changes, disabling interest and activity-based targeting of underage users in July 2021.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About the author

George Nguyen
Contributor
George Nguyen is the Director of SEO Editorial at Wix, where he manages the Wix SEO Learning Hub. His career is focused on disseminating best practices and reducing misinformation in search. George formerly served as an editor for Search Engine Land, covering organic and paid search.

Get the must-read newsletter for search marketers.