Study: Are Public Record Ads Placed On Google Racially Biased?

A study published by Harvard professor Latanya Sweeney claims that companies placing public record ads linked to people’s names through Google may use language that reflects racial bias, though why this happens is unclear. The study found that ads associated with black identifying names were more likely to have ads with the word “arrest” in […]

Chat with SearchBot

google-zipperA study published by Harvard professor Latanya Sweeney claims that companies placing public record ads linked to people’s names through Google may use language that reflects racial bias, though why this happens is unclear.

The study found that ads associated with black identifying names were more likely to have ads with the word “arrest” in them than ads that were associated with white identifying names. From the study:

A greater percentage of ads having “arrest” in ad text appeared for black identifying first names than for white identifying first names in searches on Reuters.com, on Google.com, and in subsets of the sample.

The study involved 2,184 names considered to be either black or white, based on a specific approach outlined in the study. Searches were conducted for these names at Google and Reuters, which shows search results and ads from Google. On page twenty of the study you can see the ad delivery results.

Most of the ads were placed by one particular company, Instant Checkmate. The report asks at the end:

Did Instant Checkmate provide ad templates suggestive of arrest disproportionately to black-identifying names? Or, did Instant Checkmate provide roughly the same templates evenly across racially associated names but society clicked ads suggestive of arrest more often for black identifying names?

For its part, Instant Checkmate claims it didn’t try to skew them in any particular way. The report notes:

During a conference call with the founders of Instant Checkmate and their lawyer on December 21, 2012, the company’s representatives asserted that Instant Checkmate gave the same ad text to Google for groups of last names (not first names).

Is it down to Google? Google told us this:

AdWords does not conduct any racial profiling. We also have a policy which states that we will not allow ads that advocate against an organisation, person or group of people. It is up to individual advertisers to decide which keywords they want to choose to trigger their ads.

So that’s a no.

Postscript From Danny Sullivan:

It’s possible that Instant Checkmate is providing various type of ad templates to Google and letting the algorithm decide which to show more frequently for certain types of searches. If so, then the racial bias of searches might be reflected. If searches on “black names” are more likely to get clicks if the word “arrest” next to them, then the algorithm might show that more often.

However, that wouldn’t be the algorithm itself having a racial bias. That would be it having a “conversion” bias, if anything. It’s the same that happens if multiple ads templates are submitted but one has the word “free” in the ad copy. If that ad pulls more clicks, then it might get shown more frequently.

It’s also difficult to impossible to know whether, if this happening,  it is searches by blacks or whites that are influencing the changes. It could be that blacks are searching for “black names” and more likely to click on ads if they have “arrest” next to those names. That would influence the results for everyone, since at the time of delivery, Google doesn’t know the race of someone searching. It could be that this happens when whites are searching for “black names.” A combination could also be involved.


About the author

Barry Schwartz
Staff
Barry Schwartz is a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry can be followed on Twitter here.

Get the must-read newsletter for search marketers.