[ad_1]
Google’s Keywords Planner is an add-on that helps advertisers choose search terms to associate with their ads. Researchers found that the word search tool had a malfunction that consistently connected race with hundreds of pornographic suggestions.
The search results only applied to Blacks, Latinas, and Asians when combined with the word “girl” or “boy.”
READ MORE: Google employees allege the company is rolling back diversity and inclusion programs
A similar search for “white girls” or “white boys” returned no suggested terms at all.
The Markup, a nonprofit newsroom that investigates how powerful institutions use technology to change society, discovered and published the findings yesterday, July 23.
The outlet noted that the results were blocked when a similar search was conducted after they reached out to the company about the issue.
“These findings indicate that, until The Markup brought it to the company’s attention,” the study presented, “Google’s systems contained a racial bias that equated people of color with objectified sexualization while exempting White people from any associations whatsoever.”
“In addition, by not offering a significant number of non-pornographic suggestions, this system made it more difficult for marketers attempting to reach young Black, Latinx, and Asian people with products and services relating to other aspects of their lives.”
Google Ads generated more than $134 billion in revenue in 2019.
“The language that surfaced in the keyword planning tool is offensive and while we use filters to block these kinds of terms from appearing, it did not work as intended in this instance,” Google spokesperson Suzanne Blackburn wrote in a statement emailed to The Markup.
“We’ve removed these terms from the tool and are looking into how we stop this from happening again.”
The Markup pointed out that Google’s algorithms have a long history of racism.
The Markup notes that in 2012, Google experienced a similar issue with its search engine when UCLA professor, Safiya Noble, wrote an article that explained that searches for “Black girls” often returned pornographic results.
READ MORE: Black homeless people in Atlanta targeted to improve Google’s facial-recognition software
In 2013, Harvard professor Latanya Sweeney also wrote that Googling Black names were far more likely to return display ads for arrest records than the same search for a white name.
The industry giant told The Markup that many of those issues have been addressed within the company and that they have a fully staffed and permanent team dedicated to the challenge.
Have you subscribed to theGrio’s new podcast “Dear Culture”? Download our newest episodes now!
[ad_2]
Source link