Logo

Google Is Training Its Ad Placement Systems to Be Offended

Apr 3, 2017  •  Post A Comment

Google has figured out that what’s missing in the computers that determine where ads are placed on YouTube is the abilty to understand context. As a result of this inability, The New York Times reports, ads from major retailers such as Coca-Cola, Procter & Gamble and Wal-Mart have been showing up next to racist, anti-Semitic and terrorist videos, which hasn’t sat well with the marketers.

“Google engineers, product managers and policy wonks are trying to train computers to grasp the nuances of what makes certain videos objectionable,” The Times reports. “Advertisers may tolerate use of a racial epithet in a hip-hop video, for example, but may be horrified to see it used in a video from a racist skinhead group.”

The problem has gained urgency recently amid reports that major brands are inadvertently funding extremist groups through their automated advertising.

“This glitch in the company’s giant, automated process turned into a public-relations nightmare,” The Times reports. “Companies like AT&T and Johnson & Johnson said they would pull their ads from YouTube, as well as Google’s display advertising business, until they could get assurances that such placement would not happen again.”

Google has been implementing changes, including barring ads from appearing with hate speech or discriminatory content and simplifying how advertisers can exclude particular sites, channels and videos across YouTube and Google’s display network.

“It is allowing brands to fine-tune the types of content they want to avoid, such as ‘sexually suggestive’ or ‘sensational/bizarre’ videos,” The Times reports. “It is also putting in more stringent safety standards by default, so an advertiser must choose to place ads next to more provocative content.”

Your Comment

Email (will not be published)