A settlement requiring Google-owned YouTube to pay $170 million and change how they serve up ads on videos aimed at children marks the latest twist in a series of controversies over content for young audiences.
Here are a few examples:
“I think I can not call you + my beloved bastards anymore,” Philip DeFranco lamented to his millions of YouTube subscribers in late 2016 when Google made moves to make sure ads were paired with “friendly” content.
The popular YouTuber had been informed that one of his videos had been “demonetized” because he used insulting language.
Other creators received similar notices, and protested what they condemned as censorship on the platform.
Google countered that its policy to keep money-making ads away from videos that could cause advertisers concern or embarrassment was not new, just the notification process.
In early 2017, Google and YouTube were hit with controversy after a British newspaper revealed that ads from major brands were paired with hateful content, including video from Sweden’s Felix Kjellberg, who posts under the name “PewDiePie.”
Major advertisers stepped back from YouTube, awaiting assurances that their marketing messages would not be associated with racial slurs and offensive videos.
The scandal spotlighted how computer algorithms fell short when it came to taking into account how troublesome or incendiary a video might be when it came to calculating where to show ads.
In late 2017, YouTube removed tens of thousands of children’s videos that included uncomfortable, inappropriate remarks in comments boxes and tried to stem the momentum of an ad boycott at YouTube.
“We have clear policies against videos and comments on YouTube that sexualize or exploit children and we apply them drastically every time we are alerted,” YouTube said.
Google said it invested to better detect questionable content using artificial intelligence and more human workers.
At the end of December in 2018, star video-blogger Logan Paul posted a video of coming across a body in a Japanese forest where suicides were common.
Online critics pounced on Paul, decrying his actions in the video as insensitive and disrespectful. The video was viewed millions of times before it was removed.
In early 2019, a blogger raised an alarm after noticing people with seeming salacious interests in children were using comment boxes under YouTube videos to communicate and share.
The blogger believed the tactic allowed pedophiles get around YouTube’s ban on child pornography and network, with the algorithm even serving as a helpful tool as ads brought in money for video views.
YouTube faced a new advertising boycott by big brands.
YouTube quickly took steps that included disabling comments on videos with children, removing accounts and videos, and reporting illegal activity to police.