Mark Bergen Bloomberg News
Published 7:30 PM EDT Aug 2, 2019
In early July, YouTube made a significant change to its software to boost what it deems “quality” children’s content, sending waves of traffic to certain video producers and burying other channels. The change came as the company tries to convince parents its service is safe for kids, and convince regulators that it isn’t violating the law.
The update immediately alarmed many YouTube creators, who already feel that their livelihoods hang at the whims of mysterious algorithms.
Kids’ entertainment is massive on YouTube, the internet video-sharing arm of Google. It’s also incredibly controversial. Because YouTube lets people post clips with few limitations, it has faced blistering criticism for making inappropriate and disturbing footage available to kids. In response, in recent years YouTube has made two notable changes.
In 2017, YouTube purged dozens of channels behind violent and sexual videos featuring kids or cartoons, and earlier this year it shut off the ability for users to comment on videos starring children following a scandal after evidence surfaced that video comments were used to identify young girls in clips that could be seen as sexually suggestive.
YouTube’s software algorithms determine how videos are placed in search results and viewing recommendations, and so the company is notoriously secretive about them. Thousands of video creators rely on YouTube’s cloaked system to reach their audience and earn advertising money. Many adjustments to the software are routine, but the latest change stood out.
“Most of the time, we don’t even notice it,” said Melissa Hunter of Family Video Network, a YouTube multi-channel network and consulting firm. “Whatever was tweaked about a week and a half ago was very noticeable.”
YouTube confirmed the recent software update, but declined to detail the reasons behind it.
“We make hundreds of changes every year to make it easier for people to find what they want to watch on YouTube,” Ivy Choi, a company spokeswoman, said in a statement.“We recently made one such change that improves the ability for users to find quality family content.”
Since the change, some videos aimed at preschoolers saw a precipitous drop in traffic, while others catering to a similar age group saw major spikes, Hunter said. When the shift occurred, the company did not communicate it to creators, according to Hunter and other YouTube creators. The viewers have not returned. One YouTube creator posted a chart on July 13, in a private Facebook group, showing a 98% drop in viewing traffic in three days.
“Is it time to stop creating kids content?” read the message.
YouTube bars minors under 13 from using the site, and recommends children use YouTube Kids, its app with more content filters and parental controls. But the app’s reach is small relative to YouTube’s main site, and people at the company have privately acknowledged that older children gravitate from the app to the far larger media catalog on YouTube.com.
Sundar Pichai, Google’s chief executive officer, has stressed the “educational” value of YouTube. He told investors last week that YouTube would place “a lot of effort” into its Kids app.
“It’s a product you’re going to see us focus more on and continue to evolve, add more curated content there, and make sure it’s safe for kids and give parents peace of mind,” he said on the company’s earnings call. Pichai said the approach also applied to “family-oriented” videos on YouTube.com. “Rewarding trusted creators is a big way we can help,” he added.
YouTube wouldn’t share examples of “trusted creators.” But the company pointed to a guideline it publishes for making family videos.
The document suggests avoiding footage that looks like it came from “content farms” (repetitious clips that feature the same cartoon truck in different colors, for instance) and “mindless, addictive content that has no substance or developmental value to the viewer.” A spokeswoman said these guidelines do not determine what videos are recommended, removed or eligible for ads.
The company also hasn’t detailed how it defines “quality”or “educational” videos. So one of the best barometers for YouTube’s metric is its Kids app, which places videos front-and-center once a viewer logs in. The educational merits of these choices are up for debate. Heather Kirkorian, an early childhood development professor at the University of Wisconsin-Madison, opened the app this week and found Baby Shark and Lucas the Spider, two global hits.
“I wouldn’t consider them educational. I would consider them wholesome,” she said. “The term educational’ is used as an umbrella for non-harmful.’”
Creators said YouTube’s recent updates impacted both its main site and Kids app.
- The CW’s New Batwoman Ruby Rose ‘Lived in Cardboard Batwings’ When She Was a Kid (Video)
- YouTube ‘fire challenge’ kid, 12, originally told responders he was burned by oven
- Top U.S. brands pull Google and YouTube ads over extremist videos
- YouTube reverses some restrictions on gay-themed content
- Meet the YouTube Stars Turning Viewers Into Readers
- Tax Lawyer Richard Sam Lehman announces the release of his new series of videos on the Tax Cuts and Jobs Act of 2017
- Autoplay Videos Are Not Going Away. Here’s How to Fight Them.
- YouTubers Made Hundreds Of Thousands Off Of Bizarre And Disturbing Child Content
- YouTube's 'poop' problem: raunchy cartoons skirt filters designed to keep out teens, kids
- On YouTube, sexualized 'Frozen' and Nickelodeon cartoons still slip through age filters