Google-owned entertainment
platform YouTube is taking urgent steps to ensure the safety of the younger members
of its audience. A newly developed machine learning algorithm will help
classify comments more effectively to filter content that exploits children
aged 13 and younger, writes
The New York Times
.

One of YouTube’s critical
measures is restricting comment features on videos of children that are “attracting
predatory behavior.” The algorithm will be used on “tens of millions” of videos
but less risky channels will be allowed, with moderators.

“Recently, there have been some deeply concerning incidents regarding child safety on YouTube,” tweeted Susan Wojcicki, YouTube’s chief executive. “Nothing is more important to us than ensuring the safety of young people on the platform. More on the steps we’re taking to better protect children & families.”

“Over the past week, we disabled
comments from tens of millions of videos that could be subject to predatory
behavior,” reads
a company statement
. “These efforts are focused on videos featuring young
minors and we will continue to identify videos at risk over the next few
months. Over the next few months, we will be broadening this action to suspend
comments on videos featuring young minors and videos featuring older minors
that could be at risk of attracting predatory behavior.”

Measures to ensure community
safety and appease concerns about child exploitation were taken after a video
“documenting how pedophiles have used comments on videos of children to guide
other predators” went viral. In addition, a number of top advertisers including
AT&T, Disney, Nestlé and Epic Games ceased advertising on YouTube after
their ads were displayed on videos with minors that had predatory, explicit messages
or emojis in the comment section, writes
The New York Times.
Other comments include timestamps to compromising poses
from the videos.