banner



Computers Help YouTube Remove 6.7M Problematic Videos

Extremist videos are disappearing from YouTube at a faster prune — all due to computer algorithms.

On Monday, the Google-endemic platform shared details nearly its efforts to fight objectionable videos with AI-powered machine learning. And the efforts announced to be paying off.

From Last October to December, YouTube's machine learning systems helped information technology delete 6.seven million videos over sexual imagery, spam or terrorist content. And nearly of the videos — at 76 percent — were removed earlier they received a single view.

On the flip side, the computer algorithms failed to prevent the remaining videos from gaining a brief audience. Only in a blog post, YouTube said the platform was originally slower when information technology came to taking down extremist video clips.

"For example, at the beginning of 2022, 8 pct of the videos flagged and removed for vehement extremism were taken down with fewer than 10 views," it said.

Yet, last June the video streaming service started using its machine learning flagging system. "At present more than one-half of the videos we remove for trigger-happy extremism have fewer than x views," the blog post added.

YouTube Take Down Ai

The larger question is whether the AI-powered systems tin can improve over time. YouTube'south own data might provide an respond. It plans on publishing a quarterly report on the video takedowns, the first of which went out on Mon.

"Our advances in machine learning enabled the states to take down nearly lxx percent of violent extremism content within viii hours of upload and about one-half of it in 2 hours," the quarterly report said.

Unfortunately, YouTube isn't offering any historical information. The latest report simply covers the recent Oct. to Dec. period, when information technology removed a total of 8.two million videos, the majority of which came from its automatic flagging organisation.

However, YouTube's computer algorithms don't delete any videos on their own. A human will review a flagged clip to confirm it violates the platform'southward policies. The company plans on hiring 10,000 people this year to help it review content.

YouTube is relying on the machine learning as the streaming service is facing scrutiny over content moderation. The platform has long been contesting terrorist content from creeping into the service. But other critics have pointed to the rise of misinformation on YouTube as another disturbing trend.

For instance, in February, a false conspiracy video nearly the Parkland, Florida loftier school shooting managed to trend over YouTube, before it was taken downwards.

Whether computer algorithms can be constructive at grayer areas of content moderation is its ain area of debate. Just for now, YouTube said its machine learning system is focused on flagging the nearly egregious video clips, such equally those that incite violence or contain kid abuse.

Source: https://sea.pcmag.com/news/20752/computers-help-youtube-remove-67m-problematic-videos

Posted by: hicksbaginert79.blogspot.com

0 Response to "Computers Help YouTube Remove 6.7M Problematic Videos"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel