Wednesday, December 6, 2017

YouTube promises to increase content moderation and other enforcement staff to 10K in 2018


Now that its bottom line is being affected, YouTube says it will begin to take additional steps to protect its advertisers and creators from inappropriate content on its network. In a blog post authored by YouTube CEO Susan Wojcicki on Monday, the company said it will increase its staff to 10,000 in 2018 to help better moderate video content, and perform other related tasks.* The news follows a series of scandals on the video-sharing site related to its lack of policing around content aimed at childrenobscene comments on videos of children, horrifying search suggestions, and more.
* Update: YouTube reached out to clarify its hiring numbers. Its teams include a mix of different specialists working on these issues, the majority who are content reviewers. But the number also includes other policy specialists who review and take action on content, engineers building and refining the machine learning technology, and policy specialists training our machine learning algorithms. The 10K ‘content reviewers’ headline was not accurate, and we’ve corrected.  
The company has been suffering from the fallout of accusations that it has for too long allowed bad actors to game its recommendation algorithms to reach children with videos that aren’t meant for younger viewers. At the same time, it has seemingly fostered a community of creators making videos that involve putting kids in concerning, and even exploitive, situations.
One example, the channel ToyFreaks, was recently terminated after concerns were raised about its videos, where a fathers’ young daughters were filmed in odd, upsetting and inappropriate situations, at times.
YouTube had said then the channel’s removal was part of a new tightening of its child endangerment policies. It also last month implemented new policies to flag videos where inappropriate content was aimed at children.
It has since pulled down thousands of videos of children as a result, and removed the advertising from nearly 2 million videos and over 50,000 channels.
Having policies is one thing, but having staff on hand to actually enforce them is another.
That’s why YouTube says it’s now planning to increase its workforce focused on this task. While the blog post from Wojcicki only offered the number of total hires it planned to have on staff by next year, a report from BuzzFeed notes this “over 10,000” figure represents a 25 percent increase from the current staffing levels.
However, YouTube still relies heavily on algorithms to help police its content. As Wojcicki noted in a blog post, YouTube plans to use machine learning technology to help it “quickly and efficiently remove content that violates our guidelines.”
This same technology has aided YouTube in flagging violent extremist content on the site, leading to the removal of over 150,000 videos since June.
“Today, 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms,” Wojcicki wrote. “Our advances in machine learning let us now take down nearly 70 percent of violent extremist content within eight hours of upload and nearly half of it in two hours and we continue to accelerate that speed,” she added.
The goal is now turn those technologies to a more difficult (and sometimes less obvious) area to police.
While some content is easier to spot – like videos where kids seem to be in pain, or being ‘pranked’ by parents in a cruel fashion – other videos exist in a much grayer area.
There are so many parents who have roped their kids into their quest for YouTube stardom, it’s hard to draw a fine line between what’s appropriate and what’s not.
One question that needs to be raised is to what extent can a preschooler or schoolager really consent to participating in mom or dad’s daily videos? Shouldn’t they be free to play instead of constantly instructed to act out various skits, or have the camera trained on them nonstop? These channels, after all, aren’t just the occasional fun video – they are often full-time jobs for parents. There are laws in the U.S. around child labor, and child actors in particular, but YouTube has continually danced around that line, as it’s “not really TV” – and that means it doesn’t have to play by TV’s rules regarding deceptive adsjunk food ads, and more.
In addition to the new policies and promises of increased staffing, YouTube also says it will create regular reports where it’s transparent about the aggregate data regarding the flags it receives, and the actions it takes to remove videos and comments that violate its content policies.
And most importantly, in terms of its business, YouTube says it will more carefully consider which channels and videos are eligible for advertising using a set of stricter criteria, combined with more manual curation.
“We are taking these actions because it’s the right thing to do,” wrote Wojcicki. “Creators make incredible content that builds global fan bases. Fans come to YouTube to watch, share, and engage with this content. Advertisers, who want to reach those people, fund this creator economy. Each of these groups is essential to YouTube’s creative ecosystem—none can thrive on YouTube without the other—and all three deserve our best efforts.”
Personally, I’d love it if YouTube cut off the ability for creators to make money from videos featuring children, period. Maybe the too-young stars could finally get a break and just be allowed to just go be kids again. But I won’t hold my breath.

Google’s Keep note-taking app is getting a new feature courtesy of Android 14 that’s a huge time-saver, even if Samsung got there first

  There’s a certain balance that needs to be achieved with lock screen functionality. You can’t give away too much because of, well, securit...