YouTube needs 'new set of rules and laws': executive

29 Jul, 2019

"YouTube has now grown to a big city. More bad actors have come into place. And just like in any big city, you need a new set of rules and laws and kind of regulatory regime," chief product officer Neal Mohan said in an interview.

Rising public pressure on YouTube and other social media platforms has prodded them to try and limit the negative aspects, lest governments clamp down with more stringent regulation.

Media reports said last week that the American tech giant Google had reached a multimillion-dollar settlement with the US Federal Trade Commission over alleged violations of children's data privacy laws on YouTube.

YouTube and other platforms have also been seen as havens for conspiracy theorists denying Holocaust or the September 11 attacks, as well as for Nazi and white supremacist groups.

"We must adapt to make sure that those things don't become rampant on our platform," Mohan said.

YouTube said in June that it would ban videos promoting or glorifying racism and discrimination as well as those denying well-documented violent events, like the Holocaust.

"Two billion users come to the platform every single month," Mohan noted, "so we must take our responsibility as a platform incredibly seriously."

- Where 'the magic comes from' -

"We want to make sure that YouTube remains an open platform because that's where a lot of the magic comes from, even though there may be some opinions and voices on the platform that I don't agree with, that you don't agree with."

Tech giants are facing increased scrutiny, with some political leaders calling for a breakup of the dominant players and others seeking tougher privacy and content moderation rules.

"I think first and foremost that that is our responsibility," Mohan told AFP.

"Community guidelines that were simple and straightforward 10 years ago don't apply the same way. They must be updated, they must be changed."

However the process is complex and the executive warned against expectations of a rapid solution.

"You can't just write a hate speech policy in one weekend. It could result in many unintended consequences," he said.

"We update our policies, we consult with many third party experts all across the world" with varied political leanings and points of view to "come up with some language that can then be enforced," he added.

"It's not a trivial process to come up with a new policy and a new enforcement regime."

- 'Borderline' videos -

In addition to hate speech, Mohan pointed out that YouTube also scrutinises videos that consist of "harassment, harmful and dangerous pranks, spam and abuse, fraud," or threats to child safety.

He highlighted the problem of "borderline" videos that can spread "harmful misinformation" without explicitly violating YouTube's rules.

In the United States, the company has people evaluate such content, after which it decides to what extent the videos will be recommended to other users.

That appears to have reduced recommendations of borderline content by around half, according to YouTube, and the system is due to be expanded to other countries.

"People should also be allowed to say things that... not everyone might agree with," Mohan said, while also noting that "we don't have the obligation to recommend every piece of content in a similar fashion."

He suggested that some sort of positive discrimination could be applied to "authoritative sources like AFP or CNN or BBC or the AP or whoever".

They in exchange would be challenged to provide "interesting and engaging" content.

However he said that YouTube would not shut the door on those who are outside the mainstream media.

"A new creator can come along and... by building up his credibility, building up his trustworthiness as a news source," could "establish themselves as an authoritative source on a particular topic."

Copyright AFP (Agence France-Press), 2019
 

Read Comments