YouTube faced the problem of growth, says chief product officer Neal Mohan. In his recent interview, CPO emphasized that the video-platform needs ‘new set of rules and laws’.
In recent years, Youtube is striving to block haters and criminals who intend to promote racism, violence and disinformation via the world-famous video-platform. Now, the Google-owned company comes under increasing scrutiny for many controversial videos, that’s why the new set of rules is the urgent neediness.
«YouTube has now grown to a big city. More bad actors have come into place. And just like in any big city, you need a new set of rules and laws and kind of regulatory regime,» Mohan said.
As the recent reports say, the tech giant Google had reached a multimillion-dollar settlement with the US Federal Trade Commission over alleged violations of children’s data privacy laws on YouTube. The video-platform should limit the negative aspects, lest governments clamp down with more stringent regulation, the experts recommend.
YouTube is not only the video-service that seen as a haven for terrorists, Nazi and white supremacists. «We must adapt to make sure that those things don’t become rampant on our platform,» Youtube chief product officer added.
Youtube: freedom means responsibility and censorship
Youtube reflects the problem of duality: people hail freedom of information but they are not ready to understand that censorship is an inevitable part of public information.
Nowadays, tech giants are facing increased scrutiny. Community guidelines that were simple and straightforward 10 years ago don’t apply the same way. «They must be updated, they must be changed,» Youtube official said and added that there is no rapid solution because the process is too complex.
«You can’t just write a hate speech policy in one weekend. It could result in many unintended consequences,» Mohan said.
Updating the rules and policies for Youtube requires many consultations with experts across the world. In the US, the company has people evaluate such content, after which it decides to what extent the videos will be recommended to other users.
That appears to have reduced recommendations of borderline content by around half, according to YouTube, and the system is due to be expanded to other countries.
Mr Mohan suggested as well that some sort of positive discrimination could be applied to «authoritative sources like AFP or CNN or BBC or the AP or whoever».