YouTube new Policy to curb extremist videos
After Facebook(Facebook said that it uses Artificial Intelligence to remove the terror-related content.), YouTube has also introduced four steps to confront extremist activity. YouTube has been working with various governments and law enforcement agencies to identify and remove this content said Senior Vice-President and General Counsel of Google Kent Walker.
The first of the four steps is expanding the use of their automated systems to better identify terror-related videos, using machine learning to “train new content classifiers to help us more quickly identify and remove such content”.
The second step is the company is also expanding its pool of “Trusted Flagger” users; a group of experts with special privileges to review flagged content that violates the site’s community guidelines.
The third step would be hiding videos like ones that contain inflammatory religious or supremacist content — that do not violate community standards, behind a warning. And the fourth step is The company will do more with counter-radicalisation efforts by building off of its “Creators for Change program”, which will redirect users targeted by extremist groups such as Islamic State (IS) to counter-extremist content.
On Sunday, Google, YouTube’s parent company, announced a set of policies aimed at curbing extremist videos on the platform. For videos that are clearly in violation of its community guidelines, such as those promoting terrorism, Google said it would quickly identify and remove them.