Social Media: Regulation

Department for Digital, Culture, Media and Sport written question – answered on 21st May 2021.

Alert me about debates like this

Photo of Chi Onwurah Chi Onwurah Shadow Minister (Business, Energy and Industrial Strategy), Shadow Minister (Digital, Culture, Media and Sport)

To ask the Secretary of State for Digital, Culture, Media and Sport, what assessment he has made of the potential merits of requiring social media companies to report on the algorithms they use to monitor online hate speech on their platforms and any biases found within those algorithms.

Photo of Caroline Dinenage Caroline Dinenage Minister of State

Hate speech is completely unacceptable in an open and tolerant society. Our new laws will mean social media companies must keep promises to their users about their standards and stamp out this sort of abuse. Companies will need to take steps to mitigate the risks of harm associated with their algorithms. This will apply in the case of illegal content and, in particular, companies will need to ensure that systems for targeting content to children, such as the use of algorithms, protect them from harmful material.

Ofcom will have a range of powers at its disposal to help it assess whether companies are fulfilling their duties. The largest and most high risk companies will also be required to produce transparency reports, which will include information about the steps companies are taking to protect users. These reports may include information about the processes and tools in place to address illegal and harmful content and activity, including, where appropriate, tools to identify, flag, block or remove illegal and harmful content.

Does this answer the above question?

Yes1 person thinks so

No0 people think not

Would you like to ask a question like this yourself? Use our Freedom of Information site.