Facebook and Twitter face €50m fines if they don’t tackle hate speech

New laws in Germany will force Facebook, Twitter and Google to delete hate speech within 24 hours and file quarterly reports
JOHN MACDOUGALL/AFP/Getty Images

Six months after proposing regulations that would allow it to fine social networks for failing to tackle hate speech, Germany has passed these proposals into law.

Justice Minister Heiko Maas explained that, under the Network Enforcement Act, social media companies will face fines for failing to remove “obviously illegal” content within 24 hours. For cases and posts which are less clear, the companies will have a week to take the content down. Examples given by Maas include hate speech, defamation, and incitements to violence.

When the law comes into effect in October, the sites will face a fine of €5 million but this could increase to €50 million for repeat offences or extreme cases. The regulations expand on the EU Code of Conduct on Countering Illegal Hate Speech Online which similarly requires social networks to review and remove hateful content within 24 hours. While social networks have agreed to this voluntary code of conduct, there has been no requirement to deliver oversight. German politicians felt the social networks have not fulfilled their obligations under the code and the code allowed networks to argue that anything not removed did not breach their terms.

Read more: Germany threatens Facebook with new laws forcing it to tackle hate speech and fake news

A recent example of this, separate from the issue of hate speech, was brought to light by a BBC investigation, which revealed that 80 per cent of child abuse images it reported to Facebook were not removed. Facebook responded to the allegations by requesting the BBC send examples of the material to it, then reported the BBC journalists to British authorities for sending them illegal content.

“We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards. This content is no longer on our platform,” Simon Milner, Facebook’s UK policy director, said at the time. The social network still failed to clarify if all the content had been removed. Under the new German law, Facebook will be asked to explain why it didn't remove content when requested.

Since the proposals were made, Facebook, in partnership with the Institute for Strategic Dialogue, has launched the Online Civil Courage Initiative (OCCI) in the UK, a counterspeech program to help to tackle online extremism and hate speech. Read more: Facebook launches Online Civil Courage Initiative to tackle rising extremism in the UK

The initiative is being jointly announced in London by Facebook's chief operating officer Sheryl Sandberg, Strategic Dialogue's CEO Sasha Havlicek, as well as founding partners Brendan Cox, husband of murdered MP Jo Cox and head of the Jo Cox Foundation, Mark Gardner from the Community Security Trust, Fiyaz Mughal at Tell MAMA and Shaukat Warraich from Imams Online.

The OCCI is being set up to offer "financial and marketing support to UK NGOs working to counter online extremism" and will bring together experts to develop best practice and tools. This includes training for NGOs to help them monitor and respond to extremist content, a support desk so they can contact Facebook directly, marketing support for counterspeech campaigns including Facebook advertising credits, knowledge sharing with NGOs, government and other online services; and financial support for academic research on online and offline patterns of extremism.

This article was originally published by WIRED UK