EU to fine companies for failing to remove terrorism content
New legislation planned to make internet companies remove terror content within one hour
The European Union is planning to fine social media companies that do not take down terrorist content and related material within one hour.
The European Commission has drafted legislation that will impose tough fines on internet companies that don't remove content related to terrorism quickly enough.
The Financial Times has reported that the EC has apparently lost patience with industry promises of self-regulation, and decided to implement the laws to force social media and other internet companies to remove terrorist content from their platforms.
In March the EC published guidelines on control of all forms of illegal online content, including terrorist content, incitement to hatred and violence, child sexual abuse material, counterfeit products and copyright infringement.
The EC also specified operational measures that it expected internet companies to implement to improve fast removal of content, which included clearer procedures to flag content, faster response times, more co-operation with authorities, more effective tools for removal and better safeguarding of rights.
The March announcement noted that progress had been made in dealing with content by some companies, with internet companies removing an average of 70% of illegal hate speech that is flagged to them, but the report added: "Illegal content online remains a serious problem with great consequences for the security and safety of citizens and companies, undermining the trust in the digital economy."
The EC singled out terrorism-related content as a specific area of concern, and recommended more specific provisions to further curb terrorist content online. These included a One-hour rule, which said that "considering that terrorist content is most harmful in the first hours of its appearance online, all companies should remove such content within one hour from its referral as a general rule."
Other recommendations called for faster detection and removal, including automated detection and means to stop content reappearing, and better collaboration among companies and law enforcement; an improved referral system; and regular reporting from EU member countries on progress on reporting and removal.
The Commission said it would monitor industry responses to these recommendations to determine if legislation was necessary. As the EU has decided the industry has not done enough, it seems likely that the new legislation will be based on these recommendations.