Terrorist content online should be removed within one hour, European Parliament decides

Companies that systematically and persistently fail to abide by the law may be sanctioned with up to 4% of their global turnover

A hosting company has to act quickly on any terrorist material posted on its medium after it receives a removal order from the competent national authorities
A hosting company has to act quickly on any terrorist material posted on its medium after it receives a removal order from the competent national authorities

Internet companies should remove terrorist content within one hour after receiving an order from the authorities to combat radicalisation and contribute to public security.

With 308 votes in favour to 204 against and 70 abstentions, the European Parliament backed on Wednesday a proposal to tackle the misuse of internet hosting services for terrorist purposes. Companies that systematically and persistently fail to abide by the law may be sanctioned with up to 4% of their global turnover.

Once an internet company hosting content uploaded by users (like Facebook or YouTube) that offers their services in the EU has received a removal order from the competent national authority, they will have one hour to remove it or disable access to it in all EU member states.

However, they will not be generally obliged to monitor the information they transmit or store, nor have to actively seek facts indicating illegal activity.

To help smaller platforms, MEPs decided that when a company has never received a removal order before, the competent authority should contact it, to provide information on procedures and deadlines, at least 12 hours before issuing the first order to remove content that they are hosting.

If a company has been subject to a substantial number of removal orders, the authorities may request that it implements additional specific measures (e.g. regularly reporting to the authorities, or increasing human resources). MEPs in the Civil Liberties Committee agreed not to impose an obligation to monitor uploaded content nor the use of automated tools.

The legislation targets any material, text, images, sound recordings or videos, that “incites or solicits the commission or contribution to the commission of terrorist offences, provides instructions for the commission of such offences or solicits the participation in activities of a terrorist group”, as well as content providing guidance on how to make and use explosives, firearms and other weapons for terrorist purposes.

Content disseminated for educational, journalistic or research purposes should be protected, according to MEPs. They also make clear that the expression of polemic or controversial views on sensitive political questions should not be considered terrorist content.

Daniel Dalton EP rapporteur for the proposal, said: "There is clearly a problem with terrorist material circulating unchecked on the internet for too long. This propaganda can be linked to actual terrorist incidents and national authorities must be able to act decisively. Any new legislation must be practical and proportionate if we are to safeguard free speech. Without a fair process, there is a risk that too much content would be removed, as businesses would understandably take a ‘safety first’ approach to defend themselves. It also absolutely cannot lead to a general monitoring of content by the back door."