State of the Union 2018: Commission takes action to get terrorist content off the web - Questions and Answers

Met dank overgenomen van Europese Commissie (EC) i, gepubliceerd op woensdag 12 september 2018.

Why do we need legislation to tackle terrorist content online and why now?

Rapid detection and removal of terrorist content online is crucial to prevent further dissemination across other platforms. In January 2018 alone, almost 700 new pieces of official Da'esh propaganda were disseminated online, representing a very real risk to European society.The ability to spread this type of propaganda rapidly across platforms demands an equally rapid response.

While positive results have been achieved from voluntary initiatives, including under the EU i Internet Forum, terrorist propaganda continues to be easily accessible online and the level and pace of response continues to vary. In some cases, internet platforms have not engaged in voluntary efforts or did not take sufficiently robust action to reduce access to terrorist content online. In addition, different procedures and in some cases regulatory actions across Member States i limit the effectiveness and efficiency of cooperation between authorities and hosting service providers.

Many of the recent attacks within the EU have exposed terrorists' use of the internet to plan attacks, and there is continuing concern about the role of the internet in allowing terrorist organisations to radicalise, recruit, train, facilitate and direct terrorist activity. The European Parliament i and the European Council i called on the Commission i in 2017 and again in 2018 to present proposals to address these issues. These calls were echoed by statements issued by the leaders of the G7 i and G20 i in 2017 as part of the shared effort to tackle terrorism both offline and online.

This is why the European Commission is today proposing new legislation to get terrorist content off the web, making sure that the same obligations are imposed in a uniform manner across the whole Union. The proposed Regulation builds on the Communication of September 2017 and the Recommendation of March 2018 as well as the work of the EU Internet Forum, and specifies the obligations and responsibilities of hosting service providers and Member States, thereby increasing legal certainty.

Why aren't other types of illegal content included in this legislation?

This proposal takes into account the particular urgency of stopping the dissemination of terrorist content online and aims to put in place uniform measures that address this type of especially harmful content posing an imminent security risk to Europeans.

This does not mean that the Commission's work on other types of illegal content is less important. The Commission is working full speed to curbthe spread of other types of harmful illegal content, including hate speech and child sexual abuse online, through various measures:

  • On illegal hate speech, the implementation of the Code of Conduct on countering illegal online hate speech shows that voluntary collaboration can yield very positive results in a relatively short time. The Commission will continue to monitor the implementation by participating IT companies with the help of civil society organisations. Currently 70% of illegal online hate speech is removed upon notification and in more than 80% of the cases the assessment is made within 24 hours. The initiative has also been extended to other online platforms.
  • On child sexual abuse, the work of the WePROTECT Global Alliance to end child sexual exploitation online, and the long-standing EU support to the INHOPE network of hotlines on child sexual abuse content continue to make progress. An analysis of the current situation is ongoing and should provide valuable insights whether further intervention is needed.
  • On counterfeit products, recent guidelines released by the Commission should give further clarity to hosting services and improve the results of the specific measures. In addition, participation under the Memorandum of Understanding on the sale of counterfeit goods via the internet has brought an enhanced framework for cooperation. Other voluntary actions are emerging, most recently a Memorandum of Understanding on online advertising, signed on 25 June 2018.
  • On other types of illegal content, legal proposals (such as the Copyright Directive, New Deal for Consumers) or emerging cooperation (such as Consumer Protection Joint Actions) are expected to yield positive results.

What is online terrorist content?

Terrorist content online refers to material and information that incites, encourages or advocates terrorist offences, provides instructions on how to commit such crimes or promotes participation in activities of a terrorist group. Such material might include texts, images, sound recordings and videos.

The definition outlined in the new rules is fully aligned with the definition of terrorist offences set out in the existing Terrorism Directive.

When assessing whether online content constitutes terrorist content, the authorities responsible as well as hosting service providers should take into account factors such as the nature and wording of the statements, the context in which the statements were made, including whether it is disseminated for educational, journalistic or research purposes, and the potential to lead to harmful consequences.

For example, the expression of radical, polemic or controversial views in the public debate on sensitive political questions should not be considered terrorist content.

Which service providers will be affected by the proposals?

The new rules apply to all hosting service providers offering services in the EU, irrespective of their size or where they are based. Today, the hosting service providers that are most exposed to terrorist content are in fact based outside the EU. These service providers - that do not have their headquarters in any EU Member State but which do offer services within the Union - are asked to designate a legal representative within the EU in order to facilitate compliance with the new rules. They will then fall under the jurisdiction of the Member State where the legal representative or their company seat is located.

Hosting service providers, who provide information services, including storing of information shared by users and making information available to third parties, will also be bound by the new rules. Examples include social media platforms, video streaming services, video, image and audio sharing services, file sharing and other cloud services, websites where users can make comments or post reviews.

Are there special rules or support for small hosting service providers?

Since terrorist content increasingly affects smaller service providers, the new rules do not exclude companies on account of their size, but instead provide targeted measures depending on the level of risk, while also taking into account their economic capabilities. Prior assessment and the use of standardised removal orders containing all the relevant information will support smaller companies in taking swift action.

Taking account of possible financial and technological burdens, there are mechanisms that small enterprises can benefit from. For example, the EU Internet Forum offers a space for cooperation and exchange between all relevant actors including companies, Member States and Europol. Furthermore, numerous small and medium enterprises have already benefitted from shared tools they adapted to their content policy.

What are the measures proposed in the Regulation?

  • 1. 
    Removal orders

The new rules introduce binding removal orders. The orders, issued by national authorities and requesting hosting service providers to remove terrorist content online or disable access to it, must be carried out within 1 hour. As things stand, hosting service providers follow diverging rules concerning removal orders sent by some Member States, while other Member States refer material on a voluntary basis either directly to companies or via Europol, and have little or no power to ensure its subsequent removal. Under the new rules, failure to comply with a removal order may result in financial penalties.

Removal orders will be an important tool for Member States that may also wish to continue using existing voluntary referral arrangements, particularly where hosting service providers respond swiftly and effectively. Referrals of terrorist content however are not binding and have no specific timeframe.

  • 2. 
    Strong penalties

Member States will have to put in place effective, proportionate and dissuasive penalties for not complying with orders to remove online terrorist content. In the event of systematic failures to remove such content within 1 hour following removal orders, a service provider could face financial penalties of up to 4% of its global turnover for the last business year.

  • 3. 
    Duty of care obligation

The new rules require hosting service providers to take proactive measures including the deployment of automated detection tools where appropriate and when they are exposed to the risk of hosting terrorist content. This will ensure that affected service providers do not depend only on the authorities or third parties flagging terrorist content, but take proactive measures to prevent their services from being exploited by terrorists. Service providers should also report on the proactive measures put in place after having received a removal order.

These proactive measures should be proportionate to the risk and the economic capacity of hosting service providers. They might comprise measures to prevent the re-upload of removed terrorist content or tools to identify new terrorist content, whilst recognising the need for oversight and human assessment to ensure that legal content is not removed. Such measures should be decided primarily by the hosting service providers themselves and, if necessary, in dialogue with national authorities. National authorities may, as a last resort, impose specific proactive measures where the measures in place by hosting service providers prove insufficient.

  • 4. 
    Strong safeguards

The new rules will require hosting service providers to put in place effective safeguards to ensure full respect of fundamental rights, such as freedom of expression and information. In addition to possibilities of judicial redress, such safeguards will include the possibility for hosting service providers and content providers to contest a removal order as well as effective complaint mechanisms for content providers where hosting service providers have taken down content unjustifiably.

  • 5. 
    Increased cooperation

Hosting service providers and Member States will be obliged to nominate points of contact to facilitate the swift handling of removal orders and referrals. This will help improve co-operation between Member States and the companies, where outreach efforts have at times been difficult.

A hosting service provider's point of contact does not have to be located in the EU but should be available 24/7 to ensure that terrorist content is removed, or access to it is disabled, within 1 hour of receiving a removal order.

Cooperation with Europol, Member States and hosting service providers is encouraged and will be further enhanced when transmitting removal orders and referrals.

  • 6. 
    Transparency and accountability

The new rules will provide for greater accountability and transparency. Companies and Member States will be required to report on their efforts and the Commission will establish a detailed programme for monitoring the results and impact of the new rules. To enhance transparency and accountability towards their users, online platforms will also publish annual transparency reports explaining how they address terrorist content on their services.

Why was the removal timeframe set at 1 hour?

Terrorist content is most harmful in the first hours of its appearance because of the speed at which it can spread. For instance, research has found that one third of all links to Da'esh propaganda disseminate within 1 hour of release.

Once uploaded, terrorist content is not always immediately detected and is able to move quickly from one platform to another. Furthermore, it is not always quickly removed even when referred to the companies by law-enforcement authorities or by Europol. This allows terrorists to operate online easily, to groom and recruit for terrorist purposes, facilitate terrorist activity, provide instructions on how to conduct attacks, intimidate the public and glorify their atrocities.

The short timeframe for removal is considered necessary to reduce the volume of terrorist content, as well as the number of viewers who can access it.

Does failure to remove all terrorist content online automatically mean fines for the hosting service providers?

No, each case will need to be assessed by the national authorities responsible. Member States will need to set out rules on effective, proportionate, and dissuasive penalties. When imposing sanctions, the national authorities are asked to take into account factors such as the gravity, duration, and nature of a breach - but also whether the breach was intentional or negligent, whether it was a first breach by the provider in question, as well as their economic standing and willingness to cooperate with the authorities. Given the particular importance of the swift removal of terrorist content identified in a removal order, specific rules will be put in place on financial penalties for systematic breaches of this requirement, reaching up to 4% of the hosting service provider's global turnover in the last business year.

How will this proposal affect companies?

The new rules will help prevent the dissemination of terrorist content online and make it easier for companies to operate in the Digital Single Market. They will increase legal clarity for companies and help them protect their services against exploitation for terrorist purposes, avoid reputational damage and strengthen the trust of their users.

Hosting service providers will need to put in place adequate and effective operational and technical measures to ensure the swift detection, identification and removal of terrorist content, including designation of a point of contact and a legal representative if necessary.

Who is responsible for detecting and reporting terrorist content?

  • 1. 
    What are the hosting service providers responsible for?

The new rules set a clear legal framework. Service providers need to react and remove content within 1 hour from the time they receive a removal order from national authorities. They also need to assess referrals and collaborate with both national authorities and Europol. In addition, hosting service providers exposed to terrorist content are expected to take proportionate proactive measures to reduce access to terrorist content on their services and to prevent its hosting, upload, and re-upload. These proactive measures might include tools designed to detect new terrorist material. There is however, no general monitoring obligation. The new rules require hosting service providers to set out in their terms and conditions their policy with regard to terrorist content.

  • 2. 
    What is the role of Member States?

National authorities are tasked with detecting and identifying terrorist content, issuing removal orders and referrals. In doing so they should cooperate with hosting service providers, as well as the authorities in other Member States and with Europol. To avoid duplication and possible interferences with investigations, they should inform and cooperate with each other and Europol, when issuing removal orders or sending referrals to hosting service providers.

When supervising the application of the new rules, national authorities can also request hosting service providers to report on the measures they are taking. If such measures are not considered sufficient, Member States should have the possibility to impose certain additional measures on the hosting service providers. Those should, however, take into account the economic and operational capacity of the hosting service provider so as to prevent the risk of removing potentially legal content.

  • 3. 
    What is the role of Europol?

Europol - particularly its EU Internet Referral Unit - will continue its current work, and will have a supporting role in implementing the new rules. Since its establishment in 2015, the EU Internet Referral Unit has referred over 60,000 pieces of content to over 130 companies and played a crucial role in facilitating cooperation between Member States and hosting service providers. The EU Internet Referral Unit will continue to actively scan the internet for terrorist content and refer it to the hosting service providers, accompanied by an assessment. Member States can channel removal orders and referrals via the tools and platforms to be developed by Europol.

Similarly, national authorities and hosting service providers can make use of Europol's supportive function and existing tools in the identification and implementation of proactive measures. Such cooperation will improve the ability to act collectively against terrorist content, avoiding duplication.

Can hosting service providers and/or content providers challenge the requests or decision to remove content?

Yes. Both hosting service providers and content providers have the possibility to contest a removal order issued by a national authority. In addition, where hosting service providers take a decision to remove content - following a referral or on their own initiative - content providers can ask for a review. For that purpose, the hosting service providers are asked to set up user-friendly complaint mechanisms and ensure transparency and swift follow-up.

Will the new rules censor the internet?

No. The rules have been set out in full respect of fundamental rights protected in the EU - notably those guaranteed in the Charter of Fundamental Rights of the European Union and only concern the spread of terrorist content online. Additionally, the new rules provide for strong safeguards to guarantee the protection of fundamental rights, particularly the rights to freedom of expression and information, and to ensure that legitimate material is not removed.

What safeguards will be put in place to protect fundamental rights and freedom of speech?

The new rules provide for robust safeguards to ensure that measures to remove terrorist propaganda are necessary, appropriate and proportionate within a democratic society and do not lead to the removal of material that is protected by freedom of expression and information. Any measures should be strictly targeted to curb the proliferation of terrorist content online.

A clear definition and prior assessment by national authorities will ensure that only terrorist content is removed. Increased transparency regarding the hosting service providers' policies and actions taken to remove terrorist content, combined with user-friendly complaint mechanisms, as well as reporting to public authorities, will ensure effective control and accountability.

Where the hosting service providers use automated means to identify and remove terrorist content, they must ensure that any such decisions are accurate, well-founded and subject to human oversight and verification. They need to inform users when content is removed and put in place complaint mechanisms.

Will the Regulation impact Member States' investigations?

The new rules take into account the fact that removal of some of the terrorist content could influence ongoing or future criminal investigations and effective prosecution. Member States may therefore request that removed content is retained for such purposes. For that reason, companies are requested to retain removed content and related data for 6 months.

The new rules also stipulate that where hosting service providers become aware of material that is evidence of a terrorist offence, they should promptly inform the relevant authorities. This will help ensure that law enforcement and security partners have the best possible chance to take action.

What else is the Commission doing to counter radicalisation online?

Reducing accessibility to terrorist content online is only one aspect of the EU's response to radicalisation online. The EU Internet Forum also empowers civil society partners to increase the volume of alternative narratives online. The Civil Society Empowerment Programme, funded by the Commission, provides the skills and knowledge to deliver effective campaigns online. Following a pan-European training programme for civil society partners, a first call for proposals, with a budget of €6 million, has been launched to support credible voices in the dissemination of positive, alternative narratives throughout Europe.

Will the voluntary framework of cooperation continue?

Yes. The next meeting of the EU Internet Forum will be convened in December 2018. Voluntary arrangements under the EU Internet Forum have produced positive results. The aim of today's proposal is to reinforce these efforts and ensure that all companies at risk comply with a minimum set of requirements. But beyond the legislative proposal, tackling the challenge of terrorist content online is a common effort that will require more, not less, cooperation between the private sector and public authorities. The Commission expects public-private cooperation under the EU Internet Forum not only to continue but to strengthen in the future.

For more information

Webpage on the State of the Union 2018

Press release: Commission proposes new rules to get terrorist content off the web

Factsheets, legal documents and other useful documents - all available here

MEMO/18/5711

 

Press contacts:

General public inquiries: Europe Direct by phone 00 800 67 89 10 11 or by email