The EU approves forcing platforms such as Facebook and YouTube to remove terrorist content within an hour

The plenary session of the European Parliament this Thursday gave the green light to the new regulation, which will force digital platforms such as Facebook or YouTube to remove terrorist content from their websites within a maximum of an hour in order to prevent it from spreading further Network.

The adoption by the European Chamber of the agreement reached between the institutions last month will be the final step in enabling the new regime to apply when it comes into force two months after its next publication in the EU’s Official Journal.

The new rules apply to all companies on the Internet offering their services in the European Union, regardless of where the platform’s legal seat is or how large they are, and they contain a detailed definition of what is owed as “terrorist” Content”.

The EU approves forcing platforms such as Facebook and YouTube to remove terrorist content within an hour
The EU approves forcing platforms such as Facebook and YouTube to remove terrorist content within an hour

With them, the authorities of a member state can order the platforms to remove content used as propaganda or to deactivate access from one of the EU countries.

Online providers have a maximum of one hour to comply with this order and it is up to the Member States to set the framework for penalties in the event of non-compliance, although they are responsible for ensuring that the platforms follow the instructions for which they are responsible for compliance are responsible.

In addition, platforms need to take specific measures to prevent misuse of their services and protect their networks from being used as channels for the dissemination of illegal content. However, it is up to these companies to decide which measures to implement to meet this obligation.

The rule also provides safeguards to prevent misuse, including a complaint mechanism, so that accidentally blocked or accidentally removed content can be reinstalled as soon as possible.

Finally, educational, investigative, artistic or journalistic material is protected, and companies do not have to filter all the content they upload to the network, nor use automatic control tools, although they have to act and take certain measures if the authorities detect dangerous content to avoid a wider spread.

Similar Posts