Teal overlay

EU’s digital content law now applies to all platforms

The EU’s flagship digital content law, the Digital Services Act (DSA), now applies to all platforms hosting or producing digital content within the EU.

Since August, the law has applied to very large platforms – those with more than 45 million active monthly users in the EU.

This applied to fewer than two dozen sites and search engines.

The likes of TikTok, Amazon, Facebook, Instagram and Google all had more than 45 million active monthly users.

Well-known websites, including eBay, escaped being classed among the biggest online platforms initially but will now face extra scrutiny and possibly sanctions.

From (17th February), the rules now apply to all platforms – though companies with fewer than 50 staff and a turnover of less than €10m will be exempted from many of the more burdensome rules.

They will still have to make sure that they set clear terms and conditions, however, and will also have to provide a contact point for authorities.

Among the obligations now facing digital platforms is a requirement to quickly remove or block access to illegal content as soon as they become aware of it.

They must also quickly inform the relevant authorities if they suspect a criminal offence that threatens the safety of others.

Users who repeatedly share illegal content must be barred

There is an obligation to suspend users who repeatedly post or share illegal content, including hate speech and fake or fraudulent ads.

Online shopping platforms must also suspend or block fraudsters, as well as take steps to verify the identity of users.

The very large platforms still face the most stringent requirements, including assessing risks throughout their organisations linked to illegal and harmful content.

They are required to mitigate these risks through a number of measures, such as improved content monitoring and moderation.

They will also be audited annually by independent organisations, at the platforms’ own expense, to ensure that they are complying with their obligations.

Platforms that are used by minors are required to protect their privacy and security, as well as their physical and mental well-being.

This could include age verification tools, parental controls, and tools that make it easier for users to flag abuse or access support.

Today’s news was brought to you by TD SYNNEX – the UK’s number one solutions distributor.

Back to Top