The Digital Services Act (DSA) is a new regulation in EU law that updates the Electronic Commerce Directive 2000. It is a ground-breaking piece of legislation that came into effect on August 25th, 2023 and marks a new era of accountability for online platforms.
The DSA and its sister legislation, the Digital Markets Act, which will also be phased in over the coming months, aim to place greater social responsibility on online platforms and hold them legally accountable for everything from their content to how they target and interact with their users. It aims to create a safer online environment by cracking down on illegal content and disinformation, stopping the sale of illegal goods and services online, stamping out propaganda, hate speech and crimes such as harassment and child abuse and ensuring that the fundamental rights recognised by law across Europe, including freedom of expression and data protection, are safeguarded. Companies that do not comply with the new law risk hefty fines and even an EU-wide ban.
The DSA applies to all companies, serving the EU but is really designed to target the very large online platforms that have the most significant impact on the digital market. Small businesses that operate within the EU will still need to comply with the DSA but won’t be subject to the same level of scrutiny as larger companies. The toughest obligations will apply to the 17 companies including Facebook and Amazon that have been identified as “very large online platforms”, or “very large online search engines”, namely Google and Bing.
Here are some of the main areas of focus for the DSA it is by no means an exhaustive list but gives and overview of some of the most important changes.
The DSA is designed to improve content moderation on social media platforms in order to address concerns about illegal content. The new laws require platforms to take proactive measures against illegal content such as hate speech, terrorist propaganda, child sexual abuse material, counterfeit goods, etc. Platforms must also provide users with an easy way of reporting such content and if content is deemed illegal it must be removed.
The DSA requires platforms to be transparent about their policies regarding advertising, content moderation and data collection. Platforms must also provide users with clear information about how their data is being used.
Platforms will be forced to take measures against disinformation campaigns such as those carried out by foreign actors during elections or cases of propaganda.
The DSA gives users more control over their data by requiring platforms to provide them with clear information about how their data is being used. Users must also be given the right to opt-out of targeted advertising.
The DSA gives regulators more power to enforce its provisions by allowing them to impose fines of up to 6% of a company’s global revenue for non-compliance.
The power of regulators to combat issues such as the spread of misinformation or violations of antitrust laws have been limited up until now, but the DSA is designed to change that, requiring platforms to share more information with regulators about how they moderate content, decide what users see and how artificial intelligence is used. Companies must grant access to their internal data by approved researchers and auditing firms to ensure they are compliant.
The DSA requires companies to ensure a high level of privacy for their users by redesigning their systems to ensure that data protection laws are observed.
Safety of Minors
Platforms will be prohibited from targeting children with advertising based on their personal data or cookies. They will be required to redesign their systems to ensure a “high level of privacy, security and safety of minors” and prove they have done so to the European Commission. In addition, they will have to carry out a comprehensive risk assessment of possible negative effects on children’s mental health of the content and services they provide. This element of the law has a potentially huge impact and has been long fought for.
The DSA requires companies that use recommender systems to ensure that they do not promote harmful or illegal content and must give users the right to opt out of recommendation systems and any kind of profiling based on their personal data. Social Media companies will not be able to use sensitive personal data including race, gender, religion, or sexual orientation to target users with adverts or content.
The DSA requires the large online platforms to cooperate with authorities during crises such as terrorist attacks or natural disasters.
As with any new legislation initial reactions to the new act have been mixed. Supporters argue that it is a necessary step towards regulating the digital market and ensuring accountability for illegal content, harmful practices, and crime. They believe that the DSA will help protect fundamental rights recognized by law across Europe, including freedom of expression and data protection, and create a level playing field for all businesses operating within the EU digital market. Critics argue that the DSA is too broad in scope and could lead to censorship of free speech and stifle innovation. The Digital Services Act is a ground-breaking law that cannot be ignored and is surely a step in the right direction towards regulating digital services across the European Union, providing a safer and more secure environment for millions of users.