Digital Services Act Information Hub

All the information you need for the European Digital Services Act

The European Digital Services Act is a proposed legislation which aims to modernize the e-Commerce Directive in relation to illegal content, transparent advertising and disinformation online.
It strives to make the internet a safer place for citizens and visitors to the EU, calling an end to harmful misinformation, the sale and purchase of illegal goods and the broadcasting or sharing of harmful content. The terms of the act will apply to all relevant businesses accessible online by EU citizens and they will need to ensure that their platforms are compliant.
Visual Content Moderation

Computer Vision and the Digital Services Act

Content Moderation will be a key operation in ensuring your business is compliant. Using more humans to do this is not the answer!

As visual media often poses the greatest challenge in relation to content moderation, computer vision will play an important role as the DSA comes into effect. The volume of visual media exceeds the human capacity to moderate it. As well as this, the content is simply often too disturbing for most people to handle.

Computer Vision works at machine speed so it can process hundreds of millions of images and videos per day. It has no emotional reaction to the content and it never gets tired. That’s why computer vision is the answer to the increased challenge the DSA may bring to the doors of thousands of companies.

Here are a few examples of the many types of visual moderation possible with VISUA’s Computer Vision:

Browse the content throughout this page to get a strong understanding of what will be expected of your business under the Digital Services Act, and how to utilize computer vision to your benefit.

Digital Services Act Factsheet

VISUA Factsheet – DSA

Visual Content Moderation Explained

THE DIGITAL SERVICES ACT EXPLAINED – FAQ

What is the Digital Services Act?

In its simplest explanation, it’s about protecting European users online.

The European Parliament has been on a mission for the last decade to tackle big issues facing its citizens. The first watershed legislation was the General Data Protection Regulation (GDPR), which imposed a framework and laws to protect the privacy of European citizens. Now they have followed up with the Digital Services Act (DSA), an equally momentous piece of legislation that is designed to protect users from illegal content and goods online.

The DSA establishes a framework of governance that platforms, marketplaces,  ecommerce sites and websites in general must follow to gain and retain compliance. This framework covers a wide range of content from marketplaces selling fake goods, to social platforms allowing the posting of hate speech and disinformation, and even domain registrars and hosting companies allowing the registration and creation of fake sites for phishing.

But it goes a step further, as the DSA will also ensure that online platforms share the specifics of how their systems and algorithms work with respect to protecting users. For instance, how do they detect and remove illegal goods and content quickly, and how do they detect and block misinformation and the on users who spread it.

More information on the Digital Services Act is available here.

What is the purpose of the Digital Services Act?

Ursula von der Leyen, European Commission President, said that the DSA “gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms.”

As such, the purpose of the DSA is to create a safer digital space for EU Citizens where the fundamental rights of users are protected and to establish a level playing field for businesses. In practical terms, it aims to stop the sale of harmful goods and block the publishing of harmful content by placing liabilities on platforms and service providers around transparency, openness & reporting, and content moderation.

Who does the Digital Services Act apply to?

The DSA touches virtually every type of online platform, however they can be categorised as platforms or companies that operate within the European Union territories and:

  • run any form of online store
  • are a provider of eCommerce platforms
  • are a social media platform
  • run any form of platform that allows the sharing of content
  • offer network infrastructure and domain name services
  • provide cloud and web hosting services


The DSA has a particular focus on VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines), but small and medium-sized platforms still have significant obligations.

Additionally, the DSA affects a wider group of businesses than ever before, going beyond the obvious social media platforms, eCommerce sites, and marketplaces to include intermediary platforms and services. This includes web hosting providers, email service providers, and any other company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers.

A breakdown of the company types and the specifics of their liabilities are highlighted in this Digital Services Act factsheet

Who proposed the DSA?

Ursula von der Leyen proposed the concept of a “Digital Services Act”, in her 2019 bid for the European Commission’s presidency. Ultimately the original text for the DSA was prepared by the ‘Executive Vice President of the European Commission for A Europe Fit for the Digital Age’, Margrethe Vestager, and by the ‘European Commissioner for Internal Market’, Thierry Breton, as members of the Von der Leyen Commission.

Why did the EU introduce specific rules for digital services?

The World Wide Web and social media platforms have undoubtedly been two of the most influential innovations in human history, enabling easier and faster communication, collaboration, information sharing and commerce. However, it has also allowed bad actors to exploit these platforms for financial, political and ideological gain.

In the last decade, the EU Government has ratified a number of pieces of legislation to try to curb this abuse of online platforms. Much of this was based on collaboration with platforms and self-regulation. Although partially successful, numerous gaps in efficacy remained, so just like the General Data Protection Act forced platforms and services to comply with a strict framework to protect EU citizen’s privacy, the Digital Services Act will compel them to comply with transparency, openness & reporting, and content moderation requirements to protect their EU users and subscribers from harmful and hateful products and content.

Is the DSA a regulation or a directive?

The Digital Services Act (and its sister Digital Markets Act) is a directive. This means it is a regulatory framework for the digital world that compels online platforms and service providers to take action to be compliant, with non-compliance leading to financial penalties. Previous Directives were put in place, which relied on cooperation and self-regulation. These were partially successful, but as the importance of platforms and other digital services has exponentially increased, we have witnessed the negative impacts that can shape economic, political, social and cultural life also grow. The EU Government, therefore, felt that it was time for an alignment of the regulations between political and state authorities in order to assert some control over the ways in which platforms and other digital actors operate.

What’s the difference between the digital services act and the digital markets act?

In short, the Digital Markets Act concerns itself with large online platforms, defined as ‘gatekeepers’ of the online world (think Google, Microsoft, Facebook, Amazon). Another factor is its ex-ante approach, which bases its regulations on forecasts rather than past performance. In other words, if you could infringe and will meet the financial/user criteria, then you will be obligated to be compliant, rather than having to have already infringed.

The key purpose of the DMA is to ensure “contestable and fair” digital markets by imposing strict obligations on these gatekeeper entities. Gatekeeper status is presumed where a company provides a “core platform service” in at least three EU Member States and meets a number of quantitative criteria (including EU turnover of at least €7.5 billion or a market capitalization of at least €75 billion and 45 million monthly active end users and 10,000 yearly active business users in the EU).

The obligations and prohibitions on gatekeepers include how they:

  • process and use personal data
  • determine the ranking of its own and third parties’ offerings
  • negotiate certain conditions with business users
  • impose certain restrictions on end users in terms of access and use requirements 
  • interoperate and share information
  • conduct mergers

More information on the Digital Markets Act is available here.

In contrast, the Digital Services Act impacts a wider group of online services and platforms, from small to very large, with regulations that protect European citizens from harmful content, products and practices online. More information on the DSA is provided throughout this FAQ.

When was the Digital Services Act Ratified?

First proposed in December 2020, the European Parliament ratified the Digital Services Act on the 23rd of April, 2022.

Has the Digital Services Act been adopted?

Yes, the DSA and the DMA (Digital Markets Act) were both adopted On the 5 of July 2022. However, this does not mean it is enforceable as yet. It will come into force from January 1st, 2024, or fifteen months after entry into force, whichever is later. It is also important to note that platforms placed under the category “very large online platforms” will be obligated to comply four months after they have been designated as such.

When will the DSA start applying?

The DSA will become enforceable across the EU from 1 January 2024, or fifteen months after entry into force, whichever is later. However, there is a distinction for companies designated as VLOPs (Very Large Online Platforms) because they have 45 million users or more in the EU, who will be subject to the terms of the DSA four months after their designation. The designation criteria is due to be completed in the autumn of 2022.

Will these rules of the DSA apply to companies outside of the EU?

The DSA applies to all companies that sell, or provide, products or services to European citizens, no matter where they are headquartered. However, it does not apply to those same products or services outside the EU territories.

The practical implications for companies are that they must therefore either run their services and operations to be compliant with the Digital Services Act separately to their business in other territories or align their global businesses to be DSA compliant, no matter where they sell/provide their services/products.

How will citizens benefit from the DSA’s new rules?

The top-level objective of the Digital Services Act is to make online purchases and the use of online services as safe as their offline equivalents. In other words, to provide EU citizens and companies with the same levels of protections when transacting online as they have offline.

This means better protection from counterfeits and dangerous products when buying online as well as protection from hate speech and misinformation. Brands will also enjoy better protections through greater abilities to act against trademark and copyright infringements.

What measures does the DSA legislation take to counter illegal content?

The measures brought into force by the digital services act are both specific and far reaching and include specific provisions in the act that make it easier to/for:

  • users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers’ to identify and remove illegal content;
  • trace sellers on online marketplaces and to help build trust and go after scammers more easily; 
  • obligate online marketplaces to randomly check that products or services on their sites are compliant;
  • users to challenge platforms’ content moderation decisions when their content gets removed or otherwise restricted;
  • enforce higher levels of transparency through wide ranging measures, including better information on terms and conditions, as well as on the algorithms used for recommending content or products to users;
  • enforce new obligations for the protection of minors on any platform in the EU;
  • protect EUcitizens by obliging very large online platforms (VLOPS) and search engines to prevent abuse of their systems by taking risk-based action (including potential future risk where highlighted), including oversight through independent audits of their risk management measures. platforms must mitigate risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. these measures will be carefully balanced against restrictions of freedom of expression, and subject to independent audits;
  • implement new crisis response mechanisms in cases of serious threat to public health and security crises, such as a pandemic or war;
  • enforce bans on targeted advertising that profile children or are based on special categories of personal data such as ethnicity, political views or sexual orientation.
  • enhance transparency for all advertising on online platforms and influencers’ commercial communications, including clear labelling of paid for, promoted or sponsored content;
  • adopt a ban on using so-called ‘dark patterns’ on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make;
  • implement new provisions to allow access to data to researchers of key platforms in order to scrutinise how these platforms work and how online risks evolve;
  • provide users with new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. representative organisations will also be able to defend user rights for large-scale breaches of the law;
  • assign the commission as the primary regulator for very large online platforms (platforms exceeding 45 million users), while other platforms will be under the supervision of member states where they are established. the commission will have enforcement powers similar to those it has under antitrust proceedings. an EU-wide cooperation mechanism will be established between national regulators and the commission;
  • include intermediary companies into these regulations, i.e. companies who may not engage directly with EU citizens, but whose services may indirectly impact them (hosting services, cloud storage services, domain registrars, email service providers, etc.)

How will the DSA protect people from unsafe or counterfeit goods?

The Digital Services Act places strict requirements on eCommerce sites and marketplaces to stop listings of counterfeit products from being placed on their platforms. They must also show that they are able to, and carry out regular audits of, sellers and providers of the products they sell. Large platforms must also allow independent auditors to carry random reviews of the above.

Additionally, they must provide transparency and traceability of all sellers, providing proof of their legitimacy. Finally, they must allow the creation of ‘Trusted Flaggers’ who can provide feedback when suspected counterfeit products are found on the platform.

What digital services does the act cover?

The DSA touches virtually every type of online platform. But these can be categorised as companies/platforms that:

  • operate within the European Union territories and you:
  • run any form of online store
  • are a provider of eCommerce platforms
  • are a social media platform
  • run any form of platform that allows the sharing of content
  • offer network infrastructure and domain name services
  • provide cloud and web hosting services


The DSA has a particular focus on VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines), but small and medium-sized platforms still have significant obligations.

The important aspect of this new legislation is the obligations imposed on ‘Intermediary’ platforms and services, such as marketplaces, web hosting providers, email service providers, etc. In other words, any company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers, will now have obligations to protect EU Citizens and can be penalised for not doing so.

How can harmful but not illegal content be effectively addressed by the Digital Services Act?

The DSA has been designed to protect freedom of expression with the caveat that any content must not be harmful to or abuse citizens based on race, gender, or age (especially the protection of minors), and must not adversely interfere with political processes, such as elections.

The monitoring and removal of this type of content, colloquially known as ‘awful but lawful’ is ensured by obliging platforms (especially very large online platforms) and search engines to assess potential risks in their systems and algorithms and take appropriate actions, as necessary, to stop it. Importantly, these assessments and actions are not only retroactive but also proactive in order to include potential future risks, where identified. They must also implement new crisis response mechanisms in cases of serious threats  to public health and security crises, such as a pandemic or war.

Importantly, they must be transparent in their actions, providing proof, where required, to the Commission, and other relevant parties (including individual EU citizens) upon request.

How does the Digital Services Act tackle disinformation?

Disinformation is a scourge on our society. It is distinctly different to misinformation, but the two are tied. Unfriendly regimes and organisations knowingly spread disinformation with the intent to destabilise the populace and foment anger and mistrust. Individuals then regurgitate and spread that fake content, which then becomes misinformation. But there is a very fine line between disinformation/misinformation and freedom of speech.

As early as 2018, the EU Government created the ‘Action Plan Against Disinformation’. This plan was devised specifically to identify and block this type of harmful content under the definition of:

“Disinformation is understood as verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm.”

But it was only a ‘plan’ that required cooperation by the key platforms and companies that are at the core of this spread of false content, such as search engines and social media sites.

The plan showed positive results, but the EU Government determined that the threat to democracy needed additional supports in three areas to ensure:

  • elections were free and fair
  • media freedom and pluralism is strengthened
  • disinformation can be effectively countered


And so the European Democracy Action Plan was created in 2020, which established a framework for collecting evidence specifically about foreign disinformation campaigns and blocking it. One of the most notable outcomes of this plan was the banning across Europe of media sources such as Russia Today and Sputnik.

However, both these plans relied on self-regulation that followed the EU’s framework and guidelines. But platforms were free to interpret the guidelines as they saw fit, which left gaps, such as the under-moderation of non-English based content, which has allowed Russian disinformation to spread in countries like Italy and Eastern bloc nations like Poland. As such more stringent controls were needed to ensure that positive actions could be enforced as required. Hence the DSA legislation includes text and clauses that cover this.

The DSA will compel platforms to take more widespread actions, such as transparency around how they detect and verify disinformation and the subsequent actions they take when detected, which can include warning labels, confirmation steps when a user attempts to share said information, as well as outright blocking of the content and users who persistently disseminate it. It also forces these platforms to stop the advertising of false information through their platforms.

How does the Digital Services Act regulate online advertising?

When it comes to advertising regulation, the DSA builds on the existing GDPR regulations, which already tightly control aspects of consent and transparency in advertising.

The DSA focuses on content and is designed to impose legally binding controls on digital platforms related to illegal content, transparency, and disinformation in advertising. Among these requirements, the DSA imposes new content moderation obligations on digital platforms in the areas of illegal content and algorithmic curation. It also specifically compels them to provide clarity to users around advertising targeting.

These requirements become even more demanding for digital platforms which it terms ‘Very Large Online Platforms’, or VLOPs, and ‘Very Large Search Engines’, or VLSEs, where potential ‘systemic risk’ must be identified and managed.

In essence, platforms that permit the distribution of adverts, both on and off their platforms, must be able to demonstrate transparency on user targeting as well as adequate and effective content controls across every member state to block ads that contain harmful and subversive content.

What digital services does the act cover?

The DSA touches virtually every type of online platform. But these can be categorised as companies/platforms that:

  • operate within the European Union territories and you:
  • run any form of online store
  • are a provider of eCommerce platforms
  • are a social media platform
  • run any form of platform that allows the sharing of content
  • offer network infrastructure and domain name services
  • provide cloud and web hosting services


The DSA has a particular focus on VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines), but small and medium-sized platforms still have significant obligations.

The important aspect of this new legislation is the obligations imposed on ‘Intermediary’ platforms and services, such as marketplaces, web hosting providers, email service providers, etc. In other words, any company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers, will now have obligations to protect EU Citizens and can be penalised for not doing so.

What impact will the Digital Services Act have on businesses?

The DSA boils down to four key areas of impact:

1) Transparency
Being transparent about its processes and systems, including how algorithms work to determine what is published/allowed to be listed & advertised.

2) Openness & Reporting
Making itself available to oversight by government and other organisations as well as to EU citizens in terms of why something was published or blocked.

3) Content Moderation
Their ability to detect and take action on harmful content, counterfeit products and disinformation.

4) Advertising Controls
Platforms that incorporate any form of advertising element must not allow the targeting of ads to children and also prohibit targeting based on particular characteristics of users.

The specifics of their liabilities are highlighted in this Digital Services Act factsheet

Which platforms will be caught by the Digital Services Act?

The DSA affects a wider group of businesses than ever before, going beyond the obvious social media sites, eCommerce sites and marketplaces to include intermediary platforms and services. This includes web hosting providers, email service providers, and any other company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers.

A breakdown of the company types and the specifics of their liabilities are highlighted in this Digital Services Act factsheet

How will the proposed Digital Services Act impact so-called ‘Intermediary’ companies?

The DSA was specifically designed to revamp the principle of limitations of liability for online intermediaries originally introduced in the e-Commerce Directive, ratified in 2000. The key innovation is the introduction of a new chapter that stipulate standards for transparency, and the accountability of all providers of “intermediary services” regarding illegal and harmful content.

 

This new Regulation is important for the broad category of providers of “intermediary services”, which are subject to new obligations and heightened scrutiny, by three key stakeholder groups that encompass, new national authorities created to monitor and uphold DSA compliance, right holders (e.g., holders of intellectual property rights or image rights) and ultimately, users, as they will rely on these new mechanisms to protect their rights.

As such, any company that provides intermediary services that has the potential to negatively impact EU Citizens down the line, even if the intermediary does not directly engage with consumers, will need to comply with the obligations of the DSA. This will include obligations of auditing and reporting, transparency (for things like traceability of their direct users/subscribers/sellers) and content moderation (to detect and block negative/harmful content at source).

An obvious intermediary in a marketplace, where they will now have responsibilities to detect and block counterfeit products and be able to fully trace sellers who infringe. But less obvious are providers of hosting services and bulk email platforms and survey systems providers. These types of solutions are often used for the hosting and dissemination of cyber threats (particularly phishing attacks) and they can now be more easily held accountable where bad actors use their platforms for this purpose.

How will the proposed Digital Services Act differentiate between small and big players?

The DSA has two levels of online players. The first is for small and medium sized businesses and the second is for VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines). Any platform with 45 million plus monthly users is determined to be a VLOP/VLSE. These very large platforms have increased requirements in order to comply with the DSA.

The specifics of their liabilities are highlighted in this Digital Services Act factsheet

What penalties will businesses face if they do not comply with the new rules?

Non-compliant platforms and services can be penalised with fines equating to 6% of their annual turnover.

What Are Digital Services Companies Required To Do?

The Digital Services Act has broad and varied requirements and obligations based on the type and size of platform or service:

Intermediary services (such as IPs and domain registrars) include:

  • Transparency reporting
  • Requirements on terms of service due account of fundamental rights
  • Cooperation with national authorities
  • Points of contact and, where necessary, legal representative

 

Hosting services include: 

  • All the above
  • Notice and action and obligation to provide information to users
  • Reporting criminal offences to authorities

 

Online platforms include:

  • All the above
  • Complaint and redress mechanism and out of court dispute settlement
  • Trusted flaggers
  • Measures against abusive notices and counter-notices
  • Transparency of recommender systems
  • User-facing transparency of online advertising
  • Ban from targeting ads to children and prohibit targeting based on particular characteristics of users.

 

Marketplaces include: 

  • All the above
  • Vetting the credentials of third-party suppliers
  • Compliance by design
  • Random checks by EU agents

 

Very Large Online Platforms and Very Large Search Engines include:

  • All the above (where relevant):
  • Risk management obligations and crisis response
  • External & independent auditing
  • Internal compliance function
  • Public accountability
  • Enable users’ choice not to have recommendations based on profiling
  • Data sharing with authorities and researchers
  • Codes of conduct
  • Crisis response cooperation

What technologies and systems should I be investigating to help comply with the Digital Services Act?

Providing a comprehensive list of systems and technologies is outside of the scope of this FAQ, however, it is fair to say that much of the technical work will be around the updating and enhancing of internal systems to meet the requirements around transparency and openness.

One area that requires very specialised knowledge and technology is content moderation, which will require machine learning systems that can specialise in contextual text analysis and also computer vision systems that can help in the moderation of visual content.

Useful Resources

European Digital Services Act 1_ scrolling on social media

TLDR: The European Digital Services Act has been ratified into law by the European Union and will have wide-ranging implications for companies

Visual Content Moderation

Doing the work that humans just can’t do, and at machine speed

Harmonized rules across the single market will make it easier to provide digital services across borders and ensure the same level of protection to all.

The Digital Services Act and Digital Markets Act aim to create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses.

Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC

Trusted by the world's leading platforms, marketplaces and agencies

Integrate Visual-AI Into Your Platform

Seamlessly integrating our API is quick and easy, and if you have questions, there are real people here to help. So start today; complete the contact form and our team will get straight back to you.

  • This field is for validation purposes and should be left unchanged.