Content Moderation will be a key operation in ensuring your business is compliant. Using more humans to do this is not the answer!
As visual media often poses the greatest challenge in relation to content moderation, computer vision will play an important role as the DSA comes into effect. The volume of visual media exceeds the human capacity to moderate it. As well as this, the content is simply often too disturbing for most people to handle.
Computer Vision works at machine speed so it can process hundreds of millions of images and videos per day. It has no emotional reaction to the content and it never gets tired. That’s why computer vision is the answer to the increased challenge the DSA may bring to the doors of thousands of companies.
Here are a few examples of the many types of visual moderation possible with VISUA’s Computer Vision:
Browse the content throughout this page to get a strong understanding of what will be expected of your business under the Digital Services Act, and how to utilize computer vision to your benefit.
In its simplest explanation, it’s about protecting European users online.
The European Parliament has been on a mission for the last decade to tackle big issues facing its citizens. The first watershed legislation was the General Data Protection Regulation (GDPR), which imposed a framework and laws to protect the privacy of European citizens. Now they have followed up with the Digital Services Act (DSA), an equally momentous piece of legislation that is designed to protect users from illegal content and goods online.
The DSA establishes a framework of governance that platforms, marketplaces, ecommerce sites and websites in general must follow to gain and retain compliance. This framework covers a wide range of content from marketplaces selling fake goods, to social platforms allowing the posting of hate speech and disinformation, and even domain registrars and hosting companies allowing the registration and creation of fake sites for phishing.
But it goes a step further, as the DSA will also ensure that online platforms share the specifics of how their systems and algorithms work with respect to protecting users. For instance, how do they detect and remove illegal goods and content quickly, and how do they detect and block misinformation and the on users who spread it.
More information on the Digital Services Act is available here.
Ursula von der Leyen, European Commission President, said that the DSA “gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms.”
As such, the purpose of the DSA is to create a safer digital space for EU Citizens where the fundamental rights of users are protected and to establish a level playing field for businesses. In practical terms, it aims to stop the sale of harmful goods and block the publishing of harmful content by placing liabilities on platforms and service providers around transparency, openness & reporting, and content moderation.
The DSA touches virtually every type of online platform, however they can be categorised as platforms or companies that operate within the European Union territories and:
The DSA has a particular focus on VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines), but small and medium-sized platforms still have significant obligations.
Additionally, the DSA affects a wider group of businesses than ever before, going beyond the obvious social media platforms, eCommerce sites, and marketplaces to include intermediary platforms and services. This includes web hosting providers, email service providers, and any other company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers.
A breakdown of the company types and the specifics of their liabilities are highlighted in this Digital Services Act factsheet
Ursula von der Leyen proposed the concept of a “Digital Services Act”, in her 2019 bid for the European Commission’s presidency. Ultimately the original text for the DSA was prepared by the ‘Executive Vice President of the European Commission for A Europe Fit for the Digital Age’, Margrethe Vestager, and by the ‘European Commissioner for Internal Market’, Thierry Breton, as members of the Von der Leyen Commission.
The World Wide Web and social media platforms have undoubtedly been two of the most influential innovations in human history, enabling easier and faster communication, collaboration, information sharing and commerce. However, it has also allowed bad actors to exploit these platforms for financial, political and ideological gain.
In the last decade, the EU Government has ratified a number of pieces of legislation to try to curb this abuse of online platforms. Much of this was based on collaboration with platforms and self-regulation. Although partially successful, numerous gaps in efficacy remained, so just like the General Data Protection Act forced platforms and services to comply with a strict framework to protect EU citizen’s privacy, the Digital Services Act will compel them to comply with transparency, openness & reporting, and content moderation requirements to protect their EU users and subscribers from harmful and hateful products and content.
The Digital Services Act (and its sister Digital Markets Act) is a directive. This means it is a regulatory framework for the digital world that compels online platforms and service providers to take action to be compliant, with non-compliance leading to financial penalties. Previous Directives were put in place, which relied on cooperation and self-regulation. These were partially successful, but as the importance of platforms and other digital services has exponentially increased, we have witnessed the negative impacts that can shape economic, political, social and cultural life also grow. The EU Government, therefore, felt that it was time for an alignment of the regulations between political and state authorities in order to assert some control over the ways in which platforms and other digital actors operate.
In short, the Digital Markets Act concerns itself with large online platforms, defined as ‘gatekeepers’ of the online world (think Google, Microsoft, Facebook, Amazon). Another factor is its ex-ante approach, which bases its regulations on forecasts rather than past performance. In other words, if you could infringe and will meet the financial/user criteria, then you will be obligated to be compliant, rather than having to have already infringed.
The key purpose of the DMA is to ensure “contestable and fair” digital markets by imposing strict obligations on these gatekeeper entities. Gatekeeper status is presumed where a company provides a “core platform service” in at least three EU Member States and meets a number of quantitative criteria (including EU turnover of at least €7.5 billion or a market capitalization of at least €75 billion and 45 million monthly active end users and 10,000 yearly active business users in the EU).
The obligations and prohibitions on gatekeepers include how they:
More information on the Digital Markets Act is available here.
In contrast, the Digital Services Act impacts a wider group of online services and platforms, from small to very large, with regulations that protect European citizens from harmful content, products and practices online. More information on the DSA is provided throughout this FAQ.
First proposed in December 2020, the European Parliament ratified the Digital Services Act on the 23rd of April, 2022.
Yes, the DSA and the DMA (Digital Markets Act) were both adopted On the 5 of July 2022. However, this does not mean it is enforceable as yet. It will come into force from January 1st, 2024, or fifteen months after entry into force, whichever is later. It is also important to note that platforms placed under the category “very large online platforms” will be obligated to comply four months after they have been designated as such.
The DSA will become enforceable across the EU from 1 January 2024, or fifteen months after entry into force, whichever is later. However, there is a distinction for companies designated as VLOPs (Very Large Online Platforms) because they have 45 million users or more in the EU, who will be subject to the terms of the DSA four months after their designation. The designation criteria is due to be completed in the autumn of 2022.
The DSA applies to all companies that sell, or provide, products or services to European citizens, no matter where they are headquartered. However, it does not apply to those same products or services outside the EU territories.
The practical implications for companies are that they must therefore either run their services and operations to be compliant with the Digital Services Act separately to their business in other territories or align their global businesses to be DSA compliant, no matter where they sell/provide their services/products.
The top-level objective of the Digital Services Act is to make online purchases and the use of online services as safe as their offline equivalents. In other words, to provide EU citizens and companies with the same levels of protections when transacting online as they have offline.
This means better protection from counterfeits and dangerous products when buying online as well as protection from hate speech and misinformation. Brands will also enjoy better protections through greater abilities to act against trademark and copyright infringements.
The measures brought into force by the digital services act are both specific and far reaching and include specific provisions in the act that make it easier to/for:
The Digital Services Act places strict requirements on eCommerce sites and marketplaces to stop listings of counterfeit products from being placed on their platforms. They must also show that they are able to, and carry out regular audits of, sellers and providers of the products they sell. Large platforms must also allow independent auditors to carry random reviews of the above.
Additionally, they must provide transparency and traceability of all sellers, providing proof of their legitimacy. Finally, they must allow the creation of ‘Trusted Flaggers’ who can provide feedback when suspected counterfeit products are found on the platform.
The DSA touches virtually every type of online platform. But these can be categorised as companies/platforms that:
The DSA has a particular focus on VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines), but small and medium-sized platforms still have significant obligations.
The important aspect of this new legislation is the obligations imposed on ‘Intermediary’ platforms and services, such as marketplaces, web hosting providers, email service providers, etc. In other words, any company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers, will now have obligations to protect EU Citizens and can be penalised for not doing so.
The DSA has been designed to protect freedom of expression with the caveat that any content must not be harmful to or abuse citizens based on race, gender, or age (especially the protection of minors), and must not adversely interfere with political processes, such as elections.
The monitoring and removal of this type of content, colloquially known as ‘awful but lawful’ is ensured by obliging platforms (especially very large online platforms) and search engines to assess potential risks in their systems and algorithms and take appropriate actions, as necessary, to stop it. Importantly, these assessments and actions are not only retroactive but also proactive in order to include potential future risks, where identified. They must also implement new crisis response mechanisms in cases of serious threats to public health and security crises, such as a pandemic or war.
Importantly, they must be transparent in their actions, providing proof, where required, to the Commission, and other relevant parties (including individual EU citizens) upon request.
Disinformation is a scourge on our society. It is distinctly different to misinformation, but the two are tied. Unfriendly regimes and organisations knowingly spread disinformation with the intent to destabilise the populace and foment anger and mistrust. Individuals then regurgitate and spread that fake content, which then becomes misinformation. But there is a very fine line between disinformation/misinformation and freedom of speech.
As early as 2018, the EU Government created the ‘Action Plan Against Disinformation’. This plan was devised specifically to identify and block this type of harmful content under the definition of:
“Disinformation is understood as verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm.”
But it was only a ‘plan’ that required cooperation by the key platforms and companies that are at the core of this spread of false content, such as search engines and social media sites.
The plan showed positive results, but the EU Government determined that the threat to democracy needed additional supports in three areas to ensure:
And so the European Democracy Action Plan was created in 2020, which established a framework for collecting evidence specifically about foreign disinformation campaigns and blocking it. One of the most notable outcomes of this plan was the banning across Europe of media sources such as Russia Today and Sputnik.
However, both these plans relied on self-regulation that followed the EU’s framework and guidelines. But platforms were free to interpret the guidelines as they saw fit, which left gaps, such as the under-moderation of non-English based content, which has allowed Russian disinformation to spread in countries like Italy and Eastern bloc nations like Poland. As such more stringent controls were needed to ensure that positive actions could be enforced as required. Hence the DSA legislation includes text and clauses that cover this.
The DSA will compel platforms to take more widespread actions, such as transparency around how they detect and verify disinformation and the subsequent actions they take when detected, which can include warning labels, confirmation steps when a user attempts to share said information, as well as outright blocking of the content and users who persistently disseminate it. It also forces these platforms to stop the advertising of false information through their platforms.
When it comes to advertising regulation, the DSA builds on the existing GDPR regulations, which already tightly control aspects of consent and transparency in advertising.
The DSA focuses on content and is designed to impose legally binding controls on digital platforms related to illegal content, transparency, and disinformation in advertising. Among these requirements, the DSA imposes new content moderation obligations on digital platforms in the areas of illegal content and algorithmic curation. It also specifically compels them to provide clarity to users around advertising targeting.
These requirements become even more demanding for digital platforms which it terms ‘Very Large Online Platforms’, or VLOPs, and ‘Very Large Search Engines’, or VLSEs, where potential ‘systemic risk’ must be identified and managed.
In essence, platforms that permit the distribution of adverts, both on and off their platforms, must be able to demonstrate transparency on user targeting as well as adequate and effective content controls across every member state to block ads that contain harmful and subversive content.
The DSA touches virtually every type of online platform. But these can be categorised as companies/platforms that:
The DSA has a particular focus on VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines), but small and medium-sized platforms still have significant obligations.
The important aspect of this new legislation is the obligations imposed on ‘Intermediary’ platforms and services, such as marketplaces, web hosting providers, email service providers, etc. In other words, any company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers, will now have obligations to protect EU Citizens and can be penalised for not doing so.
The DSA boils down to four key areas of impact:
1) Transparency
Being transparent about its processes and systems, including how algorithms work to determine what is published/allowed to be listed & advertised.
2) Openness & Reporting
Making itself available to oversight by government and other organisations as well as to EU citizens in terms of why something was published or blocked.
3) Content Moderation
Their ability to detect and take action on harmful content, counterfeit products and disinformation.
4) Advertising Controls
Platforms that incorporate any form of advertising element must not allow the targeting of ads to children and also prohibit targeting based on particular characteristics of users.
The specifics of their liabilities are highlighted in this Digital Services Act factsheet
The DSA affects a wider group of businesses than ever before, going beyond the obvious social media sites, eCommerce sites and marketplaces to include intermediary platforms and services. This includes web hosting providers, email service providers, and any other company that allows goods to be listed, content to be hosted and information to be shared, but does not directly engage with consumers.
A breakdown of the company types and the specifics of their liabilities are highlighted in this Digital Services Act factsheet
The DSA was specifically designed to revamp the principle of limitations of liability for online intermediaries originally introduced in the e-Commerce Directive, ratified in 2000. The key innovation is the introduction of a new chapter that stipulate standards for transparency, and the accountability of all providers of “intermediary services” regarding illegal and harmful content.
This new Regulation is important for the broad category of providers of “intermediary services”, which are subject to new obligations and heightened scrutiny, by three key stakeholder groups that encompass, new national authorities created to monitor and uphold DSA compliance, right holders (e.g., holders of intellectual property rights or image rights) and ultimately, users, as they will rely on these new mechanisms to protect their rights.
As such, any company that provides intermediary services that has the potential to negatively impact EU Citizens down the line, even if the intermediary does not directly engage with consumers, will need to comply with the obligations of the DSA. This will include obligations of auditing and reporting, transparency (for things like traceability of their direct users/subscribers/sellers) and content moderation (to detect and block negative/harmful content at source).
An obvious intermediary in a marketplace, where they will now have responsibilities to detect and block counterfeit products and be able to fully trace sellers who infringe. But less obvious are providers of hosting services and bulk email platforms and survey systems providers. These types of solutions are often used for the hosting and dissemination of cyber threats (particularly phishing attacks) and they can now be more easily held accountable where bad actors use their platforms for this purpose.
The DSA has two levels of online players. The first is for small and medium sized businesses and the second is for VLOPs (Very Large Online Platforms) and VLSEs (Very Large Search Engines). Any platform with 45 million plus monthly users is determined to be a VLOP/VLSE. These very large platforms have increased requirements in order to comply with the DSA.
The specifics of their liabilities are highlighted in this Digital Services Act factsheet
Non-compliant platforms and services can be penalised with fines equating to 6% of their annual turnover.
The Digital Services Act has broad and varied requirements and obligations based on the type and size of platform or service:
Intermediary services (such as IPs and domain registrars) include:
Hosting services include:
Online platforms include:
Marketplaces include:
Very Large Online Platforms and Very Large Search Engines include:
Providing a comprehensive list of systems and technologies is outside of the scope of this FAQ, however, it is fair to say that much of the technical work will be around the updating and enhancing of internal systems to meet the requirements around transparency and openness.
One area that requires very specialised knowledge and technology is content moderation, which will require machine learning systems that can specialise in contextual text analysis and also computer vision systems that can help in the moderation of visual content.
TLDR: The European Digital Services Act has been ratified into law by the European Union and will have wide-ranging implications for companies
Doing the work that humans just can’t do, and at machine speed
Harmonized rules across the single market will make it easier to provide digital services across borders and ensure the same level of protection to all.
The Digital Services Act and Digital Markets Act aim to create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses.
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC
Seamlessly integrating our API is quick and easy, and if you have questions, there are real people here to help. So start today; complete the contact form and our team will get straight back to you.