Content Moderation Featured

The European Digital Services Act – How it Affects you
BLOG

The European Digital Services Act – How it Affects you

Reading Time: 6 minutes

TLDR: The European Digital Services Act has been ratified into law by the European Union and will have wide-ranging implications for companies across the digital sector. From social media and ecommerce platforms to marketplaces and even hosting, cloud and domain name providers, the requirements are extensive and the penalty for non-compliance is up to 6% of yearly turnover. The EU’s aim is to stop the flow of harmful content and misinformation and the sale of counterfeit goods. Although ratified in Europe, other countries will follow suit, so affected companies need to look to implement the numerous guidelines as soon as possible and certainly before the deadline of January 2024.

How The European Digital Services Act Affects You

Over the years the European Parliament has been criticized for some of the legislation it passed, such as the curviness of bananas and cucumbers (if it’s too curvy it cannot be sold). No need to check the date, even if you happen to be reading this on any given April 1st, this is 100% true. However, they have also been trailblazers in real issues facing the people of Europe, such as the elimination of roaming charges for calls and data across the European Union countries, and who can forget the lauded and dreaded (depending on which side of the table you sit) GDPR. 

But in December, 2020, a new bill was proposed that would have seismic implications for ecommerce,  content platforms/companies and cloud/hosting/infrastructure providers, and on 23 April, 2022, the proposal was ratified into legislation. The new DSA (Digital Services Act) legal framework sets out an ‘unprecedented new standard for the accountability of online platforms regarding illegal and harmful content’. This covers both goods sold online and the hosting of content on social media and other content sharing platforms. And a major element of this legislation is that the bigger you are, the harder you fall!

Visual Content ModeartionBe Sure To Also Read ->Visual Content Moderation Use Case Overview

Why Should I Care About The European Digital Services Act?

In simple terms, if you:

  1. Operate within the European Union territories and you:
  • run any form of online store
  • are a provider of ecommerce platforms
  • are a social media platform
  • run any form of platform that allows the sharing of content
  • offer network infrastructure and domain name services
  • provide cloud and web hosting services

…you will be held accountable for the authenticity of goods you sell on your site/allow to be sold through your platform/systems, and the nature and authenticity of the content you allow to be published on your platforms (i.e. harmful/hateful content and misinformation).

  1. Are an individual or brand company concerned about the authenticity of products sold online or content that affects you, this new firm and binding European law gives you an unprecedented ability to request information directly from the provider/platform and to take action against them.
  2. Are an individual or organization which feels that your content is being blocked/taken down incorrectly, you now have a proper framework to request a full and transparent explanation for why and even how the determination was made, if it was algorithmic.

In short, this is a big deal that shifts the power away from service providers, platforms and marketplaces and into the hands of the people. And it comes with severe penalties for non-compliance, with fines of up to 6 percent of annual turnover. So, the bigger you are, the more you have to lose – literally!

European Digital Services Act 1_ scrolling on social media

How Does This Affect Me As A Business? 

In practical terms business that fall under the reach of this new framework MUST take action to implement:

  • Measures to counter illegal goods, services or content online, such as:
    • a mechanism for users to easily flag such content and for platforms to cooperate with so-called ‘trusted flaggers’;
    • new obligations on traceability of business users in online marketplaces;
  • New measures to empower users and civil society, including:
    • the possibility to challenge platforms’ content moderation decisions and seek redress, either via an out-of-court dispute mechanism or judicial redress;
    • provision of access to vetted researchers to the key data of the largest platforms and provision of access to NGOs as regards access to public data, to provide more insight into how online risks evolve;
    • transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users;
  • Measures to assess and mitigate risks, such as:
    • obligations for very large platforms and very large online search engines to take risk-based action to prevent the misuse of their systems and undergo independent audits of their risk management systems;
    • Mechanisms to adapt swiftly and efficiently in reaction to crises affecting public security or public health;
    • New safeguards for the protection of minors and limits on the use of sensitive personal data for targeted advertising.

Additionally, the European Digital Services Act provides the right for enhanced supervision and enforcement by the Commission when it comes to very large online platforms. The supervisory and enforcement framework also confirms the important role of the independent Digital Services Coordinators and Board for Digital Services.

As well as the legal repercussions around compliance to this new framework, there will also be a shift in public opinion. The whole issue around privacy and the rights of European citizens around privacy shifted up a gear after the introduction of GDPR. The same will undoubtedly happen as awareness of the DSA grows and consumers and brands learn more about their rights and avenues of legal recourse when needed.

So This Only Affects Europe?

Technically speaking, yes. But in practical terms are companies really going to run different systems that provide varying adherence to different regulations around the world, or adapt their systems to adhere to the most stringent one, currently the EU’s DSA?

It should also be noted that similar bills have been proposed in the US. America’s SHOP-SAFE bill proposes very similar measures for the same cohort of businesses. But many countries, including the US are now poring over the EU’s legislation to see what and how it could be adopted in their own territories. 

In other words, the writing is on the wall and ultimately the rules and laws will align across all territories.

What Is The EU’s Goal/Vision

Three quotes from EU Parliament leaders highlight the EU’s goal very clearly:

“It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms.

European Commission President Ursula von der Leyen

“With the DSA we help create a safe and accountable online environment. Platforms should be transparent about their content moderation decisions, prevent dangerous disinformation from going viral and avoid unsafe products being offered on marketplaces. With today’s agreement we ensure that platforms are held accountable for the risks their services can pose to society and citizens.”

Executive Vice-President for a Europe Fit for the Digital Age, Margrethe Vestager

And finally, the most ominous quote for impacted business leaders:

“With the DSA, the time of big online platforms behaving like they are “too big to care” is coming to an end.

Commissioner for the Internal Market Thierry Breton

In other words, if a bricks and mortar store is found to be selling counterfeit goods – even if it is from another seller within the store, then the store is held accountable. The same will now apply to online sellers and marketplaces. Equally, print publications are held accountable for what they publish, even if the content came from a third-party. The same will now apply to social media and content sharing platforms.

More importantly, all affected companies and platforms must take steps to actively prevent non-compliance and provide complete transparency in its policies relating to sellers and content moderation, along with the ability to report on it all.

Visual Content Moderation And The European Digital Services Act

Moderation of content, whether in posts or product listings, is going to be crucial, but a key aspect of that is reviewing visual content to ensure that it too complies with regulations. This is no easy feat as the detection of non-compliant content in images and videos is both a broad and complex challenge. But it should be very high on the priorities of key stakeholders in affected companies.

People are ingenious, so whether they are trying to spread hate or misinformation, or sell fake products, they’ll use every trick in the book to avoid detection. In a pre DSA world, where businesses and platforms were not so easily held accountable for the acts of its users, this was not such a big problem. But post DSA, it could be very costly to ignore it!

The answer is of course Computer Vision. Many companies already have trust and safety teams and moderation teams, but the challenges of content moderation with humans are many. The main issue is the proliferation of images and videos in content platforms and ecommerce alike. Text-based moderation is a mature technology and companies have learned to use it well, but these days posts will almost always contain images and/or videos, often including very little text. Bad actors will also hide important brand and product info in images, sometimes burning text into the visual. As such any successful content moderation and compliance systems must incorporate strong and effective computer vision to block/flag infringing visual content.

Want To Talk?

Computer vision is complex and must deliver specific features and capabilities tuned to the task. It must be capable of delivering the highest accuracy at a large scale and you may need to implement it on-premise or even on-device. 

If you’d like to learn more about the various options for accessing our Visual-AI (computer vision) technology just fill in the form below. You can also check out this helpful video to see how visual content moderation can help in a wide range of use cases.

Book A Demo

RELATED

BLOG BLOG
VISUA & Vision Insights Partner To Usher In New Era In Sports Sponsorship Intelligence

Reading Time: 4 minutes Exclusive partnership sees Vision Insights integrate VISUA’s Sports Sponsorship Monitoring Computer Vision Suite into its new…

Featured Sponsorship Monitoring Technology VISUA News
BLOG BLOG
Eight Types of Content Intermediary Web Companies Need to Detect & Block to be Compliant with the European Digital Services Act

Reading Time: 7 minutes The European Digital Services Act is a groundbreaking piece of legislation that aims to modernise the…

Brand Protection Content Moderation Counterfeit Detection Trademark Protection
BLOG BLOG
Eight Types of Content that Marketplaces & Ecommerce Sites Need to Block to be Compliant with the European Digital Services Act

Reading Time: 6 minutes The European Digital Services Act is new legislation which aims to modernise the regulation of online…

Brand Protection Content Moderation Counterfeit Detection Trademark Protection

Trusted by the world's leading platforms, marketplaces and agencies

Integrate Visual-AI Into Your Platform

Seamlessly integrating our API is quick and easy, and if you have questions, there are real people here to help. So start today; complete the contact form and our team will get straight back to you.

  • This field is for validation purposes and should be left unchanged.