The days of self-regulation for big tech companies are coming to an end — at least for those either operating in or providing services to users in the EU. On April 23, the European Commission announced it had finally reached a consensus on the Digital Services Act (DSA). Deemed “a world first in the field of digital regulation”, the DSA aims to create a safer digital space in which the fundamental rights of all users of digital services are protected, as well as to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Union and globally. While the agreement remains largely the same as the original draft that was proposed in 2020, the latest round of modifications focused on accountability standards for online platforms with regard to harmful or illegal content.
According to a statement issued by European Commission President Ursula von der Leyen, “The DSA will upgrade the ground-rules for all online services in the EU. It will ensure that the online environment remains a safe space, safeguarding freedom of expression and opportunities for digital businesses.”
Recent Changes to the Digital Services Act
Immediate Takedown. Content targeting victims of cyber violence (e.g., “revenge porn”) must be removed “immediately;” other content deemed illegal must be removed “swiftly.”
Online Marketplaces. Operators will be required to establish know-your-customer type of protocols with the merchants using their platforms to ensure that consumers are properly informed, as well as to prevent the sale of illegal products and services through their marketplace.
Search Engines. The scope of the current agreement now includes very large search engine operators along with very large digital online platforms (VLOP). Search engines will be held accountable for conducting routine risk analysis to mitigate risks associated with the dissemination of illegal content; adverse effects on fundamental rights; manipulation of services having an impact on democratic processes and public security; and adverse effects on gender-based violence, and on minors and serious consequences for the physical or mental health of users.
Dark Patterns. Dark patterns are deceptive practices deliberately crafted to obscure, mislead, manipulate, or coerce users into making unintended or possibly harmful choices. The DSA will prohibit these misleading interfaces and practices unless they pertain to practices permitted by the EU GDPR or the Unfair Commercial Practices Directive. Further guidance for practices that constitute dark patterns is expected to be issued under the final DSA.
Algorithm Accountability. The DSA will prevent online platforms from using algorithms based on gender, race, or religion to target users with online ads. Also prohibited, is the practice of targeting children with ads. Platforms will be prohibited from presenting targeted advertising based on the use of minors’ personal data as defined in EU law.
Transparency & Profiling. Platforms must clearly describe their recommendation systems in the platform’s terms and conditions. Platforms also must allow users to modify the parameters used in the recommendation systems, and must offer at least one choice that is not based on profiling.
Crisis Mechanism. A new article introducing a crisis response mechanism. The addition was prompted by the current Russian aggression in Ukraine to address the impact and spread of misinformation through digital platforms. This will make it possible to analyze the impact of the activities of VLOPs and VLOSEs on the crisis in question for proportionate and effective measures to be put in place for the respect of fundamental rights.
Supervisor Fee by the Commission. VLOPs must pay the European Commission a supervisory fee of up to 0.05% of their global annual revenue to enforce the DSA.
Digital Regulation: Oversight and Enforcement
The DSA will create an oversight framework to access the algorithms of VLOPs upon request and within a reasonable time if necessary to monitor and assess the compliance with the DSA. Online platforms will have to meet risk management obligations, external risk auditing, as well as to share data with authorities and researchers so they can scrutinize how the platforms work and how online risks evolve.
According to information available through the European Commission’s website, enforcing the new digital regulations will consist of national and EU-level cooperation to supervise how these platforms are adapting to the new requirements. Each Member State will need to appoint a Digital Services Coordinator, an independent authority which will be responsible for supervising the intermediary services established in their Member State and/or for coordinating with specialist sectoral authorities. To do so, it will impose penalties, including financial fines or even a suspension of services for egregious violations. Each Member State will clearly specify the penalties in their national laws in line with the requirements set out in the Regulation, ensuring they are proportionate to the nature and gravity of the infringement, yet dissuasive to ensure compliance.
Despite the steep financial penalties defined for a failure to comply, up to 6% of a company’s global annual revenue, doubt remains over the EU’s ability to carry out the more challenging task of enforcing the new laws. While there are plans to add 230 new staffers to aid in the enforcement, some believe that number to be no where near sufficient for going up against tech giants like Amazon, Google and Meta. Despite several multi-billion dollar rulings levied against Google in recent years, regulators have not forced the company to make major structural changes. And since GDPR was enacted in 2018, there has been a lack of meaningful action against the tech giants over their data collection policies (Facebook, Google, et al.) which violate the legislation. Those supporting the new legislation claim to have learned from the mistakes in the GDPR rollout, and have adjusted accordingly. Rather than leaving enforcement up to regulators in individual countries, as was the case for GDPR, the DSA laws will be primarily enforced out of Brussels by the European Commission.
The DSA still needs to be sent to the European Parliament and EU Member States for final approval, but it is widely anticipated to be approved without issue. Upon approval, the DSA becomes effective 20 days after it has been published in the Official Journal, and online platforms will then have up to 15 months to comply with the new legislation. As the global marketplace expands digital regulation to improve internet safety, the reality of whether or not Digital Services Act will hold up to its promises in terms of enforcing the law remains to be seen.