The DSA Draft: Ambitious Rules, Weak Enforcement Mechanisms
The decades of platforms playing only by their own rules and outdated laws are coming to an end in the European Union (EU). The basic framework for European platform regulation, which was enacted in 2000, has long needed an update and expansion in light of privacy breaches, disinformation campaigns and algorithmic discrimination related to online platforms. To tackle this task, the European Commission presented its Digital Services Act (DSA) as an important legislative proposal that could help rein in big tech companies. The draft DSA contains new due diligence rules for platforms that have not been part of legislative efforts in many other places worldwide. For instance, platforms need to conduct risk assessments and explain their algorithmic recommender systems. Many of these rules need clarifications, but they are a step in the right direction.
As progressive as the new rules are compared to other legislative initiatives around the world, they risk falling victim to an inconsistent and complicated enforcement regime. Enforcing the DSA is the responsibility of several national regulators in the member states (such as media regulators, consumer protection agencies, competition authorities and telecommunications regulators) and the Commission. Member states would have to designate one of their regulators as the “Digital Services Coordinator” (DSC). Each DSC is meant to be the single national point of contact for DSA matters for the Commission and for platforms. It can enforce DSA rules and sanction platforms for violations. However, when very large online platforms (i.e., those with 45 million monthly users) are concerned, national regulators are supposed to coordinate with the Commission in a long, multi-step process. In addition, there is a new European Board for Digital Services which advises the Commission. It is made up of national regulators, but chaired by the Commission, and can only issue opinions.
This enforcement structure relies strongly on existing regulators to expand their staffs and take on new tasks. In some member states, regulators might be capable and willing to do this, and might have already taken up some of these tasks. In other countries, regulators might be willing, but need more time to secure funding and build up necessary expertise, or others may not be willing at all. As a result, EU-wide rules from the DSA could be unevenly enforced. This would considerably weaken the DSA’s impact and repeat some of the issues plaguing another landmark piece of EU legislation, the General Data Protection Regulation (GDPR). European data protection rules grant citizens important rights and impose duties on companies processing personal data, including big online platforms. Yet, the GDPR’s potential positive effects are seriously hampered by different levels of enforcement across EU member states.
By involving the Commission and not leaving everything to national regulators, the DSA draft seems intent on learning from issues with GDPR enforcement. This structure, however, creates its own problems. The Commission is an executive body under political leadership and not an independent expert regulator, which is necessary to oversee platforms. The well-meaning attempt to prevent another GDPR scenario could backfire and weaken enforcement of the DSA, if national regulators do not step up and if there are turf wars between national regulators, and between national regulators and the Commission. In the end, the beneficiaries would be tech platforms whose corporate decisions would still set the tone for platform design and whose business practices would still not be overseen in a consistent manner by an expert regulator.
To solve this problem, policymakers in the member states and at the EU level should overhaul their current enforcement plans and build a dedicated European-level agency charged with enforcing the DSA’s new due diligence rules. Transparency reporting, explanations for recommender systems and audits are matters of EU-wide concern and apply to tech companies with an EU-wide reach, so they should be overseen by an EU body: a European Digital Services Coordinator. This body should focus specifically on platforms offering citizens digital spaces where they exchange views with one another, consume and share news, and receive and send political messages. This would include search engines, social media sites, video platforms and messenger services with public, social networking functions. Such platforms are important enough and different enough from other platforms and other industries that they require their own specific oversight regime. Their design and business model also carry specific individual and societal risks, such as amplifying disinformation, algorithm bias and privacy concerns, that necessitate their own oversight. The DSA provides the rulebook for addressing these risks, but not the right mechanisms to enforce the rulebook. Instead of relying on at least 27 different national regulators, the Commission and a new European advisory board, the EU should build a single European Digital Services Coordinator to deal with social networks and search engines.
A strong, well-staffed, independent European Digital Services Coordinator could focus squarely on social media sites and search engines, dealing with issues specific to these types of platforms and their frequently changing design and technology. While knowledge of this could also be built up among existing regulators, a dedicated agency has the advantage of not being distracted by other regulatory tasks. New processes of knowledge-gathering and knowledge-sharing with external experts from a diverse set of fields could be implemented that might be harder to establish at 27 separate national regulators and the Commission. Crucially, a European DSC would allow the EU to speak with one voice when addressing big tech companies and it would prevent companies from settling in the country with a regulator most favorable to them.
The DSA draft contains important and ambitious rules for platforms. Policymakers in the EU should now work towards making the enforcement mechanism just as ambitious. But building a new agency such as a European DSC will face serious hurdles, including legal questions, and will not be completed overnight. Path dependencies in member states and at the EU level make any move towards a new agency difficult, as existing regulators and governments will likely cling to their power. This is visible, for example, in Germany. The country has taken a lead on platform regulation and now, state and federal actors are keen on keeping their own rules. These national efforts, however, are simultaneously innovative and short-sighted: Progressive rules for platforms are enforced by regulators having to multitask in various regulatory fields that are unrelated to platform oversight. The DSA risks reproducing this structure if it were enforced by national regulators and the Commission, which is a likely scenario. Even if this system might work, the DSA should be seen as an opportunity to evaluate how much longer legacy regulators can be retrofitted to address platform issues and enforce new rules.
Future discussions about the DSA and its enforcement should include considerations of a dedicated, specialized EU agency that can focus solely on ensuring transparency and accountability for corporate decisions that shape the architecture of digital information spaces for millions of people in the EU.