Responses to the European Commission’s consultation on the European Democracy Action Plan (EDAP)
The following statement summarizes some of the key points from the response to the European Commission’s consultation on the planned European Democracy Action Plan (EDAP). The entire submission to the consultation, including this statement, can be downloaded as a PDF under the link on the right-hand side. The statement was not part of the official submission to the Commission.
In addition, a coalition of European civil society and think tank organizations developed a comprehensive policy paper on the EDAP, which you can also download using the link on the right-hand side.
The European Commission’s push to contribute to a debate on and offer proposals for strengthening European democracy with the European Democracy Action Plan (EDAP) are welcome and necessary. An ambitious action plan encompassing measures ranging from enforcement of existing laws to new legislative proposals to cooperation among member states and among EU institutions can help deal with some of the changes and challenges democracy has been facing over the past years. One of the many developments that can both boost democratic processes and undermine them is the online media and information space that has become important for many EU citizens’ opinion formation. Questions on election integrity, political advertising and disinformation can all be viewed in light of this digital public sphere.
In this summary, two key responses on these topics addressed in the EDAP consultation are highlighted. First, the need for enhanced transparency in political advertising, both for large platforms and political advertisers, is laid out. Second, in order to tackle disinformation, procedural accountability should be established in Europe.
The summary, which was not part of the official submission to the consultation, draws on the earlier contribution to the Commission’s EDAP roadmap and on the submission to the Commission’s consultation on the Digital Services Act (DSA), written with Aline Blankertz. Moreover, the responses to this consultation benefited greatly from discussions in a group of civil society and think tank representatives convened by the European Partnership for Democracy. A statement and a joint position paper emerged from these discussions, further addressing the overarching themes of the EDAP.
We thank the Commission for providing the opportunity to participate in the consultation and look forward to engaging further on this important Action Plan in the future, not only with the Commission and the European Parliament but all interested stakeholders.
Establishing binding rules and oversight for online advertising
Online political advertising has proven to be a valuable communications means to reach voters and supporters. However, it also carries certain risks for democratic processes. Without proper transparency and financial accountability mechanisms, big-money interests can take over citizens’ digital information spaces, for instance, on social media or on search engine results pages. Such zone-flooding can mean that thousands of ads can be run by pop-up campaigns in a short period of time. There is little opportunity for public interest scrutiny of zone-flooding efforts and online advertising more generally, as researchers, journalists, regulators and voters themselves largely rely on the transparency tools voluntarily offered by tech companies, which have been found to be insufficient, and on oversight structures, which were established for traditional journalistic media and are therefore outdated.
Advertising online, moreover, is most often targeted at rather homogeneous groups based on their inferred interests, fears and preferences (instead of the more traditional contextual advertising, which uses less personal behavioral data). Such microtargeting can lead to a strong segmentation of the citizenry online into homogenous groups and hinder public interest counterspeech of ads. Crucially, it might allow campaigns to exploit supposed fears and preferences of narrowly targeted groups for their political messaging based on a range of personal behavioral points gathered on citizens as they go about their daily lives online.
Candidates, political parties and other campaigners are not trying to sell products and services when advertising, but are instead paying to shape political debates and influence voting decisions. A lack of options for public interest scrutiny of online political ads can therefore weaken the legitimacy and integrity of elections and political campaigning more generally.
The EDAP, in sync with other Commission initiatives, especially the DSA, should define the baseline requirements for transparency and accountability for paid online political messaging in Europe. To establish meaningful transparency, the following measures should be implemented:
- Mandatory, expanded and vastly improved ad archives including information on targeting and engagement metrics, data sources, and ad financing
- Mandatory, expanded transparency reporting on processes for ad targeting and ad delivery
- Mandatory, improved ad disclaimers
- Mandatory advertiser verification
Such transparency standards should be applied to all online advertising, not just paid political messages. There could be additional rules for political advertising, including restrictions on behavioral microtargeting and expanded financial accountability reporting by platforms and political advertisers such as European parties and candidates.
Compliance with these requirements should be checked by an independent oversight body that has the technical expertise as well as staff and budget resources to audit transparency reports. Relying solely on self-regulatory measures by private corporations is not enough.
Tackling the spread of disinformation online
Risks associated with the spread of disinformation include ever more fractured political debates, a weakening of democratic processes such as elections, infringements on citizens’ fundamental right to form their opinions free from interference, and, as was the case with disinformation on the COVID-19 pandemic, harms to individual and public health. However, there also risks associated with tackling disinformation, such as infringing on citizens’ right to freedom of expression, and strengthening governmental and corporate control on speech. Thus, tackling disinformation is an important, yet difficult and delicate task. In the digital public sphere, dealing with disinformation as well as discriminatory content has become an even more important task due to the widespread use of social media, search engines and video apps, where such content can spread easily and in a targeted manner. Platforms offering these online communication and media spaces can assist democratic processes, but also amplify dangers stemming from disinformation and discriminatory content.
There is a clear need for EU-wide regulation for these platforms, which could be addressed with Commission proposals in the EDAP and the DSA. A continuation of current regulatory and self-regulatory practices is ill-advised. Continuing to rely only on national (criminal law) rules to tackle these challenges is misguided and not sufficient. Such rules largely focus on removing individual pieces of harmful/illegal content without addressing the overarching market failures that create the incentives to not tackle disinformation more effectively. Besides, neither governments nor companies should be left on their own to decide what content to delete, and thus to decide how to balance free speech concerns with potential harms associated with disinformation and discrimination. Self-regulatory measures such as the Commission’s Code of Practice on Disinformation were valuable and welcome first steps towards addressing the issues but have failed to affect meaningful changes at the companies addressed.
The EDAP should offer a clear, human rights-based framework for the DSA and other proposals touching large online platforms. It should ensure that the DSA delivers clear legal guidance for platforms, that does not focus on enforcing decisions on individual pieces of content, but instead focuses on the processes for accountable corporate decision-making. This could include a common EU framework for content moderation policies and practices, based on international human rights standards, mandatory transparency and accountability reporting as well as independent oversight.