3.1 What expertise on the tasks of the DSC exists in Germany, where are gaps? Supervision and enforcement
Supervision and enforcement
By making the term “coordinator” part of the name of the new position, the aspect of coordination at the DSC is emphasized. However, the DSC is not only supposed to coordinate the enforcement of the DSA, but also takes on enforcement responsibilities itself. For those platforms that are not considered “very large”, the DSC provides oversight, and it may also be involved in oversight for “very large” online platforms (see figures 1 and 2 in section 1). In addition, there are specific areas of responsibility in which the DSC is to be active, regardless of platform size. In Germany, expertise in some areas the DSA covers is missing, while in others, it is scattered across many bodies. This means that a holistic view of platform supervision, which the DSA at the very least encourages, is lacking.
An important task of the DSC concerns data access and analysis. There is little experience in this area in Germany to date, which is why existing structures would have to be significantly expanded. The DSC must be able to request and analyze large amounts of data from platforms. The DSA stipulates that, upon request, very large online platforms must provide data so that the DSC where the platform is headquartered and also external researchers can conduct investigations. In the past, misconduct by tech companies mainly came to light when journalists or researchers obtained internal data through leaks or whistleblowers. For example, Sophie Zhang exposed misconduct at Facebook in dealing with disinformation[5] and Frances Haugen denounced the platform for condoning the potential negative impact of its service on minors[6]. The DSA is intended to make it easier to obtain data from the platforms and thus better understand how they work, which should ultimately also help to improve the rules for content moderation and transparency, among others, in the long term. Even considering that many very large platforms do not have their headquarters in Germany: Which authority would even be able to handle large amounts of data, analyze it and draw conclusions from it?
In many German authorities, data-driven regulation is still in its infancy. The need for and potential of data science in regulatory agencies is recognized, but the structures for it are still being built in many cases. One example of an agency that already works with data and is expanding these activities is the BNetzA. For example, it receives extensive data sets as part of market analyses. Moreover, a sub-department is being set up there specifically to focus more on data science: In addition to the historically established sector-specific oversight tasks for electricity grids or telecommunications, the “Internet, Digitization Issues” sub-department now does not look at individual sectors, but rather works on studies and market analyses on platforms or certain topics such as “artificial intelligence”, independently of preexisting regulatory rules. Other agencies also work with large data sets and their own databases, for example, the media authorities with their media database or the market observations conducted at the Federation of German Consumer Organisations (“Verbraucherzentrale Bundesverband”, vzbv). Such expertise would need to be significantly expanded so that data can be used to better understand systems for content moderation, algorithmic recommendations or the placement of online advertising. This requires specialized experts in computer and data science[7]. In addition, practical experience and knowledge of sociology, anti-discrimination, human rights and psychology are also needed. Apart from the necessary expertise, there must also be the motivation to request and evaluate platform data, which has so far rarely been part of the self-image of German authorities.
However, it is not only the DSC that will be able to request platform data in the future. One of the most important innovations of the DSA is that researchers will also have this option. Here, too, the DSC has a role to play: Before researchers can obtain data, the DSC must vet the applicants, for example, by checking the data protection concepts and research purposes. Such prior checking of researchers for data use is not yet provided for in German law, at least not by authorities. Each platform can decide according to its own rules whether to make data available and, if so, to whom and in what way. There are guidelines in Germany on how to deal with requests for data access. However, firstly, it is up to the companies and not the authorities to check these requests and, secondly, there is hardly any experience to date in this regard, as the rules have only been in force since the beginning of 2022 (based on the NetzDG; see below). An indirect link to the vetting process are certification procedures, for example, at the BNetzA or at the media authorities, although these often involve technical systems, for instance, regarding age verification, and not people. This means there is also a competence gap in Germany with regard to vetting researchers. Here, it might be worth taking a look at the strengths and weaknesses of the different approaches taken by tech companies.
Shaping the rules on data access is thus an important task specifically assigned to the DSC, and one for which there is no long-standing experience in Germany. Expertise already exists on other supervisory tasks of the DSA, although this expertise is spread among several bodies and only covers parts of the DSA in each case. For instance, this is true for the issue of content moderation and the removal of possible illegal content.
The DSA stipulates that platforms must explain their content moderation to users in a comprehensible way and report on it annually. There should also be notice-and-action mechanisms for potentially illegal content. If platforms have been informed by authorities about illegal content, the companies must delete this content. These rules affect many authorities and areas of law in Germany – they may have to do with product safety, freedom of expression, other fundamental rights protections, the criminal code or all of the above. Different sets of rules and authorities deal with these diverse aspects. Some important German laws that touch on these parts of the DSA are the Interstate Media Treaty (“Medienstaatsvertrag”, MStV), the Network Enforcement Act (“Netzwerkdurchsetzungsgesetz”, NetzDG) and the Youth Protection Act (“Jugendschutzgesetz”, JuSchG). They explicitly address some types of online platforms that are also covered in the DSA. Based on these laws, the strengths and weaknesses of German platform supervision become visible.
The MStV is the key piece of legislation on media regulation in Germany. It is an “interstate treaty” because it is agreed upon by the German federal states. After years of reform, it came into force at the end of 2020, with some of the associated statutes not coming into force until the beginning of 2022. With this reform, state media authorities are responsible for “media intermediaries” for the first time, a definition that partly overlaps with “online platforms” from the DSA. Media intermediaries include social networks, video portals and search engines, but not online marketplaces, which are also covered in the DSA. For example, the media agencies are to ensure that these services explain their recommendation systems. In contrast to the broader goals of the DSA, however, this transparency requirement is mainly about ensuring media pluralism and diversity of opinion. Other issues of fundamental rights protection are not explicitly addressed, whereas this is the case for the DSA, which refers to all fundamental rights and specifically mentions consumer protection and the right to privacy, for instance.
The Federal Office of Justice (BfJ) also focuses explicitly on the supervision of online platforms due to the NetzDG.[8] The NetzDG uses the term online platforms to refer to social networks and not online marketplaces, which are also covered by the DSA. Since 2017, the NetzDG has stipulated, among other things, that platforms must provide users with notice-and-action mechanisms for potentially illegal content and submit reports on content moderation and deletion. These rules apply to platforms with at least two million registered users in Germany, which is a much lower threshold than the 45 million monthly users in the EU, which according to the DSA constitute a “very large online platform”. The BfJ is supposed to ensure compliance with the rules, but for a long time had only limited powers to do so. It was only through a subsequent NetzDG reform, which has been in force since the end of 2021, that the BfJ was given any supervisory powers at all for online platforms. Before that, the office was a “prosecuting authority” and therefore was not allowed to actively contact tech companies on regulatory matters, but only to communicate with them in lengthy, formal processes regarding possible violations of the law.[9] This weakened NetzDG oversight for a long time.[10] But there were weaknesses even in the measures that were allowed before the reform: For instance, the BfJ imposed a fine on Facebook, but the corporation refused to pay it for years without any consequences.[11]
Unlike the BfJ, the Federal Agency for Youth Media Protection (“Bundeszentralstelle für Kinder- und Jugendmedienschutz”, BzKJ) is explicitly following a regulatory approach focused on a dialogue with companies. It is based on the Youth Protection Act, which came into force in 2021 after long and sometimes conflictual discussions between the federal and state governments. Similar to the state media authorities, the thematic focus here is very narrow, as it deals exclusively with the protection of children and young people.
These examples highlight that there is an awareness in Germany for the need for separate rules for platforms. This is fundamentally in line with the DSA. In addition, the laws have had the effect of building up specialized expertise on platforms in the relevant authorities. This also applies to other authorities, for example, the BNetzA and also the BKartA. However, weaknesses also become apparent when the tasks of the DSC are considered: One lesson from the development of the BfJ is that the DSC should be allowed to communicate with platforms and needs sufficient clout to be able to assert itself against them, if necessary. Media regulation and youth media protection are concerned with key aspects of protection of fundamental rights on platforms, which is in line with the goals of the DSA. However, they only cover partial aspects of the DSA, which goes beyond social networks and beyond issues of media pluralism and youth media protection. Another challenge for the DSC is to take into account the special features of smaller platforms. In Germany, the platform oversight often focuses on very large online platforms, but after the entry into force of the DSA, these are to be supervised mainly by the Commission. This is the case, for example, with the BKartA, where the field of activity, by its very nature, mostly encompasses larger companies and corporate mergers. After the entry into force of the Interstate Media Treaty, the media authorities also focused their attention on very large platforms[12] on the one hand and on individual blogs and websites[13] on the other. Yet, in Germany, there are also many smaller social networks and online marketplaces that could be affected by the rules of the DSA. Unlike large, global tech corporations, they are less in the spotlight, often follow in the tradition of medium-sized businesses (“Mittelstand”) and have fewer resources for political and economic networking.[14] Even more than larger platforms, they could benefit from exchanges with an authority that knows these conditions and takes them into account in its own communications work.
Considering this, it becomes clear that there are promising approaches to platform oversight in several places in Germany, be it the development of data-based supervision, be it on important topics covered in the DSA, be it on regulation that relies on a dialogue with platforms. However, there is a lack of focus on a holistic, fundamental rights-based platform oversight structure that particularly takes into account small to medium-sized platforms. The DSA does not directly demand such a focus from an individual DSC, but the envisioned oversight tasks at least encourage building the relevant expertise. It is therefore worthwhile to draw lessons from existing structures. This is not to say, however, that these structures are or should be the only blueprint. Parallels to other industries and regulatory approaches need to be considered, but social networks are not television broadcasters and online marketplaces are not postal service providers. Accordingly, the “institutional design” of platform regulation must also be based on a “holistic” rather than a fragmented understanding of platforms and content moderation.[15] Germany, thanks to several reforms in media regulation, the NetzDG and also in digital youth and consumer protection, is further along than other countries, but is still far away from such a holistic approach to platform oversight.
Coordination and cooperation
Germany already has a lot of experience regarding the coordination of different agencies, which could help the DSC. However, even with this experience, it is an open question whether the coordination tasks of the DSC are already being fulfilled by existing bodies.
Examples of intra-German coordination mechanisms can be found in many places. The need for this is particularly pronounced in policy areas in which the federal states play an important role. This applies, for example, to data protection: The state authorities are responsible for supervising the private sector in their federal state. In the Data Protection Conference (“Datenschutzkonferenz”, DSK), they draw up joint statements or resolutions under an annually rotating chairmanship. The Federal Commissioner for Data Protection and Freedom of Information (“Bundesbeauftragter für den Datenschutz und die Informationsfreiheit”, BfDI) is also represented in the DSK and is responsible for data protection supervision of the federal authorities (and some private sectors).
In media regulation, too, institutions at the state level are responsible for the supervision, in this case of TV stations, radio stations and some online services. But there is no federal authority as there is in data protection. At the federal level, however, there is a “Joint Management Office” of the state media authorities, which was created expressly to serve as a “central point of contact” for the state media authorities and to coordinate their work. It also supports bodies of the state media authorities at the federal level, such as the Directors’ Conference (“Direktorenkonferenz”, DLM), the Commission for the Protection of Minors in the Media (“Kommission für Jugendmedienschutz”, KJM) and the Commission on Licensing and Supervision (“Kommission für Zulassung und Aufsicht”, ZAK). These bodies not only coordinate the exchange of information or the drafting of opinions, but also make regulatory decisions on media supervision. Their structures vary: In some cases, they consist exclusively of representatives of the state media authorities (as in the case of ZAK; the costs for this are borne by the state media authorities); in other cases, other representatives of authorities and interest groups also participate (as in the case of KJM).
These examples reveal how differently coordination structures can be set up in Germany, depending on the legal basis and also historical developments. The “Joint Management Office” is a permanent point of contact and does not itself issue coordinated opinions. This means that it is set up differently from the Data Protection Conference, whose office changes annually depending on the chairmanship and whose coordination work consists, among other things, of drafting joint opinions. Both, in turn, differ from the other bodies of media regulation, which not only bring together representatives of the media institutions, but also have a say in supervision and enforcement.
There is an additional level of coordination that is required of the DSC: Its task is decidedly not only a matter of coordinating authorities from one policy field, as is the case with data protection and media regulation, respectively. The DSC must deal with issues that were previously the responsibility of different authorities. For example, the DSA regulates which data may be used for targeted online advertising, which has a strong connection to data protection, and it also prescribes reporting channels for potentially illegal content, which touches on issues of criminal law.
There are also examples for this type of cross-sectoral communication between authorities in Germany. These range from informal and sporadic discussions to regular meetings and formalized cooperation. At the working level in particular, employees from different authorities engage in informal exchanges. At the management level, there are both ad-hoc meetings (for example, when representatives of media institutions and the Federal Office of Justice discuss the NetzDG[16]) and regular formats (such as the annual talks between state media authorities and the Federal Cartel Office[17] or the exchange between the BfDI and the Federal Office for Information Security (“Bundesamt für Sicherheit in der Informationstechnik”, BSI). Formal cooperation can relate, for instance, to a joint investigation into messenger and video services (as happened between the Federal Cartel Office and the BSI[18]) or a joint procedure for dealing with complaints (as agreed by the BNetzA and the media authorities[19]).
These very different coordination mechanisms should be thoroughly evaluated for the establishment of the DSC: What kind of coordination should the DSC actually take on? How does this function relate to its own enforcement tasks? What forms of information exchange and what coordination mechanisms have proven effective? What degree of institutionalization is needed? Does a chair make sense, and if so, what kind?
The DSC would have to combine several components of previously known formats: Like the Data Protection Conference, it would have to bring together federal and state authorities. Like the KJM, it would have to combine coordination and supervisory tasks. Like the procedural rules of the BNetzA and the media authorities, it would have to enable the formal exchange of information between policy areas (see also section 4.1). Such an oversight body at the federal level, which assumes both coordination and supervisory functions across policy areas, does not yet exist for platforms.
In addition to coordination within Germany, exchange at the European level is also a task for the DSC. German authorities have experience in this area, too, particularly because of their work in European regulatory networks. Such networks exist on almost all topics (see Figure 4 in section 3), but they vary in strength and institutionalization. The European Regulators Group for Audiovisual Media Services (ERGA), for example, is still relatively young, has no office of its own and can issue opinions to the Commission upon request. The Body of European Regulators for Electronic Communications (BEREC), meanwhile, was established as an EU body in a legal text, maintains an office and its opinions must be taken into account by the Commission. In both cases – and also, for example, in competition law or consumer protection – such EU networks enable German bodies to exchange information with other European authorities as well as the European Commission. This is an important task of the DSC, where many German bodies already have experience.
Some of the European networks are also linked to each other or at least exchange information with each other. One example are meetings between BEREC and ERGA[20] or the participation of the BNetzA in the Europe-wide network of consumer protection authorities (Consumer Protection Cooperation, CPC) on geoblocking. This type of exchange at the EU level across several topics is less pronounced, however. The DSC will still have to do this, for instance, as part of the newly created “European Board for Digital Services”. This body is to consist of all DSCs, which means that regulators from different areas could be represented here. For example, France had brought its reformed media regulator into play as a DSC. In other countries, meanwhile, it could be consumer protection or telecommunications regulators, or completely newly created agencies. Beyond cooperation within the body, there may also be specific cases where different DSCs jointly conduct investigations or exchange information. For the German DSC, therefore, it is useful to collect best practices on interdisciplinary and cross-border oversight structures.