Vortrag bei re:publica23: Wie eine starke deutsche Plattformaufsicht aussehen sollte

SNV in the Media

SNV project director Julian Jaursch gave a talk and engaged in a discussion with the audience on ideas how to set up Germany’s platform oversight governance in the wake of the EU’s Digital Services Act (DSA). The German-language video is below, along with a transcript in English. Some sources and further information both on the talk and the thoughtful audience questions are provided as well. 

 

Summary of the main points from the talk and discussion 

  • The DSC does more than just coordinate: It has important enforcement and oversight responsibilities. 
    • E.g., complaints body: The DSC can only identify systemic failures if it develops sufficient platform expertise of its own and is the central point for consumer complaints. 
    • E.g., data access: Without its own expertise and network, it will be difficult for the DSC to conduct data analyses and process data requests. 
    • It therefore makes sense to bundle competencies for platform oversight at the DSC and to keep the list of other authorities responsible for the DSA as short as possible. Otherwise, there is a risk that platform oversight could become fragmented. The DSC would then be no more than a secretariat, barely able to take on oversight tasks itself. 
  • The text of the DSA and the practicalities of DSA enforcement require that the DSC obtain external expertise. 
    • An advisory board may be appropriate for this purpose if it exists alongside other formats for exchange, can act as a technical body to provide expertise and its membership is driven by topical expertise, not political considerations. 
    • A research unit at the DSC that conducts its own studies and builds a network with experts from academia and civil society could help build platform expertise and thus enable the DSC to engage at the EU level. 

 

Video (in German) 

 

 

 

 

Translated transcript of the video 

The transcript has been lightly edited for clarity. 

 

Hello and welcome to the talk “Germany’s Search for a Digital Services Coordinator”. 

 

This Digital Services Coordinator is a new public authority that is to be created in Germany to enforce EU rules for online platforms. This involves transparency for advertising, complaint mechanisms and risk mitigation. 

 

That all sounds a bit dry now. Authorities. Regulation. Risk mitigation. But this set of EU rules has the potential to touch many of us in our everyday lives, because many of us probably use platforms in some way: search engines, online marketplaces, booking sites, forums, social networks. 

 

The goal of this set of rules is, among other things, to protect users’ fundamental rights. But this can only work if these rules are enforced well. That’s what I’ll be talking about in the next 15 minutes or so. I will first briefly describe these EU rules as such, then discuss the question of who should enforce them at the EU and national levels and at the very end I will focus very specifically on Germany. 

 

We have been dealing with these questions at SNV for a while now. SNV – the “Stiftung Neue Verantwortung” – is a not-for-profit think tank. We work on various digital policy issues. My name is Julian Jaursch and, together with some of my colleagues, I am focusing on the topics of platform oversight and platform regulation. Our analyses and proposals are all available publicly and free of charge on the website. When it comes to platform oversight, we have done our own research, talked to many people, conducted expert interviews with people from academia, civil society, business, and conducted a survey of regulators. Based on that, I’d like to present a few of the conclusions we’ve drawn here. 

 

 

Overview of the DSA: EU rules for online platforms 

 

I’ll start with the point about the EU rules: This is the DSA, the Digital Services Act. The DSA is a set of rules for online platforms that has been in force since the end of 2022 with the stated goal of creating a “transparent and safe online environment” for us users. 

 

How the DSA wants to achieve this goal is by setting rules for platforms. This is a departure from letting platforms decide things on their own. A few examples for this: How do platforms handle what content they delete, what content they recommend, what content they block? Do platforms explain their algorithmic recommendation systems? Do platforms have complaint channels for users? These are all questions that, before the DSA, were in most cases left to the platforms: There were voluntary commitments and measures, and sometimes national laws on individual aspects. With the DSA, it should now be the case that there is an EU-wide set of rules for such issues. Platforms in the EU are thus given a framework and so are member states. 

 

The DSA certainly has weaknesses. There are issues where some people would have liked to go further. So, the law is not perfect. But even if the DSA was perfect, even if there were perfect rules – the best rules are of no use, if they are not enforced well. 

 

 

Enforcing the DSA: Shared responsibilities for the Commission and member states 

 

This brings us to the second part and the second question, the enforcement of these new EU rules. Here, a very rough distinction must be made between very large online platforms, which are mentioned in the DSA, and all those that are not considered very large. Very large online platforms are those that have at least 45 million users in the EU per month. These are well-known services such as Facebook, Instagram, TikTok, Wikimedia, Zalando, Booking and still others, a total of 19 currently. 

 

For these very large online platforms, the European Commission is mainly responsible for checking their compliance with the DSA. This is a relatively new task for the Commission. The Commission has rarely acted as an actual supervisory or enforcement authority because that is not its role. That’s why there’s quite a bit of restructuring going on in Brussels right now, new teams are being set up, new people are being hired. The Commission is also getting support, including from the European Centre for Algorithmic Transparency. That was opened in Seville in April. It’s a research facility that’s there to help the Commission with enforcement for these very large online platforms. 

 

But for the online platforms that are not very large, the member states are responsible, i.e., the member state where that platform is based. In practice, this means that the member states can appoint several authorities to enforce the rules of the DSA. 

 

However, the distinction I have drawn here is not so strict. There are also ways in which the Commission and the member states can work together. First of all, the member states can sometimes help with enforcement regarding very large online platforms. Secondly, the Commission and the member states form a new European body, the European Board for Digital Services. This is an advisory body in which they also have to work together. So, the Commission and the member states are part of it. However, only one authority from each member state can participate in this. I just said that the member states can appoint several authorities. However, for the Board, there can only be one authority per member state. 

 

Now this is probably not a big surprise – this is exactly the Digital Services Coordinator from the title of the talk. This Coordinator has the task of coordinating the national authorities, it has the task of sitting on the Board, but the Coordinator does more than just coordinate. The Coordinator has also been given a whole range of important oversight and enforcement tasks. I’d like to provide some examples to briefly show what exactly the DSC has to do. 

 

 

The DSC for users: Key complaints body 

 

For the people in the audience who use platforms – be it search engines, online marketplaces, whatever – the DSC is pretty important. The example that I picked out here to show that is the DSC as a complaint body. The DSA does require that platforms themselves must have complaint mechanisms. But what happens, for example, if precisely these internal complaint channels do not work? Then consumers can turn to the DSC. What if a booking platform or a social network does not explain the algorithmic recommendation systems well? Then consumers can turn to the DSC. 

 

The German consumer protection organization vzbv has emphasized very strongly that it is important here that there is a central complaints body – so that users are not sent from one authority to the next, so that they can get feedback in a timely manner. That, in turn, means to me that the DSC needs to build expertise on different risks on different platforms for different groups of people. So, some specialization and building expertise at the DSC is important. 

 

 

The DSC for research: Own analyses and networking tasks 

 

For those who not only use platforms, but also do research on platforms, the DSC is even more important, and that is to open up the oft-cited “black box” of algorithmic platforms. 

 

Until now, if researchers wanted data from platforms, they usually had to rely on the good will of the platforms. They had to make a case to the platform if they wanted to explore, for example: How does certain content spread? How do the recommendation systems work? Sometimes that worked – there is research on platforms, after all – but in many cases it didn’t: The data was flawed or not complete, the data interfaces were not good, it was too expensive, there were issues about who got what data and who didn’t. 

 

That’s why the approach in the DSA was: There must be legally guaranteed access for researchers to data. This is roughly stated in the DSA and will be refined in further laws – but the actual implementation of the whole thing lies with the DSC. The Commission and the Coordinators can request data themselves, but it’s just the DSCs which are also responsible for ensuring that researchers, whether from universities or civil society, can request data from platforms. This means that here, too, it is very important that the DSCs build up expertise in order to understand platform risks, to be able to evaluate such applications and to develop a network with academia and civil society. 

 

If all these tasks are taken together, the result is a kind of blueprint or ideal Coordinator. That’s what I’ll try to highlight in the following. 

 

 

Envisioning a strong DSC: Collaborative, specialized, networking 

 

The DSC must be collaborative. This mainly concerns its coordinating function. It has to bring the different authorities together and be in exchange with them. But I think the examples of data access and the complaints mechanism have also shown that he has to build up expertise. The DSC must have some kind of specialization in platforms, platform risks and platform business models so that the potential benefits for users and researchers can be exploited. The last part is networking. This is important not only because the DSA requires the DSC to network, but is also important for practical reasons. Expertise cannot be built if the Coordinator is not in communication with external experts. 

 

This brings me to the last part and to the question: What is the situation here in Germany? And that also brings me to the last abbreviation for today: KDD, the “Koordinierungsstelle für digitale Dienste”, Germany’s potential DSC. Unfortunately, I have to put a very large asterisk next to it, because it’s not official yet. For months, there have been discussions about where the German DSC should be set up and how it should be set up. There will be a draft law on this from the Federal Ministry for Digital and Transport, which will then be debated in the Bundestag, but this draft law does not yet officially exist. This means that many of the things that are coming now are speculative and are intended to enable people who are interested in this to later evaluate this draft law or at least the reporting on it. 

 

I will now try to point out some criteria for the German DSC and how platform oversight might succeed in Germany, and I will refer to the three areas I just mentioned. 

 

 

The German case: The need to combine competencies 

 

In Germany in particular, it is important to keep the coordination effort to a minimum. This is partly due to Germany’s federal structure. At the state level and at the federal level, there are many authorities, institutions and bodies that somehow have to do with platforms. It was always clear that there would not be just one regulator in Germany that would be responsible for platform oversight. But it should really be ensured here that as many competencies as possible are collected at the DSC and that as few exceptions as possible are made for when the DSC is not responsible for DSA enforcement. The risk that I see otherwise is that because of all the coordination work, the DSC will not be able to perform all the other important tasks that I have presented, which would be very helpful to users and researchers. 

 

The second point on expertise: There seemed to be some talk, or there is talk, of providing a research budget for the German DSC. I think that would make a lot of sense and it would be great if something like that were to happen. I would even go further and say that the DSC needs its own research or data unit where data analyses can be carried out and where a network with the scientific community can be built. 

 

On the last point of networking, it is important to have different exchange formats to involve civil society. That also seems to be the wish of the legislators. There have been discussions about setting up an advisory board at the DSC. I think that makes sense in principle, too, if it exists alongside other exchange formats, so that there’s no overreliance on this one rather rigid format of an advisory board. 

 

What I also see as a small risk with an advisory board is that it could be a kind of fake integration of external expertise. Unfortunately, that is not seldom the case. The Maecenata Institute did a study in which they looked at all advisory boards at the federal level and found three things, among others. First, civil society is almost never involved. Second, there is very little transparency about who sits on these advisory boards and what the advisory board is allowed to do. Third, the advisory councils are almost never evaluated. The German DSC could do better on all accounts. It could involve civil society from the outset, provide a clear role for the advisory board and set up the nomination process so that it is really driven by substantive and not political questions, and there could be a legally required evaluation of the advisory board. 

 

If these three points are taken into account – i.e., bundling competences, developing expertise and truly involving civil society – then the Coordinator in Germany can hopefully play a strong role for us as platform users. 

 

 

Taking a long-term view: Towards an independent platform agency in Germany 

 

Personally, however, I would also say – and here I will allow myself a bonus abbreviation – that in the long term it would make sense to establish a truly autonomous, independent, specialized new agency: a German Digital Services Agency. 

 

I’m well aware that a new agency is not a panacea. It’s expensive, it takes a long time to build, it’s bureaucratic. But in this particular case for the DSA, it really makes sense to consider it. After all, the DSA is calling for independent platform oversight anyway, and it’s not just the DSA that deals with platforms, data protection and the data economy. There is a number of German and EU proposals that all touch on this: the AI Act, the regulation on political advertising, the reform of the GDPR. It would make sense to think about the advantages of a German Digital Services Agency for the medium or long term. 

 

I’ll close with that for now, I’m looking forward to your questions, comments and criticism, and I’d like to thank you very much. 

 

Remarks on audience questions 

Cases of conflict and centralization 

I would be interested to know whether there is a mediation procedure if the authorities in the EU do not quite agree. I would also be interested in your assessment of whether centralizing German enforcement at one authority makes sense, considering the mixed results of decentralized data protection enforcement. 

 

On the first point, if the different DSCs from the member states don’t agree, my understanding was that the Board plays a role. There is a process how these authorities can come to an agreement among themselves. But it’s also important to note that for the very large online platforms, which will be the focus of a lot of questions, the Commission is responsible anyway. I would have to think about cases in which there may be cross-border issues at all, where, for example, the German DSC has some issues to clarify with the Spanish DSC that then has to be resolved. 

 

[Note after the talk: Article 58 creates conditions for one DSC to “nudge” the other in certain cases. So, if a DSC is inactive, there is a procedure on how to proceed in this case. What hurdles there are in practice is still unclear.] 

 

To your second question about my opinion on centralization. I only have an opinion on that and no empirical evidence as to which is better or worse. I’ll come back to what I mentioned about combining competencies. Data protection oversight, which is decentralized in Germany, as well as media law enforcement, that all has its advantages, but I think we’ve also seen in both cases that it sometimes takes a long time. It all works, you can see that in practice, but it just takes time and I do think that there would be advantages to centralizing enforcement. That’s why I find this idea of a Coordinator, who really takes on an interface function, interesting. But this is only possible if the DSC can really do this and if a procedure is not created where there is the DSC and, in parallel, ten other authorities with whom it always has to exchange information. That is precisely the role of the DSC, that it talks to various regulators in advance and can then represent Germany at the European level. The DSC can’t do that if it’s not actually allowed to speak and has to consult with the other authorities on two or three political levels. 

 

[Note after the talk on the centralization at the EU level: The centralization of enforcement at EU level makes sense, as it ideally ensures a uniform interpretation and the services addressed move across the entire EU market anyway. The fact that centralization initially takes place at the Commission is also the pragmatic solution in the short term. Nevertheless, the role of the Commission should be critically questioned and accompanied, as this institution is not an independent supervisory body (see the SNV paper on the original DSA draft and the Center for Democracy and Technology’s statement on it). It is therefore important to also consider alternative bodies for the enforcement of the DSA in the long term.] 

 

DSC advisory board and coordination 

I have two more questions as well. One is about the advisory board. Such an advisory board cannot, I think, provide the function of coordination among the authorities. The advisory board can be a control body of the DSC as an independent authority, which it is supposed to be. You have to look at that: Is it all working? Where are there difficulties? There are obviously very different ideas about how it should be staffed. I think the advisory board can only take on such a task because it can’t meet every week. What do you think, for example, of the idea that the various authorities at the state and federal level have mirror units at the DSC, where they are responsible for the exchange of communications and then sit together at the DSC? Mirror units are common at the state chancellery level and at the federal chancellery. I believe that this ensures the best possible flow of information and that the link back to the authorities is guaranteed. 

 

On the first point about advisory boards: A control function would be possible, but I would disagree that that is the only possibility. The advisory board could also be there as a substantive expert body and – I would agree with you on this – not to take decisions away from the authorities or to regulate them in any way. There is no authorization or legitimation for that. But if you see the advisory board as an expert body that brings expertise to the DSC that the DSC doesn’t have in-house, I think that makes sense. For me, that would be more than a control body. For me, that would then also suggest that the DSC itself should have some say in who sits on the advisory board. Because in the best case, the DSC knows, “Okay, I have units XY and ABC here, I have people there who are experts. But right now, I have a problem on a completely different topic, I need expertise, I can get it from the advisory board, I can get it from a conference, I can get it from a consultation.” There really need to be different formats to bring that expertise to the DSC. 

 

On the second question about the mirror units: I definitely think it’s interesting, but I’m just not familiar enough with how it works in practice, for example, in the chancellery. In my ideal scenario, you wouldn’t need them so much, because a lot is left to the DSC anyway. If the DSC can take on many of the responsibilities that are actually granted to it in the DSA and these do not go to other authorities, then the mirror units are not quite so important. 

 

Regarding the interjection that media regulation is a matter for the federal states: Yes, that’s true, but that would be right there in the bill to say that that remains unaffected. Everything that the states have done so far, they continue to do if it is not affected by the DSA, and that is often the case. I’m not sure that the mirror unit is necessarily the only option or the necessary option then. But as I said, I think it is one possibility. We would have to see whether this could be transferred from the state chancelleries, from the federal chancellery. As far as I know, there is no such thing at regulators yet. There are written agreements and there are also informal meetings between authorities, but I am not aware of any federal authority having a kind of mirror department with another federal authority. But it would be an open question whether this could be tested here with the DSC. 

 

Sanctions 

Who is responsible when platforms don’t follow the rules? Does the DSC play a role here, too, and if so, what leeway does it have to impose penalties? 

 

That depends on the platform. In the cases of the very large platforms I just mentioned, violations are not necessarily detected by the DSC. There are processes for their involvement in certain cases but in the vast majority of cases, the Commission will do that. The Commission will look at the TikToks and Amazons and App Stores and Google Searches of this world and try to detect violations and make improvements or just penalize them. The DSCs are responsible for the platforms that are not very large. Here we have the caveat again – that was also the discussion just now – for the violations that are already regulated by other authorities, these authorities remain responsible. If a smaller social network violates data protection rules, this has nothing to do with the DSA and the data protection authorities continue to be responsible. If a not-very-large search engine violates competition rules, that has nothing to do with the DSA and the competition authorities remain responsible. Only for issues that are really regulated in the DSA – on content moderation, complaint channels, trusted flaggers, transparency of online advertising – would the DSCs then try to detect and penalize violations. 

 

Research units at the DSC 

I wanted to come back to the research units that you suggested at the DSC. I would be interested to know what exactly you envision there, what the task of such a research unit could be and also ask for a bit of a delineation. As far as I know, the DSC is also responsible for answering research requests and checking their eligibility. To what extent could there be a conflict of interest, if you have your own in-house research unit, but also have to decide whether researchers’ requests are justified? 

 

The question revolves around the role of the DSC on the one hand as a vehicle for external researchers to obtain data and on the other hand the possibility for the DSC to request data from the platforms, for example, to study how content is disseminated. Regarding the role of the research unit on this: I just wanted to make it clear that, from my point of view, the DSC is not just coordinating, but also has a major role in data analysis. It can’t do that if the organizational structure is only geared to exchanging information with other authorities or moving possible infringement procedures back and forth between authorities. That’s why I think such a research unit is important, if only to show that the DSC does more than just coordinate, that the DSC is also there to conduct data analyses or at least to develop a kind of research network or research community to better study platforms. 

 

On the second question about possible conflicts of interest: I would have to think about that in more detail, but I’m not convinced that they exist so directly. Ideally – and perhaps this is more wishful thinking on my part – a research unit, a strong DSC would ensure that there is a research network and that there are no conflicts of interest and that the DSC, for instance, does not block the request of a university or another organization because it wants to do it himself or because it does not think it is a legitimate request. If you have just such a strong, well-established, long-term network or research community, then ideally you could avoid something like that, hopefully. As I said, the point is still important to ensure that there are no blockades or that the DSC does not in any way hinder rather than inspire research through its position of power. That would not be in the spirit of the DSA and not in the spirit of the researchers. 

 

EMFA 

I have a question not so much about the DSC, but maybe you have an assessment anyway. I hear concerns that the European Media Freedom Act will in a way counteract the DSA because media could be exempt from regulation or also from deletion obligations. There are concerns that any Russian disinformation can masquerade as media and that then that doesn’t have to be deleted or downranked. How do you assess that? 

 

I have a similar assessment, but perhaps I have to go back for a moment. The European Media Freedom Act or EMFA is a completely different piece of legislation than the DSA. But there is an overlap at one important point – that was precisely the question. In the DSA, there was once a discussion about exempting media from the rules on content moderation. That is, if one declares oneself as a medium, then platforms do not have to or are not allowed to engage in content moderation, that is, for example, not to rank this media content up or down in the feeds. The danger that many have seen behind this, and which I also largely tend to see like that, is that I simply say, “I am a medium.” and then I can say what I want, and the platform is no longer allowed to do content moderation in any way. It’s not even just about deletion, it’s also about downranking content. This is a very delicate balance between freedom of expression, which can always be impaired by content moderation in one way or another, and the great danger of people or organizations disguising themselves as media and thus spreading disinformation. The fact that there is empirical evidence for this latter case, where organizations – be they governmental or non-governmental – declare themselves as media, set up websites that look like websites of national or European media and then use them to disseminate discriminatory content, in some cases also false or hateful content, means that this regulation, which excludes media, does not seem sensible to me. Content moderation should be designed to treat users equally and not to create exceptions. Creating exemptions was largely done by platforms for a long time and if this is now enshrined in law, I’m not sure that this is going in the right direction. 

 

Sources and further information on the DSA/DSC 

DSC and DSA enforcement in general 

 

Role of the DSC for users, especially as a complaints office 

 

Role of the DSC for researchers 

  • AlgorithmWatch article on DSA rules on data access 
  • SNV submission to the Commission’s consultation 
    • Explains why data access rules of the DSA are so important and how their implementation can succeed 
  • SNV analysis of research tasks in the DSA 
    • Argues that DSCs should establish their own research units 

 

Involvement of civil society in the DSC, especially through an advisory board 

  • Maecenata Institute study on advisory boards at the German federal level 
    • Shows that advisory boards often work with little transparency, oversight and evaluation 

 

Very Large Online Platforms and Search Engines (VLOPs) 

Published by: 
re:publica
June 06, 2023