Transcript for the Background Briefing "Enforcing the DSA: Online talk with Irene Roche Laguna"

Transcript

The first deadline of the EU’s Digital Services Act (DSA) has passed: By February 17, online platforms had to publish their user numbers, which is a key requisite for the European Commission and the member states to enforce the new rules for platforms. Yet, the EU still has a long way to go to ensure strong platform oversight.

What first steps towards enforcing the DSA will the Commission take, now that the user numbers are out? How can potential hurdles in enforcement be overcome? What are ways to foster cooperation between the Commission, national authorities and external experts from civil society and academia? On March 1st, Dr. Julian Jaursch, Project Director for "Policy | Platform regulation" at SNV, discussed these issues with Irene Roche Laguna, deputy head at the Commission unit responsible for the DSA.

The transcript has been edited for clarity.

- Start of transcript - 

Dr. Julian Jaursch, project director at SNV: Hello and welcome everyone to this SNV online background talk. Thank you so much for joining us. My name is Julian. I’m a project director here at SNV, an independent think-tank in Berlin working on various tech policy issues. Before I welcome our guest in just a moment, I’ll give you a brief introduction about the topic and the format for today. 

One of the areas we focus on in SNV is platform regulation and platform oversight. As many of you might know, the EU’s Digital Services Act (DSA), has been one of the most closely watched draft laws in the space. And now, it’s not a draft law anymore. It’s been in force since late 2022 and all the rules will finally apply early next year. The law distinguishes between different types of platforms, such as hosting providers, online marketplaces and platforms that are typically known as social media or social networks. And it also distinguishes by size. There are specific rules for so-called very large online platforms and search engines. And there are also exceptions for very small ones. Some of the rules in the DSA are fairly new and fairly untested: They concern transparency in terms of services, easy ways for users to report potentially illegal content, more clarity around online advertising as well as auditing and risk assessments. So really, you have a very broad set of rules for a broad set of platforms. 

Now, what has been most interesting to us at SNV is not just the rules themselves, but also the question of how will the rules be enforced and by whom. There’s an interesting mix of responsibilities in the DSA on this question. The Commission will have an important role for the very large online platforms or VLOPs for short. The member states will have to support the European Commission on that and they will additionally have to look at smaller platforms and see how they follow the rules. 

So now, the negotiations for the DSA are long over and the enforcement work starts. What does the Commission have to do to get up and running? How can it build up the necessary expertise, and establish good networks with external experts? I am very, very thankful that I get to discuss these questions with my guest today. Irene Roche Laguna is joining us, welcome Irene. And thank you so much for being here! 

Irene is the deputy head of precisely the unit in the Commission that is responsible for implementing the DSA. She’s been with the Commission for almost 10 years now and worked as a Law Professor and at the European Court of Justice beforehand. And in addition to having helped draft the DSA, she’s also written academic articles on it. So very clearly, she really has a deep understanding of the legal text. And now, she’s also helping to translate the legal text into practice. So once again, I’m very glad that we get to discuss how that might work, and what hurdles remain. And also, that she’ll take some questions from the audience. 

Irene Roche Laguna, deputy head of unit, DG CNCT, European Commission: Thank you very much, Julian. And good afternoon, everyone. 

Designating VLOPs: “We are very confident with the results.” 

Julian: Let’s get started with the most current development. Very recently, just a week and a half ago, online platforms had to publish their user numbers. February 17 was the deadline to do that, one of the very first key deadlines from the DSA. Knowing this number of users is pretty important because once a platform hits the threshold of 45 million monthly users in the EU, it is considered “very large”. And then, it must fulfill certain additional or specific obligations. So, probably not a huge surprise, some of the well-known platforms reported big numbers: YouTube, Instagram, TikTok, Amazon, and Booking.com. Twitter also qualifies as a VLOP – that had received a lot of attention beforehand, especially with the changes in leadership there. 

Other platforms, however, were late or haven’t reported yet or their numbers were a bit below the VLOP threshold. That was especially noticeable for porn sites, which tend to get a lot of clicks, but have not reported the numbers or say they are not a very large online platform. That has drawn some criticism, as it could be an attempt to dodge some rules for the VLOPs. Irene, if I could ask you to please briefly walk us through this process at the Commission regarding user numbers and what you make of it. Specifically, how can you be sure that you’ve covered all the very large online platforms in the EU and that the numbers are correct? 

Irene: Thank you very much for the million-dollar question or the 45-million-dollar question. Because indeed, this has been a very important moment, the 17th of February. That moment that we were expecting in order to kick off the enforcement of the DSA and to bring the results from the ground, which is, at the end of the day, the ultimate goal. The number of users or the number of active EU citizens of the service is a difficult proxy, already from the beginning. This is a reality in the text and the regulation. Because so far, this information has always been only in the hands of the platforms: Only a platform knows exactly how many users they have. Sometimes the platforms do not know exactly, because they don’t have the means or they have not really looked into that, they have approximate numbers and they have to develop the methodology to count those users. 

The DSA provides already clear guidance on how the DSA expects the platforms to count the users and we have seen that on most occasions this guidance has been respected. We also published a Q&A collecting all the questions that we have received from platforms themselves or from groups or associations that were trying to find out how to count users. We are very confident with the results. We think that these results prove that the DSA is very well-designed, it designs the scope and it captures those platforms that were at the origin of the Commission proposal. 

We cover quite a wide spectrum also in geographic terms. And in terms of societal risks, we have social networks, we have online marketplaces and we do not see a big surprise. You mentioned some issues. But first, I want to say that obviously, I cannot disclose the work in detail that we are now conducting in the next step, which is the designation process. Obviously, the designation depends solely on whether the platform or the search engine meets that threshold. We need to check whether the service that they provide is a platform or a search engine. We are on that. 

We are also checking the methodology that they have used. The Commission has always the power under Article 24 to request supplementary information or to request access to more information in case we think that the data that they have provided is insufficient or inaccurate. We have also alternative sources of information. Sometimes the platforms publish themselves some numbers for other purposes, for investment purposes and for advertising. And there are also databases that allow to have some extrapolation and approximation to a given number. We are working on that as we speak. 

The next step will be the designation of those platforms that meet that threshold, and the designation will kick off, after four months, the application of the full regulation to those platforms. So, we are very much looking forward to that step. 

Julian: That’s actually what I wanted to get to next. But let me follow up on one of the things you mentioned. I know you said you can’t disclose the designation status, but you also did mention that you can request extra data and you can consult other sources of data. Can you at least confirm that that is happening? Are you looking into certain platforms where you’re not sure, “These numbers don’t seem right” or “We are asking for additional information.” Is that happening at the moment? 

Irene: I think that’s our due diligence to go through that process, yes. 

Building new expertise at the Commission: “Lawyers and economists, we are great, but we cannot solve all problems in the world.”

Julian: Okay. Let’s pick up on what you mentioned, regarding the designation. I mentioned the deadline for the report of the user numbers. Then the next step, as you mentioned, is the designation which means the Commission says, “This is a VLOP. This is a very large search engine.” And then, you mentioned the four-month-period. So, after four months, for example, what has to happen is that these VLOPs have to provide risk assessments. That means that tech companies have to report what risks their platforms entail for users, for instance, youth protection, risks of privacy and potential fundamental rights violations. That concerns a lot of different areas: It’s a technical question regarding what and how the recommender systems are working. It’s a legal question of determining when companies are in breach of certain DSA rules. It also probably requires some general risk assessment expertise, and then, depending on the case, knowledge on public health or youth protection or anti-discrimination. It’s then up to the Commission to evaluate these reports and also evaluate the mitigation measures. 

I’m wondering how you’re preparing to do that. This is not something that the Commission had to do before, this is not one of the core tasks of the Commission. So, what are some of the steps you are taking or will take to ensure strong enforcement? At this point, are you in a good place already when it comes to the necessary expertise on this? 

Irene: Yes, thank you for the question. Because it’s indeed the next step. I just want to stress one thing: After those four months, it’s not only the obligation of risk assessments that kick in or the obligations that apply only to the very large online platforms. Those designated companies will have to comply with the full regulation, starting with having a contact point, or a legal representative if they are not established in the Union. Notice and action mechanism, complaint mechanisms, etc. So, the full regulation will start applying. 

The novelty is this risk-based approach for the very large ones, where they will have to do a self-assessment of the risks that they pose or they can pose and propose mitigation methods. This is the most novel element of the DSA, in my view, also because it is subject to a given dialogue. So, there are no platforms out there that we don’t know, at least the most popular ones. We know how they act, we have already the expertise on the problems they pose and to a certain extent, why these problems are happening, why the recommender systems could lead to some risks, why advertising can lead to risks. That’s why also the DSA reads as it reads. We have built also, further on this expertise we have worked together with internal services in the Commission, and we have created the European Centre for Algorithmic Transparency. 

We are working also with national centers that have similar expertise or even in third countries and ourselves. We are recruiting also data scientists because obviously lawyers and economists, we are great, but we cannot solve all problems in the world. And we need data scientists and data analysts that can understand why and how to avoid the problems that those algorithms can create. We are there, we are ready. And actually, we are very much looking forward to those risk assessments. 

I would like to add that, as a regulator, we don’t intend to wait and see what comes out, we intend to open a dialogue already upfront with the designated companies to try to guide them on the deliverables that we expect from them. I think this is the fair way to act as a regulator, I mean, keeping in mind that obviously, for the first exercise, we will be very, very careful on the results and on the next steps if this is necessary. Also, for us, it’s very important to start the proceedings toward an audit. That’s also quite a novelty that we expect to open up public oversight. Because regulatory oversight is a good thing. But if we open up also to auditors, third parties, researchers, civil society, all can help us to get some findings on the results on whether what the platforms are promising or proposing as mitigation measures works or doesn’t work. I know that we have ears and eyes everywhere. So, there will be means to find out in any case. 

Julian: Okay, interesting to hear you talk about interdisciplinary expertise that you try to assemble, about recruiting, which is a difficult question no matter if you’re a regulator or an organization. And then, you also mentioned the European Centre for Algorithmic Transparency, which is a new institution. I’m curious about that, if you have a word or two on the status of that, whether it’s up and running already, or when you expect that to work and what you expect of its work. 

Irene: Yes, it is not a new institution, it is inside the Joint Research Centre, which is part of the Commission, it’s a service of the European Commission. And it is already established, but we will have the launch event pretty soon in April. It is also recruiting, it is building up and creating the necessary networks, also with researchers and experts on algorithmic accountability. So, this is a work in progress. They will be in the different seats that the JRC has across Europe, in Sevilla, in Ispra and Brussels.  

Cooperation among member states and the Commission: “It is going to be sophisticated, complicated, but not impossible.”

Julian: Thanks for those points. And maybe let’s jump off of that. So, let’s say the Commission is successful in continuing to find expert staff to have enough resources, build the network at the Centre that you’ve just mentioned, and build expertise in general. To me, what is important beyond that is then also the cooperation between the Commission and the member states. You alluded to that already a little bit. 

One way that the DSA foresees this cooperation to work is an information system between the Commission and the member states that needs to be built. And another one is the new Board, a European Board for Digital Services, which is made up of the Commission as well as the national regulators, which are called Digital Service Coordinators. This new Board does have some important functions. I really wouldn’t underestimate it at all. Nonetheless, it is only an advisory body. So, I would again be curious about the status of these things, the information system and the Board, and ask you: Is that really enough to ensure a good exchange between the member states and the Commission, just an information system and the Board? 

Irene: I am confident that it will be enough. I think this cooperation is already working and it was already working before the DSA was in place. We are in constant contact with member states. The difference is that there is a tendency for member states to try to solve the problems created by the large providers. This has been quite recurrent in the national legislation. And under the DSA now, it is quite established who has to do what and how. So, for the very large online platforms, it will be a shared effort between the Commission and the member state of establishment. This is for me a major clarification. And we will work, we are already working very closely together with those member states that we expect to host those that can be designated. This is one thing. 

The information-sharing system that you mentioned is an IT tool to share information in real-time, which is very useful, but it is not a goal in itself. We could share the information by other means, but I think we are in the 21st century and if we regulate platforms, we have to work a bit like them and be a bit more modern. So, this system will allow also to have real-time information on when a member state opens a case, against whom they open a case, when there is a court injunction, what is the result. And we will have a clear picture of what is happening in the intermediary world. I think this is also sort of internal transparency and oversight that right now it’s a bit incomplete. 

If this is enough, I think it is, and the whole preparation towards the implementation of the DSA, what we see when we talk with the member states is that they are also getting ready. They are getting ready not only appointing that Digital Services Coordinator that you mentioned. But for instance, in the exercise of counting users, they have had the opportunity now to get a bit of an overview of what is the ecosystem in their member state. Because if you only focus on the big ones, you miss the startups, the national champions, those that are starting locally. And now through the exercise that they have conducted to inform, to pass the word, to create awareness on this obligation, they are more aware of who are the adversaries on their side, because they will also have to ensure compliance of the DSA to the smaller platforms, even if, as you said correctly, the small and micro ones are normally exempted. 

Julian: It’s good to hear that you’re confident and you’re eager to enforce this and to hear this all from the member states. I hope that that will ring true early next year as well. And I think, for national regulators, for academics, for civil society and other observers who follow along, but also push along a little bit, this is probably good. Because I do think that all these things that you mentioned, the information system, the Board and also the information sharing that’s already happening, I think it will probably take some time still for you to find the groove. Am I correct in that? 

Irene: Well, undoubtedly, it will be new for everyone, and there will be some first steps that will be more difficult until we get on a system mode. But I am very confident. First, we are working quite hard on building this IT ecosystem to share with member states, we are working closely together with them. We are also working on a database for the statement of reasons which is also a mandate to the Commission, which is again quite an interesting transparency tool that is quite often neglected. We are working also internally on the procedures towards the designated companies. 

But obviously, when we start, also member states will need to get used internally, because this is a huge task and you know that very well for the Coordinator. The DSA does not come in a field that is unregulated. At the national level, and also at the European level, there are plenty of regulations on hate speech, consumer protection, product safety, everything, criminal law. All these laws have also an enforcement-structure, but obviously, these illegalities happen also online via online platforms. And now, the Digital Services Coordinators will have to enforce the DSA having due regard to the competences of other authorities that already have some competences in the field. It is going to be sophisticated, complicated, but not impossible. 

Various roles for civil society: “The DSA is a great tool for civil society to get more integrated in the enforcement.”

Julian: I would close my part of this with a question on civil society. That is because the DSA includes a lot of very clear mentions of civil society and a role for civil society. So, for example, civil society organizations can be trusted flaggers, they can request data from very large online platforms, and in certain cases, they can advise the Commission, and they can potentially advise member states. 

So, when I read the DSA, I read it as calling for strong regulators and for strong external organizations that work together to form the EU’s platform oversight system. Is that a correct impression? Was that the intention all along? And if so, what does the Commission do to ensure that civil society actors are included in DSA enforcement, while at the same time not overburdening civil society with tasks that should lie with strong independent regulators and not with volunteer organizations? 

Irene: Yes. Again, it’s a very good question. It was indeed the intention of the Commission when preparing the proposal, and throughout the legislative process, co-legislators agree that civil society has a role to play. It’s not that we want to delegate enforcement. It’s not that we want civil society, users or organizations to do our work, but the more people watch what happens, the better and there are several means to get a thorough understanding of the problems, the transparency reporting, the ad repositories, the database of the statement of reasons, all this information will be public. And if civil society organizations have a specific interest in a topic, they have the means to understand what is working and what is working less. And then they can participate in the oversight and also in improving the situation. 

They can participate by getting into the risk assessment elaboration of the very large platforms. For instance, in recital 90, there is a call, an encouragement [to VLOPs] to involve civil society associations when preparing those risk assessments. We are going to encourage the very large online platforms to ask them how they are introducing this step in the process because these are also the ones representing the addresses and the beneficiaries of the regulation. Beneficiaries should be the users. Civil rights associations and civil society organizations are the ones representing these interests, which usually are not around the table. So, I don’t think it’s a case of delegated enforcement. I think it’s providing civil society organizations the tools to participate when they have the means and also providing the means. But all in all, I think the DSA is a great tool for civil society to get more integrated in the enforcement. 

Julian: You mentioned the tools or the DSA as a tool and the participation. Is there also an idea to kind of institutionalize that a bit? Is there a way to establish expert groups or advisory councils to the Commission that are more permanent and structured versus the options that you laid out? Like, “If you have information, you can reach us.” Is there a way to institutionalize this? Is that being discussed? 

Irene: That’s a good question. That reminds me that – I think it was from the parliament – there was an amendment proposal to have a civil society council built into the DSA and it was rejected. The truth is that this is always a possibility for the Commission to establish expert groups. We are, for instance, preparing the possibility to have workshops. There are always means and I don’t think the Commission is shy of consulting third parties and stakeholders including civil society. In this particular field, it’s essential to get the views and to get as pluralistic as possible, the views of the final user. So, I cannot say whether this will be institutionalized. The question is whether there is an added value and a call to institutionalize because otherwise, it would not work. 

Julian: That point is well taken. I think institutionalization for the sake of institutionalization doesn’t make sense. It was also just a question, since sometimes consultations as rather short-notice ad hoc requests for information, both at the national and at the Commission level, are difficult for NGOs, and for other organizations to follow through. That’s why I was wondering if there was a way to improve that situation. I think this is something that civil society and other organizations can and should make proposals on how to achieve that. 

I’m going to turn over to the very many audience questions now. 

Dealing with illegal content: “There is no impact on the geographic scope of national law.”

Julian: So, one of the questions regards the definition of illegal content. How is that going to work out, because the wording in the DSA seems to broadly take in all national legal provisions? This would mean that the very strictest national laws need to be applied union-wide, which was probably not intended, this person asks. Is there a clear way out or an argument now? Can you speak to the definition of illegal content in the DSA? 

Irene: Yes, this is a key question and was heavily discussed in the negotiations because it is not really a definition. I mean, it is a definition of illegal content, but it is a cross-reference to anything that is illegal, it is not defining what is illegal in X or Y situation. That would be way beyond the scope of the DSA, of a horizontal instrument. So given that this is horizontal, it cannot go into that. And also, the legal basis would be a problem. 

So indeed, if a piece of content is illegal in one member state, then it gets a notice and the notice explains why it is illegal, again, the law in that member state. Then the notice will explain why it is illegal in that member state, not across Europe, it is for the company to decide whether to remove the content in that member state, elsewhere, etc. But the notice and action mechanism is to clarify what is the exemption of liability for that intermediary, it is not to enforce the law. The enforcement of the law happens usually through public channels, but because there is a notice that is making that intermediary aware, it triggers the actual knowledge that that content is illegal in that member state. Then it is for the intermediary to decide whether or not to act. So, there is no impact on the geographic scope of national law. 

Julian: Okay, so you take away the fear that what is the very strictest definition of illegal content will become a standard across the EU. That’s not what you’re describing right now. Okay, thank you. We’ll move on. And I’ll try to group a little bit but we might jump topics here. 

Support for researchers: Delegated act on data access only in 2024 

Julian: This next one is on data access. The person asked if there will be any support from the Commission expanding researchers’ access to VLOPs. So, what type of research questions might be relevant. I would also add from my side, the templates to even get the data request. Could you please speak to that, any support from the Commission to researchers? 

Irene: We are empowered under Article 40 of the DSA to develop some of those points in a delegated act. And we are starting the work on that delegated act that would allow and would support the exercise of that important right. So, I cannot give more details at this stage, because these are initial, these are first steps. We have until February 2024, because the delegated act has to be subject to the consultation of the Board, which means that it doesn’t matter how fast we run in preparing the delegated act, we would have to wait anyway. In this sense, we will for sure provide support, in that sense, in helping create the necessary structures and the necessary safety also for the platforms, by the way, because they also need to be safe that what they are sharing is also compliant with all Union laws. 

Julian: So, realistically, the delegated act probably more next year versus this year, if you say that you have to wait on the Board? 

Irene: Yes. 

Julian: You could also read that question maybe a little bit differently. Is there some sort of outreach by the Commission? Do you see it as the responsibility of the Commission or even of the national regulators to go out there and inform people about the rights and the DSA? And researchers about their rights to data access, some type of public-facing campaign? Is that part of your task and responsibility as well or not so much? 

Irene: I would not say it’s our task or responsibility, I would say that it is our interest to increase awareness because access to data for researchers is not a present for researchers, I’m sorry to say, it is a mean to get interesting, important research that will prove or will assess the existence of risks created by the very large online platforms, so it is adding to the compliance. It is quite an instrumental objective. It is in our interest that every researcher that has something to say and something to help in this field will know about this right. 

Julian: We have two questions on timelines, let’s see in how far you can answer that: First, when do you expect the Code of Practice on Disinformation to become a code of conduct under the DSA? Second, when do you expect the designation of the VLOPs and the very large online search engines? 

Irene: For the second one, we are on it, and I cannot give a date. It’s the time it takes to conduct the necessary steps with all necessary due process, we need to be also sure that the decisions are robust. On the first question, the article related to codes of conduct gives an important role to the Board. And again, because of the final agreement in the last trialogue, it was decided that there would be this two-stage implementation or entry into application, which means that only when the Board is there, in February 2024, we will be able to trigger this exercise which will be subject to the conditions established under the DSA. 

National laws like the NetzDG: “We have improved the national laws, not only the German ones.”

Julian: The next question concerns national laws. So the DSA is in place, and this person asks what happens to other laws, that concern platform regulation, specifically content regulation. And the examples given is the German Network Enforcement Act, the NetzDG, which covers some of the same things that the DSA does. So, what will happen to those in your view? 

Irene: That’s a very important question because the legal fragmentation was one of the reasons behind the DSA in the Commission proposal. We saw that several member states were legislating, not only Germany and not only on hate speech. That was the last wave, let’s say, but already since the E-Commerce Directive, this was left to member states, the possibility to establish notice and action procedures. There were several member states having notice and action procedures for copyright issues. And at the end of the day, we have 20 member states with different rules. At the end of the day, because the internet has evolved as it has evolved, it applies to the same companies. 

So, it was unsustainable from a single-market perspective. Now we have a regulation which is a full harmonization measure. It intends to apply directly to everything that is regulated, the objectives covered by the DSA, which means that member states will have to repeal national laws that overlap, not only contradict, but also overlap. If they copied and pasted the DSA, it would also go against European law, because it would be a duplication of sources of law. So, we will look very carefully into the existing laws at the national level. This will not come as a surprise either. 

Then, there are also the articles that the DSA takes from the E-Commerce Directive, the liability exemption, that pass from a directive, that has to be transposed at the national level, to a regulation that is directly applicable and does not need to be transposed. Those articles also need to be repealed at the national level. 

You mentioned the NetzDG in Germany. Indeed, this law has served as an inspiration to the DSA, I think I can safely say. On the other hand, I think that we have improved the national laws, not only the German ones, there are several out there. And there is no need for national instruments if we have now a European one that covers the territory. 

Julian: Thanks for that. I think I can answer that, from the German side, that it is widely expected that the NetzDG will, for the most part, be repealed. 

The DSA already inspires bills in other parts of the world. Platforms, especially very big ones operate globally anyways. How does the Commission consider the importance of international regulatory dialogue and dialogue with international civil society groups? It’s a question from the Electronic Frontier Foundation. 

Irene: Yes, we are very active on that front. I think it’s not a secret that we have the aspiration to have the DSA as a gold standard, as was the case also for the GDPR [General Data Protection Regulation] some years ago. We have regular contact with other outside countries. The TTC dialogue in the US is one example [Trade and Technology Council]. And we have seen for instance, in recent events organized by UNESCO, that the DSA is taken very seriously by countries all over the world as a good example to follow. I think this is important to mention because when we proposed the DSA, we also had in mind how the DSA would play in less democratic hands. And I think the DSA passed the test. This is something that I also would like to discuss, if someone disagrees, that this is an important element across the negotiations also. In terms of civil society, we also see a lot of interest for instance from researchers from third countries. And this is something that we intend to cover also in the work that we start conducting on the delegated act. 

Julian: There are two questions regarding the DSA in relation to the Digital Markets Act, DMA. There was one question from Asha Allen, who asked if you could please comment on the disparity between civil society consultation when it comes to the DMA and DSA. In the DMA, there are open working groups, they are public on the website. Civil society can provide expertise into the development of the mechanism. That is not the case for the DSA so far. So, if you could speak on that. 

There was another question regarding an overlap of the actual regulation, when it comes to the recommendation algorithms on market platforms specifically. Is that the case that there is an overlap in this, between the DMA and the DSA, for example, Amazon’s recommendation engine might fall under both the DMA and the DSA? This is a question from Jannis Brühl. 

Irene: On the first question on whether the DSA and the DMA follow the same approach as regards civil society consultation, I don’t think it’s very different. I think we are in different stages. I think in the DMA field, where my unit is also working together with the responsible unit in [the Commission’s directorate-generals] Connect and with Digital Competition, we have worked closely together to organize workshops to get the views from final users on specific issues where we need to understand from the final user on where is the problem created by the potential gatekeepers in order to target a solution. This is something that we are also considering in the DSA to organize workshops, but I think that the DSA is giving a stronger role for civil society because they are fully integrated in several instruments. But workshops are also foreseen for different stakeholders, later in the moment when we have passed the moment of designation of very large online platforms. 

Then the question on the potential overlap: There is potential overlap with many instruments of Union law. And it is not an overlap, it’s the fact that the two regulations apply to the problems created from two different angles. One is on the competitiveness that is affected by a given tool or by a given core platform, and the other one is about the societal issues created by that online platform. Whether the same platform is designated in one or the other instrument, this can happen, or this may not happen, this is a case-by-case issue and they will be designated on the sole basis of that instrument. Both instruments are independent, even if we are close friends, even family, but they are independent. And if there is closer oversight of a given tool or feature like a recommending mechanism under the two instruments, I think it can only benefit the final user, it is not an overlap in the sense that they are colliding, it is adding supervision from two different angles. 

Enforcement at the national level: “The resources have to be proportionate to the responsibilities that that Digital Services Coordinator will have.”

Julian: The next two questions concern the Commission and the relationship to the Digital Services Coordinators. Florian Schweitzer asks if the Commission will take responsibility, at least temporarily, if the member states do not appoint their Coordinator by the deadline. The deadline for the member states to nominate their Digital Services Coordinator is February 2024. What would the Commission do with member states that don’t follow the deadline? 

The other question is regarding the resources. Is there anything the Commission can and will do to encourage member states to equip their Digital Services Coordinators with enough resources? Which is another prerequisite that the DSA has. What will happen if member states don’t do that? 

Irene: Yes, I am afraid that the DSA or the general structure of European law does not allow us to take over the role of member states’ responsibilities. If a given member state does not appoint a Digital Services Coordinator, they will be in violation of Union law and the Commission as guardian of the treaty has tools to act. We cannot say, “Now we are the national Digital Services Coordinator”, because it is not intended to be the case. This is on the first question. 

On the second question, on the resources of the Digital Services Coordinator, this is a very important question because resources, as you mentioned at the beginning, are always scarce and there will always be less resources than needed but more than provided. In the DSA, there is even wording in the recital that explains that the DSC in order to be independent, it needs to be properly staffed, it needs to be properly resourced, and not necessarily in proportion to the size of the country, but in proportion to the ecosystem that they host. If they have many platforms, they have a bigger responsibility, as regards the DSA, than a very big country that has very few platforms or intermediaries hosted. So, the resources have to be proportionate to the responsibilities that that Digital Services Coordinator will have. 

Julian: One question here doesn’t concern a specific part of the DSA but it’s a general question of what you think is the weakest point of the DSA where the European Court of Justice might be skeptical in the future. 

Irene: You’ve caught me by surprise. I am quite confident that the Court of Justice will not find anything in the DSA that is challengeable. 

Data provided by platforms to researchers: Exemption for trade secrets “absolutely necessary”

Julian: Okay, confident. I mean, that’s the theme of what you said earlier. 

I have three questions on very specific articles. If possible, just brief answers. Ilaria Buri asks on the access to data for researchers: Article 40 in the DSA says that there is a trade secrets exemption. So, platforms can say, “We can’t share this data because it’s a trade secret.” This might undermine the actual reach of this important provision in the DSA, what are your views on this conflict? And what are potential remedies? 

Irene: Yes. This exemption is absolutely necessary because what we don’t want is to kill the full business model by just giving out all the trade secrets that any company online or offline may have. I think this is important to stress. Obviously, this can also be misused, in the sense that information that does not represent the trade secret is kept behind that reason. So, this is an element that we will look into also in the delegated act, in order to spell out the conditions, what are the equivalent or the alternatives that can be provided? This will be subject also to regulatory oversight. 

Julian: We’ll keep going with Article 25, which is the provision on deceptive design. There is a provision in the DSA that states that platforms should not have online design interfaces that deceive or manipulate the users or in a way that otherwise materially distorts the ability of users to make free decisions. This person said that “materially distorts” is an unclear term. They asked: Are the examples given in the article – prominence of certain options, repeatedly asking for choices that have already been made – are those the things that are per se prohibited or only if they materially distort the user’s choice? 

Irene: I think, I will use the joker of the guidance on this particular question. No, seriously, this is a provision that was not under the Commission proposal, it was added afterwards and it was the result, if I may say so, of a given frustration against practices that are there. “Dark patterns” are well-known practices and are already prohibited under consumer protection, unfair practices and also under the GDPR. 

The DSA covers “dark patterns” to the extent that they are not already prohibited under other instruments of Union law and I mentioned the Unfair Commercial Practices Directive and the GDPR. Then, what is a dark pattern? It will have to be established on a case-by-case basis. I cannot say and even less in an open and public event, what are my views on what constitutes or does not constitute a banned “dark pattern”. I think there will be lots of examples and new examples that are not used today and this will evolve over time. And it made no sense to add in the article a full list of dark patterns because these will be outdated in 15 days. 

Julian: The other specific question goes back to our conversation about risk assessments. These are Articles 34 and 35. How do you see the delineation of the risk assessment when they touch upon media pluralism, which is a national prerogative in many cases? That goes back to the discussion we had about both risk assessments and the cooperation between the Commission and member states. 

Irene: I don’t know if it’s an enforcement overlap or a competence overlap. Media pluralism is a fundamental right and actions or measures or designs that negatively affect fundamental rights have to be assessed under the risk assessment and mitigated under the risk mitigation method. So, this is fully covered under the DSA, when it regards this risk assessment and risk mitigation measures. It does not mean that the member states will not have the possibility to cover other aspects of media pluralism. There are other instruments of Union law also being negotiated in that field. 

In any event, media pluralism has to be read in conjunction with the right to information, freedom of expression. The designated company, when conducting the risk assessment, will have to analyze to what extent their design, the algorithms, the recommender systems, the advertising model have an impact on every single fundamental right. Media pluralism will be one, freedom of expression and information will be another one, and all the rest to follow. I do not see any overlap in the enforcement between the Commission and the member states in this regard. 

Julian: It also goes back to what you said earlier that the decisions regarding individual pieces of content, no matter what they are being looked at for, whether it be privacy violations or media pluralism or illegal stuff, that will be left to national authorities, national courts, just like before, correct? 

Irene: Yes. And that reminds me of a very important point that I always stress and this time I forgot. The DSA is not about the content, it’s about the processes. So, it is about the procedures established by a given platform about transparency, about due diligence, about giving agency to the user. It’s not about deciding what to do with a given piece of content and what not to do. I think it will be ill-defined if even at the national level or at the Commission level, we start checking what they did as a consequence of one notice or as a consequence of a complaint. 

On working with Irish regulators: “I think the Irish authorities are taking it really seriously.”

Julian: I will close with one last question regarding the enforcement and the Digital Services Coordinators. You stated earlier that you were confident what the Commission is doing, what the member states are doing. This participant today asked, “How confident are you in regard to the cooperation with the Irish regulatory authorities as a future DSC?” I think this person is alluding to the GDPR, the data protection rules, where there were some issues in enforcement. So, maybe close with a short answer on your confidence in working with the Irish regulator. 

Irene: I will be totally honest, I think the Irish authorities are taking it really seriously. From what we have heard, they have been the first ones appointing the Digital Services Coordinator. They are staffing the Digital Services Coordinator and if you’ll look at the numbers that have been published, Ireland will not be the only one hosting very large online platforms. So, I would not focus only on Ireland, I think it is a joint effort. Indeed, we are working closely together not only with Ireland, but with all member states to be sure that they are on time, and that they are ready. 

Julian: Thank you. Yeah, well, that’s right: It will be more than the Irish regulator, but a lot of it will be Ireland. I think the Dutch and the Luxembourgers so far have also had VLOPs or search engines in their country. But again, point taken, that preparations are underway in Ireland, which is, I think, a promising answer to this last question. 

I do apologize, there were a ton of questions that we didn’t get around to, I can offer from my side, please feel free to be in touch. 

At this point, Irene, I want to thank you very, very much for discussing the DSA with me, and DSA enforcement, and also for answering so many of our audience questions. This is much appreciated. I also want to thank my colleagues Josefine and Justus for their great support behind the scenes. And thank you, the audience. It was great having you here. Please do let us know what you thought of this format and what you thought of the discussion. We’d appreciate any feedback that you might have. 

If you’re interested in receiving invitations for upcoming background talks, or SNV papers, please sign up for our newsletter. Other than that, once again, many thanks to our guest, Irene, and have a great rest of your day to all of you. 

Irene: Many thanks to you also Julian and to all of the audience. 

Julian: Thank you. Bye, bye. 

Irene: Goodbye.

- Ende des Transkripts -

Published by: 
Stiftung Neue Veranstaltung
March 16, 2023
Authors: 

Dr. Julian Jaursch