The Digital Services Act (DSA) was published in the Official Journal of the European Union at the end of October 2022. Seen as an “impossible reform” when the European Commission started to work on it in 2020, it was finally quite a quick journey to get all actors to agree on a European regulation that would ask online platforms and search engines to get better on content moderation. The goal is to avoid the much too real and serious issues that are arising today from illegal content, disinformation or bubble filters on those digital and privately-owned social media spaces. Sciences Po held a two-day event on 12 and 13 January to discuss the many details of this new regulation by inviting all kinds of actors that made this regulation come true: European and national regulators, digital platforms representatives, researchers, specialised digital consultants, citizens, etc.
The “Content Moderation in the Age of DSA” event was organised by the Good in Tech research network, in partnership with the Sciences Po Chair Digital, Governance and Sovereignty, the McCourt Institute, and the MSH Paris-Saclay, with the support of ARCOM, the DE FACTO project, Institut Mines-Télécom Business School and the Sciences Po médialab.
One of the round tables that took place on 13 January focused on the European dynamics that led to the DSA and that will now be in motion to apply this pioneer regulation. Another focused on the part civil society and academics can play in this new regulation.
Nathaniel Persily (James B. McClatchy Professor of law at Stanford Law School, Co-Director, Stanford Cyber Policy CenterProfessor of Law at Stanford Law School), explained that the key value at the heart of the DSA is transparency, it is a “metareform”. The social media platforms and the search engines have become public spaces and the data they collect can help “craft laws” relating to child safety, privacy, terrorism and so on. The access to this data is needed to “study social data and society” and to “understand issues”. This need for transparency does need to be put in perspective with the users’ privacy, maybe with different levels of access to the data according to the public: citizens, governments or researchers.
Shaden Shabayek (post-doctoral researcher at Université Paris-Saclay and associate researcher at SciencesPo Médialab), confirmed that there is “low sensibility and high sensibility data” and that the data that is not so sensitive could already be available and a powerful tool to analyse for example “the influence of misleading content” by disclosing metrics as the “reach” along with the “engagement metrics” that are already available on the Meta platforms.
As underlined by Frédéric Bokobza (Deputy General,Arcom which is the French regulator for online media), “The right scale is the European one”. France can be a useful partner in applying the DSA as the French regulators already have some experience in the domain (with the hate speech legislations of 2019 and 2021), but because the platforms are so global, a European regulation is the right call. The French regulator sees the DSA as a “very powerful toolbox” that the national regulators will have to learn to use. They should always aim to find balance : between protecting the users and freedom of expression, but also a dialogue between regulators, platforms and civil society.
Indeed, Prabhat Agarwal (Head of Unit at European Commission), one of the main makers of not only the DSA but also the Digital Market Act (DMA), recalled that “passing legislation is a constant matter of compromises”. He quoted Bismarck saying: “Making law is a bit like making sausages, it’s nice to discuss the end product but you probably don’t want to know how they were made”. The complexity of this almost impossible task was to find common ground with every actor involved: the platforms as well as the governments. Celene Craig, (Chief Executive of Broadcasting Authority of Ireland), who is currently working on implementing the DSA nationally, agreed with the other speakers that “all legislation is a political compromise”. The users must be protected but the national regulators need to be able to work with the platforms, they must beware of the “magic regulator” syndrome and leave enough space for the very complex platforms to evolve and adapt to the DSA. That is why the route chosen by the DSA was to set in place an autoregulatory structure that platforms can use to assess the risks they and their users are facing. To achieve this risk assessment, they must use the DSA by collecting an important amount of data and that might be the main difficulty of this new regulation.
Two of the regulators present that day, Celene Craig in Ireland and Frédéric Bokobza in France, both warned about the human and financial need behind the DSA. The “DSA Coordinator” that will be part of the “European Board” will need enough support and a strong team to be “consistent and efficient”, said Frédéric Bokobza. Celene Craig hopes that the national regulators will “really work together and not bring their national agenda”, to deliver “the high level objectives” expected. She insisted on the need to “develop cohesion in the group” by bringing back the focus on their “common purpose”. Prabhat Agarwal casted some light in the way the cooperation of the national regulators will be built: digitally. “We cannot regulate digital economy with paper”, added the Head of Unit at the European Commission, a “web functioning IT system” should be created.
The other actors that will be strongly impacted by the DSA and that fear the resources needed for the implementation are the platforms themselves. Two guests worked for Google and Wikimedia (owner of Wikipedia). They both see the DSA as an innovation that reveals the public interest for transparency and the danger of misinformation. Clément Wolf, (Head of Information Quality Policy, Google), welcomed the fact that the platforms were invited to discuss and work on the DSA’s code of practice. Dealing with disinformation might be a priority for those platforms, as well as protecting active principles as diversity and freedom of content. Jan Gerlach (Public Policy Director at the Wikimedia Foundation), also believes that the platforms and their limitations were taken into account with the “notice and take down regime” of the DSA, that recognises that a website as Wikipedia, with content mainly created by its users, cannot “monitor and intervene everywhere”.
The two platforms representants from Google and Wikimedia did see some risks to this new regulation: the huge amount of time that this “data generation machine” (as Prabhat Agarwal called it) will take from the legal teams, a managerial and financial liability, but also the infringement of user’s privacy that would represent collecting data from an account that has been deleted, for example. Jan Gerlach explained that this regulation is, as every European regulation, a “blueprint for the world” and he worried about the situation some businesses might be in if every country starts to craft its own DSA and expect extensive data reports… Nate Persily also warned European regulators about “boiling the ocean” by applying to smaller scale organisations a legal regime that was “thought for Facebook”.
Serge Abiteboul (Member of the executive board of ARCEP, researcher at INRIA, PARIS, Valda team, DI, ENS, CNRS, PSL University), had another worry in mind. The “self-assessment” mechanism asked by the DSA implies that the platforms will need to self-assess their content moderation and then ask for an audit. He wondered what would be the perfect team to audit and help platforms assess their risks. He trusts that the best solution would be to assist national regulators with a “social media of experts doing content moderation (for digital platforms or NGOs) and animated by academics”.
In a world where the digital platforms algorithms change everyday, along with their rules, regulating this fast paced industry is a real challenge but a necessary one. Shaden Shabayek insisted on the importance of “reflecting upon the rules, as a community of researchers and as a community, period”. In this area, Sciences Po takes its role as a leading research university very seriously, as proven by the organisation of this flashship event by three in-house innovative units: Sciences Po médialab, the McCourt Institute and the Digital, Governance and Sovereignty Chair, Sciences Po.