[RESEARCH PAPER] European Sovereignty in Artificial Intelligence: A Competence-Based Perspective, by Ludovic Dibiaggio, Lionel Nesta and Simone Vannuccini
8 November 2024

[ANALYSIS] Digital Fairness Act: the EU’s next project to ensure a fairer digital environment for consumers

With the Digital Services Act (DSA) and the forthcoming Digital Fairness Act (DFA), the European Union is stepping up its efforts to ensure a fairer and more transparent digital environment. Based on the recent findings of the Digital Fairness Check, the DFA aims to protect consumers from unfair online practices such as dark patterns (misleading and manipulative interfaces) and influence marketing.

As digital experiences become more central to our lives, the European Union is committed to building a fairer and more transparent online environment. With the entry into force of the Digital Services Act (DSA), it has begun a shift towards greater digital responsibility. On 3 October, the European Commission also published the results of its ‘Digital Fairness Check’, an assessment of whether current consumer protection rules are still adapted to changing online practices. The report identifies a number of unfair practices, including dark patterns, addictive interfaces and complex subscription cancellation procedures, which are likely to guide the preparation of the future Digital Fairness Act (DFA). The text that will be put forward by the new European Commissioner for Democracy, Justice and the Rule of Law aims, on the whole, to strengthen consumer rights, while adding another layer to an already dense regulatory landscape. For businesses operating in the EU and beyond, these new regulations represent a unique opportunity to demonstrate a concrete commitment to truly user-centric practices.

10 months on, what impact has the Digital Services Act had on transparency and fairness online?

The Digital Service Act (DSA), which came into force in February 2024, represents a major change in the way online platforms operate. Many companies, particularly Very Large Online Platforms (VLOPs), have been urged to adapt their practices to comply with the new regulations. The initial phase focused on transparency, content moderation and user empowerment. Platforms must now provide clearer information about their advertising practices and algorithmic recommendations.

One of the main objectives of the DSA is to empower users, and we are starting to see the first results. Users now have more direct channels for reporting illegal content. This change aims to create a more ergonomic or ‘user-friendly’ environment that encourages consumers to play an active role in their online experiences. The European Commission has also begun to put in place the necessary frameworks to monitor compliance. This includes the appointment of dedicated authorities to oversee the implementation of the DSA and ensure that platforms comply with their new obligations.

Action has already been taken against some companies, demonstrating the EU’s commitment to holding digital service providers accountable. For example, Meta is being investigated for misleading advertising practices and managing political content, with the aim of protecting democratic processes; TikTok has already been implicated in two investigations, one in February and the other in April, for failing to comply with privacy protection standards and transparency, particularly for minors; Google is under scrutiny by the Italian competition authority for ambiguous consent practices relating to personal data; X (formerly Twitter) is also under investigation for disseminating illegal content; Amazon is the target of a class action in connection with a change to its Prime subscription, deemed to be misleading. Finally, at the end of October, the European Commission launched an investigation into Temu’s compliance with the DSA, covering listings of illegal products, addictive features, the transparency of recommendation algorithms and restricted access to data for researchers. This series of investigations underlines the EU’s determination to apply the DSA in favour of consumer rights.

The Digital Fairness Check: what assessment can be made of the regulations put in place to protect the free choice of the digital consumer?

In 2022, the European Commission launched a public consultation called the Fitness Check, aimed at assessing the effectiveness of pre-existing DSA regulations in protecting online consumers. Specifically, the Fitness Check looked at the effectiveness of three key EU directives: the Unfair Commercial Practices Directive, the Consumer Rights Directive and the Unfair Contract Terms Directive. On 3 October, the Commission published its long-awaited report

The main focus of the report is on deceptive, manipulative and addictive practices known as ‘dark patterns’ – unfair practices in the design of digital interfaces that induce consumers to make decisions they would not otherwise have made. The impact of these practices, intensified by personalisation based on behavioural data, poses a major challenge for online consumer protection.

Although these practices are not new, their prevalence and effectiveness have increased, giving rise to concern, particularly in the United States, the United Kingdom and South Korea. These concerns have become widespread and are now a global concern. OECD has devoted several reports to the subject. In Europe, a large number of texts apply to these dark patterns, not only as unfair practices, but also as contrary to the law on the protection of personal data and the prohibition of abuse of dominance. In view of the significant risks of damage, the Commission has nevertheless deemed it essential to prohibit them more explicitly with the DSA (1).

In fact, the recent report highlights the main harms associated with dark patterns: loss of autonomy and privacy, cognitive overload, mental damage, and reduced collective well-being through adverse effects on competition and price transparency. Significantly, the Commission estimates the cost to consumers of being misled online at €7 billion in 2023, compared with the cost to businesses of complying, estimated at between €511 million and €737 million – a strong argument in favour of regulation.

Digital Fairness Act: why do we need European regulation of online commercial fairness?

Following this assessment, Ursula von der Leyen’s mission letter to Michael McGrath, the European Commissioner for Democracy, Justice and the Rule of Law, published last September, set out strategic priorities for strengthening consumer protection and promoting democratic integrity within the EU. In this letter, President Ursula von der Leyen refers to the need to create a future Digital Fairness Act, ‘to fight against unethical commercial techniques and practices, such as dark patterns, influencer marketing on social networks, addictive design of digital products and online profiling, in particular when consumers’ vulnerabilities are exploited for commercial purposes.’ At his hearing before the European Parliament on 5 November 2024, Michael McGrath then set out his vision for the future Digital Fairness Act (DFA). In his opening statement, McGrath stressed the need to strengthen consumer rights in the digital marketplace (2). In response to concerns expressed by several MEPs, he said that the DFA would be designed to fill gaps rather than duplicate regulations (3).

Protecting the most vulnerable: targeting minors

One of the major challenges of the project is to protect minors from harmful online practices. In response to questions about the protection of minors, McGrath highlighted the unique challenges they face (4). The DFA will be an opportunity to add an extra layer of accountability to platforms. It is likely that they will have to take user vulnerabilities into account in the design of their interfaces, based on a risk assessment. It is also conceivable that the DFA will create new subjective rights for minors or their parents. We could, for example, think of a right to parameterisation enabling users and therefore also the parents of minors to act directly on the architecture of choices.

Elsewhere in the world, other countries have already adopted regulations to specifically protect minors as particularly vulnerable users. For example, the United Kingdom’s Age-Appropriate Design Code (UK AADC ) and the California Age-Appropriate Design Code (CAADC) require online companies to assess the potential risks for young users on their platforms and to implement the strongest protection settings for these users by default.

Taking vulnerabilities into account at the design stage: towards a ‘Fairness by Design’ approach to online consumption practices

To combat dark patterns and addictive design, the European Commissioner could also draw inspiration from the work of the UK Competition and Markets Authority (CMA), which, in a report on choice architecture (5), proposed a new guiding principle for a fairer digital environment: ‘fairness by design’. This principle invites interface designers to take into account from the design stage both human cognitive limitations and sensitivity to the way in which information is presented, as well as the dichotomy between consumers’ intentions and actions.

This approach encourages a design that facilitates informed decisions by making information accessible with a minimum of cognitive effort, offering contextual guidance and protective default options. It is in line with Value-Sensitive Design (VSD) (6) methods, based on the ability to adapt design practices according to technology, values or context of use. Examples of these adaptations include ‘Privacy by Design’, a guiding principle of the RGPD, which focuses on respecting the confidentiality of personal information in systems and processes.

Better regulation of online influencer marketing: the challenge of harmonisation

In recent months, several European countries, including France and Italy, have stepped up regulation of online influencer marketing to protect consumers. In addition, there are already European regulations governing online marketing, such as the Audiovisual Media Services Directive, the DSA and the Unfair Commercial Practices Directive. However, tensions persist regarding the harmonisation of national legislation with European regulations. For example, references to the Audiovisual Media Services Directive and the DSA have been a point of contention between France and the Commission, as the French Influencers Act makes no reference to the former and creates an overlap with the latter. A new European regulation could help to harmonise regulation by providing an explicit regulatory framework and a common roadmap.

However, the idea of a new regulation is not unanimously supported. The European Regulators Group for Audiovisual Media Services (ERGA) considers that online commercial influence is already well regulated by existing laws, and therefore recommends strengthening national regulators with more resources and specialised staff, rather than introducing new legislation. In an environment where laws are multiplying, the risk of legislative overlap and inconsistency is real. In conclusion, the FAD is ambitious but will not only have to be aligned with existing legislation, but also contribute to its harmonisation by specifying its scope and clarifying its interaction with other texts. The major challenge of the text is therefore to reduce regulatory fragmentation and offer companies a unified framework, making their compliance more legible.

Notes

(1)The DSA endorses a ban on the practice with its Article 25: ‘Providers of online platforms shall not design, organise or operate their online interfaces […] in a way that deliberately or in fact deceives or manipulates recipients of the service, impairing or compromising their autonomy, decision-making capacity or choices’.

(2) ‘I will bring forward a Digital Fairness Act to strengthen consumer protection in targeted areas, complementing the existing EU digital regulatory framework.’

(3) ‘ The Digital Fairness Act is not about adding new requirements already covered by other regulations, […] It is about filling existing gaps to better protect and support consumers. ’ he clarified.

(4) “The business model of the tech giants… they want to keep people online all the time, including our children, and that’s how they increase the advertising revenue spent on their platform. […] We understand this model, and we’re going to have to do something about it under the Digital Fairness Act. I can assure you that we will do so, because some of the harmful features on these platforms have an impact, particularly on children in their formative years, and these effects can be long-lasting, he explained.

(5) The concept of ‘choice architecture’ was popularised by Richard Thaler and Cass Sunstein, v. Thaler, R. H. & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth and happiness. Yale University Press.

(6) Value Sensitive Design is a theoretical approach to technology design that incorporates human values. It was developed in Washington by Batya Friedman, v. Friedman, B. & Hendry D. G. (2019). Value Sensitive Design: Shaping Technology with Moral Imagination. MIT Press.


Fabien Lechevalier is a doctoral student in law at the Université Paris-Saclay, a researcher at the Centre d’études et de recherche en droit de l’immatériel (CERDI Université Paris-Saclay), and an affiliate of the Transatlantic Technology Law Forum at Stanford University.