Guillaume Guinard is a Sciences Po graduate and former research assistant at the Chair. He has just completed his master’s thesis, which analysed and anticipated regulatory challenges brought by existing metaverses, and his research interests include the ethics of AI, the governance of platforms, and technology as a tool to influence behaviour.
What made you want to write about Roblox?
My main motivation was to ground my analysis of the ‘metaverse’ in a tangible and current case study. The term ‘metaverse’ is used loosely these days to describe a range of seemingly inevitable evolutions in tech, from the replacement of almost every existing online service with a blockchain-based app, to Meta’s sci-fi-like depictions of concerts attended both in person and remotely through holograms. As such, analysis often amounts to speculation that is either optimistic or pessimistic depending on one’s own worldview. I believe approaching this endeavour more rigorously is possible, as technological evolution does not arrive out of nowhere or through a single linear process. It follows from the incentives of specific actors within the parameters of the law, or lack thereof.
Ignoring this runs the risk of giving a second wind to the laissez-faire approach to tech regulation which has allowed companies like Meta to trade social wellbeing for financial profit in recent years. Given that Meta is investing massively in its own ‘metaverse’ project, I took their stated goals and economic model as a starting point. I found many similarities between those and that of Roblox, an online sandbox-like universe that has become one of the largest video game companies in the world during the pandemic. I was initially introduced to Roblox through a series of YouTube videos by People Make Games’ Quintin Smith, which argued that the company profited by putting children at risk of abuse and exploitation. Given that Roblox had been under little scrutiny by researchers and policymakers until this point, I found it to be an exciting case study for my thesis.
What are some of the policy problems we see arising in this early version of a metaverse?
Roblox functions as a gig economy for children, where no income is guaranteed, and minimal restrictions are placed on users in order to maximise the incentives to pour time and labour into the platform. As such, the platform incorporates the harms linked to the status of gig workers in the real world, as well as new ones linked to its status as an online environment.
Roblox provides users with free tools to make their own games (labelled ‘experiences’). Through the platform, they can design structures and assets, create original gameplay mechanics, and incentivise other users to spend virtual currency. This virtual currency can be bought with real currency, and then converted back for a profit at a rate and under conditions set by the platform.
As such, this system profits from the labour of users (who are mostly children). Roblox has total and unrestricted authority to ensure that it keeps most of this revenue, and that the share going to the children is as small as possible. Additionally, the system incentivises strangers to work on games together in order to increase their chances of making a better and more profitable game. This puts gullible children at risk of providing labour to teams which will not pay them back or abuse them in other ways, as these organisations operate outside of Roblox’s platform. Still, Roblox enables and incentivises these by letting users recruit each other on its dedicated ‘job board’. If a company were to do this in the real world, it would undoubtedly be deemed illegal. As real money, real labour and real children are involved, it is urgent for regulators to catch up with what is happening online.
There are also new questions related to data ownership, as users can create a Roblox account and explore experiences without providing any personal identifiable information (unless they purchase virtual currency). This means every aspect of their behaviour can be scrutinised and used by the platform according to its own goals, without requirements for transparency or the protection of data linked to children, as they aren’t necessarily ‘identifiable natural persons’ (the condition which triggers the application of GDPR) in this context. This means data about children can be used to influence them for commercial purposes.
In fact, my research shows that multiple aspects of Roblox’s business model rely on influencing the behaviour of users without any age-based safeguard, according to the platform’s financial incentives. This includes gambling and financial speculation, for example when children are encouraged to purchase expensive virtual assets in the hope that their value will increase. Again, this would be considered scandalous if it was happening in another context.
Finally, the platform’s emphasis on minimising the barriers to creation means the amount of content added to the platform is way higher than what can be effectively moderated, especially given that users have found ways to circumvent algorithmic means of moderation such as word censors. Roblox thus provides the means and financial incentives for an underage user to create an experience with sexual themes – which, even if it is blocked, can be indefinitely re-uploaded by creating new accounts behind a VPN. Even if these sexual experiences stay up no more than thirty minutes, they provide potential means for adults to meet underage users in an environment where no child should ever be.
What do you think we can learn from this in regulating the broader industry?
Methodologically, we can gain better insight into how to effectively regulate the future of technology by looking at what existing economic models companies are seeking to reproduce and amplify, rather than through speculation following from their own sensationalist PR. Given how wealthy and popular Roblox has become, and its relevance to Meta’s own metaverse project – which has garnered way more attention from policymakers – it is quite troubling that it has been so neglected within politics and academia so far.
As early adopters of online multiplayer immersive environments, children under 16 are the most at risk regarding the harms I have outlined above. This means policies which seek to mitigate these harms should be designed with the highest standards of safety, considering the long-term impacts these harms can have on individuals’ intellectual and psychological development. As an example, restrictions related to online purchases by children are based on the collection of the child’s data, which becomes entirely irrelevant within an online environment where purchases are done with a virtual currency through an avatar. Current legal frameworks therefore fail to protect children being incited to gamble or purchase obscene content.
Regulations related to the status of gig workers which are currently being developed ought to ensure they include labour happening in an online world, as this can be dissimulated as mere ‘fun’ users are having, which puts children at risk of exploitation. Moreover, since such labour is compensated with virtual currency, it is essential to pass regulations ensuring such currency has a fair conversion rate when used as the default mode of compensation for labour.
In order to remain relevant, regulations related to personal data should be updated in order to ensure they include the specific ways data from individuals is collected within online environments. This means establishing a clear link between an individual and their avatar, even though the avatar might not be linked to the individual in any identifiable way. The behaviour of the avatar can nonetheless be tracked far more intensely than that of a user of a traditional website, as it includes locations, interactions, physical movement, all within a single platform.