Staying civil in Roblox: Video Games Industry Memo - 01/02/2024
Taking safety beyond rules and tools
Laura Higgins from Roblox chats to me about fostering digital civility
Xbox lay-offs, Apple shenanigans and a State of Play make for a busy news week
Suicide Squad: Kill the Justice League lands after troubled development
Bloomin’ heck, it’s February already. That means that we’re barely one month away from one of the most significant events in the global games industry: my birthday.
Ok, so maybe I’m actually talking about GDC 2024 in San Francisco. Fortunately though, I can confirm that I’ll be there and I am looking to interview lots of people for a) future VGIM long reads and b) another fun little project that I’m working on.
If you’re heading out to GDC and want to chat, or have a relevant bigwig in town that you’d love to put on the record with your pal at VGIM, drop me an email at videogamesindustrymemo@substack.com.
And while you ponder that, how does a long read about creating civility in the online world sound to you? Enjoy!
The big read - Staying civil in Roblox
This year’s Safer Internet Day takes place on Tuesday 6th February. The day, which is marked across the world, aims to raise awareness of efforts to create a safer and better internet for all: bringing together young people, teachers and educators, policy makers, Non-Governmental Organisations and industry to achieve the goal.
This year’s campaign comes at a particularly important time for online safety rules across the world. The EU’s Digital Service Act, which contains provisions to create a safer digital space for all, comes fully into force on 17th February, while the UK’s Online Safety Act is set to come into force fully later in the year.
Compliance with laws such as these is becoming an increasingly hot topic in the industry, as seen in recent VGIMs. But based on conversations I’ve had over the years with a range of online safety experts working within both the games and tech spaces, it’s clear that attempting to foster a positive culture within the digital world is as important - and perhaps even more so - as putting in place laws.
So how exactly do major companies approach the challenge of fostering a positive culture within their platforms and the games that exist on them? What’s a sensible balance between developer or company led enforcement of positive spaces and cultivating that culture organically? And how does regulation increasingly fit into all of these decisions?
Laura Higgins is a Senior Director of Community Safety and Digital Civility at Roblox. She’s worked in safeguarding both on and offline for two decades, setting up helplines for the UK Safer Internet Centre - one of the organisations behind Safer Internet Day - to tackle issues such as revenge porn years before Governments were even considering regulating the issue.
I talked to her about Roblox’s effort to create what it calls ‘digital civility’ and how that has shaped a much wider approach to the question of trust and safety.
Getting civvy with it
So first things first, what does Roblox mean by digital civility? What does it actually encompass? And are we definitely sure it’s not just a thing they’ve invented to dazzle regulators while they monetise a wild west?
While the answer to the last question is that the concept of “digital civility” has existed for more than a decade, Roblox’s answer to the challenge of developing it, according to Laura, is to attempt to achieve what sounds like a relatively simple goal in a necessarily complicated manner.
“Essentially, it's about people feeling safe and welcome”, she begins. “But it's also about human interactions and how people communicate. You know that you don't have to agree with everybody but you have to have the skills to know how to communicate appropriately.”
Those skills are varied. While Roblox is seeking to develop some basic digital skills amongst users to give them the means to protect themselves online - such as how to file a report about toxic behaviour - its ambition is to cultivate a community of online citizens who have the nous to read a digital room, can act properly within it and have the strength of character to stand up to challenges when necessary.
“There is definitely an element of knowing what's appropriate. A football match is going to be different to what's appropriate in church, for example. So understanding that and having the skills to navigate managing conflicts, managing disagreements online, keeping yourself safe as well as building resilience skills…equips the community to work together to create safe and welcoming experiences within Roblox where people can really thrive.”
Practically building digital civility, however, is hard. The process of developing a digital citizen requires the creation of a wider culture within society that prioritises positive online engagement, co-operation between government and industry to protect that space and interpersonal relationships which makes constructive behaviour possible.
Roblox, understandably, can’t control all of that. And even when it comes to the space it can control - namely, the platform it has built - its enormous user base, breadth of experiences available to them and ever-increasing age range of those within the platform means that its ambition for fostering digital civility must adapt to a dizzying array of contexts.
Delivering a digital civility strategy has forced Roblox to develop a range of tactics to foster it within the platform. The foundation of its efforts to do this are the range of technologies it has developed to try to protect and empower players in online spaces.
Roblox has implemented tools to support players within its online spaces, ranging from familiar user-led functions like muting and blocking through to the implementation of machine learning technologies to tackle the most severe harms - such as child sexual abuse material - as quickly as possible. It also offers parental controls to help guardians develop rules around the use of the platform with younger players.
Laura, however, describes these as “table stakes” for creating digital civility on any online platform. While it creates the framework for a digitally civil environment to develop within, Laura gave me an example that showed technology is a part of the wider process of empowering users to stay safe - not the end of it.
“Kids particularly like to use block and mute functions, probably more than they will report people. And when we think about child development and interpersonal relationships, there are some cues there to why they might choose those options. If you fall out with your friend, it’s temporary. You don’t want to get them in trouble, so having those things in place [rather than a blocking function alone] is essential.”
This means that building digital civility can’t be done through tools alone. Instead, it’s about recognising the complexity of interpersonal relationships and attempting to build a cultural compact between Roblox, its users and audiences who can influence them (such as teachers and parents who need to know about it for the people in their care) to create the conditions through which civil behaviour is the norm.
Partly, this involves adopting community management tactics familiar to businesses working across the games industry. Laura said that regular check-ins with the community are “really key” to her work and determining its effectiveness in tackling the problems users may be facing. The platform also aims to ‘nudge’ users towards civil interactions at relevant moments, such as upon sign up and through ongoing in game messaging encouraging positive behaviour and decisions that aligns to ‘real world’ norms.
However, Roblox’s vision for digital civility has extended its tactical approach to trust and safety beyond industry best practices to develop wide ranging interventions across society in a manner similar to a government change behaviour campaign.
Its strategy stretches to foster media literacy amongst younger users through resources aimed at teachers thinking about tackling online safety generally. Roblox partnered with Boston’s Digital Wellness Lab to work with a range of NGOs, academics and industry partners to develop a broader understanding of digital civility and methodologies for achieving it in different digital and national contexts across the world. It even creates paths for rehabilitation for users who have received sanctions for smaller offences within Roblox to prevent them from falling into darker spaces - something that is akin to wider societal strategies to prevent reoffending.
“We have to do those other things,” Laura explains. “We have to teach people the skills. We have to set good policies and have good community standards that are clear and understandable so people know what the expectations are when they come on the platform.”
In short, digital civility is “not just the tech, it’s interpersonal.” But achieving a kinder interpersonal experience on a platform with a daily active user base of 70 million people is much easier said than done - raising questions about whether such an approach can ever be truly effective.
A delicate balance
How do we measure if Roblox’s digital civility strategy is actually working? The problem with an ambitious, broadly intangible goal such as fostering digital civility is that it can be hard to demonstrate easily one way or the other that it is being achieved. Some critics may suggest that’s the point of it.
However, there are some notable arguments to suggest that the overall direction of travel for Roblox’s digital civility project has been positive for both the platform and the people on it.
The simplest argument to show that the plan is working - and one, for the record, that I’m advancing here - is its growth in both user numbers and commercial output.
An August 2023 paper about toxic communities in games produced by Take This, a non-profit organisation looking at the intersection between mental health and game communities, showed that allowing toxic players to rule the roost within games communities - and commandeer the spaces and cultures within - alienated the core player base: hitting a business’s bottom line.
Therefore Roblox’s user growth over the past year, increasing revenues and the frankly absurd range of reputationally aware brands, businesses and organisations moving onto the platform in the past few years suggests the platform has produced enough of a positive atmosphere amongst users to avoid the issues outlined in the paper.
Laura also demonstrated that there is grassroots buy-in to the concept from the Roblox developer community, with her team’s stand at last year’s internal developer conference unexpectedly flocked by experience creators wanting to implement the philosophy within the worlds they created.
But despite this, problems undoubtedly remain. At a wider level, a range of audiences - including parents, campaigners and policy makers - continue to be unconvinced that harms aren’t occurring in Roblox. For example, class action lawsuits have been filed in the US against the company by parents who allege that their children have been exposed to harmful content online.
Beyond these accusations of broader less tangible harms levelled at the platform, there are also accusations of specific harm that Roblox has caused within society, particularly in regards to extremist communities using the platform to groom and radicalise people. And while it is right to say that the links to Roblox form part of a much longer chain of digital and real world interactions, the fact these issues have occurred is the strongest - and most problematic - challenge to its vision of a digitally civil platform.
Addressing this is a tricky challenge for Roblox. Speaking to the extremism point specifically, Laura says that Roblox has a zero tolerance policy towards such content and has developed specific tactics for addressing significant illegal harms like this and child sexual exploitation as rapidly as possible.
“We have a dedicated team who are focused on this all the time. If there is any content that comes up, we take really swift action on it and ban any individuals who are responsible for creating it,” she explains.
She also contextualised the challenge by saying that such extreme content is rarely seen or experienced by most users. This aligns with conversations I’ve had with other sources, who say such experiences are typically accessed by tens or hundreds of people worldwide at most before they’re removed.
And without denying that problems with extremist content do occur on the platform from time to time, Laura also pointed to a report from the Simon Wiesenthal Center into Digital Hate and Terrorism which gave Roblox the highest mark for tackling digital hate and terrorism on its platform out of all the major tech platforms it examined.
These points are important to consider. But it also reveals the limits of digital civility. It accepts that fostering a positive culture cannot ever truly eradicate the risks of harm and illegal activity on platforms: asking us all to ponder how much societal risk we’re willing to swallow without regulatory input in return for freer online spaces.
Complicated situations
So what role should regulators play in all of this? While Laura does believe there is an important role for instruments such as law and technology in creating the boundaries around online spaces, she also thinks that treating either as a silver bullet for challenges online ignores the fact that the problems occurring there are mostly the same ones that we face in real life.
“I was involved in conversations with regulators, with the policy makers about those online harms when they would say “this company should just take it down, they should just deal with it” and I would be able to even say ‘well, I understand why it's not that easy’. And having sat, as I say, in dozens and dozens and dozens of round tables and briefings and meetings with people, there is still a misunderstanding and a disconnect between what's possible and what is realistic.”
This point is an important distinction when it comes to policy. If we step back and think about extremism again (sorry everyone), it’s possible to stop most threats of a terror attack on the London Underground by putting bag scanners at every Tube station. We accept, however, that it isn’t realistic to do so because the level of disruption caused to everyday life is disproportionate to the potentially tragic risk of an incident occurring.
The same calculus applies when it comes to thinking about online safety. And while Roblox’s vision of digital civility isn’t perfect, it at least appears to be an effort to realistically engage with the truth that human lives - and all the messiness that come with those - are occurring in online spaces.
Because the reality is that online harms and hate can’t be eradicated. As long as people are people, which they always will be, the tools and channels they communicate through will necessarily be subverted in increasingly clever, sophisticated and dangerous ways.
The question is what we can do collectively to limit the effectiveness of those harms. And while some of the answer inevitably must (and should) come through law - especially when it comes to identifying reasonable responsibility for addressing illegal activity and ensuring that the technological “table stakes” to protect users are in place - governments also need to invest as much effort into developing life-long education to help citizens develop the flexibility and critical thinking skills to navigate risks that that emerge much more quickly than regulation to manage them does.
For now though, Laura’s purview remains restricted to Roblox. She believes that the platform is a “safe and welcoming place”, that people will appreciate this more if they join it and that she’s willing to bet her standing in the online safety world on these points.
“I've worked in digital safeguarding all my career. I wouldn't be placing my reputation and integrity on the line if I didn't truly believe what I'm saying,” she told me.
And whether you do or don’t dive into Roblox on the back of what Laura says, the concept of digital civility she’s helped play a role in developing can at least be credited as a thoughtful way of conceptualising challenges within the online world - providing us all with at least some sort of framework for thinking more deeply about the issues that surround it.
Note: this piece was updated at 13:15pm on Tuesday 1st February to change the header image to a fresher one because Roblox’s PR team asked nicely (which, ironically, is a good example of digital civility)
News in brief
Xboxed off: Microsoft announced 1900 layoffs across both Activision Blizzard and Xbox last week. Yes, we did expect this to happen. No, it doesn’t feel great that industry layoffs have easily crossed the 6,000 mark for 2024 already.
Apple bites back: Apple has published its “new set of business terms” for developers who want to utilise alternative app stores or billing methods that look remarkably similar (and as punitive) as the first draft of Unity’s run-time fee. Eric Seufert has the best summary of what the new rules are and why Apple is clearly flipping the bird towards the EU’s Digital Markets Act.
Game passed: In more Apple chicanery news, the company has acquiesced on its previous ban of video game cloud streaming apps such as Xbox Game Pass on the App Store. The decision was announced last week, a day after the UK Competition and Markets Authority resumed an investigation into the distribution of cloud gaming services through app stores. Funny that…
Not really Palworlds: The Pokemon Company has said that it is “investigating” whether claims that Palworld has ripped off its IP are justified. While this will likely quietly rumble on for a while, it has already slapped down a modder for much more egregiously infringing its IP with a Pokemon laden mod of the game.
What do you mean they’ve already got a pun in the event name?: PlayStation’s latest State of Play will have taken place by the time you read this, lifting the lid on games such as Stellar Blade and Rise of the Ronin. Was I excited enough by the news to stay up until 10pm to watch the showcase and update this newsletter text accordingly? The answer, as you’ve probably guessed, is no.
On the move
Johanna Faries is the new President of Blizzard Entertainment…Steph Rogers has been appointed Senior Director of Developer Engagement, Global at Couchbase (and has video games in her brief)...Paul Gouge and Alex Rigby have opened a new mobile free to play studio called Forthstar in Manchester…
Jobs, jobs, jobs
Assembly has an opening for an Associate Director to work on the Xbox account…Splash Damage is hiring a Head of Global Brand Management…Rockstar Games is seeking a Production Coordinator in New York…Netflix needs help expanding its reach on mobile and wants a user acquisition type to become its new Creative Producer, Games Optimization Marketing…and if you like actual football and enjoy Vancouver, Electronic Arts is hiring a Producer - Content to help create marketing materials for EA Sports FC.
Events and conferences
DICE Summit, Las Vegas - 13th-15th February
Guildford Games Festival, Guildford - 16th February
Game Developers Conference, San Francisco - 18th-22nd March
London Games Festival, London - 9th-25th April
London Developer Conference, London - 11th April
Games of the week
Suicide Squad: Kill The Justice League - Reportedly troubled Rocksteady title releases to market without the beta ‘feature’ of a bug that automatically completes the game…
Persona 3 Reload - Remake of pivotal Persona game overhauls a classic’s mechanics for a modern audience.
Death Stranding Director’s Cut - 2021 version of Hideo Kojima’s courier ‘em up arrives (and it still feels remarkable saying this) on iOS.
Before you go…
A hat-tip to Michael French, master of ceremonies at London Games Festival, for spotting that Harry Clark - one of the villains from Werewolf-like TV show The Traitors - is pondering a career shift into the world of video game streaming.
It’s a great opportunity for games companies seeking a PR boost, provided they’re willing to work with someone who has proven themselves to be a compellingly untruthful git to millions of people.