Irrotational

Prophet of Truth
Member
Oct 25, 2017
7,788
I thought this was a really good step in showing just how much enforcement is being done, but also some stats and data on the different types of issues people have had on line.

They've had a massive increase in taking down fake accounts (4.33m in 6 months), using a "proactive" approach to find those accounts. They've also made 2.5m enforcements based on player reports, which they describe as "reactive" enforcement. The main categories for those actions are swearing, sexual content, and bullying.

The total number of reports by players was surprisingly high to me, at 33m in a 6 month period, but that's almost halved from 6 months in 2020 where it was 59m!

I like stats, but importantly the report has quite a good summary of links to all of their policies but also helplines and things that people might find useful. Apparently they have referred just over 3,000 people to the Crisis Text Line over concerns around self-harm.

 

Seraphs

Banned
Sep 22, 2022
640
xbox wire article:

news.xbox.com

Xbox Shares Community Safety Approach in Transparency Report - Xbox Wire

At Xbox, we put the player at the center of everything we do – and this includes our practices around trust and safety. With more than 3 billion players around the world, vibrant online communities are growing and evolving every day, and it is our role to foster spaces that are safe, positive...

  • We are taking action to offer better experiences. The Xbox team issued more than 4.33M proactive enforcements against inauthentic accounts, representing 57% of the total enforcements in the reporting period. Inauthentic accounts are typically automated or bot-created accounts that create an unlevel playing field and can detract from positive player experiences. Our proactive moderation, up 9x from the same period last year, allows us to catch negative content and conduct before it reaches players. We continue to invest and improve our tech so players can have safe, positive, and inviting experiences.
  • Players are stewards of the community. Player reporting is a critical component in our safety approach. Alongside our increased proactive safety measures, investments in scanning and filtering technologies, and education from the Xbox Ambassador community, reporting helps us improve the work that we do to better protect our players. Our players provided over 33M reports this period, with communications (46%) and conduct (43%) accounting for the majority of player concerns. Content moderation agents are on-staff 24 hours a day, 7 days a week, 365 days a year to make sure the content and conduct found on our platform adheres to our Community Standards.
  • Players are in control. Every player is different and preferences on content and experiences are not one-size-fits-all. We offer many ways for players to customize settings, from message filters to parental controls, at any point in your Xbox experience. These settings allow players to manage the type of content they see and experience across all the ways they play, whether on PC, console, or anywhere with Xbox Cloud Gaming (Beta). It's up to you, the players, friends, and family members to take control and implement the settings that are right for you.
 
OP
OP
Irrotational

Irrotational

Prophet of Truth
Member
Oct 25, 2017
7,788
I also wanted to say a huge thanks to all the folks that have to sift through that stuff and make the decisions. It must be a really hard job :(