I thought this was a really good step in showing just how much enforcement is being done, but also some stats and data on the different types of issues people have had on line.
They've had a massive increase in taking down fake accounts (4.33m in 6 months), using a "proactive" approach to find those accounts. They've also made 2.5m enforcements based on player reports, which they describe as "reactive" enforcement. The main categories for those actions are swearing, sexual content, and bullying.
The total number of reports by players was surprisingly high to me, at 33m in a 6 month period, but that's almost halved from 6 months in 2020 where it was 59m!
I like stats, but importantly the report has quite a good summary of links to all of their policies but also helplines and things that people might find useful. Apparently they have referred just over 3,000 people to the Crisis Text Line over concerns around self-harm.
They've had a massive increase in taking down fake accounts (4.33m in 6 months), using a "proactive" approach to find those accounts. They've also made 2.5m enforcements based on player reports, which they describe as "reactive" enforcement. The main categories for those actions are swearing, sexual content, and bullying.
The total number of reports by players was surprisingly high to me, at 33m in a 6 month period, but that's almost halved from 6 months in 2020 where it was 59m!
I like stats, but importantly the report has quite a good summary of links to all of their policies but also helplines and things that people might find useful. Apparently they have referred just over 3,000 people to the Crisis Text Line over concerns around self-harm.