Oct 2, 2018
3,902


Former Facebook content moderators are speaking out about their working conditions in the United States for the first time ever. The Verge traveled to Tampa, FL, to visit Facebook's worst-performing content moderation site and learn about what life is like for hundreds of moderators. They told us that the office is filthy, the work is grim, and the side effects of doing the job last long after it is over. Earlier this year, The Verge's Casey Newton broke the story about the working conditions of Facebook moderators in Phoenix, AZ. Newton published a followup today revealing that the pattern of severe workplace conditions extends to a second campus in Tampa, FL, but in even more extreme ways. You can read the full exclusive here: http://bit.ly/2IqQHqb You can find Casey's piece from earlier this year here: http://bit.ly/2WSKvA9

+

OP may want to give some kind of warning on this as there's some awful shit described in both the video and written story.

Here's the written story for those who don't have time for the video: https://www.theverge.com/2019/6/19/...-interviews-video-trauma-ptsd-cognizant-tampa
 
Last edited:

Razorrin

Member
Nov 7, 2017
5,237
the HELP Menu.
Thank fudge this is coming out, finally. I had always wondered if Facebook even treated it's content moderators properly for all the stuff that gets by, and I am disappointed that they do not.
 

Kalor

Resettlement Advisor
Member
Oct 25, 2017
19,694
That previous piece they did was bad enough but this is even worse. It's essentially just a job to systematically break people down and then bring in some fresh faces to continue the cycle, which I imagine is the same with Youtube, Twitter and any other platforms that allow user created content.
 

Camwi

Banned
Oct 27, 2017
6,375
They post a trigger warning before the video starts, but yeah, they go into graphic detail about babies and animals being tortured or killed. Don't watch it you're sensitive to that kind of thing.

I'm so glad I got off that fucking website, too. I mean sure, you can have a good experience in your little bubble of friends, but in the meantime there's shit like what's mentioned in the video, and an insane amount of misinformation making the general populace dumber and more radicalized.
 

entremet

You wouldn't toast a NES cartridge
Member
Oct 26, 2017
61,092
This the video version of the written piece correct?
 

HamSandwich

Banned
Oct 25, 2017
6,605
Ugh I'd love to get off Facebook but I use their marketplace. Other than that I have no use for it
 

TheGhost

Banned
Oct 25, 2017
28,137
Long Island
Sucks to hear continued negativity about facebook. It is a great when you just stick to the community groups for hobbies. People don't bring their at home drama or politics in there just the love for said hobby.

Being a moderator for Facebook though must be mentally depressive. Seeing the worst of humanity day in and day out + shit work conditions? No thanks.
 
Mar 29, 2018
7,078
Oh this is grim

Like admittedly it's not Facebook but a Facebook contractor

But it stands that Facebook "didn't want to see" the worst shit on its own site so outsourced it and paid clearly bottom dollar for this shit
 
Jan 29, 2018
9,486
Oh this is grim

Like admittedly it's not Facebook but a Facebook contractor

But it stands that Facebook "didn't want to see" the worst shit on its own site so outsourced it and paid clearly bottom dollar for this shit

Facebook also sets the guidelines these contractors are moderating by, and built a platform that ended up needing a rule that determines whether a video of fetal abuse can remain online by whether the skin is translucent or not.
 

Fatoy

Member
Mar 13, 2019
7,289
I was confused for a moment because the video thumbnail uses artwork from a story the same author did on this subject earlier in the year, but this is a follow-up and contains new interviews and insights. Just as an FYI for anyone else who thinks they've already read / watched this.
 

TheIlliterati

Banned
Oct 28, 2017
4,782
It's funny(terribly sad) that the one woman points out that you could hire people that aren't emotionally bothered by this sort of content. However that would just mean hiring psychopaths to police the content. They need to hire sensitive people and then emotionally/mentally destroy them to remove this filth from Facebook. So sad and disturbing.
 

Bear

Member
Oct 25, 2017
10,976
At the end of the article the author states they believe that changes will be implemented "as best as they can". I do not share this belief.
So you're insinuating that that belief makes him not a great reporter? That's a judgment based on hours upon hours of research, interviews and reporting...
 
Mar 29, 2018
7,078
Facebook also sets the guidelines these contractors are moderating by, and built a platform that ended up needing a rule that determines whether a video of fetal abuse can remain online by whether the skin is translucent or not.
😬😬😬😬😬😬😬😬😬😬😬😬😬😬😬😬😬😬😬😬

It's funny(terribly sad) that the one woman points out that you could hire people that aren't emotionally bothered by this sort of content. However that would just mean hiring psychopaths to police the content. They need to hire sensitive people and then emotionally/mentally destroy them to remove this filth from Facebook. So sad and disturbing.
We gone full Black Mirror over here
 
Oct 25, 2017
20,261
OP may want to give some kind of warning on this as there's some awful shit described in both the video and written story.

Here's the written story for those who don't have time for the video: https://www.theverge.com/2019/6/19/...-interviews-video-trauma-ptsd-cognizant-tampa

And here's some excerpts:

A Facebook content moderator working for Cognizant in Tampa had a heart attack at his desk and died last year. Senior management initially discouraged employees from discussing the incident, for fear it would hurt productivity.

My initial report focused on Phoenix, where workers told me that they had begun to embrace fringe views after continuously being exposed to conspiracy theories at work. One brought a gun to work to protect himself against the possibility of a fired employee returning to the office seeking vengeance. Others told me they are haunted by visions of the images and videos they saw during their time on the job.

Conditions at the Phoenix site have not improved significantly since I visited. Last week, some employees were sent home after an infestation of bed bugs was discovered in the office — the second time bed bugs have been found there this year. Employees who contacted me worried that the infestation would spread to their own homes, and said managers told them Cognizant would not pay to clean their homes.

They described a filthy workplace in which they regularly find pubic hair and other bodily waste at their workstations. Employees said managers laugh off or ignore sexual harassment and threats of violence. Two discrimination cases have been filed with the Equal Employment Opportunity Commission since April.

"At first it didn't bother me — but after a while, it started taking a toll," Bennetti told me. "I got to feel, like, a cloud — a darkness — over me. I started being depressed. I'm a very happy, outgoing person, and I was [becoming] withdrawn. My anxiety went up. It was hard to get through it every day. It started affecting my home life."

A worker named Lola* told me that health problems had resulted in her receiving so many occurrences she was at risk of being fired. She began going into work even when she felt ill to the point of throwing up. Facebook contractors are required to use a browser extension to report every time they use the restroom, but during a recent illness, Lola quickly took all her allotted breaks. She had previously been written up for going to the bathroom too many times, she said, and so she felt afraid to get up from her desk. A manager saw that she was not feeling well, and brought a trash can to her desk so she could vomit in it. So she did.
 
Oct 25, 2017
20,261
At the end of the article the author states they believe that changes will be implemented "as best as they can". I do not share this belief.

I believe Chandra and his team will work diligently to improve this system as best as they can. By making vendors like Cognizant accountable for the mental health of their workers for the first time, and offering psychological support to moderators after they leave the company, Facebook can improve the standard of living for contractors across the industry.

But it remains to be seen how much good Facebook can do while continuing to hold its contractors at arms' length.

Casey is specifically speaking to Chandra and FB doing their part, but then goes on to state it's hard to say what that is because of how FB has handled them.
 
Oct 25, 2017
20,261
Content moderation is just an awful job and people should be compensated way more for this than they're getting. The focus of this is on Facebook, but I have no doubt the same kind of low wage thankless shit is going on at other major platforms whom have to moderate content.

The story just serves as another sticking point in the exploitation of the workforce by deceiving them and creating a fear of lost wages because of how shit the wage gap is in the United States. Facebook fresh grads can make $300K in total comp to move some buttons around for marketing tests, but these people get awful work environments to have to filter out the worst of humanity.
 

Fatoy

Member
Mar 13, 2019
7,289
In the starkest terms, this is low-paid temp workers with zero training being forced to watch things (often repeatedly) that would faze even seasoned medical and law enforcement professionals. It's giving the job of policing the worst of human behaviour to people who would otherwise be doing database admin or working in retail, with the added cruelty that they are all but certain their actions will never make a difference.
 
OP
OP
Oct 2, 2018
3,902
the thing about the content moderation thing here is probably the same across youtube (maybe) - thing is: here, its an external contractor that does the monitoring and it sounds like they do not give a shit about the welfare of their employees with the 9 minutes of mental me time and the "quota" for the moderation.

I also think what's alarming is the inability to flag and boot users given the sort of content on display. That's a huge part of the problem. They need to be given power to ban and block content and users for good.

The bit about the woman who was choking and abusing the child was the most hard to take for me. And the bit about the iguana/animal abuse is also difficult.

In the starkest terms, this is low-paid temp workers with zero training being forced to watch things (often repeatedly) that would faze even seasoned medical and law enforcement professionals. It's giving the job of policing the worst of human behaviour to people who would otherwise be doing database admin or working in retail, with the added cruelty that they are all but certain their actions will never make a difference.

that is correct. They made it quite clear that they ended doing things that was not discussed in their interview.
 

entremet

You wouldn't toast a NES cartridge
Member
Oct 26, 2017
61,092
So they get paid 28,800 per year. Obviously underpaid for what they do.

An employee had a heart attack and died, which the parents blame on the job and conditions.

Bathrooms were covered with feces and menstrual blood.

What the heck is going on there?
 
OP
OP
Oct 2, 2018
3,902
actually the article is way more sickening to read.

"
For the six months after he was hired, Speagle would moderate 100 to 200 posts a day. He watched people throw puppies into a raging river, and put lit fireworks in dogs' mouths. He watched people mutilate the genitals of a live mouse, and chop off a cat's face with a hatchet. He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook "as long as the skin is translucent." He found that he could no longer sleep for more than two or three hours a night. He would frequently wake up in a cold sweat, crying.

fuck.

In June 2018, a month into his job, Facebook began seeing a rash of videos that depicted organs being harvested from children. So many graphic videos were reported that they could not be contained in Speagle's queue.

JESUS FUCKING CHRIST!
 
Oct 28, 2017
5,800
There's plenty of insane shit online that gets posted on places like Facebook. They really need a firebrand approach to moderation. Ban IP's, ban MAC addresses, just do something to prevent people from posting shit like that because you don't belong on the service if you do something like that.

But nah, profits.
 

Fatoy

Member
Mar 13, 2019
7,289
A rash of videos featuring organs being harvested from children who were alive and screaming??

What the unholy fuck??
The world is still a horrible place in parts. And we're paying people peanuts to make sure we only ever see a sanitised view of it.

Whichever way you slice it, this is a travesty. Either we accept that social media should be properly policed and cleansed - in which case the social media companies need to hire experienced, seasoned professionals and invest heavily in improving computer vision models to do this vital task - or we realise that having platforms where anyone can share anything might not be quite the utopian vision it's been sold as.
 

Big-E

Member
Oct 25, 2017
3,169
A rash of videos featuring organs being harvested from children who were alive and screaming??

What the unholy fuck??

Normally how these things play out is I imagine something and that is always worse than what the article talks about or the video displays.

Not this time. This is worse than what I thought possible. Like pure insanity. I couldn't make that up.
 

Ashlette

Member
Oct 28, 2017
3,254
This is so infuriating. The whole system is designed to prolong the suffering of these poor people, e.g. fearmongering supervisors who can fire them for any reason, the risk of staying unemployed, the pay that's barely "good enough" for a single person to pay their bills. These individuals should be proud of exposing Facebook like this. Nobody should get PTSD from doing their 9 to 5.

That thumbnail needs to change, though. Gritty and that dumb V for Vendetta mask are piss poor representations of the actual NSFL content presented in the video.
 
Last edited:

Fatoy

Member
Mar 13, 2019
7,289
Sorry for the double post, but here's a worrying thought.

A lot of the flood of awful content presumably comes from the ease with which anyone can open a Facebook account, and the lack of repercussions for having that account closed. One 'solution' to this would be to apply a more rigorous KYC process to opening a Facebook account, and tying online identities more closely to real-world ones. In one fell swoop, your need for moderation dries to a trickle.

Of course, that then plays into Facebook's hands by enriching the data set they already hold, and allowing them to be even more intrusive and embedded in everything from commerce to healthcare.

It's morbidly interesting to think about, though, because the relative anonymity and disposability of online identities is what allows horrific content to be so widely shared. Remove that, and you can remove the root cause of the problem - but at the obvious expense of an apocalyptic loss of privacy.

Part of me feels like there's no other way to put the genie back in the bottle when it comes to social media, though. We're not going back to a world where publishing something for others to consume has to go through a formal approval process - the volume is just too high. So we need to shift accountability back to content producers / uploaders, and that's impossible unless we know who they are. I feel like everything else talked about in this article, and anything else the companies involved are doing, is just stalling until online and real world identities converge.
 
Last edited:

Marvelous

Member
Nov 3, 2017
361
Fucking insane. Everything in the article gets exponentially worse with every paragraph, and when you think it can't get more fucked up, it absolutely does. Truly heartbreaking. The people they grind through just end up being fucked up for life to earn a few dollars an hour.
 

Qikz

Member
Oct 25, 2017
12,623
That previous piece they did was bad enough but this is even worse. It's essentially just a job to systematically break people down and then bring in some fresh faces to continue the cycle, which I imagine is the same with Youtube, Twitter and any other platforms that allow user created content.

This is sadly a lot of service industry jobs around the world. Anything dealing with customers by default is bad enough already, but many companies don't care. There's always going to be more people to fill the gaps when people burn out.
 

Ernest

Member
Oct 25, 2017
7,605
So.Cal.
I've whittled facebook down to just following my favorite bands and musicians, just so I can know when they have new music and a tour coming up. Otherwise, it's useless to me.
 
Oct 25, 2017
20,261
There's plenty of insane shit online that gets posted on places like Facebook. They really need a firebrand approach to moderation. Ban IP's, ban MAC addresses, just do something to prevent people from posting shit like that because you don't belong on the service if you do something like that.

But nah, profits.

This isn't a FB so much as it's a finger printing problem. The videos all get hashed for easier identification but slight modifications will create a new hash. It's a hard thing to solve because it's an ever evolving platform of media.
 

Kthulhu

Member
Oct 25, 2017
14,670
I really need to disable my account. The network effect is super real and it fucking sucks.
 

leafcutter

Member
Feb 14, 2018
1,219
In addition to the horrifying content described already, they have to watch at least 15-30 seconds of each video, with full audio? And the same videos keep popping up repeatedly all day? Jesus Christ.