• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Deleted member 54292

User requested account closure
Banned
Feb 27, 2019
2,636
Ugh... I feel really bad for those moderators. They essentially are bombarded with humanities darkest acts constantly. Fuck Facebook. Delete that shit if you really care.
 
It's all a massive indictment of social media companies. This isn't just about insuring that contract workers are treated better. The underlying problem is these companies have created machines that spread unspeakable content and damage.

And they don't want to fundamentally rethink the machines because they're making money from them.
 

MercuryLS

Banned
Oct 27, 2017
5,578
This is one job computers with advanced algorithms needs to take over completely. I feel for those people. Fuck Facebook and fuck these sub contractors, they are evil and take advantage of people.
 

Fatoy

Member
Mar 13, 2019
7,267
It's all a massive indictment of social media companies. This isn't just about insuring that contract workers are treated better. The underlying problem is these companies have created machines that spread unspeakable content and damage.

And they don't want to fundamentally rethink the machines because they're making money from them.
I reckon they're in the process of rethinking them as we speak, and reorienting them to create revenue in different ways.

Right now, social media companies make advertising money from the sheer volume of their users, and some level of market segmentation. In the near future, I see online and real-world identities converging so that user volume will decline considerably, but user quality (from a targeting point of view) will skyrocket. This will solve the moderation issue - which is currently taking a horrific toll on people - and also extend Facebook's tendrils into being able to target and sell / resell absolutely anything with an incredible degree of granularity, from face cream to financial instruments.

I don't envy whichever social media company has to come out first with its 'quality over quantity' message, because shareholders will not respond well in the short term. In the longer term, though, I genuinely don't see any other way this can go. Like I said on the previous page, there's no going back to the era of publisher approval and accountability.
 

Gaius Cassius

Member
Oct 28, 2017
1,871
Oregon
Sorry for the double post, but here's a worrying thought.

A lot of the flood of awful content presumably comes from the ease with which anyone can open a Facebook account, and the lack of repercussions for having that account closed. One 'solution' to this would be to apply a more rigorous KYC process to opening a Facebook account, and tying online identities more closely to real-world ones. In one fell swoop, your need for moderation dries to a trickle.

Of course, that then plays into Facebook's hands by enriching the data set they already hold, and allowing them to be even more intrusive and embedded in everything from commerce to healthcare.

It's morbidly interesting to think about, though, because the relative anonymity and disposability of online identities is what allows horrific content to be so widely shared. Remove that, and you can remove the root cause of the problem - but at the obvious expense of an apocalyptic loss of privacy.

Part of me feels like there's no other way to put the genie back in the bottle when it comes to social media, though. We're not going back to a world where publishing something for others to consume has to go through a formal approval process - the volume is just too high. So we need to shift accountability back to content producers / uploaders, and that's impossible unless we know who they are. I feel like everything else talked about in this article, and anything else the companies involved are doing, is just stalling until online and real world identities converge.

You're right, there is no fixing it, at least for now. I used to have the same job for a year, but for a different contractor company than the one in the video.

I will tell you now that 99% of the pages We took down would write an 'appeal response'. It was always the same, either they claimed 'I didn't break the rules', or 'I didn't know it was against the rules'. And that's after they posted some crazy shit.

Some people are sneaky with how they posted their shit, though. For example: you could have a person with a legit innocent page advertising their business. However, they are attached to someone known as a business manager, and he has other people with their own pages too. Money is paid to Facebook to keep the business pages up, typically the business manager does the paying. I got the feeling that many scammers and posters of illegal content would go seek out innocent people, and attach them to their account in order to sneak past our censors. Using them, essentially. It was tough reading a message from a family business who really had no idea what the business manager or someone on another page on the business account was up to. They'd write in their appeal message how they were starving and their families depended on their page store. That we were essentially killing them. It was a fucking stressful job and I'm glad to be rid of it.
 

Morrigan

Spear of the Metal Church
Member
Oct 24, 2017
34,472
So much insanity in there. From the graphic atrocities they have to watch and moderate, the horrid working conditions and shit pay, to the employers policing employees' bathroom times (seriously, WTF is this?)...

One part that really depressed me is how futile it all seems to be. One mod says reporting content to law enforcement rarely results in action being taken by police. And the example they gave that DID lead to arrests? It was two women filming themselves encouraging kids to smoke pot. Which, ok, is bad, but it won't give the moderators PTSD or anything. Compared to the sheer filth and depravity of some of the other stuff they see, like animal abuse, child abuse, etc., this is what got police action?

And the iguana part... christ. A mod had to "leave" a gross animal cruelty video up so that "police could catch them more easily". Except police did nothing and the video kept getting re-reported. WTF Facebook? Why can't you at least hide the content and report it to police yourself?

Fuck Facebook.
 
Oct 26, 2017
16,409
Mushroom Kingdom
Not even touching upon the absolutely disturbing content they have to deal with, the work conditions sound horrendous. What in the fuck.

Opens with an employee dying from stress on the job, managers then downplaying that an employee had just died...to make sure productivity up. yet it just gets worse and worse.

I hope this puts the pressure on Facebook hard.
 

Kill3r7

Member
Oct 25, 2017
24,628
That sounds like one of the worst jobs imaginable. Cognizant should be sued out of business.
 

shiba5

I shed
Moderator
Oct 25, 2017
15,863
I feel sick after reading that. I would have PTSD within hours of working there.
 

RedMercury

Blue Venus
Member
Dec 24, 2017
17,734
I can't imagine the shit those people see. I mod on reddit and even that is toxic as hell as far as doxxing, threats, abuse, and just the torrent of terrible things you see every day- people never stop finding different ways to be terrible, it's easy to get a dim view of humanity.


PTSD is a real thing with moderators, there was a woman who was doing a study on it but I unfortunately can't find the link.
 

Unicorn

One Winged Slayer
Member
Oct 29, 2017
9,653
The iguana story is bad, but wtf at the translucency policy and the organ harvesting.

Scarred just from reading.
 

Morrigan

Spear of the Metal Church
Member
Oct 24, 2017
34,472
Worth reiterating that the organ harvesting part was fake, at least.

But it doesn't make the rest of it any less sickening.
 

Palantiri

Member
Oct 25, 2017
545
Cognizant sounds like a horrible company that is largely disinterested in taking responsibility for the type of work its employees are required to perform. And Facebook seems to be only marginally interested in having a coherent and robust stance on content management and their responsibility in that matter.

The position of content moderator seems to be unclassified and unregulated. Trauma specialists need extensive training and subjecting people to this type of programming is obviously a type of torture that not every person can endure. This position and its responsibilities needs to be adequately classified, with an appropriate pay scale and access to the required health services. Ultimately this should be seen as a highly specialized position that is sufficiently regulated to ensure the health and well being of employees. Just hiring a person above minimum wage and then washing your hands of them once they get PTSD should be criminal - like practicing a licensed profession while unlicensed.

Content delivery and moderation requires updated standards. Social media does not deserve to get a pass on content regulation. The idea that everyone has a right to make virtually any type of content available on these platforms under the auspices of free speech needs to change. Any questionable content should be sequestered immediately, reviewed by qualified legal and trauma specialists and then forwarded to law enforcement as required. Law enforcement also needs to get with the century and take any such occurence of illicit content distribution very seriously.

If users have to wait for content to be thoroughly vetted then that should hopefully bring their expectations back in line with those of a sane society that understands that programming like this needs extensive moderation.
 

jelly

Banned
Oct 26, 2017
33,841
How come they have AI for lots of shit but not this?

Those rules, I mean wtf. Facebook is abhorrent. It's dollars over everything.
 

Donos

Member
Nov 15, 2017
6,543
I think that one worker had the right idea in his interview: shut down Facebook
That's not a sole Facebook problem. These firms would then get hired by other content platforms where vids have to be checked and deleted. People who take these vids and the ones who keep uploading them again and again are the cause. World is fucked up.
 

golem

Member
Oct 25, 2017
2,878
They should make Zuck do this for a week.

Wait nevermind, it probably wouldnt bother him at all.
 

FinKL

Avenger
Oct 25, 2017
2,987
Man, this is so bad. Sounds just like the MK artists forced to watch sickening stuff.
 

Deleted member 2533

User requested account closure
Banned
Oct 25, 2017
8,325
I have some sympathy for FB, Reddit, Twitter, Youtube et al when it comes to people just yelling "get more moderators." There is no easy solution. AI isn't magic, they need people, and thousands of hours are being uploaded online every minute.

Resetera doesn't have effective moderation because this site has cracked some code, or built up some kind of "woke" community; Era's moderated effectively because we're small. That's the only reason. Even then, mods have given up their duties here because of reasons of time commitment. Now imagine if Era had hundreds of millions of users. It would be impossible.

And someone mentioned deepfakes as well. That's a whole 'nother can of worms. I read an article on Youtube having a problem with those weird CGI Youtube Kids vids where it's like Spider-Man and Elsa driving taxis while pregnant and shit, and it's clearly being generated using tools to make it as quickly as possible. We're at the point where this stuff might be being auto-generated and auto-uploaded already. Now imagine instead of poorly animated cartoons, it's sexually explicit or violent media featuring politicians, activists, and other public figures.

In closing, there is no way for Facebook, or any other tech company to fix this. When your community reaches a certain size, moderation will be impossible. The solution is less on corporations and more on users to stop engaging these platforms. It's like littering, or eating junkfood, a little is manageable, but if everyone is doing it all the time, it's devastating.
 

Shodan14

Banned
Oct 30, 2017
9,410
I have some sympathy for FB, Reddit, Twitter, Youtube et al when it comes to people just yelling "get more moderators." There is no easy solution. AI isn't magic, they need people, and thousands of hours are being uploaded online every minute.

Resetera doesn't have effective moderation because this site has cracked some code, or built up some kind of "woke" community; Era's moderated effectively because we're small. That's the only reason. Even then, mods have given up their duties here because of reasons of time commitment. Now imagine if Era had hundreds of millions of users. It would be impossible.

And someone mentioned deepfakes as well. That's a whole 'nother can of worms. I read an article on Youtube having a problem with those weird CGI Youtube Kids vids where it's like Spider-Man and Elsa driving taxis while pregnant and shit, and it's clearly being generated using tools to make it as quickly as possible. We're at the point where this stuff might be being auto-generated and auto-uploaded already. Now imagine instead of poorly animated cartoons, it's sexually explicit or violent media featuring politicians, activists, and other public figures.

In closing, there is no way for Facebook, or any other tech company to fix this. When your community reaches a certain size, moderation will be impossible. The solution is less on corporations and more on users to stop engaging these platforms. It's like littering, or eating junkfood, a little is manageable, but if everyone is doing it all the time, it's devastating.
Yup, and all the digusting stuff described isn't because "Facebook bad", which that may also be true, it's because people are horrible regardless of Facebook.
 
Oct 28, 2017
5,210
Thank fudge this is coming out, finally. I had always wondered if Facebook even treated it's content moderators properly for all the stuff that gets by, and I am disappointed that they do not.
These are Cognizant employees. I'm all in favor of people pushing for Facebook to push companies like Cognizant to get their shit together or lose their business. This is like when FoxConn does something gross and people blame Apple for it even though Apple isn't their employer or even the only company doing business with FoxConn.

Content moderation is a ridiculously tough job and I don't expect it ever to pay well because of the way the work scales. But I do think a lot of work needs to be done to help avoid these really bad mental health problems. It feels like the kind of job that should be very generous with paid time off.
 
Oct 25, 2017
8,257
The Cyclone State
I'd remind everyone, too, that this isn't just Facebook. These contract companies exist for every major site. My friend did this kind of work for Photobucket (admittedly at a better workplace) and it's torture after a while. People who do this should be paid extremely well and have fantastic health insurance and access to therapy. They also need to limit the workers to a certain number of hours per day viewing graphic content. My friend reported really disgusting stuff every day, that takes a toll on anyone.
 

Aureon

Banned
Oct 27, 2017
2,819
I get that, but you might as well be permanently banning the people who post this stuff.

Fuck that.
We need real, actual laws against public indecency on the internet.
Post that shit anywhere that's not a specific, trigger-warned place for it?
You get lawful consequences.

That wild west phase has to end.
 
Oct 28, 2017
5,210
There's plenty of insane shit online that gets posted on places like Facebook. They really need a firebrand approach to moderation. Ban IP's, ban MAC addresses, just do something to prevent people from posting shit like that because you don't belong on the service if you do something like that.

But nah, profits.
Ban IPs and MAC addresses is absurd. IP addresses are not even permanent and do not represent any individual. A MAC address represents a unique device, but that doesn't represent an individual.

Content and user moderation at a global level is a really tough problem. People often want to hand wave solutions like this that won't work. And banning anything more than the user themselves is not feasible.
 

Aldi

Member
Oct 27, 2017
4,635
United Kingdom
I'd like to think that this content is passed on to the police who use technology to find the people involved making the videos described and arrest them, but I honestly dont think anything is done.

I saw some sick shit in the early days of the internet. My brother accidentally downloaded a video from limewire that was some guy being shot to death and gasping for air as he slowly died and that has always stuck with me. I can't imagine what psychological damage watching a load of these videos would do to someone.
 

rein

Member
Apr 16, 2018
713
I don't think i could handle watching hundreds of animal abuse videos every day.

This is like something from a dystopian novel. Sadly no one will be held accountable for this...
 
Oct 28, 2017
5,210
Fuck that.
We need real, actual laws against public indecency on the internet.
Post that shit anywhere that's not a specific, trigger-warned place for it?
You get lawful consequences.

That wild west phase has to end.
Why? I can freely write a letter to somebody without explicitly giving my confirmed identity. I can distribute advertisement to a large community without giving my identity. There is no legal requirement for me to state who I am publicly to a government body unless I am suspected a crime. But you want us to have to confirm our identity when acting on the internet? You really see no potential issues with this?


People keep looking at this content moderation problem and think it's as simple as making one or two radical changes and ignoring how those changes damage other things.
 
Oct 28, 2017
5,800
Ban IPs and MAC addresses is absurd. IP addresses are not even permanent and do not represent any individual. A MAC address represents a unique device, but that doesn't represent an individual.

Content and user moderation at a global level is a really tough problem. People often want to hand wave solutions like this that won't work. And banning anything more than the user themselves is not feasible.

I'm not daft when it comes to this. I know MAC addresses resemble a device but once something is burned from being able to spew on Facebook, how many more devices is a person gonna own to keep this up? Also with IP addresses, I know they're not tied to an individual, but you can at least stop them temporarily. A lot of these people spin up new accounts to post the same shit over and over.
 
Oct 28, 2017
5,210
I'm not daft when it comes to this. I know MAC addresses resemble a device but once something is burned from being able to spew on Facebook, how many more devices is a person gonna own to keep this up? Also with IP addresses, I know they're not tied to an individual, but you can at least stop them temporarily. A lot of these people spin up new accounts to post the same shit over and over.

VPNs make the IP idea pointless.

Many people share devices like computers and tablets. Even phones exchange hands a lot. You're going to have a lot of users banned from one of the world's biggest services because a previous user posted something vile. The issue here is you're hell bent on fixing this problem while ignoring the problems that your solutions make.
 

Deleted member 9838

User requested account closure
Banned
Oct 26, 2017
2,773
I fucking hate Facebook with a passion. If it isn't obvious at this point that this narrative that they are connecting the world and making the world a better place isn't complete bullshit I would look atthe parallels between this job and third world trash sorting. The shadow follows. This is the garbage dehumanizing job of the digital world which gets you barely $3000 a month for decades of trauma and mental tensions. Sue Facebook now, time to break them up and regulate them to fuck.
 
Oct 26, 2017
3,925
I made the mistake on reading the comments when The Verge posted the story on facebook, and the corporate shilling/lack of empathy was somehow more insane than I was already anticipating
 
OP
OP
Oct 2, 2018
3,902
I can't imagine the shit those people see. I mod on reddit and even that is toxic as hell as far as doxxing, threats, abuse, and just the torrent of terrible things you see every day- people never stop finding different ways to be terrible, it's easy to get a dim view of humanity.


PTSD is a real thing with moderators, there was a woman who was doing a study on it but I unfortunately can't find the link.

Well that was another horrible read. It's like the things you hear about on the dark web literally surfacing on edge of normal social media - that bit about the handcuffed woman at the end. Horrible.

The world is fucked up.
 

Heckler456

Banned
Oct 25, 2017
5,256
Belgium
actually the article is way more sickening to read.

"
fuck.

In June 2018, a month into his job, Facebook began seeing a rash of videos that depicted organs being harvested from children. So many graphic videos were reported that they could not be contained in Speagle's queue.

JESUS FUCKING CHRIST!
What in the everloving fuck at that last part. The world is a fucked up place.

edit: Actually, looking that one up, it seems those videos might have been fake. Doesn't take away from the gruesomeness or shocking nature of it, but it's something, I guess.