Oct 25, 2017
13,240
Arizona
Just like in the Slavery thread, this wouldn't matter as much if people were properly educated enough to verify information and properly identify credible sources.
I know this post is a week old, but I disagree. The problem isn't just that people don't know how to spot false information and shoddy sourcing, it's that even when they do, people don't want to. Rather, people believe what they want to believe, regardless of everything else.
 

Lime

Banned for use of an alt account
Banned
Oct 25, 2017
1,266
Who knew that a mass media platform programmed by anti-humanist engineers and funded by neoliberal venture capitalists would be an unethical algorithmic monster.

Eg
 

ISOM

Banned
Nov 6, 2017
2,684
This isn't really how it works. Most of the time, the biases are informed by behavior. Say I want to use machine learning to predict who's a criminal by looking at them (a sort of precog algorithm). Well it's likely to discover that being black is good prediction feature because simply more blacks get arrested. That's not creator bias, that's societal bias. Or remember Microsoft unleashing Tay Tweets on Twitter as a blank slate and it repeating bigoted garbage? In fact, part of the problem is that we'd wish that algorithms change biases and that requires specific intervention to do the thing we want to see, not the thing actually happens.

So what you're saying using the example in the OP, is that people when people watched Hillary Clinton or Trump vids on youtube, tended to be people who were Trump supporters? Or that the data skewed that way?
 

TokyoJoe

Member
Oct 28, 2017
1,044
There sure are a lot of disgruntled employees from Google. They should keep the juice coming. Haha
 

the-pi-guy

Member
Oct 29, 2017
6,429
Algorithms are written by people and have implicit biases bestowed upon them by their creators.

It isn't quite how it works with learning algorithms. Biases in learning algorithms are created by data. All they do is try to match inputs and make predictions, make mathematical adjustments if the expected output doesn't match the actual. With the exact same learning algorithm, you can have completely different biases. If all your data is of a certain type, then using those algorithms on new data, can often have bad results.

You could for example, give 1,000,000 pictures of criminals who are women, and the program will think only women can be criminals. So men are given a pass. With the exact same algorithm, you can do the same thing with any other group.

It's not really the algorithm itself. It has to do with the data.

I'm assuming YouTube simply gets its data based on how other people click. It's not necessarily that YouTube did a bad thing or anything intentionally. But they could probably do more. Which may include adding more data to the network, adding a "truth value" or something like that.
 

Totakeke

Member
Oct 25, 2017
1,687
This is what current AI really does when objectives can be ambiguous. Image recognition and playing games are easy because there's very little to dispute what is the right result is. Current AI will never be the objective god some people imagine it to be.


If your goal is not to exacerbate actual or manufactured bias, then you need humans to bias the results another way. But when human subjectivity is involved, it's far harder to reach a consensus. In that sense it seems funny that the more correct view of AI are like 80s anime robot shows where how benelovent a robot is is entirely dependent on its creator.
 

4859

Banned
Oct 27, 2017
7,046
In the weak and the wounded
As a repreaentative of someone who doesnt do facebook, twitter, or has ever signed into YouTube liked or subscribed to anything....

I look down upon these masses from high above the mucky muck.

Until I realize I keep all my shit down there.

Yo, stop fucking up the world with your bullshit. Its all fuckin stupid anyways even If it wasn't weaponized!!!

Just sign the fuck out and don't look at it again. Cut off the money supply.

Youndont need any of those 'freinds'. You have resetera. And more importantly, me. I will be all the 'freind' you need. No one else understands you, no one else knows what you need. Not even you.

Only me.
 

idlewild_

Member
Oct 29, 2017
355
So what you're saying using the example in the OP, is that people when people watched Hillary Clinton or Trump vids on youtube, tended to be people who were Trump supporters? Or that the data skewed that way?

These recommendation engines are typically giant black boxes, they just run micro-experiments continuously that adjust what people are recommended. Strong emotional responses are good for watch time. Pro-Hillary videos just didn't elicit the same response as anti-Hillary videos, anything Trump, or fake news in general and therefore led to people watching a recommended video less.
 

True Prophecy

Member
Oct 28, 2017
1,963
I noticed the same thing. You watch something factual/scientific and next you are linked to fake nonsense/clickbait.
That's why I keep deleting the cache/collected data of the third party app I'm using to play youtube videos on android; this keeps the nonsense at bay for a whie.

No wonder so many weak minded people without tech knowledge start believing 9/11 conspiracy theories etc. It's basically all they are shown.

When I was looking up the Falcon Heavy and Starman stuff from Space X almost all my recommendations where youtube videos and channels saying it was all fake.

Made me really sad.
 

Totakeke

Member
Oct 25, 2017
1,687
These recommendation engines are typically giant black boxes, they just run micro-experiments continuously that adjust what people are recommended. Strong emotional responses are good for watch time. Pro-Hillary videos just didn't elicit the same response as anti-Hillary videos, anything Trump, or fake news in general and therefore led to people watching a recommended video less.

They are black boxes to an extent, but you can just look at the data closely enough to see such biases in the data by just using basic statistics. Do Trump videos get more and longer views? Do people who watch Trump videos come from certain geographies and have certain user attributes? Those are simple questions to ask if someone was really interested.

Problem is there's no real incentive for employees to do so, it's hard to measure what is actually the right thing to do, and it's not as easy as saying everyone who watches Trump videos are bad.
 

idlewild_

Member
Oct 29, 2017
355
Problem is there's no real incentive for employees to do so, it's hard to measure what is actually the right thing to do, and it's not as easy as saying everyone who watches Trump videos are bad.

I think it is less that there is lack of incentive and more that there is a disagreement about what should be done as you say in the latter half. I don't think the company wants to be in the business of arbitrating truth or morality, but obviously the status quo is having a toxic affect on society.
 

Totakeke

Member
Oct 25, 2017
1,687
I think it is less that there is lack of incentive and more that there is a disagreement about what should be done as you say in the latter half. I don't think the company wants to be in the business of arbitrating truth or morality, but obviously the status quo is having a toxic affect on society.

I mean, there's plenty of natural incentives to make users spend more time on YouTube. Imagine the employee bonuses if someone made everyone watch twice as many YouTube videos. Then imagine if whatever "truth" algorithm you want to introduce reduces or prevents that increase in video views, then people would naturally be incentivized to fight against you for it and you really get nothing in return for doing the "right" thing.
 

Fuzzery

Member
Oct 25, 2017
492
I think all of this can be solved if the algorithms are improved, and it gets really good at recommending what you want/new content

Instead of optimizing for watch time in the short term, optimize for cumulative watch time months or even years out
 

Totakeke

Member
Oct 25, 2017
1,687
I think all of this can be solved if the algorithms are improved, and it gets really good at recommending what you want/new content

Instead of optimizing for watch time in the short term, optimize for cumulative watch time months or even years out

Companies, especially publicly traded companies, are rarely incentivized to favor long term strategies over short term gains.

It's also much harder to measure something that is years out because inherently a lot of fundamental changes are hard to track. It's difficult to say that YouTube users from 5 years ago look similar to the YouTube users today and then ask your algorithm to optimize on users back then. Maybe there's far more kids on YouTube nowadays and whatever works for them isn't available in data years ago. Maybe there used to be more YouTube alternatives. Maybe changes on other platforms such as Facebook drive a lot more YouTube traffic nowadays. A lot of these things Isn't really reflected in the data by default and is hard to gauge without an incredible amount of analysis.
 

Deleted member 10060

User requested account closure
Banned
Oct 27, 2017
959
Not surprising at all. I never get recommended things I would actually like based on what I watch, but as soon as I watch one questionable video I will get recommended alt-right stuff for weeks. Like what the hell is going on there?

This has happened to me so much I have had to stop using youtube for anything but pure entertainment.

I used to watch the odd video here and there just to see what the opinions of people I disagree with really are, and it ended up flooding my youtube with racist, misogynistic drivel for weeks and months. Even after clearing watch history I still get recommended shit I do not want.
 

Jebusman

Member
Oct 27, 2017
4,246
Halifax, NS
I need to understand how Youtube is making this connection:

PedSdo6.jpg


Where is the algorithm making this connection. What even is that recommended video. I don't even want to click it because I'm sure it's just going to make Youtube's already bad recommendations worse.

Edit: Based on some of the youtube comments i'm not alone in wondering why the fuck I'm being directed there.

Maybe Youtube was a mistake. Maybe the world needs books again.
 
Last edited:

Dingens

Circumventing ban with an alt account
Banned
Oct 26, 2017
2,018
I need to understand how Youtube is making this connection:

PedSdo6.jpg


Where is the algorithm making this connection. What even is that recommended video. I don't even want to click it because I'm sure it's just going to make Youtube's already bad recommendations worse.

Edit: Based on some of the youtube comments i'm not alone in wondering why the fuck I'm being directed there.

Maybe Youtube was a mistake. Maybe the world needs books again.

nobody knows. The people working for Youtube don't know, the guys who wrote the algorithm don't know. That's why this stuff is so concerning
see:
https://www.resetera.com/threads/te...stopia-just-to-make-people-click-on-ads.4640/