• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
So, I was reading today's SMBC:

1608829591-20201224.png


And it raises an interesting point to me. We're kind of already in entirely different realities. Would deepfakes advancing even do anything other than make us toss out video evidence already? Shit, people are currently using an out-of-context clip of Joe Biden saying "we're putting together the most advanced voter fraud program" to push their point.

I can imagine things getting slightly worse, but now I don't see it as dramatic as I once did. Maybe we're already fucked, and we simply won't be able to trust footage at all anymore. But it's not like we currently live in a reality governed by silly things like facts.
 

L Thammy

Spacenoid
Member
Oct 25, 2017
50,054
There are other issues than just fabricating evidence. Deepfake pornography is also a thing.
 
Oct 27, 2017
1,681
Once deepfakes get really good:

gf: We're breaking up I just saw a video of you with another girl

guy: Baby it's ok it's just a deepfake

Yea... we are in for a messed up future
 

Amibguous Cad

Member
Oct 25, 2017
3,033
No, it wouldn't. The hysteria is complete nonsense. Photoshop already exists and is not a primary locus of fraud and deception, mostly because people know that it exists and don't take there being a picture of a certain thing as ironclad proof that it happened. Deepfakes just extend that to video and voice. Maybe there's a hoax or two that gains currency while the public is ignorant of the technology, but this is something basic media literacy can easily nip in the bud.
 

Transistor

Hollowly Brittle
Administrator
Oct 25, 2017
37,191
Washington, D.C.
While that comic is right that people who want to believe things will believe them without high quality deepfakes, the issue is that everybody else isn't fooled by low quality falsehoods

Deepfakes give a lot of people who wouldn't otherwise believe those falsehoods convincing evidence to believe them.

Deepfakes will be very dangerous in the future.
 
OP
OP

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
While that comic is right that people who want to believe things will believe them without high quality deepfakes, the issue is that everybody else isn't fooled by low quality falsehoods

Deepfakes give a lot of people who wouldn't otherwise believe those falsehoods convincing evidence to believe them.

Deepfakes will be very dangerous in the future.

My thing is, the moment that becomes a possibility, reasonable people stop trusting randomly shared videos.

And for any piece of evidence, there's counter-evidence/counter-arguments. So unless someone monopolizes the whole internet to share their one fake video, there will always be reason to disbelieve it.
 

Amibguous Cad

Member
Oct 25, 2017
3,033
There's the problem tho, a shit ton of people lack this

You'd think so, but photoshop has not substantially contributed to e.g. QAnon, as far as I know. The important thing here is that this media literacy is useful in non-political contexts. Everyone gets the message that photoshop exists and photos can't necessarily be trusted the first time they share a fake on facebook, or when they get taken in by one. It's embarrassing. When deepfake tech is running on consumer hardware it's going to be everywhere. And, like photoshop, mostly used for memes rather than fraud.
 

Deleted member 21709

User requested account closure
Banned
Oct 28, 2017
23,310
The joke in that comic (humans are dumb lol) is just that. Of course accurate deepfakes can make things worse.
 

Cenauru

Dragon Girl Supremacy
Member
Oct 25, 2017
5,981
Deepfakes wouldn't just be used on politicians and celebrities. That's the issue, it's gonna be used on smaller scale stuff, and not everyone has common sense.
 

SamAlbro

Member
Oct 25, 2017
7,358
Once deepfakes get really good:

gf: We're breaking up I just saw a video of you with another girl

guy: Baby it's ok it's just a deepfake

Yea... we are in for a messed up future

But she caught me on the counter (It was deep faked)
Saw me bangin' on the sofa (It was deep faked)
I even had her in the shower (It was deep faked)
She even caught me on camera (It was deep faked)
 

samoyed

Banned
Oct 26, 2017
15,191
The comic is correct however, in the service of the joke it misses the greater implications. Deepfakes would increase the rate of right wing radicalization, because not everyone starts off at the deep end of reality denial, they are conditioned to it over time. Take a "moderate" centrist then flood them with deepfakes about Obama and Hillary eating babies in Pizzerias and they'll quickly abandon reality to go to the right. The comic does not address the impact of deepfakes on the right wing radicalization pipeline. The threat is not that the far right will lie to each other about Democrats but that the far right will lie to the center about Democrats.

The real danger of political deepfakes aren't that "people will lie about politicians" but that "people will be less able to discern fact from fiction".
 
Mar 3, 2019
1,831
No, it wouldn't. The hysteria is complete nonsense. Photoshop already exists and is not a primary locus of fraud and deception, mostly because people know that it exists and don't take there being a picture of a certain thing as ironclad proof that it happened. Deepfakes just extend that to video and voice. Maybe there's a hoax or two that gains currency while the public is ignorant of the technology, but this is something basic media literacy can easily nip in the bud.

The naivety of this post is baffling. Shittily done photoshops are easy to spot. A video shot like it was taken undercover with perfect facial and voice capture is a dangerous weapon. Think of the access Hollywood tape, that was just audio and sunk Trump for a lot of people. Now put video to it and it's over. We will literally live in an alternate facts world at that point
 

roguebubble

â–˛ Legend â–˛
Member
Aug 8, 2018
1,134
The other side of the coin is when the people get legitimately caught out for doing shitty things then they'll either claim it's a deep fake or have a deep fake created to exonerate them (I couldn't been at the paedo mansion here's a video of me at pizza express for example)

The worst I think won't be against big celebrities or politicians but against mid level journalists and activists who won't have the resources to counter the knee-jerk reaction to a deepfake and could end with the reputation ruin
 

Shopolic

Avenger
Oct 27, 2017
6,873
It'll destroy lots of things, from celebrities lives and political things to normal people.
Just imagine someone doesn't like you at work and makes a video about you to show the boss. Can you prove that person in the video isn't you?
Also it'll give excuse to people who did wrong things and they can say "that's just a deepfake video!"
Both sides are so bad.
 
OP
OP

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
revenge porn?
blackmail?
of course it will ruin everything.

I think all of these things only work if you assume no one knows deepfakes exist yet.

If most people know it exists, video evidence becomes practically useless.

I'd argue the exact opposite will happen. The moment deepfakes are truly realized, people will just say "oh, that was a deepfake," and no one will believe in the video (unless there's other evidence).

EDIT: Yeah, it looks like you're all coming from the same place... But I feel like people are imagining a world where people somehow trust video after deepfakes are rampant. Which I don't think will exist.

You don't need resources to combat something no one believes in.
 

Amibguous Cad

Member
Oct 25, 2017
3,033
The naivety of this post is baffling. Shittily done photoshops are easy to spot. A video shot like it was taken undercover with perfect facial and voice capture is a dangerous weapon. Think of the access Hollywood tape, that was just audio and sunk Trump for a lot of people. Now put video to it and it's over. We will literally live in an alternate facts world at that point

Because we don't live in a world where deepfakes exist! (yet) Like, I don't think it should be controversial to say that people re-evaluate the trustworthiness of evidence based on how easy it is to fake. In a world with deepfakes, Trump and co. would be able to say that the video (or audio) was fabricated, and the question would largely be which authorities you choose to trust. That's a problem, but it's a different and smaller one than the entire world inexplicably believing a faked video when they know for a fact that videos can now be faked.

If you want to know what kinds of problems deepfakes will cause, just look at what photoshop's done to the world. Government-shaking fabricated evidence is not on the table... but impossibly beautiful airbrushed models making the leap from fashion magazines to TV and movies definitely are.
 

Desi

Member
Oct 30, 2017
4,210
I enjoyed how deepfake was pretty much an early storyline in the first Xenosaga Game. You had to prove your crew's innocence.
 
Jan 27, 2019
16,080
Fuck off
Yes, because with advanced AI voice generations you can create clips of people saying and doing pretty much anything to frame or discredit people.
 

Amibguous Cad

Member
Oct 25, 2017
3,033
This! This is what I'm getting at.

But I'm not sure how much solid video evidence really matters. What does it matter, at the end of the day, if conspiracy theorists stop believing in crisis actors and start believing in deep fakes? You either believe the media is out to systematically deceive you in order to push an agenda, in which case no proof is ever enough, or you don't, and you never needed the video evidence to begin with. The Access Hollywood tapes (or Watergate tapes) are something of the exception that proves the rule: deepfake tech would only change the outcome in which recorded voice or video was the only evidence available.
 

SageShinigami

Member
Oct 27, 2017
30,475
While that comic is right that people who want to believe things will believe them without high quality deepfakes, the issue is that everybody else isn't fooled by low quality falsehoods

Deepfakes give a lot of people who wouldn't otherwise believe those falsehoods convincing evidence to believe them.

Deepfakes will be very dangerous in the future.

Thread. And conspiracy theorists will be even harder to convince of being wrong.
 

nihilence

nøthing but silence
Moderator
Oct 25, 2017
15,951
From 'quake area to big OH.
Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.

It was inevitable with technology. But there could be problems with it. Especially with a cult mentality. Currently a massive amount of people will follow willingly with poor falsehoods. How many more can be gained with a more convincing facsimile.
 

samoyed

Banned
Oct 26, 2017
15,191
Photoshop is a poor comparison here because the ease of image editing for propaganda purposes IS a critical component of right wing culture so the idea that "people already do shops of Obama and nothing bad happens" is just missing the forest for the trees. The shopping and propaganda is the bad thing!

People keep looking at it narrowly like the worst that could happen is making Hilary do a Hitler speech. Very few people are considering that it can be used to make Trump do a more extreme Hitler speech and convert people this way. It can be used to harm Dems and help Reps.
 

kess

Member
Oct 27, 2017
3,020
No doubt there will be a higher tech version of the Zinoviev letter in a future election, but there is going to be an effect of diminishing returns depending on how saturated it gets.
 

impingu1984

Member
Oct 31, 2017
3,416
UK
I mean yeah it would..

Ironically enough here in the UK we have the queen's speech @ 15:00 every Christmas day on the BBC (and ITV).... Every year channel 4 does an alternative speech..

This year channel 4 is doing a deepfake speech of the queen to highlight the danger of misinformation in the media

 

bill crystals

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
1,079
I've often thought the very loud public fear of Deepfakes was misguided. It reminds me of the "my phone is listening to me!" thing in the sense that phones don't need to listen to you for Facebook to know literally every single thing about you or what you're likely talking about. A completely faked, convincing video is not worth the ROI. You can achieve the exact same misinformation with like one thousandth the cost (and less chance of being found out) with any number of lower-tech methods perfected over the last handful of years. Deepfakes will never really be a thing except for fun memes IMO.
 
OP
OP

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
Photoshop is a poor comparison here because the ease of image editing for propaganda purposes IS a critical component of right wing culture so the idea that "people already do shops of Obama and nothing bad happens" is just missing the forest for the trees. The shopping and propaganda is the bad thing!

People keep looking at it narrowly like the worst that could happen is making Hilary do a Hitler speech. Very few people are considering that it can be used to make Trump do a more extreme Hitler speech and convert people this way. It can be used to harm Dems and help Reps.

Using your Hitler speech example - earlier this year, Trump was on Rush Limbaugh talking wild as usual.

We can already fake voices/merge clips. Someone who really, really wanted to could have made him say something even more extreme to rally the right wing.

And you can do that for any and every audio clip that exists. We've already fallen way further than people realize, I think. Deepfakes aren't going to make the world any crazier.
 

samoyed

Banned
Oct 26, 2017
15,191
We've already fallen way further than people realize, I think. Deepfakes aren't going to make the world any crazier.
This does not follow. Did the world get "any crazier" from this?

soviet-censorship-naval-commissar-vanishes.jpg


To this?
adobe-photoshop%5E2019%5Ephotoshop-logo-history.png


Maybe you feel it hasn't, and that is perfectly reasonable to argue, but by some metrics it has gotten "crazier" especially with regards to misinformation, including having gotten crazier from PS 4 to 7 to 13. I feel people drastically underestimate how much worse things can get and have this bias for thinking whatever they are familiar with is the extent of what's possible.
 
OP
OP

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
This does not follow. Did the world get "any crazier" from this?

soviet-censorship-naval-commissar-vanishes.jpg


To this?
adobe-photoshop%5E2019%5Ephotoshop-logo-history.png


Maybe you feel it hasn't, and that is perfectly reasonable to argue, but by some metrics it has gotten "crazier" especially with regards to misinformation, including having gotten crazier from PS 4 to 7 to 13. I feel people drastically underestimate how much worse things can get and have this bias for thinking whatever they are familiar with is the extent of what's possible.

Hm. I think what I'm saying is we're well beyond the point of letting facts stop us. Even outside of deepfakes, we haven't hit the peak of what people can do to spread misinformation. There are people with credentials spreading any conspiracy theory you want to believe in.

My point is deepfakes won't necessarily increase the potential crazy. It's a lot of effort for what reward? A politician can go on stage and lie, and millions of people will hang on their word. Deepfaking stuff is just way more effort than is necessary.
 
OP
OP

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
Deep fake porn is already ruining people's lives... so, yeah

Okay, porn is one area where it benefits bad people in a really selfish way that no other alternative does...

Incidentally, I found a Vox article making a similar argument that politics isn't the concern, but porn is. Which I can see, because quite frankly no one cares if it's fake.