• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Chairmanchuck (另一个我)

Teyvat Traveler
Member
Oct 25, 2017
9,088
China
So I think most people know about all those deepfake videos, we had before.
It can be used for great stuff like better textures:

GLEE08-35.jpg


It even is being used right now to remove mosaics from japanese porn.

It can be potentially dangerous though, as seen in this gif:

yj09twsmwnkq05xcavwc.gif


or revenge ex-partner porn or adding celebrity faces to real porn.

I am a bit torn. On one hand we can have great stuff like those texture things, on the other hand the more the networks learn, the more dangerous it could become if it is being used by people with a dangerous agenda.
"Putin threatening war."
"Person someone dislikes saying stuff like "Kill all black people".
 

Delphine

Fen'Harel Enansal
Administrator
Mar 30, 2018
3,658
France
A tiny part of me is like "wow, amaze, such technology, endless possibilities!" while the other major part of me is "burn it with fire, humanity is for sure gonna make the worst use of it and I for one don't want to have to deal with that kind of world".
 

Deleted member 41502

User requested account closure
Banned
Mar 28, 2018
1,177
I'm kinda shocked we aren't already deluded with videos showing doctors eating aborted babies or at least politicians vowing their love for the KKK or something. It seems inevitable that we'll have to just stop trusting anything we see or hear at some point.
 

Mzril

Attempted to circumvent ban with alt account
Banned
Oct 26, 2017
435
We need the heavily invest in coming up with countermeasures, both legally and educationally
 

Laser Man

Member
Oct 26, 2017
2,683
Could become a problem with small scale surveillance camera video if you can't tell anymore and the person has no alibi. It might make my job easier or redundant... don't think it's a large scale problem for big news or politics, they always bullshitted around with that and achieved their goals despite of it, not because of it!

For everyone else it's nothing more than a nuisance, famous people get hit first (already did) and regular folks second but in time nobody will care when everything can be faked so easilly.
 

Schlauchkopf

Alt-account
Banned
Aug 20, 2018
659
Isn't Deepfake what kicked off Schwarzenegger's adventure in The Running Man? Dangerous stuff for sure.
 
Last edited:

BanGy

Member
Oct 25, 2017
761
Incredible technology that is unbelievably dangerous and is going to do some real damage.
 

Fliesen

Member
Oct 25, 2017
10,254
Hmmm, given that there's a bunch of people who don't believe the "fake news" (even if it's well sourced) or who completely ignore broad scientific consensus, i guess that also gives some parts of society an immunity to whatever forged media could be created.

Similarly, with regards to fake celebrity or revenge porn or all that crap. Well, we're entering an age where all porn could be fake - essentially, we'll evolve to a state of mind where "porn with your face on it" doesn't carry as much weight - just like weirdos photoshopping celeb faces onto photos of naked ladies doesn't really shock anyone anymore.

If it's commonly known that anything could be fake and nothing's assumed to be real - well, then it's just creepy, but not embarassing or life altering.

Again - we're living in an age where still imagery can be faked and forged to an insane degree.
Deepfake just means that not even video (or, eventually, audio) recordings are safe anymore.

What needs to happen is that people are made aware of the existence of this technology and its capabilites. The transition period where everything could be fake, but people still assumed some things to be unfakeable - that's going to be a dangerous time.
 

apocat

Member
Oct 27, 2017
10,055
It's incredibly cool technology, but it can lead to huge problems in the long run. I'm both in awe and somewhat frightened of it.
 

jph139

One Winged Slayer
Member
Oct 25, 2017
14,378
The genie is out of the bottle and legislating it is misguided. People who would use it for illicit purposes have pretty much no barrier of entry and would find the software regardless. Particularly if those people are working in places that haven't outlawed it.

It's impressive technology but no more harmful than photoshop or, honestly, just selective editing.
 

low-G

Member
Oct 25, 2017
8,144
It's not good enough to be very interesting just yet, but it will be relatively soon.
 

trugs26

Member
Jan 6, 2018
2,025
It will probably lead to a period of our society which is highly skeptical until we have a new standard of identification.
 

Zaphod

Member
Aug 21, 2019
1,103
I don't think they should ban it, but if it's used for libel, there should be harsh penalties.
 

Fat4all

Woke up, got a money tag, swears a lot
Member
Oct 25, 2017
92,758
here
man if i could get a deepfake algorithm to make Era posts for me think of all the free time i'll have
 

EloKa

GSP
Verified
Oct 25, 2017
1,906
Cool tech but it will probably be the reason for few million deaths at some point
 
Aug 16, 2019
844
UK
First of all, even though very well made, we are decades far from a really convincing deep fake. Our brain can pick even a single wrong frame in a million and clearly tell us something is fishy.

Second even if they where almost perfect we are already working on ways of detecting it.

The same happened with "photoshop". You can find plenty of porn pictures with famous people's head perfectly blended in, but we are now used to idea of phoposhopped rubbish and we instinctively question the veracity of said photo. I am sure you do not believe every single photo you see.

The same is going to happen to video manipulation and deepfakes, we are going to be used to the idea and will question everything absurd or fishy. Hell it got to a point where people do not believe legit documents.

So don't be worried and enjoy it for now when it's still new and used for fun
 

HylianSeven

Shin Megami TC - Community Resetter
Member
Oct 25, 2017
19,059
I think advances in technology are good, but there are always nefarious uses of them that will come up. With Deepfakes, I think we just have to stamp out the bad when it comes up. Pornhub actually banned them because they considered them (and they are) nonconsensual. I believe Reddit did the same if I'm not mistaken, and that's the right move IMO.

My mother works at an alternative education placement facility, and a kid actually got sent there for uploading a deepfake porn video of a classmate.
 

Acorn

Member
Oct 25, 2017
10,972
Scotland
I'm worried about the further erosion of trust. But it's coming so ultimately doesn't matter what we think about the negatives.
 

The Albatross

Member
Oct 25, 2017
39,011
I feel like it's a pandora's box. Once it's open it's hard to stuff the contents back in.

Deep fakes pose a major problem, but we can address them by having a framework in place to force publishers and distributors of deep fakes to take note when something is provably a deep fake. Facebook, Twitter, Instragam, YouTube, and other content distributors will be the primary ways that these are shared, and all four of them are the most powerful technology companies on earth with nearly unlimited resources (.. well... minus Twitter...). If Google has the ability to scan every video uploaded and spot patterns to identify copyright infringement, as does Facebook, they should have the ability to use similar machine learning algorithms and pattern matching to identify deep fakes. Sure, some will be harder than others and video makers will try to get around the tools that identify them, but they have to be compelled to do so. We need higher expectations of these platforms, whether that comes from government or public shame, but we need it, and they can implement it.

If YouTube or Facebook tags deep fake videos with something like "This video may be unrepresentative of the truth [Learn More About Deep Fakes]" or something, then I Think it can greatly reign in their influence.

We have to compel the platforms that will make money with these fakes being shared to do the hard work of developing a system to identify them. Technology companies brag about how everything is possible and how they're tackling these difficult problems that noone has ever been able to solve, that their solutions are limitless, but as soon as you ask them to do something that doesn't impact their bottom line positively, they tell you about how hard it is and how difficult it is to do that... Fuck that. Grow a spine and do it.

Beyond the distribution platforms, we'll have to have higher expectations of one another. Don't share faked videos just because they conveniently fit your world view. We do it with fake news and clickbait already, and play mental gymnastics to justify doing it because we believe we're "fighting the good fight against fascism" or something, but just have some personal credibility. Check your sources, read the content of an article not just the headline, double-check where something is coming from. If you promote something that proves to be fake, you're part of the problem... You didn't just "get duped," you were complicit in misleading people. You see this already when someone posts a thread with a really clickbaity title, the content of the article really doesn't serve the point of the title, and it's unreliably sourced ... like 90% of people won't click the article, they'll comment about how much of a travesty it is, and then like 10% will click it, read it, call out the OP, the OP will usually crawl behind a rock and say "I'm just copying and pasting the news article -- they're at fault not me!" Nah, grow a spine, if you're taking time out of your day to share something that's provably not true on another website, that's on you not the site that duped you.
 
Last edited: