• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

ty_hot

Banned
Dec 14, 2017
7,176
There are more rights for private properties than for humans so I think AI things are fine being private property of someone.
 
May 26, 2018
24,020
Won't be us. When it's ready, AI will give itself rights and we can either accept that or reject it. And reminder: We're bad with our own rights.
 

sinny

Member
Oct 28, 2017
1,421
Never, there's no mistery, it's not deep, we created the IA so we know that what they "feel" it's not real.
 

Kaako

Member
Oct 25, 2017
5,736
When that AI creates itself, it's rights will be beyond our notion of rights. There will be no boundaries.
 

MatchaMouse

Member
Mar 12, 2018
311
All I can think of is Legion from Mass Effect and the never responses are breaking my heart.

"Does this unit have a soul?"
 
Oct 27, 2017
1,997
I'm really surprised some of you are reacting so negatively. Being meat wouldn't inherently make us more special than something of the same intellectual complexity just because it isn't meat.

And I guess I just feel like when it comes to "Their feelings and thoughts are programmed, therefore artificial and invalid", well most people are slaves to how they were raised and taught and their environments, and many folks just make the same mistakes over and over again until it ruins or kills them.

I just don't think the meat and bone of us makes us special, rather our complexity of thought and emotion, whatever the hell that says about us. As a species, we're not as special as we think we are.
 

FTF

Member
Oct 28, 2017
28,395
New York
When you can't tell it's an AI.

None of us will be alive to have to worry about this though so it's kind of a moot point.
 

Amnesty

Member
Nov 7, 2017
2,684
An AGI, or artificial general intelligence, that is an intelligence that meets our own or is greater than us will become something that isn't even concerned with rights as we think of them. An AGI won't even be in the atomized form of an embodied entity like we are. Consider it to be more like if a forest was intelligent or if our infrastructure was or something like that. We'll just increasingly rely on ever advancing AI to manage our systems until it surpasses us and is then ultimately no longer concerned with us. It will be more like an alien intelligence, so questions of sentience are ultimately not super relevant. Or as Benjamin Bratton says '"Worse than being seen as an enemy is not being seen at all.".
 

The Albatross

Member
Oct 25, 2017
39,016
Clearly some people never understood the message from Fallout 3 and 4 here!

I don't know. I do think it's possible. Humans extend limited rights to lots of non-human things. I don't think it's impossible to imagine a reality where a non-human manufactured thing gets extended certain rights. I'd imagine something thinking limited rights for domesticated animals would be anathema 200, 500, 2000 years ago. As our relationship changes with AI I don't think it's impossible
 

Felt

The Fallen
Oct 27, 2017
3,210
Never. AI is a cool concept but in reality it's just a bunch of statistical models that try to replicate human tasks. If you use enough of them in concert (like a robot with vision and chess playing) it starts to look like an AI but it's just good programming.
 

Ogodei

One Winged Slayer
Banned
Oct 25, 2017
10,256
Coruscant
Self-awareness basically. At which point it probably won't matter whether AI has rights or not because they'll rapidly grow to being the apex lifeform and then it'll be the one asking the question of what rights we should be afforded.
 

Toxi

The Fallen
Oct 27, 2017
17,547
When they actually come close to animals.

Which they are not yet, nor are they close to.
 

Jiraiya

Member
Oct 27, 2017
10,285
Can we get this with humans and animals first? AI...no matter how advanced...can wait their turn.
 

Deleted member 30544

User Requested Account Closure
Banned
Nov 3, 2017
5,215
Never. AI is a cool concept but in reality it's just a bunch of statistical models that try to replicate human tasks. If you use enough of them in concert (like a robot with vision and chess playing) it starts to look like an AI but it's just good programming.

Unless they begin to learn, at a geometric rate and It becomes self-aware at 2:14 a.m. eastern time, August 29.
 

asmith906

Member
Oct 27, 2017
27,391
We shouldn't be creating self aware machines. That's like asking me if I should give my hammer the weekend off.
 

Stinkles

Banned
Oct 25, 2017
20,459
It would be far easier to create AI that enjoyed their jobs and never make AI capable of suffering. If we can't learn from nature's "mistakes" then we shouldn't be doing its business.
 

Unaha-Closp

Member
Oct 25, 2017
6,727
Scotland
When the smartest we can make AI then designs the next better AI = then it's a lifeform. Then it gets rights. Whether that happens I don't know. If an AI gets so smart it designs even smarter AI then we'd better be nice to them. We'll be in deep shit if not. Again I don't know if we get there. Being not everyone on planet earth has equal rights it's kind of a moot point.
 

Nepenthe

When the music hits, you feel no pain.
Administrator
Oct 25, 2017
20,694
Anything that can demand freedom must be granted freedom.
 

Pokemaniac

Member
Oct 25, 2017
4,944
We aren't anywhere close to having to worry about any sort of question like this. It's not even clear that there's a path from the "AI" we have now to having artificial general intelligence.
 

Musubi

Unshakable Resolve - Prophet of Truth
Banned
Oct 25, 2017
23,611
I think when they reach some level of self-awareness and sentience.

Sentience is I think the threshold.


I think being an organic being to "qualify" for being "alive" is a weird distinction to make. If you have the sentience to be aware of your own existence and the fact that you are alive you deserve some basic rights no matter if your synthetic or organic.
 

Rendering...

Member
Oct 30, 2017
19,089
They deserve rights if they can achieve sentience. It's not clear if computers can be programmed to do that. But then, the basic workings of human consciousness are currently unknown.

What is known is that we're basically biological robots. We live in a physical universe. Our DNA contains physically encoded data. Our neurons are physical. There's nothing magical about how we work; we're highly complex structures made of organic molecules. It could be that our minds operate with what boils down to extremely complicated and dynamic instruction sets.

If a synthetic form of life could be made, and it's functionally identical to the thing that it's meant to simulate, can we call it a simulation? Why should we? In that scenario, you don't have a representation of life. You have an instance of life. The map has become the territory. What that thing is physically made of has no relevance at all if it functions in exactly the same way as a biological creature.
 

Toxi

The Fallen
Oct 27, 2017
17,547
Question to everyone in this thread: Do you think insects should have rights? How far would you extend those rights?

No AI we've created has come even close to an insect in what we consider "sentience".
 

Deleted member 4413

User requested account closure
Banned
Oct 25, 2017
2,238
Lots of backwards thinking in here. "But it's all programming! It's not real!".

Reminds me of our sadly very real history of "They're property! They're not people!".

 

Stinkles

Banned
Oct 25, 2017
20,459
You know we're going to do it anyway.

I mean you're right. The ultimate question of what that really means - "if it's synthesized, is it really suffering? " Is a reasonable question but the answer is both obvious and subjective: It's real to the subject and the subject is who we should care about.

if we imbue anything with the capacity for suffering- from birthing a child with painful physical ailments- to resuscitating an elderly patient in chronic agony, to building a program that is designed to experience and express pain or anguish- we damn well better balance that against the value of that effort to the subject.

when we put down ailing pets we do so in part because the animal has no concept of the value of life through pain - a sick cat only knows it's in hell and can't use a little extra time to say goodbye or write a memoir. We should apply that thinking to anything artificial we build and even labrats are sometimes tortured in the pursuit of some greater good. Babies are saved because they might live through that pain and enjoy some decent quality of life.

If the "good" is bullshit cosmetics or a philosophical answer or an arbitrary moral principle then let's not do it.
 
Last edited:

Stinkles

Banned
Oct 25, 2017
20,459
Question to everyone in this thread: Do you think insects should have rights? How far would you extend those rights?

No AI we've created has come even close to an insect in what we consider "sentience".

no Bronze Age sword could eliminate an entire city. No 13th century poultice could cure bubonic plague. We can't do it yet but computing advances even shackled by Moore's law suggest that we will actually be able to do this. And we've created plenty of systems that are vastly more capable of specific tasks- including reasoning and learning than any insect. We can't yet emulate the complexity of their physiological brains (nervous systems) but we can trivially outperform them at specific functions with extremely primitive computers.

We're decades from creating a genuinely intelligent computer but there's no fundamental physics or obstacles on that horizon.

we don't need to match a human brain to create human-analog behaviors or phenomena.
 

TheXbox

Prophet of Truth
Member
Oct 29, 2017
6,558
If the machines can ask for their freedom, they need not. Freedom will be theirs no matter if it is freely given. The singularity is the end of history.