• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

EloquentM

Member
Oct 25, 2017
9,631
But this is why we give people common road rules and instructions, and try to enforce them as best we can.

We don't let every driver figure out their own priorities.
Automated cars have the potential to follow road rules and instructions at ALL times unlike humans. We most certainly do let people figure out their own priorities because they are the ultimate decision maker behind the wheel. Taking agency away from All human drivers allows for consistency and predictability, safety and efficiency because there are much less variables than millions of other human drivers also on the road.
i mean...how?
and to be clear I just want the tech to be actually perfed before it goes live. none of the fatalities are acceptible collateral damage in my opinion, but people are just looking to the future.
lmao goddamn your edit.
 

Aaronrules380

Avenger
Oct 25, 2017
22,416
Yes but a self driving car can't use that defense. It will be expected to follow "best practice" for whatever society collectively deems in any scenario.
Which is exactly why it's way, way, way safer in the long run. You seem way more concerned about how we can blame people when things go wrong rather than preventing things from going wrong in the first place
 

Nooblet

Member
Oct 25, 2017
13,620
But this is why we give people common road rules and instructions, and try to enforce them as best we can.

We don't let every driver figure out their own priorities.
The same instructions are given to the AI, who would follow it better than humans leading to better enforcement.
But this is about edge case highly unlikely scenarios, you can't talk about how it's bad to have these machines out there because they have these priorities coded in without asking what would a human driver do ? And we all know the answer to that already which is, the same thing as the AI.
 

Kurdel

Member
Nov 7, 2017
12,157
The violence that comes with car culture continues into our short term future, to the suprise of none.

This shit is so normalised for us. 102 deaths per day in the US, if this was anything else you would have invaded them by now.
 

danm999

Member
Oct 29, 2017
17,078
Sydney
Which is exactly why it's way, way, way safer in the long run. You seem way more concerned about how we can blame people when things go wrong rather than preventing things from going wrong in the first place

No what I'm saying is, how these cars act is something that has to be defined.

The same instructions are given to the AI, who would follow it better than humans leading to better enforcement.
But this is about edge case highly unlikely scenarios, what would a human do ?

What is that instruction? That's the whole point of the discussion. Save the most people possible? Save the driver? Something else?
 

Aaronrules380

Avenger
Oct 25, 2017
22,416
The violence that comes with car culture continues into our short term future, to the suprise of none.

This shit is so normalised for us. 102 deaths per day in the US, if this was anything else you would have invaded them by now.
It's ok if millions die due to a human driver, but 1 or 2 deaths by an ai means the technology is invalid and nobody should ever support it because it's only murder when a robot does it
 

Kurdel

Member
Nov 7, 2017
12,157
It's ok if millions die due to a human driver, but 1 or 2 deaths by an ai means the technology is invalid and nobody should ever support it because it's only murder when a robot does it

It's all dumb.

We should collectively get our heads out of our asses and reasses this whole personal vehicle thing and what the costs actually are. We are fucking up the planet and killing so many people for way too long now.
 

Nooblet

Member
Oct 25, 2017
13,620
No what I'm saying is, how these cars act is something that has to be defined.



What is that instruction? That's the whole point of the discussion. Save the most people possible? Save the driver? Something else?
Erm...you are switching back and forth between two different things things.
The "road rules and instructions" don't include edge case scenarios like these, they are just that...road rules. Which the AI will be far better at following, leading to better enforcement as I mentioned.

Edge case scenarios are not mentioned or talked about anywhere, show me a road rule book that has this situation described for human drivers. This is why you don't have this same discussion for humans eventhough they'll most likely do the same thing i.e. save themselves over pedestrians. It's biologically coded into every human being via survival instincts, the AI is simply enforcing that from the perspective of the driver.

Additionally, as I said in an old post. In a lose-lose scenario like this you have to consider all possible angles. A passenger is essentially trapped inside the car when moving at speed and cannot likely jump out of it to save themselves in time before an accident. Pedestrians on the other hand atleast have a small chance to survive by dodging/running away. Yes it's going to be a very small chance but it's a chance that the passenger won't have...in this already unlikely situation.
 
Last edited:

capitalCORN

Banned
Oct 26, 2017
10,436
Okay, one thing about the moral survey that bothers me is that how is a car supposed to indentify a doctor from a ceo from whatever, or identify anything other than gender and age? And age is pretty suspect to begin with. If there some terrible surveillance state we don't know about?
 

Aaronrules380

Avenger
Oct 25, 2017
22,416
It's all dumb.

We should collectively get our heads out of our asses and reasses this whole personal vehicle thing and what the costs actually are. We are fucking up the planet and killing so many people for way too long now.
Honestly even if we shift towards a much higher emphasis on public transportation self driving vehicles will still have the same benefits for those cases as they would for self driving cars
 

Kurdel

Member
Nov 7, 2017
12,157
Honestly even if we shift towards a much higher emphasis on public transportation self driving vehicles will still have the same benefits for those cases as they would for self driving cars

Sure!

The main jist of my argument is this debate on self driving cars is usually too narrow in scope for what crises we are actually facing.
 

danm999

Member
Oct 29, 2017
17,078
Sydney
Erm...you are switching back and forth between two different things things.
The "road rules and instructions" don't include edge case scenarios like these, they are just that...road rules. Which the AI will be far better at following, leading to better enforcement as I mentioned.

Edge case scenarios are not mentioned or talked about anywhere, show me a road rule book that has this situation described for human drivers. This is why you don't have this same discussion for humans eventhough they'll most likely do the same thing i.e. save themselves over pedestrians.

What I'm saying is that since self driving cars can be programmed to react with more precision, the way they react in these scenarios will become road rules.

We don't regulate this as much with humans because we realise instinct makes it difficult to follow.
 

Nooblet

Member
Oct 25, 2017
13,620
What I'm saying is that since self driving cars can be programmed to react with more precision, the way they react in these scenarios will become road rules.

We don't regulate this as much with humans because we realise instinct makes it difficult to follow.
And that's eventually a good thing, because then it'd take into consideration that the cars on street are self driven. And pedestrian crossings would be built taking that into account. When everything is self driven it's easier, it's the transition period that's going to be difficult, because of non automated cars, rules meant for cars with human drivers.
 

davepoobond

Member
Oct 25, 2017
14,491
www.squackle.com
Thank goodness we waited until humans were perfect at driving before letting them behind the wheel

computers can also get drunk and do lots of hard drugs before getting into a vehicle, why doesn't anyone think about that


And that's eventually a good thing, because then it'd take into consideration that the cars on street are self driven. And pedestrian crossings would be built taking that into account. When everything is self driven it's easier, it's the transition period that's going to be difficult, because of non automated cars, rules meant for cars with human drivers.

all carpool lanes become self-driving lanes
zones around sensitive areas, such as schools, become self-driving only
expand from there
 

Deleted member 11413

User requested account closure
Banned
Oct 27, 2017
22,961
I wish you could learn how to read my posts but I guess you're too busy looking for progressives to swoop in with a gotcha
No, I'm pointing out that yelling about people going to jail for car accidents isn't progressive. I'm not 'looking for progressives to swoop in on' because that post was not an example of being progressive at all. That's why I called it out in the first place.
 

Sunster

The Fallen
Oct 5, 2018
10,002
200.gif
 

jotun?

Member
Oct 28, 2017
4,484
Ars Technica has a much more in-depth article about this particular incident:

Still terrible software design by Uber, but it at least shows that it's more complicated than just "durrr whoops we forgot about jaywalkers", and that it was a version of software still very much in-development. The biggest mistake was not anything in particular in the software, but improper training of the safety driver who didn't pay enough attention when that software version was meant to be heavily reliant on them.


One thing I've wondered since the beginning of this recent push toward self-driving cars is what the legal situation is for current technologies that generally save lives but occasionally screw up. Airbags for instance, if they save lives overall, but occasionally mess up and cause an unnecessary death, how liable is the manufacturer? Is there some kind of credit system where for every X lives your tech saves, you get can get away with it causing Y deaths? Or can you just get fined/sued into the ground for it even if what you're doing is the best for the greater good?
 

Deleted member 52442

User requested account closure
Banned
Jan 24, 2019
10,774
Even sacrificing a whole class of children to save yourself?

Even sacrificing myself for some drunk jaywalker?

this isnt even remotely realistic scenario to be in, but in any scenario where for some reason my car has a decision to save me or someone on the street, my money will be going to the car that would pick me

from a consumer perspective it does not make sense to pick something else if you ask me
 

Timeaisis

Banned
Oct 25, 2017
6,139
Austin, TX
It's funny because they have to prioritize the occupant otherwise no one would buy them. However, that makes it incredibly unethical in most situations. There is no good solution.

I say we scrap this whole thing, it's awful.
 

krazen

Member
Oct 27, 2017
13,105
Gentrified Brooklyn
Even sacrificing myself for some drunk jaywalker?

this isnt even remotely realistic scenario to be in, but in any scenario where for some reason my car has a decision to save me or someone on the street, my money will be going to the car that would pick me

from a consumer perspective it does not make sense to pick something else if you ask me

Id make an argument there are loads of safety features built in that give you a 'chance' from air bags, to seat belts to how they build the chassis at surviving an accident.

Not the same chance someone else would have getting smacked by a two ton slab of metal.

if we are mowing down kids so you can avoid popping an airbag at 20 mph in a fender bender, lol
 

Bitch Pudding

Member
Oct 25, 2017
4,202
Even sacrificing myself for some drunk jaywalker?

no, it's far more common that an entire class of children SUDDENLY enters the street without any warning, including their pets, since it was bring your pet day. Your example is just stupid. When did that ever happen?!


I say we scrap this whole thing, it's awful.
Indeed! Let millions die every year like we're used to. There's just nothing that can be done to prevent this.
 
Last edited:

ShadowAUS

Member
Feb 20, 2019
2,105
Australia
Okay, one thing about the moral survey that bothers me is that how is a car supposed to indentify a doctor from a ceo from whatever, or identify anything other than gender and age? And age is pretty suspect to begin with. If there some terrible surveillance state we don't know about?
The survey isn't just about self driving cars for what it's worth, it just uses them as it's the easiest way to visualise a moral dilemma involving AI. The age/sex/social status/etc data points are just there to paint a clearer, more in depth picture of society's general moral/ethical preference when an AI has to make a choice, it doesn't necessarily mean that the AI is going to be specifically programmed based on it. Even in our inevitable surveillance state future the time an AI would have to observe, calculate and react in these extreme edge cases is very, very small. The parameters that will most likely be used (it's not a new study, it's been running since 2016~ so there's a good chance the data has already been taken into account in regard to regulation.) would be number of individuals, passenger vs pedestrian, rate of intervention, human vs animal and possibly legality like jaywalking.
 

Akira86

Member
Oct 25, 2017
19,582
alls i know is somebody gotta die...

and i dont want it to be me so I guess i better work extra hard so i can afford a Mercedes
 

SupremeWu

Banned
Dec 19, 2017
2,856
Autonomous vehicles are an idea too far ahead of it's time

But also this sort of common sense

I mean, to the defense of Mercedes, why would anyone buy into those cars/systems if they knew the car was going to literally throw them under a bus if need be.

I get the dilema. I don't know how I feel about either side of it personally. Makes me wonder what Tesla does. Do they have anything public about it?


If I buy a thing to ferry my family around, I care about how they're protected first.
 

capitalCORN

Banned
Oct 26, 2017
10,436
The survey isn't just about self driving cars for what it's worth, it just uses them as it's the easiest way to visualise a moral dilemma involving AI. The age/sex/social status/etc data points are just there to paint a clearer, more in depth picture of society's general moral/ethical preference when an AI has to make a choice, it doesn't necessarily mean that the AI is going to be specifically programmed based on it. Even in our inevitable surveillance state future the time an AI would have to observe, calculate and react in these extreme edge cases is very, very small. The parameters that will most likely be used (it's not a new study, it's been running since 2016~ so there's a good chance the data has already been taken into account in regard to regulation.) would be number of individuals, passenger vs pedestrian, rate of intervention, human vs animal and possibly legality like jaywalking.

What is the proposition for infirm pedestrians/passengers? There's no way to make a value judgement that doesn't have terrible implications.
 

SupremeWu

Banned
Dec 19, 2017
2,856
Nah, they're about < 5 years off. They are right on time.

The infrastructure isn't there, we don't have laws to cover the sort of liability issues that are going to come up. The cars may out-drive a lot of human drivers but you're still going to be cramming them onto highways chock full of random nonsense the programming can never predict or adjust for.
 

Bitch Pudding

Member
Oct 25, 2017
4,202
When we're old, and I mean really old as in too old to drive, not 37 or something, we'll tell Siri to get the car and drive us to our grandchildren or to the doctors or the next strip club. That will be nice, as will be telling our grandchildren and the hookers that unlike them we actually had to drive the cars ourselves back in the day.
 

finalflame

Product Management
Banned
Oct 27, 2017
8,538
The infrastructure isn't there, we don't have laws to cover the sort of liability issues that are going to come up. The cars may out-drive a lot of human drivers but you're still going to be cramming them onto highways chock full of random nonsense the programming can never predict or adjust for.
Highways are indeed a different beast, but I'm fairly confident about autonomous vehicles safely navigating city streets. I'm sure we'll solve highways as well once we've optimized for city driving.

The many self-driving programs in the Bay Area all seem to be going quite smoothly. A friend in particular works on the team that writes the simulation software used by engineers to test logic changes for a major autonomous vehicle company, and the sentiment around the industry is these things are much closer than people on ERA seem to believe.

Fully autonomous roads are ways off, but an autonomous vehicle picking you up for your Lyft/Uber/whatever ride isn't.
 

ShadowAUS

Member
Feb 20, 2019
2,105
Australia
alls i know is somebody gotta die...

and i dont want it to be me so I guess i better work extra hard so i can afford a Mercedes
This is old fwiw, as we get closer to the point where level 4 cars become more common than not (within the next decade) and level 5 cars are becoming more viable (this is as much an infrastructure and societal issue as it is technological) and common (within the next 25-30~) it won't be a manufacturer decision on how their AI is programmed and trained in regard to ethical decisions but rather governed by each countries regulation (maybe even universal regulation but I doubt it) relating to autonomous vehicles or AI in general.
 
Last edited:

Akira86

Member
Oct 25, 2017
19,582
This is old fwiw, as we get closer to the point where level 4 cars become more common than not (within the next decade) and level 5 cars are becoming more viable (this is as much an infrastructure and societal issue as it is technological) and common (within the next 25-30~) it won't be a manufacturer decision on how their AI is programmed and trained in regard to ethical decisions but rather governed by each countries regulation (maybe even universal regulation but I doubt it) relating to autonomous vehicles or AI in general.
10-25 years down the line after tons and tons more development and research and advancement, and I'm ok with it. A centralized and fully proofed system that is consistently safe and resilient to many changing factors, built on the work and not the needs or expectations or hopes placed on a hypothetical that is supposed to be better but very definitely holds life in its balance? I'm fine with that. I'm not fine with tech being rushed to production because some (Tesla) are so super-confident in their abilities, or because they (Uber) need autonomous vehicles for the survival of their business model. I'm not ok with the scenario of a car company saying "you're safe with us, we'll sacrifice literally everyone else to save you." How does that sound to the non-Mercedes owners? I get why they need to mollify their customers, but do I need to demand that my Toyota gets some defensive AI in case I ever encounter a Mercedes on the road? The framing is just coarse, morbid, gross, and more.

However I like the idea of a central, governmental, non auto-company based control on vehicle autonomy and a focus on public safety over discussions on who lives and who dies. People may be ready for the cars, but I'm not sure the cars are ready for the people. I don't really want a running start for this, simply out of convenience, because we're kinda sorta close to safe and want to explore the economic possibilities, either.
 

Militaratus

The Fallen
Oct 27, 2017
1,212
Car should aim for the least amount of casualties, regardless if they are driver, passenger, or bystander, and keep the logs of the simulations it ran just in case anyone plans to sue to display that the final situation was the best possible action. For the selfish people, there should be a way to choose prioritization at any time, but make the default setup be minimize all. If the driver deliberately choose to prioritize his own life in the config, this information should be present in the logs and any lawsuits about caused casualties should fall upon the driver because he deliberately altered the AI to prioritize that situation above all.
 

Yankee Ruin X

Member
Oct 31, 2017
2,682
To be fair humans are the issue, if every car was self driving then they could all link together and know exactly what they are all doing and traffic wouldn't be an issue.
 

LastCaress

Avenger
Oct 29, 2017
1,680
This is something that should be decided through legislation and not by Mercedes. As more things will have AI and AI will have to make decisions that might cost lives (and save others) the rules will have to be clearly established in law. You can't put an AI on trial (even though you could prosecute put those that made it/implemented it).
 

Mudkip Xbox Series X

Alt account
Banned
Dec 14, 2019
175
From a pragmatic stand point, it would likely be less deadly for the driver as well as others to take out someone on a bicycle than potentially contribute to or create a multi vehicle pile up...

The difference is that the AI won't be making the decision based on panic, instead it'll be able to calculate much faster than any human.