• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

delete12345

One Winged Slayer
Member
Nov 17, 2017
19,697
Boston, MA
When they crash, self-driving Mercedes will be programmed to save the driver, and not the person or people they hit. That's the design decision behind the Mercedes Benz's future Level 4 and Level 5 autonomous cars, according to the company's manager of driverless car safety, Christoph von Hugo.

One of the biggest debates about driverless cars concerns the moral choices made when programming a car's algorithms. Say the car is spinning out of control, and on course to hit a crowd queuing at a bus stop. It can correct its course, but in doing so, it'll kill a cyclist for sure.

Read more here:


Autonomous murder machines, just rolling into the shop!
 

Tbm24

Banned
Oct 25, 2017
16,329
I mean, to the defense of Mercedes, why would anyone buy into those cars/systems if they knew the car was going to literally throw them under a bus if need be.

I get the dilema. I don't know how I feel about either side of it personally. Makes me wonder what Tesla does. Do they have anything public about it?
 

Jarmel

The Jackrabbit Always Wins
Member
Oct 25, 2017
19,338
New York
As awful as that sounds, it's probably the right call. Mainly because people won't trust autonomous cars otherwise if they don't know the car will place the highest priority on the driver.
 

Gallows Bat

Banned
Nov 3, 2017
343
I doubt people would buy a car that could choose to sacrifice them haha, that's probably what influenced this.
 
Oct 27, 2017
42,700
Someone's gotta be. You guys act like they're programming it to mow over pedestrians. It's not any easy decision, but they went with the one that won't tell potential buyers they won't be prioritized in a potential accident
 

Killthee

Member
Oct 25, 2017
4,169
Makes sense. Ones a paying customer and the other one is a dirty poor in the way of a paying customer.

/s
 

ry-dog

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
2,180
I mean they'll be programmed not to crash. It's not like:
if (pedestrians > cyclists){
kill cyclists
}
 

Musubi

Unshakable Resolve - Prophet of Truth
Banned
Oct 25, 2017
23,611
That's the way it should be. Like if the car that is driving you isn't programmed to save YOU what kind of fucking car is that? When automated cars became a thing these are the kinds of fucking dark decisions that are going to have to be made. The Car should always prioritize saving the driver first.
 

Dennis8K

Banned
Oct 25, 2017
20,161
Of course. You don't expect the kind of person who owns a Mercedes to be the one to go, do you?

Come on, plebs.
 
Oct 27, 2017
1,608
This seems cartoonishly monstrous.
I mean it's a tricky moral quandary. Would you buy a car that would prioritize the life of a pedestrian over the occupants and trust it to put your kid in it? Should we have sensors in a car that change the morality logic? If there's only a single occupant save the pedestrian or if there's two in the car prioritize them?

As much as it sucks I do get the logic of not wanting to sell people a car that will decide to kill you.
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
I mean, at the end of the day these cars will probably save way more lives by greatly reducing the amount of accidents that occur in general because humans are pretty shit drivers
 

Freezasaurus

Member
Oct 25, 2017
57,002
I mean it's a tricky moral quandary. Would you buy a car that would prioritize the life of a pedestrian over the occupants and trust it to put your kid in it? Should we have sensors in a car that change the morality logic? If there's only a single occupant save the pedestrian or if there's two in the car prioritize them?

As much as it sucks I do get the logic of not wanting to sell people a car that will decide to kill you.
Maybe autonomous cars are just a bad fucking idea. The only reason it's a thing is because some shitty fucks can make money off of it.
 

GMM

Member
Oct 27, 2017
5,484
It's sadly marketing suicide not to do this, the markets like the EU should however impose strict guidelines on how AI's should prioritize such cases regardless of what the manufacturer might say, at the end of the day they just want to sell a product.
 

Gouty

Member
Oct 25, 2017
1,658
I think who the car favors should be randomized every time you start it up. Keep it interesting.
 

EvilChameleon

Member
Oct 25, 2017
23,793
Ohio
It's funny this thread appears on the same day as the Carlin thread, since he liked to joke about using the sidewalks to get around traffic.
 

LazyLain

Member
Jan 17, 2019
6,500
This seems... fine? Anybody dying would be tragic, and certainly more deaths would be worse... but obviously a vehicle's top priority should be the safety of its occupants.

Maybe give passengers the option to lower their own safety priority for such scenarios, in case anybody wants to be selfless.
 

Juice

Member
Dec 28, 2017
555
These sorts of edge cases are programming the computer to make decisions a human literally could not make because the time duration is so brief that a human couldn't think and react at all in time.

so while this feels like "mow down vs don't mow down" people, the real choice is:

1. human driver makes no decision within 300ms at best and whatever outcome fate decides is the result (worth noting that outside these philosophical edge cases, human drivers will be much more dangerous most of the time)

2. computer driver uses that 300ms to identify and take an action that would save driver. This is probably easier to do because the driver is stationary and their actions don't need to be predicted when factoring this

3. computer driver uses that 300ms budget to put the lives outside the vehicle ahead of the driver. People outside the vehicle are not stationary so their reaction has to be predicted which is harder and more expensive, so if the computer tried to prioritize them it would do a worse job of it

I think the marketing of this would be more successful if we literally said "we'll just randomize what happens inside the space where humans can't react so that we don't have to pick winners and losers", but that's of course a totally irrational position.

the most important thing is that regulations guide things to make car behavior PREDICTABLE so that cars can predict the behavior of other cars and people can also take their actions based on understanding of cars will act.
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
Who gets arrested when the program neglectfully runs over someone.
I mean that would depend. If it was a freak accident I see no reason for anyone to be punished. If it's found to be a frequent problem that is the result of a coding oversight than the company should get in trouble. If the programming was hacked by the driver the driver is responsible
 

Freezasaurus

Member
Oct 25, 2017
57,002
I don't like the idea of autonomous machines like this mingling with the public at large. I'd be fine with it if the driver was legally responsible for any death / dismemberment / injury that resulted from the car's actions. Because you can't put cars in prison.
 

Heshinsi

Member
Oct 25, 2017
16,093
Maybe autonomous cars are just a bad fucking idea. The only reason it's a thing is because some shitty fucks can make money off of it.

Well considering the tens of thousands who die each year due to how shitty human drivers are, even with this protocol the expectation should be that a significant number of car accident related fatalities would be avoided. If automated cars were the future of all the cars on the road, you'd be saving thousands of lives.
 

fireflame

Member
Oct 27, 2017
2,275
Perhaps there could be an option"My life before the life of other people/or after"?Also in defense of Mercedes just imagie if there are passengers, situations can be grey.
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
These sorts of edge cases are programming the computer to make decisions a human literally could not make because the time duration is so brief that a human couldn't think and react at all in time.

so while this feels like "mow down vs don't mow down" people, the real choice is:

1. human driver makes no decision within 300ms at best and whatever outcome fate decides is the result (worth noting that outside these philosophical edge cases, human drivers will be much more dangerous most of the time)

2. computer driver uses that 300ms to identify and take an action that would save driver. This is probably easier to do because the driver is stationary and their actions don't need to be predicted when factoring this

3. computer driver uses that 300ms budget to put the lives outside the vehicle ahead of the driver. People outside the vehicle are not stationary so their reaction has to be predicted which is harder and more expensive, so if the computer tried to prioritize them it would do a worse job of it

I think the marketing of this would be more successful if we literally said "we'll just randomize what happens inside the space where humans can't react so that we don't have to pick winners and losers", but that's of course a totally irrational position.

the most important thing is that regulations guide things to make car behavior PREDICTABLE so that cars can predict the behavior of other cars and people can also take their actions based on understanding of cars will act.
yeah, people tend to forget that humans aren't exactly safe behind the wheel. Plus a good self driving car program will be less likely to bend the rules of the road the way people do all the time so cases where issues like this come up in the first place will go down once the tech is available