• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Cation

The Fallen
Oct 28, 2017
3,603
Tbh, the logic in the article kinda makes sense. This isn't really a trolley problem cause we dont know if you can even save the people outside the vehicle. The logic seems to be that the person in the car has a binary fate; whereas the people outside do not.

Rather interesting regardless. Still better than a human driver who would inevitably fuck all sides of the situation up and kill everyone in their path.
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
I don't like the idea of autonomous machines like this mingling with the public at large. I'd be fine with it if the driver was legally responsible for any death / dismemberment / injury that resulted from the car's actions. Because you can't put cars in prison.
Putting people in prison after an accident happens doesn't actually do anything to stop the accident since it already happened.Saying that you don't want the technology to exist because it's harder to punish someone if things go wrong even if it will most likely lead to way less people dying overall seems really petty and vindictive
 

RiOrius

Member
Oct 27, 2017
6,081
Perhaps there could be an option"My life before the life of other people/or after"?Also in defense of Mercedes just imagie if there are passengers, situations can be grey.
I'd also appreciate an option to allow the car to sacrifice my life if in so doing it can save the Protoss homeworld.
 

Freezasaurus

Member
Oct 25, 2017
57,002
Putting people in prison after an accident happens doesn't actually do anything to stop the accident since it already happened.Saying that you don't want the technology to exist because it's harder to punish someone if things go wrong even if it will most likely lead to way less people dying overall seems really petty and vindictive
I think it would make people think twice before buying one.
 

BAD

Member
Oct 25, 2017
9,565
USA
They said this years ago. I remember a thread with Mercedes stating this at the old place.
 

JABEE

Member
Oct 25, 2017
9,854
I guess the question is would you put your children in a car that would sacrifice their lives for strangers?
 

Vyse

One Winged Slayer
Member
Oct 25, 2017
1,392
This is almost a generic super villain statement, Mercedes is basically saying that the ends justify the means and regardless of what damage may come from this it won't outweigh the damage prevented by trusting the average human driver.
 

hydruxo

▲ Legend ▲
Member
Oct 25, 2017
20,441
Well this will definitely go over well for Mercedes. Good luck to the PR team on this if they want to try and defend it lmao.
 

Slayven

Never read a comic in his life
Moderator
Oct 25, 2017
93,143
I am waiting for insurance companies and laws to wake up, then you will see some stuff
 

Deleted member 18944

User requested account closure
Banned
Oct 27, 2017
6,944
"Will be programmed to sacrifice pedestrians"

The article does a great job of fear mongering a bit, especially when it provides the Trolley problem as a real world scenario that most likely will not happen.

Here's some tidbits about safety features and ethical concerns of self driving cars.

The ethical considerations of autonomous vehicles cannot be understated, especially when a situation involves choosing between two victims, a pedestrian or passenger. As well, if a vehicle crash is unavoidable, does the car choose to crash into a small vehicle versus an SUV?

If algorithms are generally designed to do the former, not the latter, isn't this a form of discrimination against the owners of small vehicles who might not be able to afford larger ones?

Dieter Birnbacher, a professor of philosophy at the University of Duesseldorf, and Wolfgang Birnbacher, an FPGA system designer at IBEO Automotive Systems GmbH, concede that, as with any safety-critical technology, the general public is usually willing to accept a certain level of risk in exchange for the benefits of said technology.

"Despite all security measures, a residual risk is unavoidable, which raises questions: How safe is safe enough? How safe is too safe?" they wrote recently in the IEEE Intelligent Systems magazine.

Smart cars could reduce travel times, improve air quality, and eliminate virtually all accidents caused by human error. Ultimately, if public consensus determines an acceptable level of risk, the public becomes responsible for what happens in the face of that risk, the two experts said.

"It goes without saying that an egalitarian decision algorithm along these lines would lead to a radical shift of responsibility from the individual to the public. Neither the owner nor the passengers could be held responsible for the behavior of the vehicle any longer since risk preferences and conflict solving are determined in advance by societal consensus, leaving no room for individual intervention," the authors say.

The top three causes of car accidents are distracted driving, speeding, and drunk driving, according to the National Highway Traffic Safety Administration. Human error—which accounts for over 90% of accidents—would be largely eliminated with self-driving cars, leaving any remaining likelihood of accidents to autonomous vehicle design quality.

And there's the rub. How can car manufacturers make self-driving cars safer?

The answers are complex, but researchers from the University of Sao Paulo, Brazil propose one solution: an independent module, the Autonomous Vehicle Control (AVC).

The AVC is a safety system designed and built independently from the vehicle's uniquely manufactured system. The AVC can be installed into any vehicle and tested for industry safety standards across the board, no matter who the manufacturer is.

The idea is to implement an AVC that will both interact with the vehicle's systems and create a protection layer that is independent of the way the vehicle's system was developed, ensuring that, no matter how the car is designed by a manufacturer, it will meet all safety standards.


What researchers found when they essentially quizzed people on the Trolley problem as well is basically you save more lives than the few.

What they found
The researchers identified three relatively universal preferences.

On average, people wanted:

  • to spare human lives over animals
  • save more lives over fewer
  • prioritize young people over old ones

 

Deleted member 48897

User requested account closure
Banned
Oct 22, 2018
13,623
If y'all don't count any fatalities caused by these vehicles as deaths explicitly done under capitalism then you do not understand economics.
 

Eeyore

User requested ban
Banned
Dec 13, 2019
9,029
"Will be programmed to sacrifice pedestrians"

The article does a great job of fear mongering a bit, especially when it provides the Trolley problem as a real world scenario that most likely will not happen.

Here's some tidbits about safety features and ethical concerns of self driving cars.






What researchers found when they essentially quizzed people on the Trolley problem as well is basically you save more lives than the few.




Have you seen the Good Place episode on this? It's pretty good.
 

capitalCORN

Banned
Oct 26, 2017
10,436
Some of you have some weird hope about machines and AI. I've witnessed plenty of tech go to shit for non-attributable reasons. Somehow coming to terms that a hairline fracture across the public space now defines life and death is a frightening jump for me.
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
This is almost a generic super villain statement, Mercedes is basically saying that the ends justify the means and regardless of what damage may come from this it won't outweigh the damage prevented by trusting the average human driver.
The average human driver is fucking awful and doesn't deserve anyone's trust
 

Deleted member 48897

User requested account closure
Banned
Oct 22, 2018
13,623
Who is more likely to survive, the person with airbags or people being mowed down by a chunk of metal?

This is the thing that really gets me. There's almost no circumstance in which a car is in danger of hitting a pedestrian and the driver is most likely to face certain death, unless either the driver or, for God knows what reason, the autopilot is driving irresponsibly. Favoring the driver here is Mercedes writing a check that they SHOULDN'T be able to catch. I didn't think I'd see a more obvious technofascist wet dream than the cybertruck this year, but here we are.
 

Opto

Banned
Oct 28, 2017
4,546
I mean that would depend. If it was a freak accident I see no reason for anyone to be punished. If it's found to be a frequent problem that is the result of a coding oversight than the company should get in trouble. If the programming was hacked by the driver the driver is responsible
I'm not talking about an accident. I'm talking about a situation where beyond reasonable doubt the AI was negligent. Who goes to jail? If no one then we're just disseminating responsibility to the point where no one is responsible for deaths
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
Some of you have some weird hope about machines and AI. I've witnessed plenty of tech go to shit for non-attributable reasons. Somehow coming to terms that a hairline fracture across the public space now defines life and death is a frightening jump for me.
I think you're missing the part where humans as we already are are already terrible drivers as a whole, and while tech malfunctions may be a thing, this technology won't realistically be greenlit for public use until it's significantly safer than the typical human driver anyways
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
I'm not talking about an accident. I'm talking about a situation where beyond reasonable doubt the AI was negligent. Who goes to jail? If no one then we're just disseminating responsibility to the point where no one is responsible for deaths
If there was negligence in the coding of the AI then any reasonable person would put the liability on the company responsible. It's not particularly hard to figure out
 

capitalCORN

Banned
Oct 26, 2017
10,436
I think you're missing the part where humans as we already are are already terrible drivers as a whole, and while tech malfunctions may be a thing, this technology won't realistically be greenlit for public use until it's significantly safer than the typical human driver anyways
Regulation will be a human input. Bugtest all you want. One is negligence, one is design. One is a moment, the other a purpose. We're coming hot off the heels off the 737 max here.
 

Vyse

One Winged Slayer
Member
Oct 25, 2017
1,392
That's not what's being said at all
Except it is, the article explains that AI will chose to save the drivers in any scenario but it's ''ok'' because it will pose significantly less danger than human drivers.
-Mercedes's von Hugo, then, thinks that the ethical problems will be outweighed by the fact that cars will be better drivers overall. "There are situations that today's driver can't handle, that . . . we can't prevent today and automated vehicles can't prevent, either. The self-driving car will just be far better than the average human driver,"
 

Aaronrules380

Avenger
Oct 25, 2017
22,472
Bugtest all you want. One is negligence, one is design. One is a moment, the other a purpose. We're coming hot off the heels off the 737 max here.
I'm finding it hilarious how you seem to think it's more ok if many people are dying due to incompetence rather than a few people dying due to a decision making protocol who would probably die anyways if a human was behind the wheel