• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

dammitmattt

Member
Oct 28, 2017
246
11 autopilot deaths in 6 years seems not so terrible. 1 is too many of course, but thats less than 2/year. How many autopilot enabled teslas are on us roads today?

Also, victims were 59 & 69, and had the car out on a test drive. Not like they were kids messing around in dads tesla. Unless the dealer was like...'yeah go to sleep in the backseat, its all good'

There's more than 1.5 million Autopilot-enabled vehicles on the road today, with 2k new vehicles being added to that pool every day. All models have basic Autopilot features enabled (Autosteer, Traffic-Aware Cruise Control), and a large chunk have more advanced features enabled (like Navigate on Autopilot, Auto Lane Change, Summon, and more). Billions and billions of miles have been driven with Autopilot enabled and yet we still only hear about the handful of crashes and deaths that happen, almost all of which have been due to gross negligence of the driver.

The comparison to the J&J vaccine reaction is spot-on...
 

Bob Beat

Member
Oct 25, 2017
3,916
My 2008 car can detect if some one is sitting in the front passenger seat to determine something about the airbag but tesla can't figure out if a driver is sitting behind the wheel?
 

GearDraxon

Member
Oct 25, 2017
2,786
For context: I drove ~200mi down I-5 on Wednesday. ~175mi of that was on Autopilot - for those of you unfamiliar, lots of I-5 is basically flat and straight, but it's not an airstrip. Any time I didn't put positive pressure on the wheel for more than a short bit, the car alerted me to do so. If I had ignored that enough, the Autopilot would have turned off, and not been available for the remainder of that journey. When I've got it engaged, I keep a hand on the wheel and my foot in the same position I would for a "normal" cruise control. Like any new tech, when I first started using it, I was incredibly cautious: foot on accelerator, almost-steering "just in case," etc. Over time, I've put enough miles on it to recognize that it's not perfect by any means, but it definitely keeps me more centered in a lane than I would normally, especially on long stretches.

The number of hoops you have to jump through to defeat the systems is pretty high, but there are always going to be people who go to those lengths - especially with social media, where it becomes a weird flex to show yourself doing things "while the car drives itself." That being said, acting like Autopilot (or other assisted-driving modes) are suddenly going to lead to people looking at their phone while driving is hilarious - I see people staring at their phones while pulling up to / away from stop lights every day, in every make and model of car. That ship has already sailed.

Part of the issue is people's piecemeal knowledge of what the cars can and can't do, or basic features - this thread, where people are thinking that Teslas don't have weight sensors in the seats (of course they do), or the other week, where someone legit thought that you might not be able to drive one through a car wash "because it's electric."
 

jdawg

Banned
Nov 26, 2020
511
There was literally no one in the driver's seat in this case. This take is not relevant

If you can't focus on the road while not driving, making it pointless to sit there, why would you sit in the driver's seat? Not that hard to understand

Requiring drivers to be at the helm of self driving cars doesn't work, because they will quickly lose focus and attention, doesn't matter if someone is sitting in the chair or not. If you want people to pay attention and keep focus, especially for extended periods of time, they have to be driving
 
Last edited:

Gigglepoo

Banned
Oct 25, 2017
8,317
Also, victims were 59 & 69, and had the car out on a test drive. Not like they were kids messing around in dads tesla. Unless the dealer was like...'yeah go to sleep in the backseat, its all good'

It's almost like being moronic doesn't have an age limit.

The comparison to the J&J vaccine reaction is spot-on...

That'll be my go-to example of blowing shit way out of proportion for years to come. 1 in 1,000,000 chance and they pulled it from the market!
 

ascii42

Member
Oct 25, 2017
5,798
This is why Cadillac's uses sensors to verify that you are still paying attention to the road.
 

tenchir

Member
Oct 25, 2017
43
... if autopilot wasn't enabled then how was the car moving without anyone in the driver's seat?

Tesla's with and without AP still have cruise control. If the two people driving the car were testing driving it, then they probably didn't know they haven't enabled AP, but CC by mistake?
 
Last edited:

TheYanger

Avenger
Oct 25, 2017
10,153
I don't think the name of the feature is the problem here. When using autopilot in a Tesla the car communicates very clearly that you need to stay in control at all times. And if you ignore it the car will start to get loud, will come to a stop if needed, and will actually disable the feature for a period of time. The drivers here must have intentionally subverted the system, and thankfully they didn't hurt anyone else.
Yep. My mom's tesla tests if you're at the wheel if it doesn't think you're paying attention, and if you fail it shuts off, and if you do it too many times it claims it will disable your autopilot permanently.
 

Biggersmaller

Banned
Oct 27, 2017
4,966
Minneapolis
A lot of bad assumptions here about Tesla's Autopilot. Some criticize Elon Musk or the company without even a simple search to understand how these cars work. It was 100% the driver's fault. The deceased weren't simply unobservant while using autopilot, they were dangerously reckless.

Basically this:

qyKI1ogNCoynG42AcQbRFWn34b8=.gif
 

GYODX

Member
Oct 27, 2017
7,244
A lot of bad assumptions here about Tesla's Autopilot. Some people here really want to criticize Elon Musks or the company without even a simple search to understand how these cars work. It was 100% the driver's fault. The deceased weren't simply unobservant while using autopilot, they were dangerously reckless.

Basically this:

qyKI1ogNCoynG42AcQbRFWn34b8=.gif
Yeah, after a certain point, you can't save stupid people from themselves.
 

spam musubi

Member
Oct 25, 2017
9,381
Requiring drivers to be at the helm of self driving cars doesn't work, because they will quickly lose focus and attention, doesn't matter if someone is sitting in the chair or not. If you want people to pay attention and keep focus, especially for extended periods of time, they have to be driving

It clearly does work though, given that hundreds of thousands of other people who have Teslas and haven't had accidents
 

Sandstar

Member
Oct 28, 2017
7,744
Yeah, I do question the sense of these intermediate levels of assisted driving. "Feel free to let the machine do this for you... but stay super alert to take over at a moment's notice." As you say, people just aren't wired that way. Though, it's not like normal unassisted driving is terribly safe either—so I guess it's fair to use that as a comparative benchmark for partially-autonomous, rather than perfection.

Right, regular driving isn't safer, but I do think that this "assisted driving, where the driver can take over in an emergency" is a fiction made up so they can market their autonomous vehicles sooner. People get distracted while actively driving, they're absolutely going to get distracted while "assisted" driving.


It clearly does work though, given that hundreds of thousands of other people who have Teslas and haven't had accidents

Yet, people will claim that driving is dangerous, because of all the accidents, despite all the millions of people who drive, and haven't had accidents.
 

spam musubi

Member
Oct 25, 2017
9,381
Right, regular driving isn't safer, but I do think that this "assisted driving, where the driver can take over in an emergency" is a fiction made up so they can market their autonomous vehicles sooner. People get distracted while actively driving, they're absolutely going to get distracted while "assisted" driving.




Yet, people will claim that driving is dangerous, because of all the accidents, despite all the millions of people who drive, and haven't had accidents.

Sure? That's a dumb argument too?
 

Armadilo

Banned
Oct 27, 2017
9,877
The only real problem is that Tesla calls this Autopilot when it's just supposed to assist the driver, the real full autopilot hasn't even been shown off yet and hasn't been available to anybody yet.
 

nullref

Member
Oct 27, 2017
3,055
Right, regular driving isn't safer, but I do think that this "assisted driving, where the driver can take over in an emergency" is a fiction made up so they can market their autonomous vehicles sooner. People get distracted while actively driving, they're absolutely going to get distracted while "assisted" driving.

I agree, and would feel more comfortable with a far more conservative approach to marketing these features. I imagine that's at odds with Tesla's brand and appeal, unfortunately.
 
Oct 28, 2017
967
Yeah, but the vast majority of people likely aren't familiar with the nuances of airplane autopilot. Except that assumption that you can just set it and forget it, even if it's wrong in practice.

Name isn't the primary problem at all. But it could be a minor part of it. Also, an every 2 min prompt to put your hands on the wheel is an eternity when you're driving.

It's not every two minutes it's usually random and around every half mile.

As an owner who uses it everyday this isn't a naming issue but a lack of brain issue. The car makes it very clear what it can and can't do. If you ignore the prompts it does lock you out of autopilot.

In order to circumvent the safety features a person needs to be fully aware of what those features are and work to override them so they can sit in the passenger seat.
 

Biggersmaller

Banned
Oct 27, 2017
4,966
Minneapolis
Right, regular driving isn't safer, but I do think that this "assisted driving, where the driver can take over in an emergency" is a fiction made up so they can market their autonomous vehicles sooner. People get distracted while actively driving, they're absolutely going to get distracted while "assisted" driving.




Yet, people will claim that driving is dangerous, because of all the accidents, despite all the millions of people who drive, and haven't had accidents.

"Regular" driving is very quickly becoming the more dangerous option...if we aren't there already. Tesla is largely responsible for that jump. Like an airplane, requiring a driver to be ready in the event Autopilot fails is reasonable and a necessary expectation in this stage of development. To suggest it's a cynical "fiction" to push untested tech doesn't make sense when these deaths were caused by people intentionally circumventing safety measures.
 

TheYanger

Avenger
Oct 25, 2017
10,153
"Regular" driving is very quickly becoming the more dangerous option...if we aren't there already. Tesla is largely responsible for that jump. Like an airplane, requiring a driver to be ready in the event Autopilot fails is reasonable and a necessary expectation in this stage of development. To suggest it's a cynical "fiction" to push untested tech doesn't make sense when these deaths were caused by people intentionally circumventing safety measures.
"Regular" driving is already demonstrably worse in situations where the car can handle with autopilot - the reason you have to be there is because there are enough situations that autopilot can't handle appropriately - this is easily visible driving one in places with awkward streets and markings where you see the 'autopilot available' icon flip on and off. Like that video above of the guy on twitter sasying he can use it on streets without lines - yeah, sometimes it does flip and see the sides of the road as lines, but it also flips OFF very fast in those situations because it's not reallysure wtf is going on.

Something like freeway driving though? Autopilot works fantastically and there's no chance it's worse than a human being driving at this point.
 

ThatCrazyGuy

Member
Nov 27, 2017
9,867
When the battery in an electric car catches fire, are those harder to put out than a normal engine/gas tank fire?
 

mAcOdIn

Member
Oct 27, 2017
2,978
Saying autopilot is fine because it functions similarly to an aircraft's autopilot is conveniently leaving out how much training airline pilots go through before they can fly an aircraft like a 737 for an airline. It's like saying "if trained professional pilots can handle this feature than any idiot with a Driver's License should be able to handle it," which is crazy to me.

I don't think these types of solutions should exist at all on consumer cars. I'm not a luddite either, I welcome self-driving cars but this period in the middle where the technology clearly isn't ready and we're basically treating every driver as a beta tester is negligent and idiotic.

When the car can truly be expected to drive itself while you're sleeping or whatever then you can put the feature into consumer cars.
 

ElNerdo

Member
Oct 22, 2018
2,233
I highly doubt changing the name "autopilot" to something else is going to stop people from being stupid.
 
Feb 26, 2021
234
There is some discussion on The Verge about this, and people are claiming that Autopilot was not even engaged when the crash has happened. I don't know if this was something reported, but the claim there is that autopilot cannot be engaged on the type of road where this crash has happened, and that if it was engaged, it would never drive a car that fast on that type of road.

Then again, it's of course possible that the autopilot for whatever reason malfunctioned and didn't 'realize' what road they are on.

I was thinking that there really is so much work necessary to circumvent someone sitting in the driver's seat for the autopilot to work, that these guys are either grade A idiots, who brought a sack of potatoes to place it into driver's seat, and water bottle to be taped onto a wheel, or something more sinister happened - like there being a 3rd person in the car who was driving but managed to escape, or the driver was not buckled, and the side impact threw him into a passenger seat - but the report sounds pretty definitive that simply no one was in the driver's seat.

Tesla's system is the easiest one to circumvent, its not that hard.

Other Level 2 Systems have camera that verifies that the drivers eyes are on then road when the system is on. Tesla choose not to do this to save a few Bucks.
 

Sandstar

Member
Oct 28, 2017
7,744
"Regular" driving is very quickly becoming the more dangerous option...if we aren't there already. Tesla is largely responsible for that jump. Like an airplane, requiring a driver to be ready in the event Autopilot fails is reasonable and a necessary expectation in this stage of development. To suggest it's a cynical "fiction" to push untested tech doesn't make sense when these deaths were caused by people intentionally circumventing safety measures.

It's really not. People aren't going to stay aware, and engaged when they're not doing any of the driving. it's a fiction design solely to let them push something that's not autonomous driving as autonomous driving. And maybe Musk should make safety measures that aren't easily defeatable.
 
Feb 26, 2021
234
Yeah, I have a feeling this is the more relevant data point. Not that this particular incident isn't relevant enough to be reported, but we would get a very different mental picture of the relative safety of each if we had a thread for every person killed in a plain ol' manual car accident. Like, just in the US there's around 30-40 thousand driving deaths a year; that'd be over a hundred new threads a day.

You couldn't compare the two rates in any meaningful way. Tesla's probably don't even make up 1% of the total cars on the road. Even then, you cant assume the teslas are always on self Driving. Of course human drivers would have more deaths.
 

GMM

Member
Oct 27, 2017
5,484
Regardless if the autopilot failed and Musk is lying about the car involved not having bought assisted driving, it doesn't change the fact that these systems are not meant to function fully autonomous just yet and the driver would have to have intentionally tricked the system if no one was really in the drivers seat at the time of the accident. This story and the potential faults in Tesla's systems can be talked about when we know more about what actually happened in this incident, until then it's unfair to pass judgement on people who actually died here or the software/hardware from Tesla.
 

Weltall Zero

Game Developer
Banned
Oct 26, 2017
19,343
Madrid
You couldn't compare the two rates in any meaningful way. Tesla's probably don't even make up 1% of the total cars on the road. Even then, you cant assume the teslas are always on self Driving. Of course human drivers would have more deaths.

"You couldn't compare the rates in any meaningful way, except by performing straightforward division by the number of respective owners". Wow, what insurmountable mathematical challenge; truly the Fermat's Last Theorem of our time.

If the death per self-driving car is lower than the rate per non-self driving cars, then it immediately follows that self driving is safer, because no matter what percentage of time its owners are actually using self-driving, it's certainly higher than zero. And, of course, the opposite is true if death rate is actually higher for self-driving cars.
 

chuckddd

Member
Oct 25, 2017
23,132
I'm less concerned about the self driving feature and much more concerned about the thermal runaway battery fire.
 

AIan

Member
Oct 20, 2019
4,867
I still find it baffling people are putting so much faith into autopiloting cars. It's not bulletproof unless the road/parking system is redesigned to compensate.
 
Aug 30, 2020
2,171
I just think it'd be great if we could focus on features that make the public safer, rather than frivolous corporate beta testing for profit at the expense of the public safety.

Tesla Autopilot as a feature right now does not make anyone safer and makes driving more dangerous. There are other fine safety features, but this one is dangerous. And now people have died. Degrees of irresponsibility on the drivers if they indeed did anything wrong (probably best to wait for a final investigation), but for a company to make such functionality public is purely irresponsible too.
 

medinaria

Member
Oct 30, 2017
2,544
User Banned (3 Days): Hostility; Prior Ban for Hostility
"You couldn't compare the rates in any meaningful way, except by performing straightforward division by the number of respective owners". Wow, what insurmountable mathematical challenge; truly the Fermat's Last Theorem of our time.

If the death per self-driving car is lower than the rate per non-self driving cars, then it immediately follows that self driving is safer, because no matter what percentage of time its owners are actually using self-driving, it's certainly higher than zero. And, of course, the opposite is true if death rate is actually higher for self-driving cars.

it actually doesn't "immediately follow" at all

consider this entirely hypothetical example I've made to disprove you, with very simple and pleasantly round numbers - we'll assume self-driving usage rate of 50%:

50/10000 non-self-driving cars have a fatal accident in a given number of miles
5/10000 non-self-driven teslas have a fatal accident in half the given number of miles
40/10000 self-driven teslas have a fatal accident in half the given number of miles (you can assume they're the same teslas if you want, it makes no difference really)

the end result is that teslas are more safe (45/10000 vs. 50/10000 over the same number of miles) but automated driving is less safe (80/10000 vs 50/10000 over the same number of miles)

obviously with more realistic numbers the margins are different, but it doesn't "immediately follow" in any kind of way. in order to make it "immediately follow" you have to assume that the underlying driven-tesla vs driven-non-tesla fatality rates are the same, but you haven't really shown that? in fact, we have a lot of reason to believe they wouldn't be, in that teslas are generally newer, driven by people in lower risk groups (rich middle-aged men), and driven in urban areas or on highways (which are significantly less likely to have fatal car accidents than rural areas per mile!). musk himself loves to talk about how safe his car is! it's possible that the underlying rate is much lower, at which point nothing you've said is accurate whatsoever

in general, "is automated driving safer than manual driving" is a very thorny question that it's difficult to get a real answer to, because there are a large number of confounding variables. the best data we could have in theory are rates on self-driven automated vehicles vs manually-driven automated vehicles of the same type, and even then you run into the problem of "you can't use modern self-driving in many of the more dangerous situations, so your data is likely skewed to begin with" and "tesla enjoys putting out very misleading articles that don't say what they claim they say".

also like, just gonna be real here, as a neutral observer I'm basically only writing this reply in the first place because you're being an enormously smug asshole without any real ability to back it up, and you should maybe reconsider how you talk to people or like pick up a statistics book or something idk
 

dammitmattt

Member
Oct 28, 2017
246
I just think it'd be great if we could focus on features that make the public safer, rather than frivolous corporate beta testing for profit at the expense of the public safety.

Tesla Autopilot as a feature right now does not make anyone safer and makes driving more dangerous. There are other fine safety features, but this one is dangerous. And now people have died. Degrees of irresponsibility on the drivers if they indeed did anything wrong (probably best to wait for a final investigation), but for a company to make such functionality public is purely irresponsible too.

What you're saying is demonstrably wrong, not to mention you are being hypocritical.

Demonstrably wrong - Autopilot makes driving more dangerous. Tesla has plenty of data (~4 billion miles) that show that you're wrong. Do you have any data to back up your assertion?

Hypocritical - You are assuming that Autopilot is at fault, which as of now the data says it wasn't engaged. But you want to withhold judgement on the responsibility of the "driver," who is absolutely at fault no matter whether he was using Autopilot or not (unless a rogue AI took over the car...)?

Come on...