ADVERTISEMENT

Tesla Autopilot Crash and Death

I cringe every time I read articles about self driving, self parking, self stopping, self anything cars. If you can't drive or if you're too lazy to perform all of the duties that involve driving, then don't fukcing drive.
 
LOL at people who think having control of your car means you're immune from dying in a crash. You're really going to denounce a system because of ONE death?

"On the day the Tesla driver died, he said, approximately 100 other people died on U.S. roads. No one knows how many of those deaths could have been prevented by cars that could predict crashes before they happen and brake by themselves."

Sure the technology isn't perfect, but it's getting better, and will continue to do so with connected vehicles and fixed point interfaces. I don't know if I buy the explanation though.
 
I'll get into a semi-auto / auto vehicle tomorrow if there were more of them on the road. This is not a tech I want to be an early adopter on. I have a long commute, being able to work while commuting would be wonderful. Once there are more semi/auto cars on the road, each situation that the car encounters will be more predictable, making it safer.
 
A Tesla with the semi-autonomous autopilot system engaged failed to brake or perform other crash avoiding maneuvers when the car could not differentiate between the side of a white trailer and the bright sky. The driver of the car also did not react.

http://www.cbsnews.com/news/this-fatality-could-slam-the-brakes-on-driverless-cars/

Strange...as this accident/death happened almost 2 months ago but Tesla didn't announce it till yesterday.

Sad someone lost their life over this "driverless" accident.
 
Do you get on a plane, like ever?
That technology has been around for a very long time. It has slowly been perfected over the years.

This is brand new. The technologies arent comparable at this point. I dont want to be test user for brand new technology that could put my life at risk. Maybe in 5-10 years when its been better tested I would feel comfortable with it. Just not at this time.
 
That technology has been around for a very long time. It has slowly been perfected over the years.

This is brand new. The technologies arent comparable at this point. I dont want to be test user for brand new technology that could put my life at risk. Maybe in 5-10 years when its been better tested I would feel comfortable with it. Just not at this time.
It's not that new, it has been tested and I think there's more being made of this in the media than it warrants.

http://www.wftv.com/news/local/self-driving-car-driver-died-after-crash-in-florida-a-first/375092000

"The company said this was the first known death in over 130 million miles of Autopilot operation. It said the NHTSA investigation is a preliminary inquiry to determine whether the system worked as expected.

Tesla says that before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time, the statement said.

Autopilot makes frequent checks, making sure the driver's hands are on the wheel, and it gives visual and audible alerts if hands aren't detected, and it gradually slows the car until a driver responds, the statement said
."

The car isn't completely autonomous and there's a large degree of responsibility that the driver must still assume. The same driver had previously posted a video of how the autopilot had saved his life in another instance.
 
A plane cruises on autopilot with a dedicated air space and nothing in front or surrounding it. A car has to navigate a sea of assholes.
And a pilot pays attention during the flight, as there is unseen turbulence, changes in other flight patterns, weather, and other obstacles that a plane's autopilot does not account for. It does not appear that the driver was in this case, despite the limitations of the car's autopilot, which is only intended (for now) to assist a driver.
 
  • Like
Reactions: ucflee
"The company said this was the first known death in over 130 million miles of Autopilot operation. It said the NHTSA investigation is a preliminary inquiry to determine whether the system worked as expected.

I think I read that the US average is one death per 90 million miles. Then again, the Model S is far and away the safest car in an accident on the road today. So being anywhere close to the average looks bad. On the other hand, this was a pretty freak set of circumstances. Had he hit the trailer wheel or cab, he probably would have walked away. The crumple zone drove under the trailer and the windshield took the full impact.

For autonomous driving to really go anywhere we need vehicle to vehicle communication, which I think is going to be mandated on new cars soon.
 
It's not that new, it has been tested and I think there's more being made of this in the media than it warrants.

http://www.wftv.com/news/local/self-driving-car-driver-died-after-crash-in-florida-a-first/375092000

"The company said this was the first known death in over 130 million miles of Autopilot operation. It said the NHTSA investigation is a preliminary inquiry to determine whether the system worked as expected.

Tesla says that before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all times. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time, the statement said.

Autopilot makes frequent checks, making sure the driver's hands are on the wheel, and it gives visual and audible alerts if hands aren't detected, and it gradually slows the car until a driver responds, the statement said
."

The car isn't completely autonomous and there's a large degree of responsibility that the driver must still assume. The same driver had previously posted a video of how the autopilot had saved his life in another instance.
Google has stated from their own autopilot program that often times its not the autopilot that is the problem, but the other cars on the road. Its just not good enough at reaching to other vehicles on the road.

Please feel free to put your own personal life at risk then.
 
BTW, if you want to get all final destination freaked out. This same guy apparently posted a youtube video of the autopilot system saving him when an 18 wheeler swerved into his lane.

EDIT: Here's the video, not an 18 wheeler
 
Last edited:
  • Like
Reactions: CommuterBob
Google has stated from their own autopilot program that often times its not the autopilot that is the problem, but the other cars on the road. Its just not good enough at reaching to other vehicles on the road.

Please feel free to put your own personal life at risk then.
Every time you get in your car and get on the road with others, you're putting your life at risk. Autopilot or not. Google's driverless fleet has had one at-fault accident in over 1.5 million miles of on-road use, and only 14 accidents total, with only one resulting in injury. The at-fault accident was a result of the car deciding to swerve into another vehicle instead of hitting some sandbags. Nobody was injured. The other accidents were mostly other drivers rear-ending the Google car which was legally stopped at a light or stop sign, or accidents that occurred while the vehicle was in manual mode. You can see all of their accident reports here: https://www.google.com/selfdrivingcar/reports/
 
That technology has been around for a very long time. It has slowly been perfected over the years.

This is brand new. The technologies arent comparable at this point. I dont want to be test user for brand new technology that could put my life at risk. Maybe in 5-10 years when its been better tested I would feel comfortable with it. Just not at this time.

How many deaths can be attributed to the early development of autopilot? There has been many airplane crashes that can be, at least, partly blamed on autopilot. There has been 1 death for autonomous cars and practically 0 other injuries caused by the autopilot system.

A plane cruises on autopilot with a dedicated air space and nothing in front or surrounding it. A car has to navigate a sea of assholes.
The sea of assholes is the biggest issue with autonomous cars and we will likely never get away from that.

You do realize that most planes are equipped with autoland as well though typically only used when the pilot is physically unable to see the runway. With as much as you fly (especially places like Heathrow that have continually shitty weather), I'd be willing to bet that you've landed with autoland and the pilots hands off.
 
How many deaths can be attributed to the early development of autopilot? There has been many airplane crashes that can be, at least, partly blamed on autopilot. There has been 1 death for autonomous cars and practically 0 other injuries caused by the autopilot system.

The sea of assholes is the biggest issue with autonomous cars and we will likely never get away from that.

You do realize that most planes are equipped with autoland as well though typically only used when the pilot is physically unable to see the runway. With as much as you fly (especially places like Heathrow that have continually shitty weather), I'd be willing to bet that you've landed with autoland and the pilots hands off.

Yep, I'm well aware of the Instrument Landing System on planes (ILS). I've landed with my pilot friend many times using it. It's one of the reasons that planes can land, which should be insanely difficult, so easily and so routinely every day. However even with the ILS, the pilots' hands are on at all times and they monitor the pitch, altitude, speed, etc like a hawk with the ability to make adjustments as they see fit.
 
Yep, I'm well aware of the Instrument Landing System on planes (ILS). I've landed with my pilot friend many times using it. It's one of the reasons that planes can land, which should be insanely difficult, so easily and so routinely every day. However even with the ILS, the pilots' hands are on at all times and they monitor the pitch, altitude, speed, etc like a hawk with the ability to make adjustments as they see fit.
Right, "hands off" is a cliche. They pilots aren't stepping in unless they have to. If their hands are on the controls, whatever. The pilots aren't doing shit except watching and waiting to step in. The guy that crashed his Tesla was supposed to have his hands on the wheel and be paying complete attention, but alas, he didn't and now he's dead. So who do we blame, the driver or the car?

And I don't think we'll be sitting back and relaxing while a car drives for a long while. The first autonomous cars on the market will require constant driver contact until you have a good saturation of autonomous cars (or at least cars that stream telemetry data to nearby cars).
 
Something else to keep in mind when comparing early airplane autopilot and early car autopilot is the drastic difference in the "pilots". Airplane pilots went to flight school and chose the career because they love to fly. In contrast, a lot of Tesla drivers choose the Model S because they hate to drive. The airplane pilot is looking, maybe even hoping, for a reason to take over. The Tesla drive is annoyed that they can only read 3 facebook posts before they have to put their hands back on the wheel, probably never looking up.

 
Much like lane change warnings, auto-braking, and other "smart" technology in cars, this is only supposed to aid the driver. It's likely that the driver wasn't paying attention and just relied on the auto-pilot.

It's also been reported that the truck was white and it was very bright out so it wasn't really easy to see until it was already almost in the next lane. The auto-pilot likely didn't "see" it because of the lack of contrast, but is debatable if a human would have noticed it coming over in time either.
 
Much like lane change warnings, auto-braking, and other "smart" technology in cars, this is only supposed to aid the driver. It's likely that the driver wasn't paying attention and just relied on the auto-pilot.

It's also been reported that the truck was white and it was very bright out so it wasn't really easy to see until it was already almost in the next lane. The auto-pilot likely didn't "see" it because of the lack of contrast, but is debatable if a human would have noticed it coming over in time either.

That's what Tesla says, but how does the color of the truck affect the radar from seeing it?

1006811-1445278171149748-Paulo-Santos_origin.png
 
LOL at anyone who would trust their life to an auto pilot car...
We cannot even get road signs correct or consistent. How can machines manage in an already inconsistent, error'd world?
 
a sea of assholes.
Which is the title of the movie Barrister always watches whilst in his self driving Tulsa (the Oklahoman version of a Tesla. Hint: it's a wheelbarrow pushed by a fake Indian that's trying to earn college credit through a summer internship).
 
  • Like
Reactions: SublimeKnight
ADVERTISEMENT
ADVERTISEMENT