![]() |
Tesla autopilot doesn't notice a tractor trailer, fatal crash
Tesla’s Autopilot being investigated by the government following fatal crash | Ars Technica
Sounds like tractor trailer turned left through oncoming lanes, and Tesla autopilot drove right into it without applying brakes, car went under trailer and driver was killed. Presumably decapitated. Tesla says autopilot did not notice the white trailer against the bright sky. I thought the Tesla has a front facing radar? |
It was a matter of time before someone killed themselves over that stupid auto pilot feature.
****er deserves it. More $$$ then brain cells. |
Strange... and unfortunate. I have to imagine the driver was not paying attention?
|
I think the whole idea of "semi-autonomous" is a really, really, really, bad road to be semi-going down.
|
he was a test pilot..test driver?
that sucks..and i dont think he deserved it. |
He definitely did not deserve it. That statement is beyond reprehensible.
RIP |
Quote:
|
Tragic way to find flaws in the system. Driver did not deserve to die while advancing technology.
I'm very excited about autonomous cars in hard core traffic areas, I think they might be able to knock out traffic jams. Got to think that if the truck was autonomous too, they would have been "talking" to each other making it safer. The transition between a few autonomous vehicles and most being autonomous will be the most dangerous. |
Quote:
****er deserves it. |
I think it is weird that forward radar wouldn't detect a broadside tractor trailer.
|
Guns don't kill people, automated driving cars do.
|
As a computer programmer, automatic cars scare the crap out of me. They hit a situation they don't understand, and this is what happens, every time the situation is encountered. I'm not comfortable with the idea that I may be the first one to hit a situation the developers didn't think about.
|
Reads like the color of the truck and lighting was with regards to the human error. Why the teslas radar didn't pick it up is the real question, at least a few meters before impact. Was the truck too tall?
|
The human not noticing, is easy. He may have been dozing, reading the paper, surfing the web.
The car not noticing, is weird. I can see the autopilot being programmed to disregard bridges and overpasses, but the truck was low enough to take off the top of the car, so it wasn't that high off the ground, and obviously it's wheels would have been at ground level. I don't think this should be an unanticipated situation . . . a large vehicle turning in front of the car seems pretty foreseeable. Seems there have been many reports of the Tesla autopilot doing scary things, entering curves way too fast etc. I don't think the autopilot can possibly have been thoroughly tested in autonomous mode, look at how long Google has been testing its self driving cars. Has Tesla been reckless in enabling and advertising Autopilot?. |
Quote:
Really, man? You guys are nuts. |
What does Tesla claim this system can do? Does it assist the driver or actually drive? I would welcome a backup system to a human driver, but there is no way I'm going to sit in a car and ignore the road while it drives. Too many variables out there.
|
|
This is going to be a new and lucrative field for liability and personal injury lawyers. A car manufacturer simply cannot advise you to trust their autopilot without backing it up (financially). The mix of autonomous cars and human controlled ones will be a litigator's wet dream. Hell, even automated rail transit screws up occasionally, and they are on tracks going in one direction!
|
Sounds like the error may lay heaviest on the truck that cut over multiple lanes of traffic.
|
Cars have a steering wheel and foot pedals for a reason.
|
Tally so far:
Human error caused auto crashes: 156,494,896 GPS/AutoPilot crashes: 1 It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand, just like flying. But in both those examples it's out of our control, so we'd like to think we would do better than a computer. Sometimes yes, most times no. edit: 156,495,106 since 5 mins ago |
Quote:
1.08 death per 100 million miles driven (2014): http://www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/overview-of-fatality-facts Tesla - about 100 million miles driven by customers, now 1 fatality, Tesla reveals new details of its Autopilot program: 780M miles of data, 100M miles driven and more | Electrek So statistically, about the same with automated navigation. Unless you consider most of us above average ;) Edit: remove drunk drivers, and/or other human "preventable" causes of accidents, and humans drivers would be statisticlly safer. |
To prevent future drivers dozing off the system should be designed so that they should remain in control, and the car should only take over if it senses an emergency that the driver has failed to respond to, and each instance of system intervention should be logged and the data collected against the driver's license. Too many interventions and bye bye license.
|
I've seen a tesla driver reading a book on the I17. On you tube you can see people taking a nap.
But good idea nevertheless... |
Keeping the hands (or one hand) on the wheel should remain "mandatory", not every invention is a good invention!
|
Is it possible that the car's radar saw the clear space underneath the broadside trailer, and didn't recognize it as a threat. Maybe the system needs to be adjusted to scan for thing 4'-5' off the road surface as threats also.
|
Quote:
Quote:
On a semi-related not, I was putting gas in my truck yesterday, and a Tesla Roadster pulls up to the next pump! That was a first! The driver then pulled his small gas can for his lawn mower out of the trunk. |
Quote:
|
ya don't fly an airplane on autopilot and leave the cockpit to take a nap or a dump.
just like a car that can drive itself, an airplane can t/o & land itself. But the pilot/pilots are suppoda monitor all aspects of the operation and are prepaired to take over and fly the damn thing if the autopilot gets crazy go nut. Auto flight has been around a long time but it do f/u on accasion. The mentality of everyday car drivers are in no way prepaired for anything |
Quote:
In the history of bad ideas automated cars are a really really bad idea (IMHO) |
Quote:
|
Quote:
|
If the Tesla is using "radar" then why would the color of the truck matter. If the Tesla is using a optical "radar", then the color might matter. Regardless, it's a mistake to think you are driving a car from the Jetsons. I feel so sorry for the driver and his family.
|
Here's a fun fact; seat-belted drivers take more driving risks than when not seatbelted.
Same goes for say, how hard linebackers hit when the amount of their (safety) gear goes up/improves. IOW, it's human nature to take more risks when there is the proverbial net under the high-wire, than when not. This Tesla software[alleged] death was akin to putting a net under the high-wire. ...a net made of silly-string. The user thought, oh, I have technology on my side. I will happily move along, knowing the net is there for me. A double-fault catastrophe. :-\ |
Quote:
Auto-pilot implies the car is completely in control, on the Tesla it is clear (or it's supposed to be clear) that the driver is supposed to pay attention and be ready to over-ride the what the car is doing. |
Quote:
Regardless, it is terribly sad. |
Quote:
What is problematic IMHO is now self driving cars get a bad rep because of this. Other high line manufacturers have emergency braking assist as well as Tesla, they just don't market it aggressively as "auto" anything. G |
Quote:
And that's all you need to know. Next topic. |
Quote:
Quote:
|
I would venture to say that owning a Tesla is probably out of reach for the masses.
That being said, a new Subaru with "Eyesight" is much more affordable. And if you want the Subaru "Eyesight" technology, you'd better try it out first. My older brother down in New Orleans recently bought a new Outback with the "Eyesight" equipment. He found out that the "safety" of the system came with issues. For example, if someone cut in front of him on the highway it would hit the brakes. If someone were in the exit lane adjacent to him and they were slowing down, his car would slow down as well. It looks down the road quite a ways and adjusts your speed to the traffic speed. Then the final deal breaker, the rear corner "sensor" unit failed. Took it to the dealership. They asked him if he was on a certain bypass in New Orleans at a certain spot. Exactly. They then informed him that he certainly wasn't the first to have the issue in that same location. About 6 months in and he traded it for an Audi. |
All times are GMT -8. The time now is 03:46 PM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website