Pelican Parts Forums

Pelican Parts Forums (http://forums.pelicanparts.com/)
-   Off Topic Discussions (http://forums.pelicanparts.com/off-topic-discussions/)
-   -   Tesla autopilot doesn't notice a tractor trailer, fatal crash (http://forums.pelicanparts.com/off-topic-discussions/920147-tesla-autopilot-doesnt-notice-tractor-trailer-fatal-crash.html)

jyl 06-30-2016 03:46 PM

Tesla autopilot doesn't notice a tractor trailer, fatal crash
 
Tesla’s Autopilot being investigated by the government following fatal crash | Ars Technica

Sounds like tractor trailer turned left through oncoming lanes, and Tesla autopilot drove right into it without applying brakes, car went under trailer and driver was killed. Presumably decapitated. Tesla says autopilot did not notice the white trailer against the bright sky.

I thought the Tesla has a front facing radar?

Arizona_928 06-30-2016 03:48 PM

It was a matter of time before someone killed themselves over that stupid auto pilot feature.
****er deserves it. More $$$ then brain cells.

TheMentat 06-30-2016 03:51 PM

Strange... and unfortunate. I have to imagine the driver was not paying attention?

Gogar 06-30-2016 04:00 PM

I think the whole idea of "semi-autonomous" is a really, really, really, bad road to be semi-going down.

vash 06-30-2016 04:08 PM

he was a test pilot..test driver?

that sucks..and i dont think he deserved it.

Chocaholic 06-30-2016 04:16 PM

He definitely did not deserve it. That statement is beyond reprehensible.

RIP

impactbumper 06-30-2016 04:37 PM

Quote:

Originally Posted by AZ_porschekid (Post 9181692)
It was a matter of time before someone killed themselves over that stupid auto pilot feature.
****er deserves it. More $$$ then brain cells.

Wow

rwest 06-30-2016 04:46 PM

Tragic way to find flaws in the system. Driver did not deserve to die while advancing technology.

I'm very excited about autonomous cars in hard core traffic areas, I think they might be able to knock out traffic jams.

Got to think that if the truck was autonomous too, they would have been "talking" to each other making it safer. The transition between a few autonomous vehicles and most being autonomous will be the most dangerous.

JD159 06-30-2016 05:58 PM

Quote:

Originally Posted by AZ_porschekid (Post 9181692)
It was a matter of time before someone killed themselves over that stupid auto pilot feature.
****er deserves it. More $$$ then brain cells.

Somebody is a POS.

****er deserves it.

jyl 06-30-2016 06:06 PM

I think it is weird that forward radar wouldn't detect a broadside tractor trailer.

Arizona_928 06-30-2016 06:11 PM

Guns don't kill people, automated driving cars do.

legion 06-30-2016 06:14 PM

As a computer programmer, automatic cars scare the crap out of me. They hit a situation they don't understand, and this is what happens, every time the situation is encountered. I'm not comfortable with the idea that I may be the first one to hit a situation the developers didn't think about.

aigel 06-30-2016 06:22 PM

Reads like the color of the truck and lighting was with regards to the human error. Why the teslas radar didn't pick it up is the real question, at least a few meters before impact. Was the truck too tall?

jyl 06-30-2016 06:58 PM

The human not noticing, is easy. He may have been dozing, reading the paper, surfing the web.

The car not noticing, is weird. I can see the autopilot being programmed to disregard bridges and overpasses, but the truck was low enough to take off the top of the car, so it wasn't that high off the ground, and obviously it's wheels would have been at ground level.

I don't think this should be an unanticipated situation . . . a large vehicle turning in front of the car seems pretty foreseeable.

Seems there have been many reports of the Tesla autopilot doing scary things, entering curves way too fast etc. I don't think the autopilot can possibly have been thoroughly tested in autonomous mode, look at how long Google has been testing its self driving cars. Has Tesla been reckless in enabling and advertising Autopilot?.

rusnak 06-30-2016 07:11 PM

Quote:

Originally Posted by rwest (Post 9181762)
Tragic way to find flaws in the system. Driver did not deserve to die while advancing technology.

I'm very excited about autonomous cars in hard core traffic areas, I think they might be able to knock out traffic jams.

Got to think that if the truck was autonomous too, they would have been "talking" to each other making it safer. The transition between a few autonomous vehicles and most being autonomous will be the most dangerous.

You do realize that the guy who died proably thought like you do. And yet, somehow you seem to not notice, other than to call it a tragic way to find flaws in the system.

Really, man? You guys are nuts.

wdfifteen 06-30-2016 09:35 PM

What does Tesla claim this system can do? Does it assist the driver or actually drive? I would welcome a backup system to a human driver, but there is no way I'm going to sit in a car and ignore the road while it drives. Too many variables out there.

jyl 06-30-2016 09:40 PM

https://www.teslamotors.com/presskit/autopilot

dentist90 06-30-2016 11:03 PM

This is going to be a new and lucrative field for liability and personal injury lawyers. A car manufacturer simply cannot advise you to trust their autopilot without backing it up (financially). The mix of autonomous cars and human controlled ones will be a litigator's wet dream. Hell, even automated rail transit screws up occasionally, and they are on tracks going in one direction!

Tervuren 06-30-2016 11:09 PM

Sounds like the error may lay heaviest on the truck that cut over multiple lanes of traffic.

dewolf 06-30-2016 11:38 PM

Cars have a steering wheel and foot pedals for a reason.

dentist90 06-30-2016 11:52 PM

Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand, just like flying. But in both those examples it's out of our control, so we'd like to think we would do better than a computer. Sometimes yes, most times no.

edit: 156,495,106 since 5 mins ago

dad911 07-01-2016 12:21 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand, just like flying. But in both those examples it's out of our control, so we'd like to think we would do better than a computer. Sometimes yes, most times no.

edit: 156,495,106 since 5 mins ago

Nope, You would need to distribute/divide those (total)numbers by miles driven to be meaningful.

1.08 death per 100 million miles driven (2014): http://www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/overview-of-fatality-facts

Tesla - about 100 million miles driven by customers, now 1 fatality, Tesla reveals new details of its Autopilot program: 780M miles of data, 100M miles driven and more | Electrek

So statistically, about the same with automated navigation. Unless you consider most of us above average ;)

Edit: remove drunk drivers, and/or other human "preventable" causes of accidents, and humans drivers would be statisticlly safer.

petrolhead611 07-01-2016 01:10 AM

To prevent future drivers dozing off the system should be designed so that they should remain in control, and the car should only take over if it senses an emergency that the driver has failed to respond to, and each instance of system intervention should be logged and the data collected against the driver's license. Too many interventions and bye bye license.

Arizona_928 07-01-2016 03:24 AM

I've seen a tesla driver reading a book on the I17. On you tube you can see people taking a nap.
But good idea nevertheless...

Holger 07-01-2016 03:40 AM

Keeping the hands (or one hand) on the wheel should remain "mandatory", not every invention is a good invention!

ckelly78z 07-01-2016 05:12 AM

Is it possible that the car's radar saw the clear space underneath the broadside trailer, and didn't recognize it as a threat. Maybe the system needs to be adjusted to scan for thing 4'-5' off the road surface as threats also.

Dantilla 07-01-2016 05:41 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand,

Quote:

Originally Posted by dad911 (Post 9182043)
Nope, You would need to distribute/divide those (total)numbers by miles driven to be meaningful.

1.08 death per 100 million miles driven (2014)

Tesla - about 100 million miles driven by customers, now 1 fatality,

So statistically, about the same...

But how many of those Tesla-driven miles were using autopilot? A very small fraction, I would guess.

On a semi-related not, I was putting gas in my truck yesterday, and a Tesla Roadster pulls up to the next pump! That was a first!
The driver then pulled his small gas can for his lawn mower out of the trunk.

pksystems 07-01-2016 06:17 AM

Quote:

Originally Posted by Dantilla (Post 9182193)
On a semi-related not, I was putting gas in my truck yesterday, and a Tesla Roadster pulls up to the next pump! That was a first!
The driver then pulled his small gas can for his lawn mower out of the trunk.

Does your government subsidize electric lawn mowers? (They don't where I live, they also don't subsidize electric cars) That may be why he's still using a gas powered one. :)

FLYGEEZER 07-01-2016 06:28 AM

ya don't fly an airplane on autopilot and leave the cockpit to take a nap or a dump.
just like a car that can drive itself, an airplane can t/o & land itself. But the pilot/pilots are suppoda monitor all aspects of the operation and are prepaired to take over and fly the damn thing if the autopilot gets crazy go nut. Auto flight has been around a long time but it do f/u on accasion. The mentality of everyday car drivers are in no way prepaired for anything

scottmandue 07-01-2016 06:47 AM

Quote:

Originally Posted by legion (Post 9181854)
As a computer programmer, automatic cars scare the crap out of me. They hit a situation they don't understand, and this is what happens, every time the situation is encountered. I'm not comfortable with the idea that I may be the first one to hit a situation the developers didn't think about.

I'm not a programmer just a lowly service tech (sort of) but yeah, if computers were infallible I wouldn't have a job (been doing this for about 20 years and no sign of my job going away)

In the history of bad ideas automated cars are a really really bad idea (IMHO)

1990C4S 07-01-2016 07:22 AM

Quote:

"Tesla noted that when a driver activated the system, an acknowledgment box popped up, explaining that the autopilot mode “is an assist feature that requires you to keep your hands on the steering wheel at all times.”
it is 'Driver Assist' not 'Auto-pilot'.

FLYGEEZER 07-01-2016 07:39 AM

Quote:

Originally Posted by 1990C4S (Post 9182336)
it is 'Driver Assist' not 'Auto-pilot'.

Thats a play on words... "that depends on what the meaning of is..is"

PetrolBlueSC 07-01-2016 08:15 AM

If the Tesla is using "radar" then why would the color of the truck matter. If the Tesla is using a optical "radar", then the color might matter. Regardless, it's a mistake to think you are driving a car from the Jetsons. I feel so sorry for the driver and his family.

island911 07-01-2016 08:27 AM

Here's a fun fact; seat-belted drivers take more driving risks than when not seatbelted.

Same goes for say, how hard linebackers hit when the amount of their (safety) gear goes up/improves.

IOW, it's human nature to take more risks when there is the proverbial net under the high-wire, than when not.

This Tesla software[alleged] death was akin to putting a net under the high-wire. ...a net made of silly-string. The user thought, oh, I have technology on my side. I will happily move along, knowing the net is there for me.

A double-fault catastrophe. :-\

1990C4S 07-01-2016 08:34 AM

Quote:

Originally Posted by FLYGEEZER (Post 9182356)
Thats a play on words... "that depends on what the meaning of is..is"

No, they are two very different modes of operation.

Auto-pilot implies the car is completely in control, on the Tesla it is clear (or it's supposed to be clear) that the driver is supposed to pay attention and be ready to over-ride the what the car is doing.

95avblm3 07-01-2016 08:45 AM

Quote:

Originally Posted by jyl (Post 9181910)
The human not noticing, is easy. He may have been dozing, reading the paper, surfing the web.

The news report mentioned the truck driver saying he heard a Harry Potter movie playing when he got out to go check on the Tesla driver. Not sure if I believe that because I thought NHTSA didn't allow videos to be played in a vehicle within the field of view of the driver.

Regardless, it is terribly sad.

aigel 07-01-2016 08:56 AM

Quote:

Originally Posted by 1990C4S (Post 9182444)
No, they are two very different modes of operation.

Auto-pilot implies the car is completely in control, on the Tesla it is clear (or it's supposed to be clear) that the driver is supposed to pay attention and be ready to over-ride the what the car is doing.

This is key - calling it autopilot may have been smart marketing, but driver assist would be the correct word, so people understand the car adds a layer of assistance to the driver, not vice versa.

What is problematic IMHO is now self driving cars get a bad rep because of this. Other high line manufacturers have emergency braking assist as well as Tesla, they just don't market it aggressively as "auto" anything.

G

BE911SC 07-01-2016 09:19 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1


And that's all you need to know.

Next topic.

onewhippedpuppy 07-01-2016 09:41 AM

Quote:

Originally Posted by legion (Post 9181854)
As a computer programmer, automatic cars scare the crap out of me. They hit a situation they don't understand, and this is what happens, every time the situation is encountered. I'm not comfortable with the idea that I may be the first one to hit a situation the developers didn't think about.

Quote:

Originally Posted by dentist90 (Post 9182025)
This is going to be a new and lucrative field for liability and personal injury lawyers. A car manufacturer simply cannot advise you to trust their autopilot without backing it up (financially). The mix of autonomous cars and human controlled ones will be a litigator's wet dream. Hell, even automated rail transit screws up occasionally, and they are on tracks going in one direction!

Both of these nail it. Impossible to program a computer to anticipate and react to every variable that might ever present itself in every situation, it simply isn't possible without some variety of AI. So at that point, every accident is the responsibility of the automaker. I think liability, not technology, will ultimately be the limiting factor for automated cars.

SCadaddle 07-01-2016 09:43 AM

I would venture to say that owning a Tesla is probably out of reach for the masses.

That being said, a new Subaru with "Eyesight" is much more affordable.

And if you want the Subaru "Eyesight" technology, you'd better try it out first. My older brother down in New Orleans recently bought a new Outback with the "Eyesight" equipment. He found out that the "safety" of the system came with issues. For example, if someone cut in front of him on the highway it would hit the brakes. If someone were in the exit lane adjacent to him and they were slowing down, his car would slow down as well. It looks down the road quite a ways and adjusts your speed to the traffic speed. Then the final deal breaker, the rear corner "sensor" unit failed. Took it to the dealership. They asked him if he was on a certain bypass in New Orleans at a certain spot. Exactly. They then informed him that he certainly wasn't the first to have the issue in that same location. About 6 months in and he traded it for an Audi.


All times are GMT -8. The time now is 03:46 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website


DTO Garage Plus vBulletin Plugins by Drive Thru Online, Inc.