Pelican Parts Forums

Pelican Parts Forums (http://forums.pelicanparts.com/)
-   Off Topic Discussions (http://forums.pelicanparts.com/off-topic-discussions/)
-   -   Tesla autopilot doesn't notice a tractor trailer, fatal crash (http://forums.pelicanparts.com/off-topic-discussions/920147-tesla-autopilot-doesnt-notice-tractor-trailer-fatal-crash.html)

dentist90 06-30-2016 11:52 PM

Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand, just like flying. But in both those examples it's out of our control, so we'd like to think we would do better than a computer. Sometimes yes, most times no.

edit: 156,495,106 since 5 mins ago

dad911 07-01-2016 12:21 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand, just like flying. But in both those examples it's out of our control, so we'd like to think we would do better than a computer. Sometimes yes, most times no.

edit: 156,495,106 since 5 mins ago

Nope, You would need to distribute/divide those (total)numbers by miles driven to be meaningful.

1.08 death per 100 million miles driven (2014): http://www.iihs.org/iihs/topics/t/general-statistics/fatalityfacts/overview-of-fatality-facts

Tesla - about 100 million miles driven by customers, now 1 fatality, Tesla reveals new details of its Autopilot program: 780M miles of data, 100M miles driven and more | Electrek

So statistically, about the same with automated navigation. Unless you consider most of us above average ;)

Edit: remove drunk drivers, and/or other human "preventable" causes of accidents, and humans drivers would be statisticlly safer.

petrolhead611 07-01-2016 01:10 AM

To prevent future drivers dozing off the system should be designed so that they should remain in control, and the car should only take over if it senses an emergency that the driver has failed to respond to, and each instance of system intervention should be logged and the data collected against the driver's license. Too many interventions and bye bye license.

Arizona_928 07-01-2016 03:24 AM

I've seen a tesla driver reading a book on the I17. On you tube you can see people taking a nap.
But good idea nevertheless...

Holger 07-01-2016 03:40 AM

Keeping the hands (or one hand) on the wheel should remain "mandatory", not every invention is a good invention!

ckelly78z 07-01-2016 05:12 AM

Is it possible that the car's radar saw the clear space underneath the broadside trailer, and didn't recognize it as a threat. Maybe the system needs to be adjusted to scan for thing 4'-5' off the road surface as threats also.

Dantilla 07-01-2016 05:41 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

It is easy to be alarmist, but statistically your chances are way better with automated navigation than by your own hand,

Quote:

Originally Posted by dad911 (Post 9182043)
Nope, You would need to distribute/divide those (total)numbers by miles driven to be meaningful.

1.08 death per 100 million miles driven (2014)

Tesla - about 100 million miles driven by customers, now 1 fatality,

So statistically, about the same...

But how many of those Tesla-driven miles were using autopilot? A very small fraction, I would guess.

On a semi-related not, I was putting gas in my truck yesterday, and a Tesla Roadster pulls up to the next pump! That was a first!
The driver then pulled his small gas can for his lawn mower out of the trunk.

pksystems 07-01-2016 06:17 AM

Quote:

Originally Posted by Dantilla (Post 9182193)
On a semi-related not, I was putting gas in my truck yesterday, and a Tesla Roadster pulls up to the next pump! That was a first!
The driver then pulled his small gas can for his lawn mower out of the trunk.

Does your government subsidize electric lawn mowers? (They don't where I live, they also don't subsidize electric cars) That may be why he's still using a gas powered one. :)

FLYGEEZER 07-01-2016 06:28 AM

ya don't fly an airplane on autopilot and leave the cockpit to take a nap or a dump.
just like a car that can drive itself, an airplane can t/o & land itself. But the pilot/pilots are suppoda monitor all aspects of the operation and are prepaired to take over and fly the damn thing if the autopilot gets crazy go nut. Auto flight has been around a long time but it do f/u on accasion. The mentality of everyday car drivers are in no way prepaired for anything

scottmandue 07-01-2016 06:47 AM

Quote:

Originally Posted by legion (Post 9181854)
As a computer programmer, automatic cars scare the crap out of me. They hit a situation they don't understand, and this is what happens, every time the situation is encountered. I'm not comfortable with the idea that I may be the first one to hit a situation the developers didn't think about.

I'm not a programmer just a lowly service tech (sort of) but yeah, if computers were infallible I wouldn't have a job (been doing this for about 20 years and no sign of my job going away)

In the history of bad ideas automated cars are a really really bad idea (IMHO)

1990C4S 07-01-2016 07:22 AM

Quote:

"Tesla noted that when a driver activated the system, an acknowledgment box popped up, explaining that the autopilot mode “is an assist feature that requires you to keep your hands on the steering wheel at all times.”
it is 'Driver Assist' not 'Auto-pilot'.

FLYGEEZER 07-01-2016 07:39 AM

Quote:

Originally Posted by 1990C4S (Post 9182336)
it is 'Driver Assist' not 'Auto-pilot'.

Thats a play on words... "that depends on what the meaning of is..is"

PetrolBlueSC 07-01-2016 08:15 AM

If the Tesla is using "radar" then why would the color of the truck matter. If the Tesla is using a optical "radar", then the color might matter. Regardless, it's a mistake to think you are driving a car from the Jetsons. I feel so sorry for the driver and his family.

island911 07-01-2016 08:27 AM

Here's a fun fact; seat-belted drivers take more driving risks than when not seatbelted.

Same goes for say, how hard linebackers hit when the amount of their (safety) gear goes up/improves.

IOW, it's human nature to take more risks when there is the proverbial net under the high-wire, than when not.

This Tesla software[alleged] death was akin to putting a net under the high-wire. ...a net made of silly-string. The user thought, oh, I have technology on my side. I will happily move along, knowing the net is there for me.

A double-fault catastrophe. :-\

1990C4S 07-01-2016 08:34 AM

Quote:

Originally Posted by FLYGEEZER (Post 9182356)
Thats a play on words... "that depends on what the meaning of is..is"

No, they are two very different modes of operation.

Auto-pilot implies the car is completely in control, on the Tesla it is clear (or it's supposed to be clear) that the driver is supposed to pay attention and be ready to over-ride the what the car is doing.

95avblm3 07-01-2016 08:45 AM

Quote:

Originally Posted by jyl (Post 9181910)
The human not noticing, is easy. He may have been dozing, reading the paper, surfing the web.

The news report mentioned the truck driver saying he heard a Harry Potter movie playing when he got out to go check on the Tesla driver. Not sure if I believe that because I thought NHTSA didn't allow videos to be played in a vehicle within the field of view of the driver.

Regardless, it is terribly sad.

aigel 07-01-2016 08:56 AM

Quote:

Originally Posted by 1990C4S (Post 9182444)
No, they are two very different modes of operation.

Auto-pilot implies the car is completely in control, on the Tesla it is clear (or it's supposed to be clear) that the driver is supposed to pay attention and be ready to over-ride the what the car is doing.

This is key - calling it autopilot may have been smart marketing, but driver assist would be the correct word, so people understand the car adds a layer of assistance to the driver, not vice versa.

What is problematic IMHO is now self driving cars get a bad rep because of this. Other high line manufacturers have emergency braking assist as well as Tesla, they just don't market it aggressively as "auto" anything.

G

BE911SC 07-01-2016 09:19 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1


And that's all you need to know.

Next topic.

onewhippedpuppy 07-01-2016 09:41 AM

Quote:

Originally Posted by legion (Post 9181854)
As a computer programmer, automatic cars scare the crap out of me. They hit a situation they don't understand, and this is what happens, every time the situation is encountered. I'm not comfortable with the idea that I may be the first one to hit a situation the developers didn't think about.

Quote:

Originally Posted by dentist90 (Post 9182025)
This is going to be a new and lucrative field for liability and personal injury lawyers. A car manufacturer simply cannot advise you to trust their autopilot without backing it up (financially). The mix of autonomous cars and human controlled ones will be a litigator's wet dream. Hell, even automated rail transit screws up occasionally, and they are on tracks going in one direction!

Both of these nail it. Impossible to program a computer to anticipate and react to every variable that might ever present itself in every situation, it simply isn't possible without some variety of AI. So at that point, every accident is the responsibility of the automaker. I think liability, not technology, will ultimately be the limiting factor for automated cars.

SCadaddle 07-01-2016 09:43 AM

I would venture to say that owning a Tesla is probably out of reach for the masses.

That being said, a new Subaru with "Eyesight" is much more affordable.

And if you want the Subaru "Eyesight" technology, you'd better try it out first. My older brother down in New Orleans recently bought a new Outback with the "Eyesight" equipment. He found out that the "safety" of the system came with issues. For example, if someone cut in front of him on the highway it would hit the brakes. If someone were in the exit lane adjacent to him and they were slowing down, his car would slow down as well. It looks down the road quite a ways and adjusts your speed to the traffic speed. Then the final deal breaker, the rear corner "sensor" unit failed. Took it to the dealership. They asked him if he was on a certain bypass in New Orleans at a certain spot. Exactly. They then informed him that he certainly wasn't the first to have the issue in that same location. About 6 months in and he traded it for an Audi.


All times are GMT -8. The time now is 11:44 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website


DTO Garage Plus vBulletin Plugins by Drive Thru Online, Inc.