Pelican Parts Forums

Pelican Parts Forums (http://forums.pelicanparts.com/)
-   Off Topic Discussions (http://forums.pelicanparts.com/off-topic-discussions/)
-   -   Tesla autopilot doesn't notice a tractor trailer, fatal crash (http://forums.pelicanparts.com/off-topic-discussions/920147-tesla-autopilot-doesnt-notice-tractor-trailer-fatal-crash.html)

cairns 07-03-2016 09:26 AM

Quote:

Any "autopilot" is not a "close my eyes and go to sleep and I will be alright" replacement for a driver/pilot. How can anyone think it is?
Easy. They believed Elon Musk. The guy who died wasn't necessarily stupid. But he believed Elon and his marketing claims. And he paid the price for that.

Contrary to what Elon says his "Autopilot" is not better than the average driver. I don't know any average driver that would deliberately drive into a truck.

1990C4S 07-03-2016 09:29 AM

Quote:

Originally Posted by 95avblm3 (Post 9182464)
The news report mentioned the truck driver saying he heard a Harry Potter movie playing when he got out to go check on the Tesla driver. Not sure if I believe that because I thought NHTSA didn't allow videos to be played in a vehicle within the field of view of the driver.

Regardless, it is terribly sad.

It appears he may have had a portable DVD player on board...

KNS 07-03-2016 11:57 AM

Quote:

Originally Posted by ossiblue (Post 9184560)
drivers ignored all the specified warnings and instructions about being ready to take immediate control of the vehicle. They began diverting their attention from the direction of travel and even taking their hands off the wheel.

I can see it now - People will complain that they have to pay attention while in the car: "I have to look outside? - That's not why I bought this car..."

onewhippedpuppy 07-03-2016 12:15 PM

Quote:

Originally Posted by ossiblue (Post 9184560)
Interesting article in the Los Angeles Times this morning regarding the different ways Tesla and Google are approaching the driverless car. In the article, a Google spokesman mentioned that during their testing they noticed that within a few minutes of being in the car, drivers ignored all the specified warnings and instructions about being ready to take immediate control of the vehicle. They began diverting their attention from the direction of travel and even taking their hands off the wheel.

I am in the camp that believes this death is a result of failure of both parties--the technology and the driver. We don't know the details as yet, but it's going to be hard to argue the driver did not notice a semi turning in front of him if he was following the protocol outlined by Tesla for their "autopilot" mode. Whether or not Tesla should have made the system inoperable unless the driver did follow the protocols is a subject for lawyers.

Assuming that this makes it to court, I suspect the argument will be that Musk's hyperbole contradicted the fine print in the owner's manual and lulled the owner into a false sense of security. Or something like that.

wdfifteen 07-03-2016 12:27 PM

Quote:

Originally Posted by KNS (Post 9184696)
I can see it now - People will complain that they have to pay attention while in the car: "I have to look outside? - That's not why I bought this car..."

Exactly. The driver of this car was playing a Harry Potter movie. If the car was delivered without a windshield, steering wheel, and brake pedal Tesla might be culpable.

ossiblue 07-03-2016 12:31 PM

Quote:

Originally Posted by onewhippedpuppy (Post 9184713)
Assuming that this makes it to court, I suspect the argument will be that Musk's hyperbole contradicted the fine print in the owner's manual and lulled the owner into a false sense of security. Or something like that.

Pure speculation on all of this, of course, but to counter this argument, Tesla has ample documentation from the deceased's own videos that he was well aware of the short comings of the system and that it was not completely safe without driver intervention. To say that this particular driver was lulled into a false sense of security would be difficult, IMO, to establish.

Take a few minutes and read Tesla's manual for the autopilot system (it's online) and there are plenty of warnings about the system not always "seeing" objects in the road, mistaking fixed objects for cars, etc. If someone actually reads the manual and all the warnings, there is no way they would ever think driving without immediate attention to surroundings and hands on the wheel is safe or acceptable. Not defending Tesla here, but they never claim nor emphasize the car is completely autonomous.

KNS 07-03-2016 02:52 PM

Quote:

Originally Posted by ossiblue (Post 9184560)
If someone actually reads the manual and all the warnings, there is no way they would ever think driving without immediate attention to surroundings and hands on the wheel is safe or acceptable. Not defending Tesla here, but they never claim nor emphasize the car is completely autonomous.

Unfortunately, people stopped reading the owner's manual years ago (I still do).

rusnak 07-03-2016 04:05 PM

Quote:

Originally Posted by 1990C4S (Post 9184574)
It appears he may have had a portable DVD player on board...

Yeeeesh. This is where I use a voice like the dad on "That 70s Show" and call the dead guy a dumbass.

jyl 07-03-2016 06:52 PM

I used to be a product liability lawyer, defending auto companies. In a former life.

Warnings are of only limited help to the defense. Especially if the manufacturer knows that users routinely disregard the warnings. Which Tesla certainly knew. Musk's wife posted a video of herself driving on Autopilot with hands off the wheel (Instagram, since deleted). Musk tweeted about the hands off driving shown in users Youtube videos. He said things like the car could drive from San Diego to San Francisco with the driver hardly having to touch the wheel. I'm sure there is plenty of research showing that humans ignore warnings about staying ready to take control, similar to what Google said, and Tesla employs experts in human interface design who knew it (if it doesn't employ such experts, that's even worse). A search of Tesla's internal emails may well find discussion of this by executives and engineers. This would be a delicious case to have on the plaintiff side, and a difficult case to have on the defense side.

jyl 07-16-2016 12:21 AM

Boy, Tesla is really taking a legalistic, disingenious approach to Autopilot accidents.

On July 1, a Tesla went off a highway, hit a guardrail. The driver told police the car was on Autopilot. Tesla denied this, said the car's logs proved it was not on Autopilot. Now Tesla has said what it meant was: the car was on Autopilot, but Autopilot TURNED ITSELF OFF, leaving the car driving unguided, and 40 seconds later the car went off the road and crashed.

Musk tweeted:
"Onboard vehicle logs show Autopilot was turned off in Pennsylvania crash. Moreover, crash would not have occurred if it was on."

Details here:
http://jalopnik.com/musk-autopilot-was-off-in-pa-tesla-model-x-crash-acco-1783695454

I think it is likely what happened is, the car was driving along the (pretty straight?) freeway on Autopilot, the driver dozed off/zoned out, didn't touch the steering wheel for awhile, the car started telling the driver to take the wheel, the alerts finally woke the driver/snapped him out of his reverie, he grabbed the wheel and jerked it, and the car went off the road, Autopilot having turned itself off.

Which is worse: an Autopilot that can't keep the car on the road, or an Autopilot that suddenly turns itself off and forces a groggy, startled human to "take manual control"?

Yeah, the driver should have remained alert for hours, always ready to instantly resume manual control at all times . . . but untrained people just don't/can't do that. Google figured this out as they developed self driving cars, which is why Google is no longer developing "semi autonomous" cars. Tesla is digging itself deeper by continuing to run its beta Autopilot experiment on public roads.

sc_rufctr 07-16-2016 12:27 AM

Quote:

Originally Posted by jyl (Post 9201064)
Boy, Tesla is really taking a legalistic, disingenious approach to Autopilot accidents.

On July 1, a Tesla went off a highway, hit a guardrail. The driver told police the car was on Autopilot. Tesla denied this, said the car's logs proved it was not on Autopilot. Now Tesla has said what it meant was: the car was on Autopilot, but Autopilot TURNED ITSELF OFF, leaving the car driving unguided, and 40 seconds later the car went off the road and crashed.

http://forums.pelicanparts.com/uploa...1468657626.jpg

Seahawk 07-16-2016 04:59 AM

Quote:

Originally Posted by jyl (Post 9185054)
I used to be a product liability lawyer, defending auto companies. In a former life.

This would be a delicious case to have on the plaintiff side, and a difficult case to have on the defense side.

Quote:

Originally Posted by jyl (Post 9201064)
Boy, Tesla is really taking a legalistic, disingenuous approach to Autopilot accidents.

Tesla is digging itself deeper by continuing to run its beta Autopilot experiment on public roads.

Those were very informative posts. Thanks.

As a comparison, in Navy flight test we divide test into two separate and distinct parts: Developmental Flight Test (DT) and Operational Flight Test (OT).

DT is a very controlled, measured build-up with instrumented aircraft and sensors, lots of data collection and precise flight on instrumented ranges.

OT is a less controlled (but heavy on data collection) series of tests that take the aircraft into it's intended operational environment. We really work hard at trying to discover both the human interface issues and engineering issues before the system hits the fleet.

The "auto-pilot" turning itself off would be a complete "unsuitable" in both DT and OT. It would have to be fixed and strenuously tested.

The other, more insidious problem is the "driver" transition period from inattention to resuming control of the vehicle. I would really like to see how that is going to be dealt with - where they get the math on human system performance in that particular regime.

jyl 07-23-2016 10:34 PM

Tesla working on changes to Autopilot driver-assistance software

Telsa (Musk) is acknowledging, I think, that the car's radar detected the tractor trailer but the car was programmed to ignore overhead objects and thus ignored the trailer.

I remain astonished that Tesla's lawyers are permitting the company's CEO to tweet about a fatal accident involving Autopilot, much less to tweet details about why the Autopilot didn't work and how the software could have been designed to prevent the accident.

Florida, like most states, imposes strict liability for design defects. "Under Florida law of strict product liability, a defendant is strictly liable for a plaintiff's injury if the product is in a condition that is unreasonably dangerous. A product is unreasonably dangerous if the product fails to perform as safely as an ordinary consumer would expect when used as intended or in a manner reasonably foreseeable by the manufacturer or if the risk of danger in the design of the product outweighs the benefits."

What are the chances that a Florida jury will think that an "ordinary consumer" would expect a $100,000 car, with an "Autopilot" that is advertised as safer than the average driver, to drive at full speed into a tractor trailer that was detected by the car's radar?

Won 07-24-2016 05:54 AM

Quote:

Originally Posted by jyl (Post 9201064)
Which is worse: an Autopilot that can't keep the car on the road, or an Autopilot that suddenly turns itself off and forces a groggy, startled human to "take manual control"?

A friend of mine does machine learning and visual processing stuff for autonomous driving, for German OEMs. According to him, the most difficult part currently is figuring out how to keep the human engaged, or at least to make them re-focus effectively on driving, should the need arise - exactly as you say.

Like the old IT saying - Problem Exists Between Chair And Keyboard.

intakexhaust 07-24-2016 08:13 AM

What the heck are these companies thinking? I can see them perhaps focused on an auto steer and brake ONLY to avoid a collision but not autonomous -point A to B. Nutz.

This is a world of people that become complacent on nearly everything. Just look at the zombie pedestrians with their phones and now the future might have them behind an auto steer vehicle?

Wait until the day we have manual steering parent with an entire family in van getting crashed into and burned to death by some failed autonomous Tesla.

jyl 07-24-2016 08:50 AM

When I represented a domestic auto OEM in product liability lawsuits, plaintiffs' lawyers often sought to force senior management, up to and including the CEO, to give lengthy videotaped depositions. Our client fought these attempts, as you would expect, since they were simply efforts to harass the executives and burden the company. We usually succeeded in preventing those depositions, because the courts recognized the pointlessness of questioning top executives about engineering decisions about which they had no knowledge or involvement. Musk's tweets, in contrast, show that he is fair game for depositions about Autopilot's programming and the decision to enable and not disable Autopilot, to include or omit Lidar and similar sensors, etc. Musk already divides his time between Tesla and Space-X. He only has 365 days available each year, and if he has to spend 2 days preparing for and submitting to each deposition, that is a significant drain on the company resource that is his time, not to mention the reputational risk of having your star CEO interrogated under oath. And what is Tesla thinking when it deletes Musk's tweets, do they think that makes them unavailable?

Holger 09-29-2016 07:06 AM

A Tesla drove into a bus near Hamburg today.
The Autopilot was active.
The driver only got minor injuries.

manbridge 74 09-29-2016 09:00 AM

Quote:

Originally Posted by jyl (Post 9211992)
When I represented a domestic auto OEM in product liability lawsuits, plaintiffs' lawyers often sought to force senior management, up to and including the CEO, to give lengthy videotaped depositions. Our client fought these attempts, as you would expect, since they were simply efforts to harass the executives and burden the company. We usually succeeded in preventing those depositions, because the courts recognized the pointlessness of questioning top executives about engineering decisions about which they had no knowledge or involvement. Musk's tweets, in contrast, show that he is fair game for depositions about Autopilot's programming and the decision to enable and not disable Autopilot, to include or omit Lidar and similar sensors, etc. Musk already divides his time between Tesla and Space-X. He only has 365 days available each year, and if he has to spend 2 days preparing for and submitting to each deposition, that is a significant drain on the company resource that is his time, not to mention the reputational risk of having your star CEO interrogated under oath. And what is Tesla thinking when it deletes Musk's tweets, do they think that makes them unavailable?

You used to be a lawyer or still are?

I ask because I find your analytic skills somewhat lacking.

Pazuzu 09-29-2016 09:08 AM

Quote:

Originally Posted by Holger (Post 9299416)
A Tesla drove into a bus near Hamburg today.
The Autopilot was active.
The driver only got minor injuries.

News is slow to get here...

Autobahn speeds, autopilot, holding the wheel but clearly not concentrating, runs into the back of a bus that changed lanes in front of him. I can't find any reports here on how fast he was going, can the autopilot system handle autobahn speeds?

911 Rod 09-29-2016 09:54 AM

https://www.youtube.com/watch?v=NqnJRo4FQNo

This is crazy


All times are GMT -8. The time now is 06:58 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website


DTO Garage Plus vBulletin Plugins by Drive Thru Online, Inc.