Pelican Parts Forums

Pelican Parts Forums (http://forums.pelicanparts.com/)
-   Off Topic Discussions (http://forums.pelicanparts.com/off-topic-discussions/)
-   -   Tesla autopilot doesn't notice a tractor trailer, fatal crash (http://forums.pelicanparts.com/off-topic-discussions/920147-tesla-autopilot-doesnt-notice-tractor-trailer-fatal-crash.html)

Por_sha911 07-01-2016 10:09 AM

Quote:

Originally Posted by dentist90 (Post 9182039)
Tally so far:
Human error caused auto crashes: 156,494,896
GPS/AutoPilot crashes: 1

Quote:

Originally Posted by BE911SC (Post 9182506)
And that's all you need to know.
Next topic.

There is a heck of a difference between total hits and batting average.

Quote:

Originally Posted by 1990C4S (Post 9182336)
it is 'Driver Assist' not 'Auto-pilot'.

The problem is that there is a huge temptation to not dial in your attention. This is akin to the lowered mental alertness from texting

jyl 07-01-2016 03:28 PM

Quote:

Originally Posted by 1990C4S (Post 9182336)
it is 'Driver Assist' not 'Auto-pilot'.

Tesla calls it "Autopilot". They advertise that term, Musk brags about it, he says it is already better than the average driver . . .

No one can expect an untrained driver to remain alert after hours of being a passenger on autopilot.

aap1966 07-01-2016 04:00 PM

An insurmountable issue is the car making a moral decision.
Kid runs out in front of the car, a human swerves into oncoming traffic (risking their own lives) to avoid the kid.
Would the car?
I can also foresee scenarios where the technology would fail.
What if a dog runs out rather than a kid? How does the car reliably differentiate kid (avoid at all costs) vs dog (don't swerve, sorry Rex)?
What about scenarios where a of breaking road rules is required? (Driving through a red light to clear the way for an ambulance)?

rusnak 07-01-2016 04:22 PM

Quote:

Originally Posted by jyl (Post 9182960)
Tesla calls it "Autopilot". Musk brags about it, he says it is already better than the average driver . . .

I hope that freakshow peddler of overhyped gizmos uses his own Autopilot, if it's so righteous.

Noah930 07-01-2016 06:49 PM

Granted, none of us know what truly transpired. At the same time, it's a bit Darwinianly stupid to do something like go down the highway in a car (while sitting in the left front seat) and not pay immediate attention to what's going on around you, Tesla-magic autopilot or not. With the new rumor that the guy may have been watching a movie on a portable DVD player, that's even more tragically sad.

sc_rufctr 07-01-2016 07:01 PM

Joshua Brown, RIP really should have known better. He had a YouTube channel almost entirely devoted to Tesla.

https://www.youtube.com/channel/UCLqNcKzWLAVb1Ez6SbshI7g

<iframe width="560" height="315" src="https://www.youtube.com/embed/9I5rraWJq6E" frameborder="0" allowfullscreen></iframe>

cairns 07-02-2016 05:08 AM

I feel bad for the guy and his family but he's dead because he totally bought Tesla's BS.

Tesla shouldn't be hyping this the way they are and I sure don't want it around me when I'm driving. There are already millions of drivers who don't pay attention. And here we have a car company that actually encourages it....and a government that has given them untold millions.

Mercedes and others have similar systems. But no one, other than Tesla, calls theirs an autopilot.

island911 07-02-2016 06:36 AM

Yep. What we have is big push, by Tesla, to embrace the belief in unicorns.

As in--This device will transport you and your loved ones without a worry, and w/o a drop of that evil oil. Arching Rainbows abound.

Other car companies have the product development maturity to move slowly with this type of technology. Whereas Tesla has hype technology. ...Feeding on people's desire for unicorns.

Brian in VA 07-02-2016 06:39 AM

The other problem is now you'll be cruising along paying even less attention than usual, and suddenly there will be an emergency and you'll have to instantly switch on and take over from the computer, in a stressful split second critical situation. Not going to happen for most people.

1990C4S 07-02-2016 07:05 AM

Quote:

Originally Posted by jyl (Post 9182960)
Tesla calls it "Autopilot". They advertise that term, Musk brags about it, he says it is already better than the average driver . . .

I agree, they are using the wrong terminology. If you need to pay attention and be ready to take over control, as Tesla claims you do, then they are misleading people as to the the abilities of the vehicle.

Seahawk 07-02-2016 09:53 AM

The amount of external variables that need to be programmed into autonomous driving are exponentially more than the variables of today's aviation autopilots.

In my commercial UAS world, the current auto pilot capabilities are really revolutionary. No one really "flies" the UAS. It accepts a flight plan and executes it faithfully and manages wind, temperature, elevation, aircraft systems status, etc. with ease.

The next phase for us is beyond line of sight flight but that is still sooo much easier than the dynamics of driving.

As a former flight test guy, my concern is that the driver systems are not being driven and tested with the same protocols as in flight test.

The problems can be solved.

jyl 07-02-2016 02:27 PM

I'm sure the problems can be solved, and will be solved, and autonomous cars will be safe and widely used. Eventually. In the meantime, I'm appalled by Tesla's over confident, arrogant and cynical approach to this.
- They marketed the self driving system and called it "Autopilot", a name that invites drivers to sit back and let the car do the driving.
- They did so even though they knew the system was not fully tested, and we know that because they explicitly designated it "beta": how much more of an admission can there be?
- Musk bragged that Autopilot was better than the average driver, that it was 50% safer to use Autopilot than to drive yourself, and tweeted his approval of drivers who posted Youtube videos of their Teslas driving autonomously, including a video by the now-dead guy.
- When the accident happened, Tesla kept quiet about it, I haven't read about any change they made to Autopilot software in response to the accident, and they didn't tell owners that Autopilot couldn't cope with a big truck turning in front of the car: they let those owners go right on using Autopilot, although any one of them could have then been killed just like Howard was.
- When the news of the accident broke, they issued a statement that is disingenuous and weasely, by saying neither the driver nor the car noticed a white semi trailer against a bright sky, they are suggesting that a human wouldn't have detected the truck, so how can you blame Autopilot, but this is bull****: a human who was paying attention would easily have seen the white semi trailer turning in front of him, if he had been paying attention, and the whole point of Autopilot is that it is always paying attention - but it can't detect what a human can.

Tesla either has terrible lawyers, or they don't listen to their lawyers. Maybe Howard was such a big fan of Tesla that his family will quietly settle. But the next dead person may not be. Especially if they are in the car that the Tesla hits.

SpyderMike 07-02-2016 03:45 PM

Autopilot is a name associated with a variety of flight control devices. Not all cover three axis and steering (vertical and horizontal). So I wouldn't get too hung up on the use of the term "autopilot". Sure, there are current ones with amazing abilities, but that term was used to cover single axis systems too. It is a device to reduce pilot/driver workload, not one to replace driver/pilot responsability. By the way, plenty of pilots have met their fate using autopilot in planes (e.g., controlled flight into terrain). No explainable reason why you wouldn't pay attention to your surroundings while it is engaged (in plane or car). I will lean towards the side of "pilot error" on this one.

I like the Tesla product, it is a hoot to drive.

By the way, this is what Tesla is "touting":

Autopilot allows Model*S to steer within a lane, change lanes with the simple tap of a turn signal, and manage speed by using active, traffic-aware cruise control. Digital control of motors, brakes, and steering helps avoid collisions from the front and sides, and prevents the car from wandering off the road. Autopilot also enables your car to scan for a parking space and parallel park on command. And our new Summon feature lets you "call" your car from your phone so it can come greet you at the front door in the morning.

Autopilot features are progressively enabled over time with software updates.

onewhippedpuppy 07-02-2016 05:01 PM

Interesting that they blame the lack of visual contrast, because I would be shocked if the car didn't have some sort of proximity radar system that has nothing to do with visual contrast. I suspect it is more to do with the large cavity under the truck and the fact that the car didn't pick it up.

KNS 07-02-2016 05:01 PM

^^ But the general, non flying public for most part equates "Auto Pilot" to mean: The machine is doing the flying (or driving).

SpyderMike 07-02-2016 05:07 PM

The general "non flying" public should understand that the plane they are in can crash on autopilot (and they have and will). That is why there is a pilot or two in the cockpit. Someone needs to be paying attention. I don't blame the technology here.

Any "autopilot" is not a "close my eyes and go to sleep and I will be alright" replacement for a driver/pilot. How can anyone think it is? I can't even trust my car's cruise control. Sorry, but I hope his end was swift and painless.

island911 07-02-2016 05:24 PM

Quote:

Originally Posted by SpyderMike (Post 9184028)
.... I don't blame the technology here.
.

So just how much TSLA do you own?

Seriously, seems you should blame BOTH the driver and the tech. This sounds like what is referred to as a double-fault condition; where both the tech failed, and the user failed.

I understand that people like to assign blame to just one item, but if the "Auto pilot" was on, but didn't help, this is clear as day, both failed.

Now one could argue that the driver made the decision to buy, drive, and auto-pilot that gear, therefore Tesla has no fault for how it's used. And we will see just how far that argument goes in court.

SpyderMike 07-02-2016 06:00 PM

Quote:

Originally Posted by island911 (Post 9184044)
So just how much TSLA do you own?

None that I know of, unless it is buried in a mutual fund. Not that it matters. I don't fault the technology. I, for one, am not an early adopter of any technology. Let the first users find the limitations. I still run Win7.

Let the lawyers feed on it...that is what they do and how they make their nut. They will slow down the technology gains a bit, and make it costlier, but there is a movement to autonomous travel that will find its way.

Hard-Deck 07-03-2016 08:53 AM

Test driver is a former S.E.A.L.

https://www.sofx.com/2016/07/02/ex-navy-seal-killed-crash-using-teslas-autopilot-u-s-stripes/

ossiblue 07-03-2016 09:17 AM

Interesting article in the Los Angeles Times this morning regarding the different ways Tesla and Google are approaching the driverless car. In the article, a Google spokesman mentioned that during their testing they noticed that within a few minutes of being in the car, drivers ignored all the specified warnings and instructions about being ready to take immediate control of the vehicle. They began diverting their attention from the direction of travel and even taking their hands off the wheel.

I am in the camp that believes this death is a result of failure of both parties--the technology and the driver. We don't know the details as yet, but it's going to be hard to argue the driver did not notice a semi turning in front of him if he was following the protocol outlined by Tesla for their "autopilot" mode. Whether or not Tesla should have made the system inoperable unless the driver did follow the protocols is a subject for lawyers.

cairns 07-03-2016 09:26 AM

Quote:

Any "autopilot" is not a "close my eyes and go to sleep and I will be alright" replacement for a driver/pilot. How can anyone think it is?
Easy. They believed Elon Musk. The guy who died wasn't necessarily stupid. But he believed Elon and his marketing claims. And he paid the price for that.

Contrary to what Elon says his "Autopilot" is not better than the average driver. I don't know any average driver that would deliberately drive into a truck.

1990C4S 07-03-2016 09:29 AM

Quote:

Originally Posted by 95avblm3 (Post 9182464)
The news report mentioned the truck driver saying he heard a Harry Potter movie playing when he got out to go check on the Tesla driver. Not sure if I believe that because I thought NHTSA didn't allow videos to be played in a vehicle within the field of view of the driver.

Regardless, it is terribly sad.

It appears he may have had a portable DVD player on board...

KNS 07-03-2016 11:57 AM

Quote:

Originally Posted by ossiblue (Post 9184560)
drivers ignored all the specified warnings and instructions about being ready to take immediate control of the vehicle. They began diverting their attention from the direction of travel and even taking their hands off the wheel.

I can see it now - People will complain that they have to pay attention while in the car: "I have to look outside? - That's not why I bought this car..."

onewhippedpuppy 07-03-2016 12:15 PM

Quote:

Originally Posted by ossiblue (Post 9184560)
Interesting article in the Los Angeles Times this morning regarding the different ways Tesla and Google are approaching the driverless car. In the article, a Google spokesman mentioned that during their testing they noticed that within a few minutes of being in the car, drivers ignored all the specified warnings and instructions about being ready to take immediate control of the vehicle. They began diverting their attention from the direction of travel and even taking their hands off the wheel.

I am in the camp that believes this death is a result of failure of both parties--the technology and the driver. We don't know the details as yet, but it's going to be hard to argue the driver did not notice a semi turning in front of him if he was following the protocol outlined by Tesla for their "autopilot" mode. Whether or not Tesla should have made the system inoperable unless the driver did follow the protocols is a subject for lawyers.

Assuming that this makes it to court, I suspect the argument will be that Musk's hyperbole contradicted the fine print in the owner's manual and lulled the owner into a false sense of security. Or something like that.

wdfifteen 07-03-2016 12:27 PM

Quote:

Originally Posted by KNS (Post 9184696)
I can see it now - People will complain that they have to pay attention while in the car: "I have to look outside? - That's not why I bought this car..."

Exactly. The driver of this car was playing a Harry Potter movie. If the car was delivered without a windshield, steering wheel, and brake pedal Tesla might be culpable.

ossiblue 07-03-2016 12:31 PM

Quote:

Originally Posted by onewhippedpuppy (Post 9184713)
Assuming that this makes it to court, I suspect the argument will be that Musk's hyperbole contradicted the fine print in the owner's manual and lulled the owner into a false sense of security. Or something like that.

Pure speculation on all of this, of course, but to counter this argument, Tesla has ample documentation from the deceased's own videos that he was well aware of the short comings of the system and that it was not completely safe without driver intervention. To say that this particular driver was lulled into a false sense of security would be difficult, IMO, to establish.

Take a few minutes and read Tesla's manual for the autopilot system (it's online) and there are plenty of warnings about the system not always "seeing" objects in the road, mistaking fixed objects for cars, etc. If someone actually reads the manual and all the warnings, there is no way they would ever think driving without immediate attention to surroundings and hands on the wheel is safe or acceptable. Not defending Tesla here, but they never claim nor emphasize the car is completely autonomous.

KNS 07-03-2016 02:52 PM

Quote:

Originally Posted by ossiblue (Post 9184560)
If someone actually reads the manual and all the warnings, there is no way they would ever think driving without immediate attention to surroundings and hands on the wheel is safe or acceptable. Not defending Tesla here, but they never claim nor emphasize the car is completely autonomous.

Unfortunately, people stopped reading the owner's manual years ago (I still do).

rusnak 07-03-2016 04:05 PM

Quote:

Originally Posted by 1990C4S (Post 9184574)
It appears he may have had a portable DVD player on board...

Yeeeesh. This is where I use a voice like the dad on "That 70s Show" and call the dead guy a dumbass.

jyl 07-03-2016 06:52 PM

I used to be a product liability lawyer, defending auto companies. In a former life.

Warnings are of only limited help to the defense. Especially if the manufacturer knows that users routinely disregard the warnings. Which Tesla certainly knew. Musk's wife posted a video of herself driving on Autopilot with hands off the wheel (Instagram, since deleted). Musk tweeted about the hands off driving shown in users Youtube videos. He said things like the car could drive from San Diego to San Francisco with the driver hardly having to touch the wheel. I'm sure there is plenty of research showing that humans ignore warnings about staying ready to take control, similar to what Google said, and Tesla employs experts in human interface design who knew it (if it doesn't employ such experts, that's even worse). A search of Tesla's internal emails may well find discussion of this by executives and engineers. This would be a delicious case to have on the plaintiff side, and a difficult case to have on the defense side.

jyl 07-16-2016 12:21 AM

Boy, Tesla is really taking a legalistic, disingenious approach to Autopilot accidents.

On July 1, a Tesla went off a highway, hit a guardrail. The driver told police the car was on Autopilot. Tesla denied this, said the car's logs proved it was not on Autopilot. Now Tesla has said what it meant was: the car was on Autopilot, but Autopilot TURNED ITSELF OFF, leaving the car driving unguided, and 40 seconds later the car went off the road and crashed.

Musk tweeted:
"Onboard vehicle logs show Autopilot was turned off in Pennsylvania crash. Moreover, crash would not have occurred if it was on."

Details here:
http://jalopnik.com/musk-autopilot-was-off-in-pa-tesla-model-x-crash-acco-1783695454

I think it is likely what happened is, the car was driving along the (pretty straight?) freeway on Autopilot, the driver dozed off/zoned out, didn't touch the steering wheel for awhile, the car started telling the driver to take the wheel, the alerts finally woke the driver/snapped him out of his reverie, he grabbed the wheel and jerked it, and the car went off the road, Autopilot having turned itself off.

Which is worse: an Autopilot that can't keep the car on the road, or an Autopilot that suddenly turns itself off and forces a groggy, startled human to "take manual control"?

Yeah, the driver should have remained alert for hours, always ready to instantly resume manual control at all times . . . but untrained people just don't/can't do that. Google figured this out as they developed self driving cars, which is why Google is no longer developing "semi autonomous" cars. Tesla is digging itself deeper by continuing to run its beta Autopilot experiment on public roads.

sc_rufctr 07-16-2016 12:27 AM

Quote:

Originally Posted by jyl (Post 9201064)
Boy, Tesla is really taking a legalistic, disingenious approach to Autopilot accidents.

On July 1, a Tesla went off a highway, hit a guardrail. The driver told police the car was on Autopilot. Tesla denied this, said the car's logs proved it was not on Autopilot. Now Tesla has said what it meant was: the car was on Autopilot, but Autopilot TURNED ITSELF OFF, leaving the car driving unguided, and 40 seconds later the car went off the road and crashed.

http://forums.pelicanparts.com/uploa...1468657626.jpg

Seahawk 07-16-2016 04:59 AM

Quote:

Originally Posted by jyl (Post 9185054)
I used to be a product liability lawyer, defending auto companies. In a former life.

This would be a delicious case to have on the plaintiff side, and a difficult case to have on the defense side.

Quote:

Originally Posted by jyl (Post 9201064)
Boy, Tesla is really taking a legalistic, disingenuous approach to Autopilot accidents.

Tesla is digging itself deeper by continuing to run its beta Autopilot experiment on public roads.

Those were very informative posts. Thanks.

As a comparison, in Navy flight test we divide test into two separate and distinct parts: Developmental Flight Test (DT) and Operational Flight Test (OT).

DT is a very controlled, measured build-up with instrumented aircraft and sensors, lots of data collection and precise flight on instrumented ranges.

OT is a less controlled (but heavy on data collection) series of tests that take the aircraft into it's intended operational environment. We really work hard at trying to discover both the human interface issues and engineering issues before the system hits the fleet.

The "auto-pilot" turning itself off would be a complete "unsuitable" in both DT and OT. It would have to be fixed and strenuously tested.

The other, more insidious problem is the "driver" transition period from inattention to resuming control of the vehicle. I would really like to see how that is going to be dealt with - where they get the math on human system performance in that particular regime.

jyl 07-23-2016 10:34 PM

Tesla working on changes to Autopilot driver-assistance software

Telsa (Musk) is acknowledging, I think, that the car's radar detected the tractor trailer but the car was programmed to ignore overhead objects and thus ignored the trailer.

I remain astonished that Tesla's lawyers are permitting the company's CEO to tweet about a fatal accident involving Autopilot, much less to tweet details about why the Autopilot didn't work and how the software could have been designed to prevent the accident.

Florida, like most states, imposes strict liability for design defects. "Under Florida law of strict product liability, a defendant is strictly liable for a plaintiff's injury if the product is in a condition that is unreasonably dangerous. A product is unreasonably dangerous if the product fails to perform as safely as an ordinary consumer would expect when used as intended or in a manner reasonably foreseeable by the manufacturer or if the risk of danger in the design of the product outweighs the benefits."

What are the chances that a Florida jury will think that an "ordinary consumer" would expect a $100,000 car, with an "Autopilot" that is advertised as safer than the average driver, to drive at full speed into a tractor trailer that was detected by the car's radar?

Won 07-24-2016 05:54 AM

Quote:

Originally Posted by jyl (Post 9201064)
Which is worse: an Autopilot that can't keep the car on the road, or an Autopilot that suddenly turns itself off and forces a groggy, startled human to "take manual control"?

A friend of mine does machine learning and visual processing stuff for autonomous driving, for German OEMs. According to him, the most difficult part currently is figuring out how to keep the human engaged, or at least to make them re-focus effectively on driving, should the need arise - exactly as you say.

Like the old IT saying - Problem Exists Between Chair And Keyboard.

intakexhaust 07-24-2016 08:13 AM

What the heck are these companies thinking? I can see them perhaps focused on an auto steer and brake ONLY to avoid a collision but not autonomous -point A to B. Nutz.

This is a world of people that become complacent on nearly everything. Just look at the zombie pedestrians with their phones and now the future might have them behind an auto steer vehicle?

Wait until the day we have manual steering parent with an entire family in van getting crashed into and burned to death by some failed autonomous Tesla.

jyl 07-24-2016 08:50 AM

When I represented a domestic auto OEM in product liability lawsuits, plaintiffs' lawyers often sought to force senior management, up to and including the CEO, to give lengthy videotaped depositions. Our client fought these attempts, as you would expect, since they were simply efforts to harass the executives and burden the company. We usually succeeded in preventing those depositions, because the courts recognized the pointlessness of questioning top executives about engineering decisions about which they had no knowledge or involvement. Musk's tweets, in contrast, show that he is fair game for depositions about Autopilot's programming and the decision to enable and not disable Autopilot, to include or omit Lidar and similar sensors, etc. Musk already divides his time between Tesla and Space-X. He only has 365 days available each year, and if he has to spend 2 days preparing for and submitting to each deposition, that is a significant drain on the company resource that is his time, not to mention the reputational risk of having your star CEO interrogated under oath. And what is Tesla thinking when it deletes Musk's tweets, do they think that makes them unavailable?

Holger 09-29-2016 07:06 AM

A Tesla drove into a bus near Hamburg today.
The Autopilot was active.
The driver only got minor injuries.

manbridge 74 09-29-2016 09:00 AM

Quote:

Originally Posted by jyl (Post 9211992)
When I represented a domestic auto OEM in product liability lawsuits, plaintiffs' lawyers often sought to force senior management, up to and including the CEO, to give lengthy videotaped depositions. Our client fought these attempts, as you would expect, since they were simply efforts to harass the executives and burden the company. We usually succeeded in preventing those depositions, because the courts recognized the pointlessness of questioning top executives about engineering decisions about which they had no knowledge or involvement. Musk's tweets, in contrast, show that he is fair game for depositions about Autopilot's programming and the decision to enable and not disable Autopilot, to include or omit Lidar and similar sensors, etc. Musk already divides his time between Tesla and Space-X. He only has 365 days available each year, and if he has to spend 2 days preparing for and submitting to each deposition, that is a significant drain on the company resource that is his time, not to mention the reputational risk of having your star CEO interrogated under oath. And what is Tesla thinking when it deletes Musk's tweets, do they think that makes them unavailable?

You used to be a lawyer or still are?

I ask because I find your analytic skills somewhat lacking.

Pazuzu 09-29-2016 09:08 AM

Quote:

Originally Posted by Holger (Post 9299416)
A Tesla drove into a bus near Hamburg today.
The Autopilot was active.
The driver only got minor injuries.

News is slow to get here...

Autobahn speeds, autopilot, holding the wheel but clearly not concentrating, runs into the back of a bus that changed lanes in front of him. I can't find any reports here on how fast he was going, can the autopilot system handle autobahn speeds?

911 Rod 09-29-2016 09:54 AM

https://www.youtube.com/watch?v=NqnJRo4FQNo

This is crazy


All times are GMT -8. The time now is 05:51 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website


DTO Garage Plus vBulletin Plugins by Drive Thru Online, Inc.