Pelican Parts Forums

Pelican Parts Forums (http://forums.pelicanparts.com/)
-   Off Topic Discussions (http://forums.pelicanparts.com/off-topic-discussions/)
-   -   Why Self-Driving Cars Must Be Programmed to Kill (http://forums.pelicanparts.com/off-topic-discussions/888469-why-self-driving-cars-must-programmed-kill.html)

wayner 10-26-2015 04:07 AM

I fear that the insurance companies will remove the unpredictable human element from the equation and we will have no choice but to be driven.

SiberianDVM 10-26-2015 05:04 AM

Quote:

Originally Posted by on2wheels52 (Post 8851272)
For myself, as someone who rides a motorcycle every day, I rather look forward to the predictability of said self-driven cars. It would beat wondering what maneuver the yo-yo next to me is going to do next.
Jim

I'll be glad knowing that the self-driven motorcycle behind me isn't going to try to scare me by passing me in a double yellow zone while simultaneously doing a wheelie.

stomachmonkey 10-26-2015 05:34 AM

Quote:

Originally Posted by aap1966 (Post 8851226)
How will a self driving car differentiate between a shopping cart running into it's path ("OK to hit") and a pram running into it's path ("Not OK to hit")?
If a soccer ball suddenly appears on the road, will the car know it's OK to continue (good field of view, soccer players not nearby) or slam on it's brakes? (poor field of view, kid may be chasing the ball).
Will a car sacrifice itself? (Hit the kid, or miss the kid by swerving into the path of the truck?)
Can the car differentiate between animates? (Hit the dog, miss the truck vs miss the kid, hit the truck).

Self driving cars will need to make moral decisions.
We have enough trouble deciding which crisis human decisions are morally justified.

What about the necessary breaking of road rules?
Driving through a red light to make way for the ambulance to get through the intersection?

Some of that is pretty simple, thermal imaging rules out the shopping cart.

Soccer ball has no thermal image but as you point out, clear field of view or did the ball appear from behind a large object like a parked car?

When is it appropriate to break the established rules of the road to avoid a collision?

Now we are getting into order of priority, a decision tree.

We can certainly create the if / and / or tree but that brings with it another thing to deal with, reaction time.

What is the processing delay in the car reaching a decision and then acting on it?

If the process starts to exceed human reaction time it's a no go.

Brian in VA 10-26-2015 05:46 AM

A lot of philosophical questions so that we can sit back and do work on our laptops on the way to work. Great.
Can your car drive you home if you've been drinking?

Holger 10-26-2015 06:04 AM

Quote:

Originally Posted by stomachmonkey (Post 8851403)
Now we are getting into order of priority, a decision tree.

Yes, but what happens if several things happen simultaneously?

wayner 10-26-2015 06:23 AM

Quote:

Originally Posted by stomachmonkey (Post 8851403)
When is it appropriate to break the established rules of the road to avoid a collision?...

How about accelerating through a red light from a dead stop...when the ABS in the car behind you has kicked in, increasing its stopping distance as it is careening down a snowy hill directly towards you?

I'm glad that my car still lets me make decisions.;)

stomachmonkey 10-26-2015 06:45 AM

Quote:

Originally Posted by Holger (Post 8851458)
Yes, but what happens if several things happen simultaneously?

Exactly.

The simplest way to think about it is a scoring system. Could be 1-10, 1-100, 1-1,000 etc...

Factors are weighted and combine for a cumulative score, some things increase the score, other things may have a negative value and decrease the score.

The more granular the scoring the less likely you end up with scenarios that tie although you still have to assume it may eventually happen.

Now the other thing is every action the car takes may introduce new parameters that need to be calculated.

The system could become so complex that it ends up being detrimental to the point that you resort to basic logic, action is taken that represents the least loss of life with no consideration for any other factors.

stomachmonkey 10-26-2015 06:59 AM

Quote:

Originally Posted by wayner (Post 8851486)
How about accelerating through a red light from a dead stop...when the ABS in the car behind you has kicked in, increasing its stopping distance as it is careening down a snowy hill directly towards you?

I'm glad that my car still lets me make decisions.;)

Again that's something that you can get lost in trying to set logic for.

The basics are simple, the car has range finders which can be used to calculate closing distance and speed to calculate if the oncoming car will stop in time.

But you are at a 4 way intersection with cars coming from the left and right. Can your car find a hole to get you through?

Your car needs to calculate the closing rates of those cars and whether or not it can provide the timed acceleration necessary.

Now it needs to make basic assumptions, what are the road conditions? Snow covered so a hard launch is out of the question. Tires, what condition are they in? How much weight (passengers, cargo) are in your vehicle adding to the gross weight that will affect traction / acceleration.

The AI has to have some behavioral modeling, what are the oncoming cars likely to do when they see you pull into the intersection and how will their actions affect the results of the initial go / no go decision.

If those other cars are also self driving then it's a bit easier because you have a predictive model to calculate against but what programming are they carrying? That opens a whole other can of worms.

There will need to be an industry standard for the logic, you can't let everyone develop their own. One base set of code that all the cars use.

There would certainly be periodic updates required. How do we ensure all cars are running the same version? Real time over the air updates or is it enough to flash the computer during annual vehicle inspections?

We would need an agency that oversees the whole thing, more bureaucracy.

Jay Auskin 10-26-2015 07:04 AM

I just hope cars are programmed to fulfill the stereotypes attached to them. You know: Mustangs rev their engines 3-4 times minimum at a stoplight, minivans ignore yield signs, and BMW cars self-park, taking up two spaces. :)

GH85Carrera 10-26-2015 07:22 AM

Quote:

Originally Posted by Jay Auskin (Post 8851554)
I just hope cars are programmed to fulfill the stereotypes attached to them. You know: Mustangs rev their engines 3-4 times minimum at a stoplight, minivans ignore yield signs, and BMW cars self-park, taking up two spaces. :)

And for sure the BMWs don't use turn signals. :eek:

cashflyer 10-26-2015 07:52 AM

In the situations posed by the article, I think the cars should have a randomizer routine in the program so that you can never really predict the outcome. That way you can wager or turn it into a drinking game.

Which brings up another question... will self driving cars be allowed to fuel with ethanol? Or would that be drinking and driving?

Quote:

I predict that self-driving cars will be given their own strip of the road to separate them from 'driven' cars and trucks.
I'm happy enough for them to give the bicycle lane to the self driving cars, as long as they program the cars to knock over the bicyclists and then honk and rev tauntingly at them.

pcardude 10-26-2015 08:02 AM

Interesting. I thought it would just try not to crash. Calculating whose life to try to save is crazy

stomachmonkey 10-26-2015 08:09 AM

Quote:

Originally Posted by pcardude (Post 8851645)
Interesting. I thought it would just try not to crash. Calculating whose life to try to save is crazy

Liability.

Imagine the lawsuit when a mother pushing her baby stroller across an intersection gets taken out by a self driving car that was avoiding a head on crash with another vehicle.

The plaintiffs argument will be, "you were able to program it to avoid a head on so why not to avoid __________"

Nickshu 10-26-2015 08:15 AM

I don't think we'll ever see true self-driving cars. Most likely more driver aids. Automakers will never accept total liability for every accident that occurs. Lawmakers will never give automakers total immunity.

mikehinton 10-26-2015 08:16 AM

Sounds like a bountiful new revenue stream for trial lawyers.

dad911 10-26-2015 09:37 AM

I hope PP and I will be around in 20 years to revisit this thread, lol. Autonomous cars are inevitable, google (and soon apple) will prove it. Collisions will be avoided, and response will be faster and more reliable than human control.

In the 1980's, Rutgers University had less storage/processor power in their mainframe than my iphone has now. I remember a 'friendly debate' with a professor over real-time language translation, which seemed (to me) highly unlikely at the time. If you gave an engineer specs then for one of today's phones, they would have said impossible.

I can only think of 1 event, in my driving career (conservative estimate 900,000 miles) where I am not sure a computer would have been able to avoid the tractor trailer sliding down the hill sideways in front of me, in the snow&ice. But I'm pretty sure sensors and a computer would have found the 2 cars that almost hit me head-on sooner, and done a better job than I the countless times I have driven stupidly late, sick, tired, mad, or slightly inebriated.

If an autopilot can land a passenger jet, surely an autonomous car can avoid collisions safely.

Less than 20 years, more likely 10. I'd be interested in 1 today, for my mother.

wayner 10-26-2015 09:39 AM

I wonder from a liability stand point how this is any different than self-landing airplanes?

I wonder if insurance companies like the predictable risk better when a human is removed?

I wonder how upgrades will be controlled without making things accidentally worse?

Microsoft couldn't do it but Jobs could as the ultimate QA guy at Apple but since he is gone that Competetive advantage is now out the window

In theory this could all work but humans are behind it all anyway, and one mistake could now mean hundreds or millions of lives

Dantilla 10-26-2015 11:34 AM

Quote:

Originally Posted by dad911 (Post 8851818)

If an autopilot can land a passenger jet, surely an autonomous car can avoid collisions safely.

Quote:

Originally Posted by wayner (Post 8851822)
... how this is any different than self-landing airplanes?

An airplane with autoland capability requires both software (easy) and lots of hardware both in the airplane and on the ground, in the immediate area of the runway. Not just a simple GPS. It is not reasonable to outfit every road/intersection with the equipment necessary to compare a self-guided car to autoland capability.

Besides that, the airplane is not cleared to land until all other airplanes are out of the way, hence no risk of collision. Only one airplane uses the runway at a time.

While traffic alerts are common now in airplanes (even my little 4-seater has it), no airplane takes action in its own. Just an alert for the human at the controls.

......

Imagine a self-driving car cruising along the hiway at 60 mph or so. Suppose a mosquito full of 98 degree blood flies near the car, so that the approaching mosquito is quickly gaining in size compared to the small infrared sensor. Car goes into threshold braking, mosquito splats right on the sensor, so the car fires the airbags.

Now the car gets rear-ended, and causes a chain reaction pile-up.

I'm sure the engineers working on this have figured out what to do about wayward bugs.

I hope so......

stomachmonkey 10-26-2015 12:45 PM

Quote:

Originally Posted by Dantilla (Post 8851995)
An airplane with autoland capability requires both software (easy) and lots of hardware both in the airplane and on the ground, in the immediate area of the runway. Not just a simple GPS. It is not reasonable to outfit every road/intersection with the equipment necessary to compare a self-guided car to autoland capability.....

To your point a more realistic parallel would be aircraft collision avoidance systems multiplied by the delta between planes and cars and even then it only assists with vehicle on vehicle avoidance, won't help if Bambi decides to cross the road at an inconvenient time.

https://en.wikipedia.org/wiki/Airborne_collision_avoidance_system

Holger 10-27-2015 01:29 AM

Quote:

Originally Posted by Nickshu (Post 8851664)
I don't think we'll ever see true self-driving cars. Most likely more driver aids. Automakers will never accept total liability for every accident that occurs. Lawmakers will never give automakers total immunity.

^this

Or self driving cars get own lanes and are only self driving on highways and other safe environments!


Regarding airplanes: I have never understood why there are no restrictions about what the pilots can do and override. How the hell can it be possible to crash a perfectly working plane into a mountain?! This should be avoided by the electronics in all scenarios! Cant be that hard!


All times are GMT -8. The time now is 10:50 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website


DTO Garage Plus vBulletin Plugins by Drive Thru Online, Inc.