(Close Window)
Topic: That trolley problem and self-driving cars
Message: Posted by: landmark (Sep 25, 2017 07:33AM)

Synopsis: when confronted with alternate tragic scenarios, a self-driving car will have to make a decision as to what to do. That decision will be governed by some algorithm programmed by human beings. In effect, the programmers will be making some very intense complex moral decisions...
Message: Posted by: Jonathan Townsend (Sep 25, 2017 07:39AM)
But that self driving Mercedes costs more to repair than the pedestrians have in health insurance...

Message: Posted by: landmark (Sep 25, 2017 08:06AM)
First rule to be encoded: "Kill all the lawyers..."
Message: Posted by: Jonathan Townsend (Sep 25, 2017 09:37AM)
Sadists getting dolts to squeal about imaginary suffering and outcomes.

Correct answers include:
Are they poor / black?
Will my insurance rates go up?
Would it make you unhappy if I just stopped the trolley and nobody had to die?
How about we do this with your family, pets and some visiting dignitaries' vehicle which is unmammed but delivering a pizza?
At an intersection does Amazon.com have to yield to Pizza Hut?

Back in 2008 there was a TV story about worse outcomes for navigation and autopilot in cars. "poison sky"

What about two ambulances at an intersection?
What if someone has a car that registers as priority traffic to the automated system?

Old questions. meh
Message: Posted by: landmark (Sep 25, 2017 12:09PM)
Jon, I always thought the framing of the trolley car problem was useless, precisely because humans are [i]not[/i] self-driving cars who make deep decisions on some sort of consistent rule-based algorithm. So agreement on the meh as it applies to the abstract formulation.

But, I don't think the subject is dismissable, because, in fact, self-driving cars have the corporate, military, and spook communities strongly in favor of the technology's dissemination. They will be a large reality soon enough. And unlike humans, they [i]will[/i] have to have consistent algorithms built in.

The one thing though, that always amused me was to wonder how they will be programmed to respond to road rage, after they cause a major accident--or simply cut off someone a little too quickly. A sensor for a bat or gun approaching the window?

And your Amazon vs. Pizza Hut comment I think is prescient and very relevant. Looking forward to the Supreme Court cases, as one company tries to sabotage another company's right of way.
Message: Posted by: Jonathan Townsend (Sep 25, 2017 12:42PM)
It's pre-programmed overides that worry me. And the likely inclusion of code to receive overrides.

Folks might recall police car-stopping technology attempts from decades ago. Now it's no big deal to lock the passengers inside, tell the vehicle where to go and then let them stew till they're ready to cooperate.

Elementary, Watson...the motive is the means.
Message: Posted by: landmark (Sep 29, 2017 07:40AM)
[quote] And the likely inclusion of code to receive overrides.[/quote]
And of course you're implying here not only gifts from the Police State, but non-aligned mischief as well.
Message: Posted by: tommy (Sep 29, 2017 08:44AM)
Donít worry it will be expertly programmed like the green people are by the IPCC.