Friday 8 July 2016

Self-driving cars - fatal Tesla car crash

Fatal Tesla Self-Driving Car Crash Reminds Us That Robots Aren't Perfect: The first fatal crash involving Tesla's Autopilot system highlights the contradictory expectations of vehicle autonomy
On 7 May, a Tesla Model S was involved in a fatal accident in Florida. At the time of the accident, the vehicle was driving itself, using its Autopilot system. The system didn’t stop for a tractor-trailer attempting to turn across a divided highway, and the Tesla collided with the trailer. In a statement, Tesla Motors said this is the “first known fatality in just over 130 million miles [210 million km] where Autopilot was activated” and suggested that this ratio makes the Autopilot safer than an average vehicle
And discussed by Kaydee in the Engineering Ethics Blog:
By all accounts, Brown [the 'driver' of the car, Joshua Brown] was a generous, enthusiastic risk-taker (his specialty when he was in the military was disarming weapons, according to a New York Times report), and hands-free driving went against the explicit instructions Tesla provides for the autopilot feature. But Tesla owners do it all the time, apparently, and until May 7, Mr. Brown had gotten away with it. ...
Still, telling drivers how great a self-driving feature is, and then expecting them to pay constant attention as though the car were a driver's ed student and you were the instructor, is sending a mixed message.
Kaydee makes an interesting comparison with the first recorded steam-locomotive railway fatality which was:
...that of the English politician William Huskisson, who attended the opening ceremonies of the Liverpool and Manchester Railway on Sept. 15, 1830, which featured inventor George Stephenson's locomotive the Rocket. Wanting to shake the hand of his former political enemy the Duke of Wellington, Huskisson walked over to the Duke's railway carriage, then saw that the Rocket was bearing down on him on a parallel track. He panicked, tried to climb onto the carriage, and fell back onto the track, where the locomotive ran over his leg and caused injuries that were ultimately fatal. Passengers had been warned to stay inside the train, but many paid no attention.
If Huskisson's death had been mysterious and incomprehensible, it might have led to a wider fear of railways in general. But everyone who learned of it took away the useful lesson that hanging around in front of oncoming steam locomotives wasn't a good idea, and railways became an essential feature of modern life. Nevertheless, every accident can teach engineers and the rest of us useful lessons in how to prevent the next one, and the same is true in Mr. Brown's sad case.

Huskisson's accident - source: http://www.kidderminstershuttle.co.uk/news/regional/11805260.The_Walk__Under_the_shadow_of_death/
The particular interest for this blog, though, is the information ethics question of the attribution of responsibility for the accident - and whether the fact that it was self-driving makes any difference. In The Ethics of Information Floridi uses the distinction between moral accountability and moral responsibility, and maybe in this case the car is accountable but either the driver or Tesla (or both) are responsible, though I'm not whether that really contributes anything useful.

No comments: