Skip to content
Join our Newsletter

John Ducker: Who's to blame in crash with driverless car?

Technology is moving fast in the vehicle world.
web1_nysh301-320_2020_171117
A parking lot is full of Uber self-driving Volvos in Pittsburgh in March, 2020. Both Uber and Lyft have since sold off their self-driving vehicle projects to bigger corporations. THE ASSOCIATED PRESS

Technology is moving fast in the vehicle world. Even before truly autonomous vehicles hit our streets in large numbers, big issues have popped up — things that cannot be ignored, even though we are still years and perhaps decades away from having to face these problems as individual drivers.

For example: Who’s to blame in a crash involving a driverless car?

Already several U.S. jurisdictions are trying to wade through this legal morass, even as the whole driverless car concept is still in its infancy.

The highest profile case so far comes from Phoenix Arizona.

In 2018 Rafaela Vasquez, a “back-up” driver in an Uber company vehicle was charged with negligent homicide after the autonomous Volvo she was overseeing failed to spot a woman walking a bicycle across a night time street.

The technological and legal factors woven through this incident are mind blowing.

Uber officials had disabled the manufacturer’s installed automatic braking system because for some unexplained reason it wouldn’t work with the vehicle’s self-driving system.

Then the vehicle’s detection system failed to identify the woman crossing in front as a hazard initially. When the car finally figured out an emergency stop was required it did not alert Vasquez. The car plowed into the woman at 60 km/h without any braking, killing her instantly.

Complicating things further were the on-board readings and video from inside the Uber car showing that Vasquez may have been viewing an episode of The Voice when the victim was struck. She and her lawyers deny that, saying that she was only listening to the program while attending to an Uber app as required by her employer.

Some legal experts and pundits say it was Uber which should have been charged. Their requirements for operating the vehicle while deliberately disabling obvious safety features make them morally, if not legally, culpable.

While there is something to that, it’s going to be difficult for the defence to argue that a person being present inside of a car, in whatever diminished role they might have, has no responsibility for the conduct of that vehicle.

Yet Vasquez still has her defenders. Pundits and researchers say what happened made her a “moral crumple zone.”

First coined by Madeleine Clare Elish, co-founder of the AI on the Ground Initiative at the Data and Society Research Institute, moral crumple zones are instances where a corporation places a driver inside an autonomous vehicle, not to be a fail safe in times of trouble, but simply to be a scapegoat when things go wrong.

In this scenario a complex system, being monitored by more than one person, failed. As it sits for the moment, the person inside the car is deemed liable, although she arguably did not have genuine control over the vehicle’s actions.

Uber was quick to turn all of its data over to investigators, including video which shows her looking at something other than the road ahead at the time of impact. In doing so, Uber has escaped prosecution for the moment and ensured Vasquez became the moral crumple zone — protecting them from criminal liability as a bumper and an airbag would protect vehicle occupants in a normal crash.

The case has dragged on now for flour years, with no end in sight.

So where to from here? In terms of figuring out criminal liability — who knows?

The U.S. National Transportation Safety Board issued a damning report on the incident which found that Vasquez still could have had time to react properly if she had been paying attention. But the Board also lambasted Uber for “inadequate safety cultures” and lax oversight of this program which led to “shortcomings.”

Meanwhile in Europe (once again), England and France have already laid down strict legal frameworks to govern autonomous vehicles. The UK has said that it will be solely the responsibility of the self driving car manufacturer when a vehicle crashes and is in autonomous mode.

France has already approved regulations which clearly waive responsibility for a human operator inside a vehicle which is in autonomous mode. Germany and Japan are on a similar track.

These moves by governments and these series of high profile crashes in the U.S. has given Silicon Valley pause. Both Uber and Lyft sold off their autonomous vehicle projects to bigger corporations.

Tech watchers predict that because of the crashes, other major technical glitches and ongoing legal troubles, the truly self-driving car is still years away.

That’s a good thing. We need a product that, at a minimum, matches airliner safety standards before opening the floodgates any further.

I hope regulators here are watching closely. The pause is a big opportunity to earnestly prioritize the safety of Canadian road users in the future.

johntcdriving@gmail.com