A recent statement by the Yavapai County, Arizona Attorney reported by Reuters reveals their decision that Uber will not be held criminally liable in the crash one year ago which killed Elaine Herzberg in Tempe Arizona. I have written extensively about the crash, which became famous as the first fatality in the testing of robocars.

The police are still considering charges against Uber’s sole safety driver, who is alleged to have been watching a streaming TV show on her phone prior to the crash. Uber settled quickly with some family members of the deceased, but was told to stop testing in Arizona by the governor. They voluntarily stopped in all other locations.

I agree with the police that most of the culpability lies with a safety driver who would completely ignore her job and watch a video. The safety driver’s job is to diligently watch the road and be ready to intervene in the event of any problem, or even likely problem. An alert safety driver would almost surely have prevented this crash.

While I and many others have written a great deal about the serious flaws in Uber’s software which were the proximate cause of the crash, it is important to understand these were entirely secondary to the real apparent cause —  negligent safety driving. The robocars on the road today are all prototypes undergoing testing. Because of that, like teen drivers in driving school, they are watched at all times by (usually) 2 safety drivers, watching the road and the software, and trained and ready to intervene in the event of any problem. I did some of this training in my time at Google (now Waymo.)

All the prototypes on the road have had many software errors which would have caused a crash without intervention. This is true even of Waymo, the most advanced team. All teams have these errors quite frequently in the early days of their testing, and have fewer of them as they get better. Some teams, with Waymo again in the lead, are going tens of thousands of miles between needed interventions. Leaked reports suggested Uber was going about 13 miles between interventions at the time of the incident. Reports to California indicated that Apple Computer’s cars were going less than a mile between interventions for much of last year.

In spite of this, the safety driver approach works remarkably well. Waymo has now reached over 10 million miles of testing with only one minor accident in which it was at fault. That’s a much better safety record than the average human has — we would probably have 10 at-fault accidents in that time period, serious enough to get reported to police. That’s good news because it shows that testing with safety drivers is not putting the public to any unusual risk — in fact, it’s less risky than any other driving any company might have its employees do.

Uber was not doing safety driving well at all. While most teams use two safety drivers (the second one can’t take the wheel but can shout) and Uber used two until a few months before the accident, Uber decided to switch to only one. Rumors suggest this was done in order to speed up testing in advance of a big demo, but this would be odd, because for most large-company projects, the limiting factor in their testing is the number of vehicles, not the number of safety drivers. Even so, they did drop to just one.

In addition, they were (it seems clear in hindsight) not vetting the drivers well, nor were they supervising them well. Uber had in place no tools to monitor the safety driver and assure her attention was keeping on the road. To be fair, many teams did not have such monitoring in place at the time, though since then free software packages to do this have been released and teams are or have already integrated this approach. Uber did keep a video recording in the cabin so they could see, after the fact, what the safety drivers were doing, but apparently, they did not use it to enforce discipline on them.

One can speculate that the reason the state attorney is not charging Uber is the conclusion that they were betrayed by the safety driver they hired. (She is an ex-con, which Uber was very aware of because she was hired through a program to help rehabilitate such people.) There is no excuse for watching a video when you are being paid to watch the road to ensure safety.

There is no excuse, but we can’t ignore the forces that might lead to such action. We see the same forces in Tesla drivers who, told that they must constantly be watching the road with Tesla Autopilot, look away from it and suffer some crashes and probably a few fatalities. There is a human nature that makes people slack off from jobs requiring constant concentration. And there is also a human failing when it comes to trusting technology. When the car is driving itself, it’s easy to believe you can get away with looking away. Usually just for a short time — people regularly type text messages in cars that can’t drive themselves at all — but in some cases longer. Even when you’ve been very clearly instructed that you must not do that. People will not reliably follow that instruction. This is one reason that most teams use two safety drivers. (The other core reason is that the 2nd person monitors the software to make sure it is operating properly and not giving warnings. The driver behind the wheel must not do that.)

Two drivers also form a social bond and keep each other alert. A second staffer would surely never have allowed the main driver to watch a TV show.

Uber put their safety drivers in a position where it should be expected they would fail — even if rarely. But this driver went beyond that, putting herself, her employer and the public at risk with a dereliction of duty that goes beyond the pale. Most culpability lies with her, but Uber is not innocent, either.

There’s other blame too. Because the victim was a little high and crossing at a “do not cross here” sign, legally she did not have right-of-way and the fault was hers under the traffic code. And the city of Tempe is being sued by some family members because this location, in spite of the “do not cross” signs, has a paved sidewalk in the median which just begs “here is where to cross,” though not in words. Most of the family settled quickly with Uber.

And yes, Uber’s software screwed up royally too. While we await the NTSB investigation, details of the failings in that software have not been fully disclosed, but it seems clear that Uber’s project was quite immature. Waymo operates with one safety driver (and sometimes no safety drivers) in the same area, but with a system that is vastly more mature. Uber’s project was so immature it was, it can be argued, negligent for them to operate it with only one safety driver. But not criminally so according to police. I have an outline of the major issues with the software for those with interest.  Many ascribe blame to the fact that Uber’s system was not reliable enough to do emergency braking, and so they depended on safety drivers to do that — but again, these are prototypes which have these sorts of faults.  It could surely have given a warning, however.  And disturbing suggestions have come out that all this happened in order to make an important demo for the CEO appear smoother.  We hope to learn more about that in the final NTSB document.

The fatality here set the industry back. It increased public fear and distrust of the vehicles and the testing, in Arizona and beyond. That slowing will, perversely, have a serious cost in lives, because it means the technology will reach saturated deployment later as well, where it will be saving huge numbers of lives by taking cars out of the hands of reckless human drivers. Each year of delay of deployment will result in the deaths of 20-30,000 Americans and perhaps 500,000 or more worldwide. You read me right — if Uber’s errors here delay all deployment by a year, half a million people will die needlessly. Not at the “hands” of robocars, but at the hands of people manually driving because their robocar was delayed. It’s a sobering number.

Of course, if the safety driver faces criminal charges, it won’t answer many questions about how robocars should be developed. We already know that watching a TV show was a bad thing to do. In that sense, the criminal part of this story is over. A few civil issues remain. Because Elaine Herzberg was homeless, a wrongful death suit would have been unlikely to win major damages, and because she was crossing at the wrong place, the vehicle code was not in her favor. (The Arizona vehicle code still requires a driver to make reasonable efforts to avoid a pedestrian even if they should not be there, but police decided not to apply this.) This event won’t end up telling us very much about the law of robocar accidents in the long run. That is yet to come. Uber has vowed to return to testing on the streets, but a year later has done little more than let human drivers operate their test vehicles manually. For a time it seemed as though Uber’s project might suffer a death penalty because of this incident but Uber continues on.

Let’s hope it’s a long time before another challenge of this sort arises.