One of Uber’s autonomous vehicles in Tempe, Arizona (a Phoenix suburb) struck and killed a 49-year-old pedestrian on March 18, 2018. A “safety” driver was in the driver’s seat and was found to be unimpaired, though they did not take the wheel to avoid the crash. The death is considered the first pedestrian fatality associated with autonomous vehicles. After the crash, Uber, a ride-for-hire service, stopped all self-driving vehicle testing in Tempe, San Francisco, Pittsburgh, and Toronto, Canada. Police have yet to determine fault.
Driver error causes 94 percent of all vehicular crashes—an astonishing number—according to the National Highway Traffic Safety Administration. We’ve been told that autonomous vehicles might eliminate driver-error types of crashes; it simply isn’t true, though a significant reduction seems likely. The software that runs driverless cars is far from perfect, because no software is perfect—ask anyone who’s ever experienced the frustration of the “blue screen of death” on a PC. Driving requires so many simultaneous decisions involving others’ unpredictable behaviors that it may well be impossible for technology to protect against all crash possibilities.
But we need to ask why the car in Tempe did not stop as it was supposed to, as well as why the backup driver didn’t take control of the car, which was traveling at approximately 38 miles per hour at the time of impact. In such a situation, is the crash due to negligence on someone’s part, or is it the technology’s fault (product liability)?
A Complex Question
Normally, a driver and his or her insurance company would be considered responsible for a driver-error collision—in other words, it would be negligence. But with an autonomous vehicle crash, it’s possible that, legally, a crash is not due to negligence, because technically the backup driver isn’t driving. Additionally, the collision may or may not be due to product liability, depending on legal interpretations. Clearly, there is fault at play—but how do you sue a “robocar”?
An attorney based in California, Sergei Lemberg, claims that Uber, Volvo (the maker of the SUV in the accident), the companies that supplied the self-driving technology, and even the backup driver could be held liable in a negligence scenario. Product liability does not loom large in his reading of the situation.
Others believe that the legal spotlight will become focused on product liability from the aspect of defective design, both of which call for liability without fault. What would need to be demonstrated for a defective design claim is that an inherent flaw arising from the product’s design rendered it unsafe.
Legal scholar Bryant Walker Smith, a University of South Carolina assistant professor of both law and engineering, thinks that crashes involving driverless cars will translate into a shift of fault from negligence to product liability. He has noted that drivers in Arizona (as in most states) are required by law to exercise “due care” to avoid striking pedestrians, even if they are not walking in a marked crosswalk, which was the case in Tempe. Therefore, the software in an autonomous vehicle must be designed to notice pedestrians who are crossing in unmarked areas.
Regardless of fault, however, Smith expects any litigation to be settled quickly by Uber to avoid public scrutiny: “Only if Uber believes that it was wholly without fault could I see this case going to trial.”
Who is Bryant Walker Smith?
Smith is a nationally-known expert in autonomous driving and the law. He taught the first-ever course on self-driving vehicles and legal issues and led the autonomous driving and legal issues program at Stanford University. Previous to his legal career, Smith was a transportation engineer.
Locally, Smith may well become influential in South Carolina as our state attempts to determine what kinds of laws and regulations, if any, should be imposed on autonomous vehicles, and where liability should be laid in cases of serious crashes involving such vehicles.