The futuristic inventions of your childhood sci-fi movies, from video calls to self-driving cars,have all become our reality. However, the laws required to manage complications of these inventions are not fully developed yet. Unlike a Zoom call or Google watch, self-driving cars require heavy regulation because there are lives at stake.
There isn’t much legal precedent for what happens in self-driving car accidents across the US. The technology is too new. Simply put, legal precedent is how past rulings of court cases define the law. The outcome of these cases can decide the laws moving forward and define who is liable in accidents, what settlements may be like, and the culpability of corporations.
Laws and regulations surrounding autonomous vehicles are still not consistent across all states. So if you get into a car accident in Indiana with a Tesla, you may face different consequences than in Arizona. It’s important to have an idea of the laws in your state governing liability in self-driving car accidents before purchasing your first autonomous vehicle.
Any accident can be scary and overwhelming. When you add a car with advanced technology like a Tesla, that can add an extra layer of stress. Will you have to file a lawsuit against the car company to cover medical bills? Or worse, are you required to cover any damages if there’s a technological error? Who is at fault and who is responsible? That’s still somewhat up for debate.
This article will give you a picture of some past rulings and the scope of the law as it relates to self-driving cars and liability. Whether you’re at fault or seeking damages it’s important to get a clear understanding of the law regarding cars with autopilot features and some statistics about their safety.
Are self driving-cars legal anywhere? (And does it matter?)
Anyone who has spent time in the long lines at the DMV knows anything automobile-based requires a lot of bureaucracy and regulation. While self-driving cars are a great invention they are not fully cleared for the road on the federal level. Only a handful of states have officially authorized self-driving cars via legislation.
Technically, it’s not illegal to own a self-driving car anywhere in the US. Currently, they are cleared for the road as long as they have someone in the driver’s seat. There’s also no clear language in the law about how much driving a driver must do vs. the car. This introduces a gray area. After all, autonomous vehicles were not something lawmakers even imagined when initially writing vehicle and safety regulations. This ambiguity means there are more autonomous cars on the road than our government may be ready for.
While there are NHTSA guidelines on self-driving cars in the US, these are just a framework. Liability is still decided on the state level based on what your state decides when drafting regulations on autonomous vehicles. Are they interpreting the law as-is to accommodate self-driving vehicles or will new laws need to be enacted to cover this changing landscape? Your fender bender with a self-driving Uber could become a landmark case that defines the law in your area. This is why finding the right law firm is essential.
The law is simply not as quick as cutting-edge technology. The NHTSA has just updated its regulations to include self-driving cars that do not have standard driver controls. This would be to accommodate cars like GM’s Cruise Origin which no longer has a steering wheel. While the NHTSA is still working on regulations and collecting data, it’s clear technology is developing faster than legislators are able to enact laws concerning AVs (autonomous vehicles).
The challenges of liability
The challenge to autonomous vehicles with regard to liability is that there is more than one liability framework for AV accidents. Self-driving cars may involve both tort liability and product liability. Tort liability covers your standard driver-based liabilities: negligence, no-fault, and strict liability.
But if you aren’t driving are you at fault as the owner of the car or is it a technical issue? That brings up product liability which would include negligence, strict liability, manufacturing defects, design defects, and failure to warn which would fall onto the car manufacturer or their subsidiaries. The cause of the accident could be computer-based, a manufacturing issue, or negligence in preparing AV drivers for the road. This could also complicate a self-driving car lawsuit or accident case by introducing multiple parties.
Accident Statistics for Self Driving Cars
How safe are self-driving cars? Here are some helpful statistics to give a clearer understanding of the landscape.
- In Phoenix Arizona, Waymo’s vehicles were involved in 47 collisions and near-misses over the course of 20 months. Luckily, none of these accidents resulted in serious injuries, and “nearly all” of the collisions were the fault of the other driver.
- Since 2016, three Tesla drivers have died in crashes in which the Autopilot feature was engaged and failed to detect obstacles in the road.
- Between September 2016 and March 2018, there were 37 crashes of Uber test vehicles in autonomous mode, 33 of which involved striking other test vehicles.
Ultimately, it does seem like self-driving cars aren’t necessarily more dangerous. But what can go wrong can vary from simple driving mistakes or accidents to issues involving technological and human error. The following case studies explore liability. (Trigger warning: the following case studies reference wrongful death lawsuits and people who lost their lives in accidents.)
Case study: Pedestrian killed by an Uber self-driving car
Elaine Herzberg’s death in 2018 is the first known case of a pedestrian death caused by a self-driving car. In this case, the backup driver faced criminal liability while Uber did not. It was decided that this was driver negligence because he was watching a television show on his phone while driving. The backup driver is meant to be the final check and so while the computer did not register Herzberg the driver was meant to.
This case opens up an interesting potential precedent with regards to self-driving cars and liability. Essentially, while you may not be actively driving the car you are still taking on the liability and responsibility for the vehicle. This dispels fantasies of doing something else, like working on a laptop or watching YouTube videos, while your car drives for you. Also, given that this was an Uber and not a personal vehicle this does put the liability on the driver since they were there in a professional capacity.
It also presents an interesting future of self-driving cars that might not necessarily be individuals purchasing, owning, and operating their own AVs, but companies like Uber and Waymo owning a fleet of self-driving cars people can use as taxis. In this case, liability fell on the driver.
Case Study: Tesla driver killed 10 Seconds after initiating Autopilot
Jeremy Banner’s 2019 death occurred after initiating autopilot on his 2018 Tesla Model S, which then crashed into a semitrailer crossing its path on a Florida highway. The subsequent lawsuit references an almost identical accident in 2016 involving a Tesla S on autopilot crashing into a semitrailer on the same Florida highway, killing driver Joshua Brown.
The lawsuit claims Banner believed Tesla’s vehicle autopilot system and various safety components would “prevent fatal injury resulting from driving into obstacles and/or vehicles in the path of the subject Tesla vehicle.” It also claimed that Tesla “specifically knew that its product was defective and would not properly and safely avoid impacting other vehicles and obstacles in its path,” and “had specific knowledge of numerous prior incidents and accidents in which its safety systems on Tesla vehicles completely failed causing significant property damage, severe injury and catastrophic death to its occupants.”
This case opens up the topic of negligence. Does it fall on Banner, similar to the driver in the Herzbeg case, for not having his hands on the wheel? Or did he in good faith believe that Tesla’s safety measures and autopilot feature would protect him on the road. Is Tesla at fault for not being clear about the full scope of the autopilot feature and the actual responsibility of drivers? The fact that a similar accident also caused death does open Tesla up to some liability in this case.
Case Study: Crash involving a Tesla Model 3
In 2019, on a California freeway, Benjamin Maldonado turned on his turn signal and as he moved right he was hit by a Tesla Model 3 that was traveling about 60 miles per hour on autopilot. His 15-year-old son who was in the passenger seat, not wearing a seatbelt, was thrown from the vehicle.
Suffice to say, these accidents can be a cause for concern. While Tesla may not take on full culpability since the passenger was not wearing a seatbelt, it does still call into question would this accident have happened if autopilot was not engaged or if the backup driver was fully engaged.
It calls into question how can a car with multiple checks including a self-imposed security system and a backup driver may not at least partly be at fault in this case? It also opens up Tesla to possible culpability for Tesla for not clearly marketing a feature that still requires you to actively engage in the driving and safety of your vehicle.
Does it fall on Tesla for not having fully explained to drivers that autonomous driving doesn’t mean that you can take your eyes off the road or off the wheel?
While autonomous driving can seem like a flashy new feature, their liability and safety are still in question as human, technical, and vehicle error can all result in injury or death. The law is still being written and if you are involved in an accident with an autonomous or self-driving vehicle your case just may decide the law for future victims.