In November 2016, Ohio Governor John Kasich announced a $15 million investment in 35-mile stretch of road, dubbed the “Smart Mobility Corridor,” for the testing of self-driving vehicles. It goes between Dublin and East Liberty, and is the location for fiber optic cable that will transmit data on self-driving vehicle operations.
The data will go to researchers at the Transportation Research Center at East Liberty and the Ohio State University’s Center for Automotive Research. As Ohio barrels toward the future, some may be curious about what laws are in place to govern these vehicles should they get into an accident.
Federal Government Recently Released New Automated Vehicle Policy
The U.S. Department of Transportation (DOT) released the first automated vehicles policy last September 2016. The policy includes a 15-point safety assessment to set clear expectations for manufacturers developing and deploying automated vehicle technologies, and differentiates federal responsibilities from state responsibilities.
Federal responsibilities, for example, include setting safety standards for new motor vehicles and their equipment, enforcing compliance with the safety standards, investigating any recall and remedy of non-compliances and safety-related vehicle defects, educating the public about safety issues, and issuing guidance as needed to achieve national safety goals.
State responsibilities, on the other hand, include licensing drivers and registering motor vehicles, enacting and enforcing traffic laws and regulations, conducting safety inspections as needed, and regulating motor vehicle insurance and liability.
Everything that’s in this policy so far, however, is voluntary. Meanwhile, it’s still up to states to create their own automated vehicle laws.
Who is Liable in an Accident with a Self-Driving Vehicle?
As of this writing, there are no currently approved Ohio laws regulating self-driving vehicles. The new corridor being developed is the only place where these vehicles are allowed to be legally driven.
The lack of laws may make it difficult to assess liability in the case of an accident. That means that current laws will still apply. Negligent behavior may include failing to signal when turning, speeding, driving recklessly, or disobeying traffic signals.
It would seem that a self-driving car that violates any of these laws and gets into an accident would at least share responsibility, but so far we don’t know how this will play out.
In early 2016, a Tesla Model S self-driving car crashed into the rear of a truck allegedly because it did not identify the truck’s existence in front of it. The driver of the autonomous car failed to override the system because he was allegedly distracted by a movie at the time. He tragically died in the crash.
This wasn’t the only accident with a Tesla self-driving vehicle. Another occurred in China in January 2016, and again, the driver died in the crash. The man’s family has filed a lawsuit against Tesla. The company says it’s not liable because drivers agree by contract to keep their hands on the wheel even when the autopilot is engaged.
Since self-driving cars are marketed as requiring no driver, the only thing left to blame in an accident would be the technology, and the automaker. It could be that as self-driving vehicles become a reality on our highways, that automakers may actually face greater liability than they do today, when drivers themselves are usually held responsible.
Exclusively focused on representing plaintiffs, especially in mass tort litigation, Eric Chaffin prides himself on providing unsurpassed professional legal services in pursuit of the specific goals of his clients and their families. Both his work and his cases have been featured in the national press, including on ABC’s Good Morning America.