The Legal Examiner Mark The Legal Examiner Mark The Legal Examiner Mark search twitter facebook feed linkedin instagram google-plus avvo phone envelope checkmark mail-reply spinner error close
Skip to main content

In November 2016, Ohio Governor John Kasich announced a $15 million investment in 35-mile stretch of road, dubbed the “Smart Mobility Corridor,” for the testing of self-driving vehicles. It goes between Dublin and East Liberty, and is the location for fiber optic cable that will transmit data on self-driving vehicle operations.

The data will go to researchers at the Transportation Research Center at East Liberty and the Ohio State University’s Center for Automotive Research. As Ohio barrels toward the future, some may be curious about what laws are in place to govern these vehicles should they get into an accident.

Federal Government Recently Released New Automated Vehicle Policy

The U.S. Department of Transportation (DOT) released the first automated vehicles policy last September 2016. The policy includes a 15-point safety assessment to set clear expectations for manufacturers developing and deploying automated vehicle technologies, and differentiates federal responsibilities from state responsibilities.

Federal responsibilities, for example, include setting safety standards for new motor vehicles and their equipment, enforcing compliance with the safety standards, investigating any recall and remedy of non-compliances and safety-related vehicle defects, educating the public about safety issues, and issuing guidance as needed to achieve national safety goals.

State responsibilities, on the other hand, include licensing drivers and registering motor vehicles, enacting and enforcing traffic laws and regulations, conducting safety inspections as needed, and regulating motor vehicle insurance and liability.

Everything that’s in this policy so far, however, is voluntary. Meanwhile, it’s still up to states to create their own automated vehicle laws.

Who is Liable in an Accident with a Self-Driving Vehicle?

As of this writing, there are no currently approved Ohio laws regulating self-driving vehicles. The new corridor being developed is the only place where these vehicles are allowed to be legally driven.

The lack of laws may make it difficult to assess liability in the case of an accident. That means that current laws will still apply. Negligent behavior may include failing to signal when turning, speeding, driving recklessly, or disobeying traffic signals.

It would seem that a self-driving car that violates any of these laws and gets into an accident would at least share responsibility, but so far we don’t know how this will play out.

In early 2016, a Tesla Model S self-driving car crashed into the rear of a truck allegedly because it did not identify the truck’s existence in front of it. The driver of the autonomous car failed to override the system because he was allegedly distracted by a movie at the time. He tragically died in the crash.

This wasn’t the only accident with a Tesla self-driving vehicle. Another occurred in China in January 2016, and again, the driver died in the crash. The man’s family has filed a lawsuit against Tesla. The company says it’s not liable because drivers agree by contract to keep their hands on the wheel even when the autopilot is engaged.

Since self-driving cars are marketed as requiring no driver, the only thing left to blame in an accident would be the technology, and the automaker. It could be that as self-driving vehicles become a reality on our highways, that automakers may actually face greater liability than they do today, when drivers themselves are usually held responsible.



One Comment

  1. Gravatar for Matt Monroe

    Though it's become a bit of a cliche to use "the trolly problem" in discussions involving self-driving cars, the reality is that –– at some point in the very near future –– a self-driving car will have to make a life or death decision quite similar to the trolly problem, and that decision will be something along the lines of: "Do I kill my passengers OR do I kill the pedestrian who has just stepped into the crosswalk in front of me?"

    And let's be blunt here: it's the self-driving car itself that will have to make this "who to kill" decision.

    And when an incident like this does take place (and, myself, I believe a scenario like this will end up taking place quite soon), there will be complete and total legal turmoil when a court decides on final liability. It doesn't matter if the car manufacturer is pinned with the liability, or the software developer(s), or even the owner of the car. Raging debates –– and clumsily enacted legislation –– will take place, all followed by even more debates and more clumsily enacted legislation.

    I suspect there will be a good five to ten year sorting out process –– first with the States, and then at the Federal level –– before sane and well thought out legislation/policies are put into place with regards to self-driving car accidents, injuries, and death.

Comments are closed.

Of Interest