On September 14, 2016, Uber put its new self-driving taxis on the road in Pittsburgh. On the one hand, the city feels like it’s on the cutting edge of technology, paving the way for newer, safer modes of transportation. On the other hand, safety experts remain concerned that Uber is jumping the gun, and putting new technology on the road before it’s ready.
Part of the problem is that the state of Pennsylvania hasn’t yet passed laws that would provide guidance should one of these self-driving cars become involved in a crash or other accident.
That may soon change as Federal regulators are due to release new guidelines for automakers and tech companies, requiring among other things, that the vehicles meet a 15-point safety assessment.
Self-Driving Cars May Be Confused by Pittsburgh’s Bridges
Some citizens are excited about stepping into a self-driving car, but others feel like they’re in a giant experiment launched upon them without their consent. Though Uber has promised that its self-driving vehicles will still have a driver behind the wheel, safety experts are concerned about unexpected conditions on the road that can throw a computer off and potentially cause an accident and/or injuries.
According to the Washington Post, researchers have found that bridges can confuse these types of vehicles. Pittsburgh has more bridges than any other major U.S. City. The National Highway Traffic Safety Administration (NHTSA) stated that commuters in the city are essentially “guinea pigs,” and that they can expect that there will be crashes.
Uber is the first company to include everyday commuters in a self-driving vehicle test. Other automakers, including Google and GM, are also conducting driving tests on real roads, but they’re not inviting passengers to come along for the ride.
Uber, however, is now giving its loyal customers in Pittsburgh the choice of a regular Uber driver or a self-driving car. The company is starting out with four vehicles in the city.
Autonomous Vehicles Aren’t Perfect Yet
Other studies have shown that self-driving cars have trouble in bad weather. Snow and rain puddles can make it harder for them to detect lines on the pavement, which may cause them to drift out of their lane.
Basically robots on wheels, self-driving vehicles don’t understand human gestures well, such as those given by a human crosswalk guard. Some experts have recommended the vehicles not be allowed near schools. A GPS jammer (used to block police from tracking a vehicle) will also confuse a self-driving vehicle’s navigation system. Tests in California found that Google autonomous vehicles failed 341 times, between September 2014 and November 2015.
Lawmakers in Pennsylvania have proposed legislation that would require companies testing self-driving cars to have insurance, as well as to report crashes, times when a human driver had to take over the wheel, and security breaches. Those laws haven’t been passed yet, however, so Uber is not subject to them.
President Obama, commenting on the forthcoming federal guidelines concerning automated vehicles, stated in an editorial posted September 19, 2016, “And make no mistake: If a self-driving car isn’t safe, we have the authority to pull it off the road. We won’t hesitate to protect the American public’s safety.”
The first-ever White House Frontiers Conference will be held in Pittsburgh on October 13, 2016. The purpose of the event is to explore the future of innovation in America, focusing on science, technology, and innovation.
Exclusively focused on representing plaintiffs, especially in mass tort litigation, Eric Chaffin prides himself on providing unsurpassed professional legal services in pursuit of the specific goals of his clients and their families. Both his work and his cases have been featured in the national press, including on ABC’s Good Morning America.