Plaintiffs lawyers are pushing back against a number of regulatory proposals governing liability issues for autonomous self-driving cars and their manufacturers as these “robot cars” inch closer to becoming a reality on U.S. roads.
One main point of contention is a proposal to grant automobile manufacturers blanket legal immunity in cases where their autonomous vehicles crash and cause personal injury. The proposal comes as industry groups, legislators, lawyers, and others hash out the legal and regulatory framework for self-driving cars, which is proving to be as complex as the technology behind the cars itself.
“Alternative immunity schemes” being considered include no-fault insurance systems, auto industry self-regulation, complete and partial immunity for automakers, and federal preemption of state laws, Law360 reports.
The American Association for Justice (AAJ), a national organization of plaintiffs lawyers, issued a report called “Driven to Safety: Robot Cars and the Future of Liability,” stating a case for why blanket immunity proposals are a bad idea for everyone.
“Every time a new auto technology has been introduced, the civil justice system has played a key role in ensuring its safety,” American Association for Justice President Julie Braman Kane said in a statement. “Robot cars show tremendous promise for saving lives, but policymakers must ensure that when a robot car crashes, the injured, the families of those killed, or taxpayers don’t get stuck with the bill for the manufacturer’s failing.”
According to Law360, the report heralds the safety benefits of autonomous cars, such as reducing human error and the number of crashes, but insists that the U.S. civil justice system should serve as the main forum for determining safety requirements and liability issues when crashes occur. Some of the proposed regulations leave those decisions to regulators, legislators, and the automobile industry itself.
“Liability questions abound as to who is to blame when an autonomous, or robot, car crashes,” Law360 reports. “For instance, is the human to blame even if they’re not actually driving or is the car manufacturer or software designer to blame? If a car is conditionally or semi-autonomous and alerts the human driver to take over, when is the machine no longer responsible for driving?”
The AAJ argues that the “the civil justice system is well-placed to handle such ambiguity,” noting that the courts have proven to be not just the most capable agents for regulatory change, but arbiters for what changes are needed in the technology itself.
“The courts have faced disruptive technologies many times before, and proved themselves able to adapt,” the AAJ report states. “The peculiarities of each innovation have been worked out by the common law on a case-by-case basis until a legal consensus is reached. While legislative bodies and government agencies often end up playing catch-up to technological change, the law is a living thing and is capable of evolving with technology.
“Attempts to circumvent accountability, through reform proposals that grant corporations immunity, will eliminate incentives to make vehicles safe, and almost certainly result in more lives lost,” AAJ said in its report.
Law360 notes that as robot cars dominate U.S. roads and highways, human-error crashes fade, “leaving only crashes caused by design and manufacturing defect.” This change will lead to a shift away from personal injury lawsuits to product liability complaints.
“If there is one proposal that might fit in an eventual driverless world it is strict liability,” the AAJ report states. Strict liability holds manufacturers accountable for all crashes their vehicles cause, thereby ensuring victims access to justice, while incentivizing corrections and improvements in the technology, the report argues.