A Utah woman whose Tesla Model S sedan crashed into a firetruck near Salt Lake City in May is suing the automaker, claiming the vehicle’s Autopilot system drove her into stopped traffic.
Heather Lommatzsch, 29, alleges in her lawsuit that Tesla salespeople told her when she was buying the car in 2016 that she would only need to touch the steering wheel occasionally with Autopilot activated. But instead of helping her avoid a collision, her vehicle drove her into the back of an idling firetruck, the lawsuit alleges according to the Associated Press.
Tesla’s Autopilot is a semi-autonomous system that is supposed to keep the vehicle centered in its lane, maintain a safe distance from other vehicles on the road, change lanes when necessary, and avoid crashes.
Ms. Lommatzsch broke her foot in the crash and police charged her with a misdemeanor traffic citation for failure to keep a proper lookout while driving. The driver of the firetruck was also injured but did not require hospitalization.
Electronic data taken from Ms. Lommatzsch’s Tesla shows that it picked up speed for 3.5 seconds before it crashed into the firetruck and that the brakes were applied less than a second before impact.
“Police suggested that the car was following another vehicle and dropped its speed to 55 mph (89 kph) to match the leading vehicle. They say the leading vehicle then likely changed lanes and the Tesla automatically sped up to its preset speed of 60 mph (97 kph) without noticing the stopped cars ahead,” the AP reported.
Something similar played out in the crash of a Tesla Model X SUV that crashed into a California highway barrier March 23, killing the driver. The National Transportation Safety Board (NTSB) investigated the crash and found the vehicle was on Autopilot when it accelerated in the seconds before it collided with the barrier.
Tesla disputes allegations that its semi-autonomous Autopilot system is dangerous, saying that “drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times.”
But the driverless features of Tesla’s vehicles combined with the continued need for driver input may be at the heart of Tesla Autopilot crashes. After investigating the deadly crash of a Tesla Model S in Florida last year, the NTSB found that the vehicle’s “operational limits” played a major role in the accident.
According to the NTSB, the main shortcomings of Tesla’s semi-autonomous technology involved the vehicle’s inability to monitor and ensure driver attention. Investigators found that Tesla’s system worked exactly as intended and it did not identify any defects in the design or performance of the vehicle’s autopilot systems.
However, they found fault in the “operational design” because the vehicle’s semi-autonomous driving technologies rely on driver input in certain situations, yet they cannot prevent drivers from fully relying on the vehicle to self-drive in all circumstances.