Jason Bolton turned on Tesla’s controversial “Autopilot” system while driving his Model 3 to a business meeting in the Bay Area from his Fresno home. But just as he reached the Route 152 summit, the system malfunctioned, sending the car careening into a fatal crash, a lawsuit against Tesla by his wife and two daughters claims.
The black 2023 sedan smashed down onto its roof, then rolled onto its wheels, according to the lawsuit filed Tuesday in Santa Clara County Superior Court by Bolton’s widow Linda and daughters Rowan and Willow. Photos included with the lawsuit show the top of the car pancaked, with the windshield and windows blown out.
Bolton, a 49-year-old employee benefits administrator, “suffered gruesome and ultimately fatal injuries” in the July 2023 crash, the wrongful death lawsuit said.
The family claim in the lawsuit that the electric car maker, seeking profits and in “blind pursuit of market dominance,” released the “Autopilot” system when it was unready for public use. Tesla, led by CEO Elon Musk, used “deceitful marketing tactics” to “deliberately misrepresent” the capabilities and limitations of its technology, and “manipulated consumers into believing (“Autopilot”) is capable of hands-free driving.”
The lawsuit accused Tesla of “a stunning disregard for basic ethics and consumer safety.”
The legal action comes on the heels of numerous lawsuits, and state and federal probes, related to the “Autopilot” system, which does not automatically pilot a car. The basic system provides cruise control and steering assistance. An enhanced version includes navigation, and automated lane changes and exiting.
Tesla did not immediately respond to requests for comment on the lawsuit and the government investigations. Its owners manuals warn “Autopilot” users to keep their hands on the steering wheel at all times and “be mindful of road conditions, surrounding traffic, and other road users.”
In April, on the eve of a jury trial expected to delve deeply into the “Autopilot” system, Tesla settled for undisclosed terms a lawsuit filed by the family of Apple engineer Walter Huang, a married father of two from Foster City killed on Highway 101 in Mountain View in 2018 after “Autopilot” steered his Tesla Model X compact SUV into a freeway barrier. Federal investigators had cited Huang’s “over-reliance” on “Autopilot” while distracted, probably by a game on his phone. Lawyers for his family have noted that there was no definitive finding he was playing a game.
The first lawsuit to blame Autopilot for a deadly crash — filed by the family of the deceased driver, Micah Lee, and two passengers seriously injured in the southern California crash — went to trial last year, with the jury siding with Tesla.
Central to lawsuits and investigations of Autopilot are questions around whether Tesla’s marketing and the name “Autopilot” encourage drivers to travel without hands on the wheel and their attention off the road. The system’s ability to recognize stopped emergency vehicles and take appropriate action has also been called into question.
A 2019 survey by the Insurance Institute for Highway Safety found Tesla’s “Autopilot,” more than any other manufacturer’s driver-assistance systems, led drivers to overestimate system capabilities, with 48% saying they believed it would be safe to use it hands-free.
The National Highway Transportation Safety Administration has been investigating “Autopilot” since August 2021, at first looking into 17 incidents in which a Tesla on “Autopilot” ran into a parked emergency vehicle on a highway. In 2022, the agency said it had widened the probe, “to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”
In March 2023, the agency said it was launching a special investigation of a fatal crash the previous month when a Tesla Model S sedan collided with a ladder truck from the Contra Costa County fire department. The driver of the Tesla was killed, a passenger was critically injured, and four firefighters suffered minor injuries.
In 2022, the California Department of Motor Vehicles filed an administrative complaint claiming Tesla deceptively advertised “Autopilot” and its “full self-driving” system in manners that contradicted its own warnings that the features needed active driver supervision.
Tesla in December recalled nearly all 2 million of its vehicles in the U.S. after a two-year investigation by federal regulators into some 1,000 crashes occurring during “Autopilot” use. The recall was an over-the-air software update intended to give drivers more warnings when they failed to pay attention while using the “Autosteer” function of “Autopilot.”
Originally Published: