Tesla Knew About Autopilot Defect In 2019 Crash

While using the driver assist system marketed as “Autopilot” in his red Tesla Model 3 on a dark 2019 morning in Delray Beach, Fla., Jeremy Banner took his hands off the wheel and trusted the system to drive for him. It had been pitched to him as a system that could do just that. But the system’s sensors missed a tractor trailer crossing both lanes in front of him, and the car ran at full speed under the side of the trailer. The roof was ripped from the car, Banner was instantly killed, and the car continued driving for nearly a minute before coming to a stop at a curb. A judge ruled last week that Banner’s wife’s negligence lawsuit against Tesla can proceed to trial.

Bryant Walker Smith, a University of South Carolina law professor, told Reuters that the judge’s summary is significant because “it suggests alarming inconsistencies between what Tesla knew internally, and what it was saying in its marketing.”

“This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Smith continued. “And now the result of that trial could be a verdict with punitive damages.”

The judge cited Tesla’s 2016 video unveil of its so-called Autopilot Full Self-Driving driver assistance program as part of his reasoning. “Absent from this video is any indication that the video is aspirational or that this technology doesn’t currently exist in the market,” he wrote.

“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” according to the judge’s verdict.

The plaintiff should be able to argue to a jury that Tesla did not provide sufficient warning that Autopilot and Full-Self Driving require driver attention to take over in case of an emergency situation. Even today after dozens of related deaths, I still hear from Tesla drivers who trust FSD to drive them home when they are impaired (either from fatigue or alcohol) or simply engage in other activities behind the wheel.

According to TechCrunch

The judge compared Banner’s crash to a similar 2016 fatal crash involving Joshua Brown in which Autopilot failed to detect crossing trucks, which led to the vehicle crashing into the side of a tractor trailer at high speed. The judge also based his finding on testimony given by Autopilot engineer Adam Gustafsson and Dr. Mary “Missy” Cummings, director of the Autonomy and Robotics Center at George Mason University.

Gustafsson, who was the investigator on both Banner’s and Brown’s crashes, testified that Autopilot in both cases failed to detect the semitrailer and stop the vehicle. The engineer further testified that despite Tesla being aware of the problem, no changes were made to the cross-traffic detection warning system from the date of Brown’s crash until Banner’s crash to account for cross traffic.

The judge wrote in his ruling that the testimony of other Tesla engineers leads to the reasonable conclusion that Musk, who was “intimately involved” in the development of Autopilot, was “acutely aware” of the problem and failed to remedy it

The case — No. 50-2019-CA-009962 — will go to trial at Circuit Court for Palm Beach County, Florida.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment