California Jury Exonerates Tesla In Autopilot Crash

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

In July 6, 2019, Justine Hsu of Los Angeles was driving a 2016 Model S on city streets using Autopilot. She alleged in a lawsuit brought against Tesla in 2020 that the car suddenly swerved to the right and hit a curb, which caused the driver’s airbag to deploy “so violently it fractured Plaintiff’s jaw, knocked out teeth, and caused nerve damage to her face.”

In her complaint against Tesla, Hsu claimed that the company was on notice prior to the accident of certain defects in its Autopilot and frontal airbag systems on her 2016 Model S. The complaint asked for sanctions against Tesla, saying the company was aware on the date of the accident of similar failures of the Autopilot system on city streets.

Hsu’s lawsuit said she had undergone three surgeries and continued to require medical treatment. “Because of the Autopilot failure, and the improper deployment of the airbags, Plaintiff Hsu suffered severe injuries, resulting in a broken jaw, broken teeth, and multiple injuries to her face, hands, and legs,” the complaint stated. The suit asked for more than $3 million in damages. In response to Hsu’s complaint, Tesla argued “that Hsu used Autopilot on city streets, despite Tesla’s user manual warning against doing so,” according to ArsTechnica.

The Jury Speaks

The case finally came to trial this spring. During three weeks of testimony, the jury heard from three Tesla engineers who testified about how the Autopilot system works and the safeguards built into the system to remind drivers to always be attentive to the road ahead and ready to assume control of the vehicle at any time. Hsu testified that the system only prompted her to take control of the car about one second before the crash occurred.

On Friday, April 21, the jury rendered a verdict in favor of Tesla. After the trial ended, several jurors spoke with Reuters about their decision. They said Tesla clearly warned drivers that the partially automated driving software was not a self-piloted system and that driver distraction was to blame for the accident. In addition to rejecting Hsu’s Autopilot claim, the jury “found that the airbag did not fail to perform safely, and that Tesla did not intentionally fail to disclose facts to her,” Reuters reported. The jury awarded Hsu nothing in damages.

Juror Mitchell Vasseur, 63, told Reuters that he and his fellow jurors felt badly for Hsu, but ultimately determined that Autopilot was not at fault. “Autopilot never confessed to be self pilot. It’s not a self-driving car,” Vasseur said. “It’s an auto assist and they were adamant about a driver needing to always be aware. Vasseur said the accident would not have happened if Hsu had been more attentive, a mistake that anyone could make. I personally would never use autopilot,” he said. “I don’t even use cruise control.”

Jury foreperson Olivia Apsher, 31, said the Autopilot system reminds drivers when they are not adequately taking control. “It’s your vehicle,” she said. “There are audible warnings and visual warnings both for the driver, indicating that it is your responsibility.” She said she would love to have Autopilot features in her own car but added, “The technology is something that’s assisting you and we want that message to be clear. Drivers should understand that before they sit behind and take control of the vehicle using those features.”

The Effect On Other Tesla Autopilot Suits

Tesla’s Autopilot found partly to blame for 2018 crash on the 405
A Tesla rear-ended a fire truck that was parked because it was responding to an accident on the 405 Freeway in Culver City on Jan. 22, 2018. (Culver City Firefighters Local 1927)

The debate about whether the name “Autopilot” implies the system is capable of doing more than what the company says it can do in the owner’s manual continues. There are several other Autopilot-related lawsuits pending that make similar claims to those contained in Hsu’s complaint. People also confuse the Autopilot system with Tesla’s Full Self Driving suite of automated driving features. It may be that some drivers have the intellectual capacity of peat moss, but Tesla over the years has made any number of confusing and sometimes contradictory claims about what the two systems can and cannot do.

Reuters says this is an important victory for Tesla because this is the first Autopilot lawsuit to go to trial in the US and could influence the juries in other pending cases. One thing we know for certain is that a decision by a state court in Southern California is not binding on other courts — even other courts in California, and certainly not in other state courts or in federal court. Nonetheless, Tesla must be breathing a sigh of relief that it survived this first round in what promises to be a protracted battle.

ArsTechnica points out that Tesla is the defendant in a case brought by five Texas police officers who were injured in February 2021 when a Tesla Model X operating in Autopilot mode crashed into several police vehicles that were stopped on a highway with their emergency lights activated. Tesla’s do have a rather distressing habit of running into emergency vehicles, something that is a factor in the decision by the National Highway Traffic Safety Administration (NHTSA) to open an investigation into crashes involving Tesla cars using Autopilot.

Despite winning this round, Raj Rajkumar, professor of electrical and computer engineering at Carnegie Mellon University, told Business Today that Tesla’s technology is “far from becoming fully autonomous” despite Musk’s repeated promises over the years. “When fatalities are involved, and they are on highways, jury perspectives can be different. While Tesla won this battle, they may end up losing the war,” he said.

The Takeaway

It seems pretty clear that one should not be using Autopilot in city driving. It is less capable of controlling a car in an urban environment than Tesla’s vaunted Full Self Driving suite, and the company does say explicitly in the owners manual not to do it. And yet, the system does engage in city driving. If it’s so smart, shouldn’t it know it is in an inappropriate environment and refuse to activate itself?

Around the CleanTechnica writer’s lounge, we have a few questions but no answers. First, should an airbag break bones and loosen teeth when it deploys? According to Forbes, although injuries from malfunctioning airbags tend to be the most serious, airbags can cause damage even when they work properly. The 10 most common airbag injuries include:

  • Facial injuries including bruising and fractures to the small bone due to the impact of the airbag
  • Chest injuries including heart injuries due to the impact of the airbag against the chest
  • Burns on the chest, hands, arms, or face when the fabric of the airbag moves along the skin
  • Fractures including to the skull, ribs, and wrist bones
  • Traumatic brain injuries
  • Eye injuries from chemical irritation or from the pressure of the airbag
  • Ear trauma, including hearing loss
  • Internal bleeding if organs are damaged by airbag deployment
  • Asthma attacks and other respiratory issues due to the chemicals involved in deployment
  • Injury to the fetus in pregnant women

The second question we have is, why would a person who admits he never uses cruise control be allowed to serve on a jury that is being asked to make a decision about the proper functioning of an electronic system he clearly doesn’t understand? Somewhere in the jury selection process, someone missed that bit of information. We question whether that juror should have been allowed to serve on the jury panel.

Justine Hsu continues to receive medical treatment and was devastated by the jury verdict. It is hard not to have compassion for someone who has suffered so much physical pain and suffering. Perhaps other plaintiffs in other lawsuits will get different verdicts, but that won’t help Ms. Hsu.

The lesson for all of us is that automobiles are dangerous instrumentalities that can cause serious harm or death. In the fullness of time, the law will decide when computers and their manufacturers will be held liable for them. Car companies are making ever bolder claims about what their automated driving systems can do. Someday — perhaps soon — they will be held to account for those claims when those systems fail to provide the level of protection promised. Whether that will be a good thing or not will depend on your perspective.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica.TV Video

Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and doesn't really give a damn why the glass broke. He believes passionately in what Socrates said 3000 years ago: "The secret to change is to focus all of your energy not on fighting the old but on building the new." You can follow him on Substack and LinkedIn but not on Fakebook or any social media platforms controlled by narcissistic yahoos.

Steve Hanley has 5513 posts and counting. See all posts by Steve Hanley