tesla autopilot

tesla-loses-autopilot-wrongful-death-case-in-$329-million-verdict

Tesla loses Autopilot wrongful death case in $329 million verdict

Tesla was found partially liable in a wrongful death lawsuit in a federal court in Miami today. It’s the first time that a jury has found against the car company in a wrongful death case involving its Autopilot driver assistance system—previous cases have been dismissed or settled.

In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

While Tesla said that McGee was solely responsible, as the driver of the car, McGee told the court that he thought Autopilot “would assist me should I have a failure or should I miss something, should I make a mistake,” a perception that Tesla and its CEO Elon Musk has done much to foster with highly misleading statistics that paint an impression of a brand that is much safer than in reality.

The jury heard from expert witnesses about Tesla’s approach to human-machine interfaces and driver monitoring, as well as its use of statistics, then considered their verdict on Thursday afternoon and Friday before deciding that, while McGee was two-thirds responsible for the crash, Tesla also bore a third of the responsibility for selling a vehicle “with a defect that was a legal cause of damage” to Benavides’ relatives and Angulo. The jury awarded the plaintiffs $129 million in compensatory damages, and a further $200 million in punitive damages.

Tesla loses Autopilot wrongful death case in $329 million verdict Read More »

experts-lay-into-tesla-safety-in-federal-autopilot-trial

Experts lay into Tesla safety in federal autopilot trial

For example, she said Tesla “clearly recognized that mode confusion is an issue—this is where people, for example, think the car is in Autopilot and don’t understand that the Autopilot has disengaged,” she told the court.

Cummings also referred to the deposition of Tesla autopilot firmware engineer Ajshay Phatak. Phatak’s deposition told the court that the company did not keep good track of Autopilot crashes prior to 2018, and Cummings pointed out that “it was clear they knew that they had a big problem with people ignoring the warnings. Ignoring the hands-on requests. And…as you know, prior to this accident. It was known to Tesla that they were having problems with people ignoring their warnings.”

Tesla’s abuse of statistics to make misleading claims about safety are nothing new: in 2017, Ars found out that Tesla’s claims about Autopilot reducing crashes was not at all backed by data, which in fact showed the driver assist actually increased crash rates.

Mendel Singer, a statistician at Case Western University School of Medicine, was very unimpressed with Tesla’s approach to crash data statistics in his testimony. Singer noted that he was “not aware of any published study, any reports that are done independently… where [Tesla] actually had raw data and could validate it to see does it tend to make sense” and that the car company was not comparing like with like.

“Non-Teslas crashes are counted based on police reports, regardless of safety system deployment,” Singer said. Further, Tesla kept misleading claims about safety on its website for years, Singer pointed out. When asked whether he would have accepted a paper for peer review from Tesla regarding its reports, “that would have been a really quick and easy rejection,” he said.

While it’s possible that Tesla will still settle this case, we may also see the trial carried out to its conclusion.

“The plaintiffs in this instance have already received compensation from the driver of the Tesla in question, apparently in a decent amount. My understanding is that this makes them much less likely to take the kinds of offers Tesla has been making for settlements, and this is more about the justice,” said Edward Niedermeyer, author and long-time Tesla-watcher.

“That said, the judge in the case has made some frustrating rulings around confidentiality on key issues, so it’s possible that may be in Tesla’s favor. They could also just up their settlement offer enough to be impossible to refuse,” Niedermeyer said.

Experts lay into Tesla safety in federal autopilot trial Read More »

elon-musk-makes-bold-claims-about-tesla-robotaxi-in-hollywood-backlot

Elon Musk makes bold claims about Tesla robotaxi in Hollywood backlot

“It’s going to be a glorious future,” Musk said, albeit not one that applies to families or groups of three or more.

Musk claims that Tesla “expects to start” fully unsupervised FSD next year on public roads in California and Texas. A recent analysis by an independent testing firm found the current build requires human intervention about once every 13 miles, often on roads it has used before.

A rendering of the two-seat interior of the Tesla Cybercab

Only being able to carry two occupants is pretty inefficient when a city bus can carry more than 80 passengers. Credit: Tesla

“Before 2027” should see the Cybercab, which Musk claims will be built in “very high volume.” Tesla-watchers will no doubt remember similar claims about the Model X, Model 3, Model Y, and most recently the Cybertruck, all of which faced lengthy delays as the car maker struggled to build them at scale. Later, Musk treated the audience to a video of an articulated robotic arm with a vacuum cleaner attachment cleaning the two-seat interior of the Cybercab. Whether this will be sold as an aftermarket accessory to Cybercab owners, or if they’re supposed to clean out their robotaxis by hand between trips, remains unclear at this time.

Musk also debuted another autonomous concept, the Robovan. It’s a small bus with no visible wheels, but brightly lit interior room for up to 20 occupants. Musk said little about the Robovan and how it figures into Tesla’s future. In 2017 he revealed his dislike for public transport, saying “it’s a pain in the ass” and that other passengers could be serial killers. 

After promising that “unsupervised FSD” is coming to all of Tesla’s five models—”now’s not the time for nuance,” Musk told a fan—he showed off a driverless minibus and then a horde of humanoid robots, which apparently leverage the same technology that Tesla says will be ready for autonomous driving with no supervision. These robots—”your own personal R2-D2,” he said—will apparently cost less than “$30,000” “long-term,” Musk claimed, adding that these would be the biggest product of all time, as all 8 billion people on earth would want one, then two, he predicted.

Elon Musk makes bold claims about Tesla robotaxi in Hollywood backlot Read More »

tesla’s-2-million-car-autopilot-recall-is-now-under-federal-scrutiny

Tesla’s 2 million car Autopilot recall is now under federal scrutiny

maybe ban it instead —

NHTSA has tested the updated system and still has questions.

A 2014 Tesla Model S driving on Autopilot rear-ended a Culver City fire truck that was parked in the high-occupancy vehicle lane on Interstate 405.

Enlarge / A 2014 Tesla Model S driving on Autopilot rear-ended a Culver City fire truck that was parked in the high-occupancy vehicle lane on Interstate 405.

Tesla’s lousy week continues. On Tuesday, the electric car maker posted its quarterly results showing precipitous falls in sales and profitability. Today, we’ve learned that the National Highway Traffic Safety Administration is concerned that Tesla’s massive recall to fix its Autopilot driver assist—which was pushed out to more than 2 million cars last December—has not actually made the system that much safer.

NHTSA’s Office of Defects Investigation has been scrutinizing Tesla Autopilot since August 2021, when it opened a preliminary investigation in response to a spate of Teslas crashing into parked emergency responder vehicles while operating under Autopilot.

In June 2022, the ODI upgraded that investigation into an engineering analysis, and in December 2023, Tesla was forced to recall more than 2 million cars after the analysis found that the car company had inadequate driver-monitoring systems and had designed a system with the potential for “foreseeable misuse.”

NHTSA has now closed that engineering analysis, which examined 956 crashes. After excluding crashes where the other car was at fault, where Autopilot wasn’t operating, or where there was insufficient data to make a determination, it found 467 Autopilot crashes that fell into three distinct categories.

First, 221 were frontal crashes in which the Tesla hit a car or obstacle despite “adequate time for an attentive driver to respond to avoid or mitigate the crash.” Another 111 Autopilot crashes occurred when the system was inadvertently disengaged by the driver, and the remaining 145 Autopilot crashes happened under low grip conditions, such as on a wet road.

As Ars has noted time and again, Tesla’s Autopilot system has a more permissive operational design domain than any comparable driver-assistance system that still requires the driver to keep their hands on the wheel and their eyes on the road, and NHTSA’s report adds that “Autopilot invited greater driver confidence via its higher control authority and ease of engagement.”

The result has been disengaged drivers who crash, and those crashes “are often severe because neither the system nor the driver reacts appropriately, resulting in high-speed differential and high energy crash outcomes,” NHTSA says. Tragically, at least 13 people have been killed as a result.

NHTSA also found that Tesla’s telematics system has plenty of gaps in it, despite the closely held belief among many fans of the brand that the Autopilot system is constantly recording and uploading to Tesla’s servers to improve itself. Instead, it only records an accident if the airbags deploy, which NHTSA data shows only happens in 18 percent of police-reported crashes.

The agency also criticized Tesla’s marketing. “Notably, the term “Autopilot” does not imply an L2 assistance feature but rather elicits the idea of drivers not being in control. This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation,” it says.

But now, NHTSA’s ODI has opened a recall query to assess whether the December fix actually made the system any safer. From the sounds of it, the agency is not convinced it did, based on additional Autopilot crashes that have happened since the recall and after testing the updated system itself.

Worryingly, the agency writes that “Tesla has stated that a portion of the remedy both requires the owner to opt in and allows a driver to readily reverse it” and wants to know why subsequent updates have addressed problems that should have been fixed with the December recall.

Tesla’s 2 million car Autopilot recall is now under federal scrutiny Read More »