Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Rob Lenihan

Tesla faces major roadblocks to its full self driving plans

Elon Musk was looking for a fight...sort of...again.

On Aug. 26 Tesla's (TSLA) -) CEO was behind the wheel of his personal Model S during a demonstration of the company's Full Service Driving Version 12. The demo was live-streamed on X, formerly Twitter, which Musk bought last year.

DON'T MISS: 3M surges on report of $5.5 billion earplug settlement with US military

Leaving from Tesla's offices in Palo Alto, Calif., Musk jokingly suggested visiting Meta Platforms (META) -) CEO Mark Zuckerberg and challenging him to a fight. This, of course, harked back to Musk's challenge that he meet Zuckerberg for a cage match, an idea the Meta chief dismissed.

"We'll say 'hi'," Musk says. "We'll be friendly. ... It's a polite inquiry as to whether you would like to engage in hand-to hand combat, you know if it's not inconvenient perhaps you would like to."

But extracurriculars aside, during the 45-minute ride to show what his company expects to drive vehicles fully autonomously, Musk had to take control when the vehicle tried to go through a red light.

Although the fight-with-Zuckerberg was a joke, Tesla is facing serious legal battles as the electric vehicle maker for the first time gears up to defend itself at trial against allegations that failure of its Autopilot driver assistant feature led to death,

Tesla’s standard Autopilot and premium Full Self-Driving systems control braking, steering and acceleration in limited circumstances. The company makes clear that drivers using the systems must keep their hands on the wheel and be alert at all times.

Lawsuits challenge safety of self-driving systems

Tesla is facing two trials in quick succession, Reuters reported, with more to follow.

The first, scheduled for mid-September in a California state court, is a civil lawsuit charging that the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour, strike a palm tree and burst into flames.

Lee was killed in the 2019 crash, and the two passengers, identified in court records as Lindsay Molander and her eight-year-son, Parker Austin, were seriously injured. The boy was reportedly disemboweled.

"The autopilot failed due to negligent design, manufacture, assembling, testing, and marketing," the complaint said. 

The suit said that Tesla should have known that the Autopilot system was “insufficiently designed to keep a vehicle from suddenly leaving a lane of travel,” lacked adequate warnings and instruction and was not properly tested.

The complaint also names Lee's estate as a defendant, alleging he was under the influence of alcohol at the time of the accident.

The second trial, set for early October in a Florida state court, involves a 2019 crash north of Miami. Jeremy Banner’s Model 3 drove under the trailer of an 18-wheel big-rig truck that had pulled into the road, shearing off the Tesla's roof and killing Banner. 

Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit, filed by Banner's wife.

“This was completely senseless," Trey Lytal, attorney for Banner's widow, said in 2019, according to CBS12 News. "There’s no question at all that it was defective. It did not work properly. In fact, it didn’t work at all."

Banner’s attorneys argued in a pretrial court filing that internal emails show Musk is the Autopilot team's "de facto leader."

Tesla denied liability for both accidents, attributed the accidents to driver error, and said Autopilot is safe when monitored by humans. Tesla said in court documents that drivers must pay attention to the road and keep their hands on the steering wheel.

Aiming for self-driving ability; missing targets

"There are no self-driving cars on the road today," the company said.

Musk has for years promised that Tesla would achieve self-driving capability but has missed his targets.

Last month, Musk told investors that “I know I'm the boy who cried FSD, but, man, I think we'll be better than human by the end of this year.”

Tesla’s website states that “the currently enabled Autopilot, Enhanced Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous.”

Tesla won a case in Los Angeles in April by arguing that it tells drivers that its technology requires human monitoring, despite the "Autopilot" and "Full Self-Driving" names. 

The case involved an accident in which a Model S swerved into the curb and injured its driver. Jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and driver distraction was to blame. 

The National Highway Traffic Safety Administration will resolve its two-year investigation into Tesla Autopilot and could make a public announcement soon, the agency's acting head told Reuters on Aug. 24.

"We'll get to a resolution [of the Tesla probe]," Acting NHTSA Administrator Ann Carlson said.

She declined to discuss how the Tesla investigation might be resolved, but added "hopefully you'll hear something relatively soon."

  • Action Alerts PLUS offers expert portfolio guidance to help you make informed investing decisions. Sign up now.
Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.