- Tesla's Full-Self Driving (Supervised) advanced driving assistance system was tested on over 1,000 miles by AMCI, an independent automotive research firm.
- During the review process, drivers had to intervene over 75 times.
- FSD (Supervised) can work flawlessly dozens of times in the same scenario until it glitches unexpectedly and requires driver intervention.
Tesla and its outspoken CEO have long promised self-driving cars, but we’re still not there yet. Despite the two available advanced driving assistance systems (ADAS) being called Autopilot and Full Self-Driving (Supervised), they still aren’t classified as Level 3 systems on SAE’s levels of driving autonomy chart, meaning the driver still has to be attentive and ready to take over control at any time.
While the so-called FSD can run flawlessly for the majority of situations, as attested by multiple testing videos, it can sometimes miss the mark, and it’s these occasional hiccups that can become dangerous.
That’s what AMCI Testing, an independent research firm, concluded after testing Tesla’s FSD on over 1,000 miles of city streets, rural two-lane highways, mountain roads and highways. The company used a 2024 Tesla Model 3 Performance fitted with the automaker’s latest hardware and running the latest software iterations, 12.5.1 and 12.5.3.
During testing, AMCI drivers had to intervene over 75 times while FSD was active, resulting in an average of once every 13 miles. In one instance, the Tesla Model 3 ran a red light in the city during nighttime even though the cameras clearly detected the lights. In another situation with FSD (Supervised) enabled on a twisty rural road, the car went over a double yellow line and into oncoming traffic, forcing the driver to take over. One other notable mishap happened inside a city when the EV stopped even though the traffic light was green and the cars in front were accelerating.
Here’s how Guy Mangiamele, Director of AMCI Testing, put it: "What's most disconcerting and unpredictable is that you may watch FSD successfully negotiate a specific scenario many times–often on the same stretch of road or intersection–only to have it inexplicably fail the next time."
AMCI released a series of short videos which you can watch embedded below (just try to ignore the background music.) The clips show where FSD (Supervised) performed very well, like moving to the side of a narrow road to let incoming cars pass, and where it failed.
"With all hands-free augmented driving systems, and even more so with driverless autonomous vehicles, there is a compact of trust between the technology and the public,” said David Stokols, CEO of AMCI Testing's parent company, AMCI Global. “Getting close to foolproof, yet falling short, creates an insidious and unsafe operator complacency issue as proven in the test results," Stokols added.
AMCI’s results come as Tesla is preparing to launch its Robotaxi on October 10. On several occasions, CEO Elon Musk alluded that the company’s cab would be able to drive autonomously anywhere because it doesn’t rely on pre-mapped data to make decisions and instead uses a camera system that intelligently assesses situations and makes decisions on the fly.
However, Bloomberg and famed Tesla hacker Green The Only recently reported that Tesla is actively collecting data in the Los Angeles area where the Robotaxi event is scheduled to happen. Several test vehicles were also spotted by keen-eyed Redditors on the same roads where a bright yellow mule resembling a two-door Cybercab was photographed.