- Tesla has been hit with a new federal investigation, this time into how its automated driving assistance tech works in low visibility.
- It's the only automaker that has abandoned radar and lidar in favor of a camera-only approach.
- Cameras don't do a great job of depth perception, so perhaps having the extra safety net of radar may have been a good idea for now.
A fan of Tesla might think that the automaker just can't catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars' performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.
The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.
What the car can "see" is the big issue here. It's also what Tesla bet its future on.
Unlike the vast majority of its competitors that are giving their cars with autonomous driving capabilities more ways to “see” their surroundings, Tesla removed ultrasonic and other types of sensors in favor of a camera-only approach in 2022.
This means there isn’t really any redundancy in the system, so if a Tesla with FSD enabled drives through dense fog, it may not have an easy time keeping track of where the road is and staying on it. Vehicles that not only have cameras but also radar and lidar will make more sense of their environment even through dense fog, although these systems are also affected by the elements. Inclement weather seems to sometimes make FSD go rogue.
When you enable FSD in your Tesla, the car is hardcoded to follow traffic rules and obey all road signs, but it also knows when not to do those things in certain situations. It tracks its position via satellite and it uses artificial intelligence linked to neural networks to make sense of where it is and what other vehicles are doing around it. It relies on its camera-only ADAS array to help it see in all directions. A separate neural network handles route planning and the AI plays a crucial role in making everything work together. Other neural networks are used for different tasks and all of this requires some serious processing power to run.
Tesla uses data from both autonomous driving and other driver behavior, feeding both into its AI models. It relies on a supercomputer of its own making called Dojo to process the vast amount of video data that it receives from its cars. It’s also used to train the various machine learning models that Tesla uses to develop its autonomous driving and it’s what makes the camera-only system work and get better over time.
This is all very cool stuff, in theory. But it’s actually behind on the project whose success could have silenced the critics: the Cybercab. Several companies are closer to launching their own fully autonomous and driverless taxi and they’ll probably beat Tesla to it.
Cybercab production is only slated to start in 2026, although even CEO Elon Musk will admit that his timelines are "optimistic", so it may be even later than that. But this is happening while there are already companies in the U.S. that are operating small fleets of autonomous cabs without even a safety driver on board. However, Tesla will make up a lot of the difference because it will tap into autonomous driving tech that it’s been perfecting for years.
Moreover, Tesla is the only manufacturer whose vehicles don’t even have ultrasonic sensors for parking. That’s right: they use cameras for that and, as we discovered during my own refreshed Model 3 drive, I found it to be an inferior solution sometimes leaving you wondering if the car is seeing an actual obstacle or just a low curbstone or a surface change.
Older Teslas had a combination of radar and cameras for Autopilot and driver assistance systems. With newer software versions launched after Tesla went down the "Pure Vision" route, it disabled the sensors in the older cars that had them from the factory. So even if you have FSD enabled in an older Tesla that has more than just cameras, only the cameras will be used when the car is driving itself.
The incident that prompted the new NHTSA investigation occurred in November 2023, when a 2021 Model Y with FSD enabled slammed into a Toyota SUV parked on the side of the highway, which then struck and killed one of the people who were in front of the vehicle.
We don’t know if the driver of the Tesla was paying attention to the road at the time of the crash and simply didn’t see the other car or if their eyes were not on the road—people bypassing the safety systems meant to keep drivers’ attention on the road while the car drivers itself is the topic of another investigation.
NHTSA will now look at this system’s ability to "detect and respond appropriately to reduced roadway visibility conditions." We are very curious about the results of this particular investigation since it will reveal whether having just cameras is enough or if the support of radar and lidar makes self-driving cars safer.
Musk has vehemently opposed the notion that relying only on cameras for autonomous driving is unsafe, but the entire automotive industry (which has all but unanimously embraced the marriage of cameras, radar and sometimes lidar as the go-to solution for cars that can drive themselves) says otherwise. The Tesla boss argues that if humans can navigate solely through a combination of vision and intelligence, cars should be able to do it too.
But cameras don’t perceive depth the way the human eye does, so having the redundancy of radar or lidar is an extra safety net you want in a driverless vehicle that will take you up to highway speeds and can potentially cause harm to you or others. Elon’s argument behind going camera-only is valid, but it doesn’t seem applicable yet and the slew of investigations doesn’t help. Autonomous car tech still needs to evolve more before just cameras will be enough. Those other self-driving players with their spinning sensor-laden cars can’t all be wrong, right?