A malicious technology can trick self-driving cars into "hallucinating" phantom vehicles and veering dangerously off-course to get out of their way, researchers have discovered.
The new hack, dubbed "MadRadar," can also hide real cars from on-vehicle radar sensors and fool a self-driving car into thinking a real car has jerked off course. The scientists reported their findings in a peer-reviewed paper, which will be presented Feb. 26 at the Network and Distributed System Security (NDSS) Symposium 2024 in San Diego.
"Without knowing much about the targeted car's radar system, we can make a fake vehicle appear out of nowhere or make an actual vehicle disappear in real-world experiments," lead author Miroslav Pajic, a professor of electrical and computer engineering at Duke University in North Carolina, said in a statement. "We're not building these systems to hurt anyone, we're demonstrating the existing problems with current radar systems to show that we need to fundamentally change how we design them."
Related: Is an electric car better for the planet?
Self-driving cars will increasingly take to U.S. roads over the next few years. Mercedes Benz became the first automaker in the U.S. to receive approval for Level 3 self-driving cars in January 2023 — meaning vehicles can perform all the driving under certain conditions. Approval was granted by Nevada state regulators for use on U.S. public freeways. Many electric vehicles, including Tesla's, are fitted with automation or autopilot systems.
Different cars use different systems by design, so it's unlikely any two vehicles will use the same operating parameters even if they are the same make and model of car, the scientists said in the statement. They may, for example, use different operating frequencies or take measurements at marginally different intervals — measures that are built in to protect against radar-spoofing attacks.
MadRadar, however, can accurately detect a car's radar parameters in less than a quarter of a second from a remote position and then send out its own radar signals to fool the target's radar. The scientists did not reveal the specific mechanisms of the attack ahead of their paper's publication at NDSS.
They did, however, demonstrate three attack types on real-world radar systems in moving cars. In one attack, MadRadar sent signals to the target car to fool it into thinking another vehicle was in its way. It did this by changing the signal so it mimicked what the expected contact might look like.
A second attack fooled a target's radar into thinking there was no passing car, when in reality there was one. MadRadar did this by adding masking signals around the passing car's location to create a "bright spot" and confuse the radar system.
In the third attack, the researchers combined these two attacks and fooled a car into thinking a real car had suddenly changed course. "Imagine adaptive cruise control, which uses radar, believing that the car in front of me was speeding up, causing your own car to speed up, when in reality it wasn't changing speed at all," Pajic said. "If this were done at night, by the time your car's cameras figured it out you'd be in trouble."
The results, the scientists said, reveal that car manufacturers need to rethink how they implement anti-radar-spoofing protections in their vehicles. Manufacturers should take steps to better safeguard vehicles and their inhabitants, they added, though the team didn't specify how.