Researchers have discovered that malicious technology can trick self-driving cars into being “trippy” ghost vehicles and veer dangerously off course to get out of their way.
The new trick, dubbed “MadRadar,” can also hide real cars from the vehicle’s radar sensors and trick a self-driving car into believing that a real car has veered off course. The scientists reported their findings in a peer-reviewed paper, which will be presented Feb. 26 at the 2024 Network and Distributed Systems Security Symposium (NDSS) in San Diego.
“Without knowing much about the target car’s radar system, we can make a fake vehicle appear out of nowhere or make a real vehicle disappear in real-world experiments,” said the lead author. Miroslav Pajicprofessor of electrical and computer engineering at Duke University in North Carolina, said in a statement. “We are not building these systems to harm anyone, we are demonstrating existing problems with current radar systems to show that we need to fundamentally change the way we design them.”
Self-driving cars will increasingly travel on American roads in the coming years. Mercedes Benz became the first US automaker to receive approval for Level 3 autonomous cars in January 2023, meaning the vehicles can do all driving under certain conditions. Nevada state regulators granted approval for use on public highways in the US. Many electric vehicles, including Tesla’s, are equipped with automation or autopilot systems.
Different cars use different systems by design, so it is unlikely that two vehicles will use the same operating parameters even if they are the same make and model, the scientists said in the statement. They may, for example, use different operating frequencies or take measurements at marginally different intervals, measures that are built in to protect against radar spoofing attacks.
MadRadar, however, can accurately detect a car’s radar parameters in less than a quarter of a second from a remote position and then send its own radar signals to fool the target’s radar. The scientists did not reveal the specific mechanisms of the attack before the publication of their paper in the NDSS.
However, they demonstrated three types of attacks on real-world radar systems on moving cars. In one attack, MadRadar sent signals to the target car to trick it into thinking there was another vehicle in its path. He did this by changing the signal to mimic what the expected contact would look like.
A second attack tricked the target’s radar into thinking there were no cars passing by, when in fact there were. MadRadar did this by adding masking signals around the location of the passing car to create a “bright spot” and confuse the radar system.
In the third attack, the researchers combined these two attacks and tricked a car into thinking that a real car had suddenly changed course. “Imagine adaptive cruise control, which uses radar and thinks the car in front of me is accelerating, causing your own car to accelerate, when in reality it wasn’t changing speed at all,” Pajic said. “If this were done at night, when your car cameras detected it, you would be in trouble.”
The results, the scientists said, reveal that automakers need to rethink how they implement protections against radar spoofing in their vehicles. Manufacturers should take steps to better protect vehicles and their occupants, they added, although the team did not specify how.