Hacked Billboards Can Make Teslas See 'Phantom Objects,' Causing Them to Swerve or Stop Abruptly

Security researchers have demonstrated how Tesla's Autopilot driver-assistance systems can be tricked into changing speed, swerving or stopping abruptly, simply by projecting fake road signs or virtual objects in front of them.

Their hacks worked on both a Tesla running HW3, which is the latest version of the company's Autopilot driver-assistance system, and the previous generation, HW2.5.

The most concerning finding is that a fake road sign only needs to be displayed for less than half a second, in order to trigger a response from Tesla's system.

In one example cited by the researchers, a "Stop" sign hidden within a fast food commercial successfully caused a Tesla running in Autopilot mode to stop, despite the command only flashing on-screen for a fraction of a second.

The system also recognized virtual projections of people and cars as real objects, responding by slowing down the car in order to avoid colliding with them, and was tricked by a drone that projected a fake speed sign onto a wall.

The researchers, from Ben-Gurion University of the Negev, said their findings "reflect a fundamental flaw of models that detect objects [but] were not trained to distinguish between real and fake objects."

It's easy to imagine how a bad actor could use the shortcoming to cause an accident or traffic jam, by hacking into a digital billboard for instance.

Such attacks have to potential to be both dangerous and easy to carry out because they "can be applied remotely (using a drone equipped with a portable projector or by hacking digital billboards that face the Internet and are located close to roads), thereby eliminating the need to physically approach the attack scene, changing the exposure vs. application balance," the researchers wrote.

They're also so fleeting that they're difficult for the human eye to detect, and leave behind very little evidence.

tesla model x electric car dashboard
The interior of a Tesla Model X full electric luxury crossover SUV car with a large touch screen and carbon look dashboard on display at Brussels Expo on January 9, in Brussels, Belgium. Researchers have...

Similar hacks also worked on the Mobileye 630 Autopilot system, because both it and Tesla's system rely on visual recognition through the use of cameras.

The researchers confirmed that these attacks would not have fooled autopilot systems that rely on LIDAR, which measures distances and maps surroundings with the use of lasers.

The company's CEO Elon Musk, however, has been consistently critical of LIDAR, which is a more expensive system, famously proclaiming in 2019 that: "Lidar is a fool's errand. Anyone relying on lidar is doomed."

Tesla, which insists that Autopilot requires "active driver supervision and [does] not make the vehicle autonomous," has been informed of the findings by the researchers.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer



To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go