Please disable your adblock and script blockers to view this page

Autopilot systems can be tricked by projected 2D images

Ben-Gurion University
Tesla Model X

Matthew Beedham

No matching tags

Google Maps

No matching tags


No matching tags

Positivity     46.00%   
   Negativity   54.00%
The New York Times
Write a review: The Next Web

Despite having complex sensor arrangements and thousands of hours of research and development poured into them, self-driving cars and autopilot systems can be gamed with consumer hardware and 2D image projections.[Read: Artist fakes Google Maps traffic jam with 99 phones]Like a great price on TNW2020 ticketsResearchers at Ben-Gurion University of the Negev in Israel, were able to trick self-driving cars — including a Tesla Model X — into braking and taking evasive action to avoid “depthless phantom objects.”The image below demonstrates how two-dimensional image projections tricked a Tesla’s Autopilot system into thinking there was a person stood in the road. To the human eye, it’s clear that this is a hologram of sorts, and wouldn’t pose a physical threat, but the car perceives it otherwise.This technique of phantom image projections can also be used to trick autopilot systems in to “thinking” any number of objects lay in the road ahead, including cars, trucks, people, and motorcycles.Researchers were even able to trick the system’s speed limit warning features with a phantom road sign.Using phantom images, researchers were able to get Tesla’s Autopilot system to brake suddenly. They even managed to get the Tesla to deviate from its lane by projecting new road markings onto the tarmac.Take a look at the outcome of the researcher‘s experiments in their video below.What’s more terrifying, though, is that “phantom image attacks” can be carried out remotely, from a distance, using drones or by hacking video billboards.In some cases, the phantom image could appear and disappear faster than the human eye can detect or before a person notices.

As said here by Matthew Beedham