Please disable your adblock and script blockers to view this page

Split-Second ?Phantom? Images Can Fool Tesla?s Autopilot

Ben Gurion University
Georgia Tech
the ACM Computer and Communications Security
Tesla Model X
HW2.5 Autopilot
Waymo, Uber
Condé Nast
Affiliate Partnerships

Andy GreenbergTo
Yisroel Mirsky
Ben Gurion
Ben Gurion UniversityBut
Ben Nassi
Charlie Miller
Ben Gurion's
Didi Chuxing


No matching tags



No matching tags

Positivity     35.00%   
   Negativity   65.00%
The New York Times
Write a review: Wired

And they warn that if hackers hijacked an internet-connected billboard to carry out the trick, it could be used to cause traffic jams or even road accidents while leaving little evidence behind."The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that's dangerous," says Yisroel Mirsky, a researcher for Ben Gurion University and Georgia Tech who worked on the research, which will be presented next month at the ACM Computer and Communications Security conference. They managed to make a Tesla stop for a phantom pedestrian that appeared for a fraction of a second, and tricked the Mobileye device into communicating the incorrect speed limit to the driver with a projected road sign.In this latest set of experiments, the researchers injected frames of a phantom stop sign on digital billboards, simulating what they describe as a scenario in which someone hacked into a roadside billboard to alter its video. More recently, another Chinese team found they could exploit Tesla's lane-follow technology to trick a Tesla into changing lanes just by planting cheap stickers on a road."Somebody's car will just react, and they won't understand why."Yisroel Mirsky, Ben Gurion UniversityBut the Ben Gurion researchers point out that unlike those earlier methods, their projections and hacked billboard tricks don't leave behind physical evidence. The result, the researchers say, could far more reliably defeat their phantom attacks, without perceptibly slowing down a camera-based autonomous driving system's reactions.Ben Gurion's Nassi concedes that the Ghostbuster system isn't perfect, and he argues that their phantom research shows the inherent difficulty in making autonomous driving decisions even with multiple sensors like a Tesla's combined radar and camera.

As said here by Wired