Posts: 14,434
Threads: 9,515
Thanks Received: 9,034 in 7,184 posts
Thanks Given: 9,806
Joined: 12 September 18
27 May 21, 06:34
(This post was last modified: 27 May 21, 06:34 by harlan4096.)
Quote:
Researchers at RSA Conference 2021 demonstrated how Tesla and Mobileye autopilots can be tricked by “phantom” images.
It’s a common movie plot device, the main character thinking they saw someone step onto the road, so they swerve and end up in a ditch. Now imagine it’s real — sort of — and instead of a trick of the light or the mind, that image comes from a cybercriminal projecting, for a split second, something the car autopilot is programmed to respond to. Researchers from Georgia Tech and Ben-Gurion University of the Negev demonstrated that sort of “phantom attack” threat at RSA Conference 2021.
The idea of showing dangerous images to AI systems is not new. Techniques usually involve using modified images to force the AI to draw an unexpected conclusion. All machine-learning algorithms have this Achilles heel; knowing which attributes are key to image recognition — that is, knowing a bit about the algorithm — makes it possible to modify images so as to hinder the machine’s decision-making process or even force it to make a mistake.
The novelty of the approach demonstrated at RSA Conference 2021 is that the autopilot was shown unmodified images — an attacker need not know how the algorithm works or what attributes it uses. The images were briefly projected onto the road and nearby stationary objects, with the following consequences.
In a variation on the theme, the images appeared for a fraction of a second in a commercial on a billboard by the side of the road, with essentially the same outcome:
Thus, the authors of the study concluded, cybercriminals can cause havoc from a safe distance, with no danger of leaving evidence at the scene of the crime.
All they need to know is how long they have to project the image to fool the AI (self-driving cars have a trigger threshold to reduce their likelihood of producing false positives from, for example, dirt or debris on the camera lens or lidar).
Now, a car’s braking distance is measured in dozens of feet, so adding a few feet to allow for better situation assessment wasn’t a big deal for AI developers.
...
Continue Reading