Instead of coping with rush-hour traffic, just get into the car, sit back and read a newspaper or dig into the latest news from your smartphone – without fear of accidents. It’s every driver’s dream, but there are major complications.
Researchers at Ben-Gurion University of the Negev (BGU) in Beersheba have projected a phantom image on the road in front of a semi-autonomous car’s autopilot that caused it to brake suddenly. By doing so, the researchers showed that autopilots and advanced driving assistance systems (ADASs) in semi-autonomous or fully autonomous cars consider depthless projections of objects (phantoms) as real objects.
In a research project entitled “Phantom of the ADAS,” they show how cyber-attackers can exploit this perceptual challenge and manipulate a car into endangering its passengers They also present how attackers can fool a driving assistance system into believing that fake road signs are real by disguising phantoms for 125 milliseconds in advertisements presented on digital billboards located near roads.
BGU’s Cyber Security Research Center is now developing a system to prevent driving assistance systems from being fooled by phantom people and signs.
While the deployment of semi/fully autonomous cars have already begun in countries around the globe, the deployment of vehicular communication systems is delayed. Vehicular communication systems link the car with other vehicles, pedestrians and surrounding infrastructure. The lack of such systems creates a “validation gap” that prevents semi/fully autonomous vehicles from validating their virtual perception with a third party, forcing them to rely solely on their sensors.
Ben Nassi, a doctoral student of Prof. Yuval Elovici and his research team, showed that projecting a phantom of a person can trigger a car to suddenly brake, while a phantom image of lanes can cause its autopilot to veer into the oncoming traffic lane.
Elovici is director of the BGU Cyber Security Research Center and Deutsche Telekom Innovation Labs@BGU, which is located in the Advanced Technologies Park near the University. He and Nassi are members of the university’s department of software and information systems engineering. The Cyber Security Research Center is a joint initiative of BGU and Israel’s National Cyber Bureau.
“This is not a bug. This is not the result of poor code implementation. This is a fundamental flaw in object detectors that essentially use feature matching for detecting visual objects and were not trained to distinguish between real and fake objects. This type of attack is currently not taken into consideration by the automobile industry,” declared Nassi.
The researchers have also developed a method to conduct attacks remotely by projecting a phantom road sign from a drone and disguising a phantom road sign in an advertisement for 125 milliseconds screened on a digital billboard alongside a road.
While previous attacks that exploited the validation gap required skilled attackers to approach the scene of the attack, Nassi shows that remote attacks that do not require any special expertise but can fool advanced systems with a drone and a projector.
In practice, depthless objects that are projected on the road are considered real even though depth sensors exist. The researchers believe that this is the result of a “Better Safe Than Sorry” policy that causes the car to consider a visual 2D object real.
To detect phantoms, the researchers are developing a convolutional neural network model that analyzes a detected object’s context, surface and reflected light and is capable of detecting phantoms with high accuracy.