New “GhostStripe” Attack Could Blind Tesla and Baidu Autopilots
A group of scientists from Singapore has developed a method to interfere with the operation of autonomous vehicles that use computer vision for road sign recognition. The new technique, called GhostStripe, poses a potential hazard to drivers of Tesla and Baidu Apollo vehicles.
The core concept of GhostStripe involves using LED lights to create light patterns on road signs. These patterns are invisible to the human eye but confuse the vehicle’s cameras. The attack works by rapidly flashing LEDs in various colors when the camera triggers, causing distortions in the captured image.
These distortions arise due to the nature of CMOS digital shutters. Cameras with such shutters scan images in stages, and the flashing LEDs create different shades at each scanning stage. As a result, the captured image does not correspond to reality. For instance, the red color of a “Stop” sign may appear differently on each scan line.
When such a distorted image is fed into the vehicle’s classifier, which is based on deep neural networks, the system fails to recognize the sign and does not respond appropriately. While similar attack methods were known, the research team achieved stable and repeatable results, making the attack practical in real-world conditions.
The researchers developed two versions of the attack:
- GhostStripe1 does not require access to the vehicle and uses a tracking system to monitor the vehicle’s location in real-time. This allows the LED flashing to be dynamically adjusted so that the sign goes unrecognized.
- GhostStripe2 requires physical access to the vehicle. In this case, a converter is installed on the camera’s power supply, which detects image scanning moments and precisely controls the timing of the attack. This allows targeting a specific vehicle and manipulating the sign recognition results.
The team tested the system on real roads using the Leopard Imaging AR023ZWDR camera, which is used in Baidu Apollo equipment. Tests were conducted on “Stop,” “Yield,” and speed limit signs. GhostStripe1 achieved a 94% success rate, while GhostStripe2 achieved 97%.
One factor affecting the attack’s effectiveness is strong ambient lighting, which reduces its success. The researchers noted that attackers must carefully choose the time and location for a successful attack.
Some countermeasures can reduce vulnerability. For example, replacing CMOS cameras with global shutter sensors that capture an image in one go, or randomizing scan layers, could help. Additional cameras can also lower the likelihood of a successful attack or require more sophisticated methods. Another measure could be training neural networks to recognize such attacks.