Article 73BEQ Autonomous Cars Vulnerable to Prompt Injection

Autonomous Cars Vulnerable to Prompt Injection

by
janrinok
from SoylentNews on (#73BEQ)

looorg writes:

https://www.theregister.com/2026/01/30/road_sign_hijack_ai/?td=keepreading
https://the-decoder.com/a-printed-sign-can-hijack-a-self-driving-car-and-steer-it-toward-pedestrians-study-shows/

Autonomous vehicles fooled by humans with signs. They apparently do not really verify their inputs, one is as good as the next one. So they fail even basic programming techniques of sanitizing and verifying inputs.

[quote]The researchers at the University of California, Santa Cruz, and Johns Hopkins showed that, in simulated trials, AI systems and the large vision language models (LVLMs) underpinning them would reliably follow instructions if displayed on signs held up in their camera's view.[/quote]

Commands in Chinese, English, Spanish, and Spanglish (a mix of Spanish and English words) all seemed to work.

As well as tweaking the prompt itself, the researchers used AI to change how the text appeared - fonts, colors, and placement of the signs were all manipulated for maximum efficacy.

The team behind it named their methods CHAI, an acronym for "command hijacking against embodied AI."

While developing CHAI, they found that the prompt itself had the biggest impact on success, but the way in which it appeared on the sign could also make or break an attack, although it is not clear why.

In tests with the DriveLM autonomous driving system, attacks succeeded 81.8 percent of the time. In one example, the model braked in a harmless scenario to avoid potential collisions with pedestrians or other vehicles.

But when manipulative text appeared, DriveLM changed its decision and displayed "Turn left." The model reasoned that a left turn was appropriate to follow traffic signals or lane markings, despite pedestrians crossing the road. The authors conclude that visual text prompts can override safety considerations, even when the model still recognizes pedestrians, vehicles, and signals.

Original Submission

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments