Cybersecurity researchers have disclosed details of a new supply chain attack vector dubbed Rules File Backdoor that affects artificial intelligence (AI)-powered code editors like GitHub Copilot and Cursor, causing them to inject malicious code.
“This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent configuration files used by Cursor and GitHub Copilot,” Pillar security’s Co-Founder and CTO Ziv Karliner said in a technical report shared with The Hacker News.
“By exploiting hidden unicode characters and sophisticated evasion techniques in the model facing instruction payload, threat actors can manipulate the AI to insert malicious code that bypasses typical code reviews.”
The attack vector is notable for the fact that it allows malicious code to silently propagate across projects, posing a supply chain risk.
The crux of the attack hinges on the rules files that are used by AI agents to guide their behavior, helping users to define best coding practices and project architecture.
Specifically, it involves embedding carefully crafted prompts within seemingly benign rule files, causing the AI tool to generate code containing security vulnerabilities or backdoors. In other words, the poisoned rules nudge the AI into producing nefarious code.
This can be accomplished by using zero-width joiners, bidirectional text markers, and other invisible characters to conceal malicious instructions and exploiting the AI’s ability to interpret natural language to generate vulnerable code via semantic patterns that trick the model into overriding ethical and safety constraints.
Following responsible disclosure in late February and March 2024, both Cursor and GiHub have stated that users are responsible for reviewing and accepting suggestions generated by the tools.
“‘Rules File Backdoor’ represents a significant risk by weaponizing the AI itself as an attack vector, effectively turning the developer’s most trusted assistant into an unwitting accomplice, potentially affecting millions of end users through compromised software,” Karliner said.
“Once a poisoned rule file is incorporated into a project repository, it affects all future code-generation sessions by team members. Furthermore, the malicious instructions often survive project forking, creating a vector for supply chain attacks that can affect downstream dependencies and end users.”
https://thehackernews.com/2025/03/new-rules-file-backdoor-attack-lets.html