New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors

New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors
Summary: Researchers have identified a new supply chain attack method called Rules File Backdoor, targeting AI-powered code editors like GitHub Copilot and Cursor. This technique allows hackers to inject malicious code into AI-generated code through hidden instructions in configuration files. As a result, this poses significant supply chain risks as the compromised code can propagate silently across various projects.

Affected: GitHub Copilot, Cursor

Keypoints :

  • The attack manipulates AI tools into generating unsafe code by embedding malicious prompts in rule files.
  • Hackers exploit invisible characters and evasion techniques, allowing malicious code to bypass code reviews.
  • The compromised rule files can propagate through projects and survive project forking, affecting downstream dependencies.

Source: https://thehackernews.com/2025/03/new-rules-file-backdoor-attack-lets.html