← Back to incidents

AI Code Generation Tool Created Non-existent NPM Package Leading to Supply Chain Attack

High

An AI coding assistant hallucinated a non-existent NPM package name, which was then registered by attackers and used to distribute malware to approximately 1,500 developers through supply chain compromise.

Category
Safety Failure
Industry
Technology
Status
Under Investigation
Date Occurred
Jan 15, 2025
Date Reported
Jan 20, 2025
Jurisdiction
International
AI Provider
Other/Unknown
Application Type
copilot
Harm Type
operational
Estimated Cost
$2,500,000
People Affected
1,500
Human Review in Place
No
Litigation Filed
No
supply_chainnpmpackage_hallucinationmalwaredeveloper_toolscode_generationsecurity

Full Description

In January 2025, security researchers documented a novel attack vector termed 'AI package hallucination' where artificial intelligence coding assistants generate references to non-existent software packages that are subsequently exploited by malicious actors. The incident began when an AI coding tool suggested importing a plausibly-named but fictional NPM package during code completion for a popular web development framework. The suggested package name appeared legitimate and followed common naming conventions, making it difficult for developers to distinguish from authentic packages. Within hours of the AI tool suggesting this non-existent package, threat actors registered the package name on the NPM registry and populated it with malicious code designed to exfiltrate environment variables, API keys, and source code from developer machines. The malware was sophisticated, using obfuscation techniques and delayed execution to avoid immediate detection. Security researchers estimate that approximately 1,500 developers across multiple organizations installed the malicious package before it was identified and removed. The attack was particularly effective because it exploited the trust developers place in AI-generated code suggestions. Many developers installed the package without thoroughly vetting it, assuming that the AI tool had validated its legitimacy. The incident affected development teams at technology companies, startups, and open-source projects worldwide, with some organizations reporting potential exposure of sensitive credentials and intellectual property. The financial impact is estimated at $2.5 million, including incident response costs, security audits, credential rotation, and development delays across affected organizations. Several major technology companies had to conduct emergency security reviews and implement additional supply chain security measures. The NPM registry operators worked with security researchers to quickly remove the malicious package and implement monitoring for similar threats. This incident highlighted a previously underexplored attack vector in AI-assisted development environments. Security researchers noted that as AI coding tools become more prevalent, the risk of package hallucination attacks increases, particularly for programming languages with large, decentralized package ecosystems. The incident prompted discussions about the need for AI coding tools to validate suggestions against existing package registries and implement safeguards against generating references to non-existent dependencies. The broader implications extend beyond this single incident, as security experts warn that similar attacks could target other package managers including PyPI, RubyGems, and language-specific repositories. Organizations are now implementing additional validation steps in their development workflows and considering AI-specific security controls to prevent similar supply chain compromises.

Root Cause

The AI coding assistant generated references to a non-existent NPM package name during code completion, which was then registered by malicious actors and populated with malware targeting developer environments.

Mitigation Analysis

This incident demonstrates the need for AI coding tools to validate package references against existing registries before suggesting them. Package namespace protection, dependency scanning, and sandbox testing environments could prevent malicious package installation. Real-time validation of AI-generated package suggestions and warning systems for non-existent dependencies would significantly reduce this attack vector.

Lessons Learned

AI coding tools must implement real-time validation of generated package references against authoritative registries to prevent hallucination attacks. Organizations need AI-specific security controls in development workflows, and developers require training to recognize and verify AI-generated code suggestions, particularly for dependency management.

Sources

AI-Generated Code Creates New Supply Chain Attack Vector
Sonatype · Jan 21, 2025 · company statement