Summary: A new supply chain attack method called ‘slopsquatting’ has emerged, leveraging the tendency of generative AI tools to produce non-existent package names. This poses a significant risk to developers who may unintentionally install malicious packages, as 20% of AI-generated code samples featured recommended packages that do not exist. Although no attacks have been reported yet, the predictable nature of these hallucinations presents a potential target for future exploitation by threat actors.
Affected: Developers, software libraries (e.g., PyPI, npm)
Keypoints :
- Slopsquatting is a twist on typosquatting, using AI-generated nonsensical package names.
- 20% of evaluated AI-generated code samples recommended nonexistent packages, indicating a serious risk.
- Effective mitigation strategies include manually verifying package names and using dependency scanners.
- Lowering AI model “temperature” settings can help reduce the occurrence of hallucinations.
- Testing AI-generated code in isolated environments before deployment is essential to preventing security threats.
Views: 2