from Hacker News

AI Hallucinations Are Fueling a New Class of Supply Chain Attacks

by sksxihve on 4/12/25, 2:10 PM with 6 comments

  • by pera on 4/12/25, 3:34 PM

    I was talking about this issue with a friend a while ago: If an LLM often hallucinates the same package name for a common problem you could copy an existing library, adapt the API to fit the hallucination, use the same hallucinated name and finally include a backdoor.
  • by jruohonen on 4/12/25, 2:45 PM

    "They found that 8.7% of hallucinated Python packages were actually valid npm (JavaScript) packages"