Fake AI Video Sites Spread New Noodlophile Malware
Cybercriminals are exploiting the surging interest in artificial intelligence by using fake image-to-video AI websites to distribute a new information-stealing malware dubbed “Noodlophile,” according to cybersecurity researchers. The malicious platforms claim to generate AI-powered videos from static images but instead deliver executable files that initiate a stealthy attack chain.
Once downloaded, these files deploy the Noodlophile infostealer, a novel threat designed to harvest sensitive user data from infected systems. The tactic leverages the popularity of AI tools to lure unsuspecting users into engaging with seemingly legitimate services, increasing the likelihood of successful infection.
The attack underscores the growing trend of threat actors disguising malware as AI-driven applications, capitalizing on public enthusiasm for generative technologies. Security professionals caution users to verify the legitimacy of AI tools and avoid downloading software from untrusted or unfamiliar sources. The emergence of Noodlophile highlights the evolving techniques used to deliver infostealers through socially engineered digital fronts.
