Get the latest gossip

AI poisoning tool Nightshade, used by artists to disrupt AI models scraping and training on their artworks without consent, received 250,000 downloads in 5 days


It's a strong start for the free tool, and shows a robust appetite among some artists to protect their work from being used to train AI.

“Nightshade hit 250K downloads in 5 days since release,” wrote the leader of the project, Ben Zhao, a professor of computer science, in an email to VentureBeat, later adding, “I expected it to be extremely high enthusiasm. Nightshade seeks to “poison” generative AI image models by altering artworks posted to the web, or “shading” them on a pixel level, so that they appear to a machine learning (ML) algorithm to contain entirely different content — a purse instead of a cow, let’s say. On the Nightshade project page, Zhao and his colleagues — Shawn Shan, Wenxin Ding, Josephine Passananti, and Heather Zheng — stated they developed and released the tool to “increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.”

Get the Android app

Or read this on r/Entertainment

Read more on:

Photo of Nightshade

Nightshade