New Tool to Protect Artists from AI Scraping

Image by Arek Socha from Pixabay

A team at the University of Chicago developed a program that will help protect visual artists from AI image generators which use their work without permission.

With “Nightshade,” artists can put alterations in their work before uploading them online. This will mislead and confuse the AI systems, ultimately damaging AI art creation.

Image by Tetiana Shyshkina on Unsplash

Computer science professor Ben Zhao led the team that created Nightshade. He hopes this tool will help artists protect their work and personal information from being scraped without their permission.

“Artists are afraid of posting new art,” said Zhao.

His team also developed ‘Glaze,’ a tool that prevents AI models from learning an artist’s particular style. Similar to Nightshade, Glaze changes the pixels of images that are invisible to the human eye, but computer-learning models will interpret differently.

Zhao’s team is making Nightshade open source so artists could modify it and make their own versions of protection.

Photo credit:

Comments are closed.