“Nightshade”: A Tool to Protect Digital Art from AI Training
Researchers at the University of Chicago have unveiled a novel tool that empowers artists to “poison” their digital art, thwarting developers from using their work to train AI systems.
On this page
Researchers at the University of Chicago have unveiled a novel tool that empowers artists to “poison” their digital art, thwarting developers from using their work to train AI systems.
Named after the poisonous family of plants, Nightshade works by altering digital images in ways that introduce inaccuracies into the datasets used for AI training.
It essentially misleads AI systems by modifying pixels in a manner that can make them interpret a cat as a dog, or vice versa, as reported by MIT’s Technology Review.
“I’m just really grateful that we have a tool that can help return the power back to the artists for their own work,” some artists say.
The content on The Coinomist is for informational purposes only and should not be interpreted as financial advice. While we strive to provide accurate and up-to-date information, we do not guarantee the accuracy, completeness, or reliability of any content. Neither we accept liability for any errors or omissions in the information provided or for any financial losses incurred as a result of relying on this information. Actions based on this content are at your own risk. Always do your own research and consult a professional. See our Terms, Privacy Policy, and Disclaimers for more details.