Article 6FVP9 Artists can use a data poisoning tool to confuse DALL-E and corrupt AI scraping

Artists can use a data poisoning tool to confuse DALL-E and corrupt AI scraping

by
Emilia David
from The Verge - All Posts on (#6FVP9)
banana.0.png Image: OpenAI

Fighting against data used to train AI models has become more poisonous.

A new tool called Nightshade allows users to attach it to their creative work, and it will corrupt - or poison - training data using that art. Eventually, it can ruin future models of AI art platforms like DALL-E, Stable Diffusion, and Midjourney, removing its ability to create images.

Nightshade adds invisible changes to pixels in a piece of digital art. When the work is ingested by a model for training, the poison" exploits a security vulnerability that confuses the model, so it will no longer read an image of a car as a car and come up with a cow instead.

The MIT Technology Review reported that Ben Zhao, a professor at the University of Chicago and one of the...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments