Article 6FX1P Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

by
Thom Holwerda
from OSnews on (#6FX1P)

But even without filing lawsuits, artists have a chance to fight back against AI using tech. MIT Technology Review got an exclusive look at a new open source tool still in development called Nightshade, which can be added by artists to their imagery before they upload it to the web, altering pixels in a way invisible to the human eye, but that poisons" the art for any AI models seeking to train on it.

Excellent. This is exactly the kind of clever thinking we need to stop major corporations from stealing everyone's creative works for their own further gain. I hope we can develop these poisons further, to the point of making these AI" tools entirely useless.

Get permission, or get poisoned.

External Content
Source RSS or Atom Feed
Feed Location http://www.osnews.com/files/recent.xml
Feed Title OSnews
Feed Link https://www.osnews.com/
Reply 0 comments