The tool, called Nightshade, is intended as a way to fight back against AI companies that use artists’ work to train their models without the creator’s permission.
Addition: The paper is available at https://arxiv.org/abs/2310.13828
You must log in or # to comment.
I don’t know enough about AI or this model to say much, but
Poisoned data samples can manipulate models into learning, for example, that images of hats are cakes
it will obviously never work in England /s