For sure, AI can reproduce wholesale verbatim copies of text from miscellaneous sources. It can also create images that are so close to random deviantartists’ images that it’s undeniably plagiaristic. I expect this bug will be worked out eventually, but it is currently quite capable of doing this. In other words, you could say the weights contain a lossy encoding of many artists’ works, and those works can (lossily) be eked out of the model with some coercion.
AI doesn’t do that though.
For sure, AI can reproduce wholesale verbatim copies of text from miscellaneous sources. It can also create images that are so close to random deviantartists’ images that it’s undeniably plagiaristic. I expect this bug will be worked out eventually, but it is currently quite capable of doing this. In other words, you could say the weights contain a lossy encoding of many artists’ works, and those works can (lossily) be eked out of the model with some coercion.
Which AI models? Can you share some examples please?
Here’s a poignant example IMO: