Sure, but this particular manifestation of that sort of fraud has its own character, needs its own responses and is worth its own discussion.
I’ve seen some promising developments around tools to poison AI datasets if they vacuum up the work without the permission of the original artist, but it’s a bit of a shame if we fall into a pattern of a combative arms race being the only thing that will realistically prompt any sort of change in approach by the owners and developers of the software.
This really reads like a ChatGPT result