Those analogies don’t make any sense.
Anyway, as a publisher, if I cannot get OpenAI/ChatGPT to sign an indemnity agreement where they are at fault for plagiarism then their tool is effectively useless because it is really hard to determine something in not plagiarism. That makes ChatGPT pretty sus to use for creatives. So who is going to pay for it?
Something like Microsoft Word or Paint is not generative.
It is standard for publishers to make indemnity agreements with creatives who produce for them, because like I said, it’s kinda difficult to prove plagiarism in the negative so a publisher doesn’t want to take the risk of distributing works where originality cannot be verified.
I’m not arguing that we should change any laws, just that people should not use these tools for commercial purposes if the producers of these tools will not take liability, because if they refuse to do so their tools are very risky to use.
I don’t see how my position affects the general public not using these tools, it’s purely about the relationship between creatives and publishers using AI tools and what they should expect and demand.