@Aqua-Letifer Law provides grounds for prosecution, yet whether to prosecute remains a choice.
I have no sympathy for the businesses of the world who profit by infringing the rights of others. Still, it costs large sums of money to prosecute a case in court.
Imagine a world where LLM generative AI turns out to be a flop, where ChatGPT spouts gibberish rather than prose. In such a world, the developers of ChatGPT would have exhausted their initial tens of millions in early funding, present a few conference papers, and that would be the end of it. Most authors/publishers would likely never know that their copyrighted works have been used to train an AI large language model, and even if they know, they would see insufficient monetary incentives to sue. (Why spend millions in legal fees to sue a bunch of developers who have exhausted their funding and has no prospect of making more money with their gibberish AI? Just for the principle? There are many researchers and developers using similar datasets without permission who aren't getting sued.)
Now that the world see the value of a certain way of doing generative AI, it is right that we seriously consider how to divide the large expected bounties among all contributing parties, authors and publishers included. Let the lawsuits run their courses. Let the advocates and the lobbyists make their pleas. And see what public policies emerge from all this.