To me it’s a different situation, because no previous hard, unique, “ownership” work of these people was used without consent to make this cure.
Think of this instead: a doctor collects a big volume of symptoms and analyses and creates a statistical way to cure people more easily. They publish many examples of their work without licensing anyone (legally and morally) to use it freely. Now some algorithm collects their data and many others data and transforms it into a better method. A doctor suffers from going out of business. Is that ethical? On one hand, the algorithm invented something new and easier to access. On the other, it basically stole parts of their and similar researches on a previously unthinkable scale. We humans copy ideas all the time and this is somewhat normal, but this enormous at-scale capability was never a thing.
Personally I don’t care for optometrists, uber drivers or designers. Nature will find a way. But when we talk about fundamental social contracts like property or accumulated knowledge protection, I think it is unethical to break them, regardless of technicalities. If it’s such a great advancement benefiting everyone, why can’t AI creators just ask permission for 2.3B of datapoints they used?
But that's what makes this so tough. Whether you consider the ways in which these AI models repurpose existing artistic works to be mere technical details or central to decide on the ethics of the matter depends largely on the analogies you reason by, as michaelt noted.
Think of this instead: a doctor collects a big volume of symptoms and analyses and creates a statistical way to cure people more easily. They publish many examples of their work without licensing anyone (legally and morally) to use it freely. Now some algorithm collects their data and many others data and transforms it into a better method. A doctor suffers from going out of business. Is that ethical? On one hand, the algorithm invented something new and easier to access. On the other, it basically stole parts of their and similar researches on a previously unthinkable scale. We humans copy ideas all the time and this is somewhat normal, but this enormous at-scale capability was never a thing.
Personally I don’t care for optometrists, uber drivers or designers. Nature will find a way. But when we talk about fundamental social contracts like property or accumulated knowledge protection, I think it is unethical to break them, regardless of technicalities. If it’s such a great advancement benefiting everyone, why can’t AI creators just ask permission for 2.3B of datapoints they used?