And tbh even if people are absolutely fine with that, and think that the analogies and legal arguments that they make are absolutely sound and maybe think copyright's a terrible idea anyway, I still can't see why they'd expect Big AI to suddenly drop the "don't care what you think about how we use your stuff, if it's not explicitly illegal we're going to use it" stance when it comes to stuff that's supposed to be 'private' rather than stuff that supposed to be 'property'.
Sure maybe you care more about whether OpenAI has stuff derived from the contents of your Dropbox on their servers which is technically neither "training a model" nor the actual "copy" they were required to delete after 30 days than you ever did about copyrighted stuff. But why would OpenAI?
Sure maybe you care more about whether OpenAI has stuff derived from the contents of your Dropbox on their servers which is technically neither "training a model" nor the actual "copy" they were required to delete after 30 days than you ever did about copyrighted stuff. But why would OpenAI?