>>510530583I am a (bad) artist, but I'm concerned about the effect that releasing my content to an online audience will have at this point.
Using Glaze, Nightshade, and other protective programs is a tie-over for the time being, and the prospect of companies like Google and OpenAI using bots to essentially replicate my work (without my permission) is a massive disincentive since with each produced work I'm essentially aiding my direct competition -- which is hosted, paid, staffed, and directed by people who I disagree with on nearly every topic imaginable. Since you're here, I'd assume you'd concede that the people who are hosting and designing generative AI are *not* generally aligned with ideals like ethnonational solidarity, free speech, rights to self-defense, rights to privacy, isolation, and so on.
What I've outlined is a similar plight to people who are gifted programmers, but don't like the idea of automating jobs away from white people, or designing surveillance systems to capture minor crimes, becoming a kind of accomplice to the incrimination or marginalization of their own countrymen. It's becoming harder to work as someone with these kinds of skills in an avenue that generates steady income, because ultimately what you're doing is "joining the winning team", working bit-by-bit for a system which despises you, but can use your livelihood to construct a system alien to your own interests.
This obviously wouldn't prevent me from creating art I enjoy, but their wouldn't be a reason for me to share it with other people, even if other people would like me to, and it would increase the quality of their lives. This suggests that it's possible that *many* artists may lose motivation to publicly share their work, leading to a dramatically increased rate at which corporate, centralized, AI *art* dominates the public space at an increasing rate. See the Nobel-winning essay "The Market For Lemons" by George A. Akerlof for a better description of this phenomenon.