once you read about the mathematics of neural networks and diffusion models you realize that the whole AI hype is a bunch of jeets throwing whatever at a bunch of black boxes with near infinite compute time and hoping it doesn't spit out complete nonsense, humanity as a whole has no idea about how our own brains process information or what qualities make up "intelligence", much less how to encode said qualities into an algorithm
language models are effective at encoding "intuition" rather than intelligence since the way they work is essentially by grabbing words (tokens) and associating them with a vector in a high-dimensional space, in such a way that related concepts or words that usually go together are closer together geometrically, so if you train the network on enough data, or about half the entire amount of text written by humans or half the amount of cat pictures in the Internet, you eventually end up with a fancy prediction software
the piss filter in particular is probably a positive feedback loop as noted in social media; the jeet devs probably noticed that Ghibli pics and cartoons with a "warm" aesthetic to it did well in social media, so they probably finetuned generative models to generate more pictures like these, and in latent space the network assumed we want literally "warm" or piss-colored pictures. Add to that the fact that recent AI models are trained with synthetic data (lol) and shit hits the fan
rant over, t. bored grad student