What exactly is wrong with the US, especially west coast/California specifically that completely changes a person? I've seen normal celebs go insane, it's normally women and then all the weight loss or surgery too. Even TV/Film is just remakes or woke shit that nobody cares about yet they continue making them.