I don't know how "woke" movies are considered a bad thing. When you think about it a movie that tells you "Hey guys racism is bad, climate change is destroying the planet, the rich are constantly exploiting the working man for their own personal gain, minorities should be given more representation, and also countries committing war crimes should be stopped at all cost" should be a positive experience.
Instead manchildren are crying about it claiming its "pushing a message", when really all its doing is making basic common sense statements even a toddler wouldn't find issue with. These sad sacks of worthlessness go as far as to send directors and actors death threats on social media. They even do it to movies and shows that don't actually have any real "politics" in them like the 2016 Ghostbusters reboot for example. It's just a dumb comedy movie but these clowns were out there acting like it was a "feminist propaganda" movie when...it wasn't? Also how is a movie having a feminist lean to it bad anyhows?