I'm lost pol. I thought men were inherent leaders of our world. Why are women better than men? Can someone redpill me on feminism real quick? I understand that more women are going to college than men but they are all taking psych, nursing, teaching, art so how will these lead to women becoming leaders?