Hey, so after Summerslam I am seeing a lot of people who are saying they are done with WWE. I know a lot of people who feel like this are new fans or are fans who have only really experienced WWE and I've been seeing a lot of sentiment that people feel they can no longer enjoy pro-wrestling because of WWE. I wanted to write up this post to say to people, particularly women and LGBTQ+ fans who feel unwelcome by WWE's progressively more politically right-wing approach, you are still welcome and wanted in the wrestling community.