( Yes, I tagged Feminism as a religion because I swear it is like a cult or something) Let me start by saying I am all for women having equal rights, working, getting paid equally, being able to do "most" jobs men do (I still feel some are better suited for only men for logical reasons), and being able to decide if they want to be a stay at home mom or be a working mother. BUT what I feel about feminism, well, it is utter garbage. These women try to empower each other by bashing men, acting like the "typical male", playing alpha, ect. ect. and to me they are basically straight lesbians. I am not saying that to be offensive to lesbians either, just stating that they dislike men that much. Maybe I am wrong here for feeling this way but I can't see one logical "good" thing feminists have done for each other. They play the victim while stating they are not victims. To me, it is ruining a lot of women who were once capable of good heterosexual relationships with men. If we are all "equal" and they want us to all be equal, then why downplay and bash the opposite sex to get what you want? What are your views on feminism?