I'm not making this shit up. I seriously sat down and questioned why I am going to college, spending a huge amount of money to only learn how females are 'oppressed' in society and have been throughout the ages.
In every single class, whether if it was English or film studies, they all discussed feminist principles. In one class, I raised my hand and disagreed with a female student, and got scorned for being 'close minded'.
I've been in fact very open minded about the whole thing, and tried to understand it from their perspectives, but I don't see any oppression. All I see is a bunch of young women nagging about how they don't have enough - or complaining that there is some disadvantage they're always facing in comparison to the male. What disadvantage? I truly don't see it. Females tend to have the upper hand if you ask me. There is a reason why men are more likely to end up homeless, there is a reason why the suicide rates are mostly male.
Hell, if you're pretty enough (or not, sometimes it doesn't even matter), you may not even have to work a day in your life -- just find a partner who will do it for you. That is never an option for the male. There are so many open doors just for being female that make it a way easier ride; I don't care if anyone disagrees with me, it's blatantly evident.