To preface, I know feminism has been around for a long time...more than half a century, but in very recent times (at least to me) it's been insane.
Today, on facebook I see a few photos posted by girl A, girl B, and girl C.
One is of girl A's boyfriend waxing her car, and her captioning it "Omg he's so sweet to me! Love him!"
One is of girl B's boyfriend holding 5 shopping bags, hating everything, and her saying, "he takes me shopping and offered to carry the bags, love him blah blah!"
Girl C on my facebook shares a photo that said, "Feminists shouldn't be called feminists. They're just normal people. If they're not, they're SEXIST."
At work yesterday, someone brought up the Little Mermaid movie in our slack channel, which is, previously to this point, filled with fun and light-hearted banter. Someone asked what it was about, to which a feminist replied, it's "about a girl who gives up anything and everything for some rich guy." And all the women on our team reacted to it ("liked" it). Okay.....pulls collar away from neck
That same girl also sent out a company-wide email a couple of weeks ago, pointing out the amount of women in engineering (not as many, ...shocker). I didn't understand the point, though. We have to purposely hire women because not as many go into engineering? Do you really think we wouldn't hire a woman who applied, that had the chops?
In the course of just a couple days in 2016, I've seen feminism rearing it's head, directly and indirectly. 5 years ago I never even heard of the word. I would've guessed it's about womens rights and that's it.
What the fuck has happened in the span of 5 years?