Most of my friends are young, indie, and liberal, so I get a lot of this stuff on social media.
I'm just wondering what your opinions on this are, and I'd particularly appreciate responses from any women or transgender folks. Barring any bandwagon feminist-extremism, do you perceive our society as markedly patriarchal, in that it strongly favors men over women in many capacities such as employment, government office, social equality, and legal stature?
A common issue several of my female friends have stated is that they have a "constant/daily fear" of men. I'm not even sure what to say to that..."sorry"? Is it fair to treat all men as hostile because of that fear, and how is that any different than fearing a different race for the crimes some members have committed?
There's more I could go on about, but I'd prefer to return back to the main question. Post yours thoughts. Thanks Flood.
[spoiler]One tiny thing. The link above is what prompted this thread, and I find it really off-putting that the author has to keep mentioning, even confessing, that he's a "cis white male", as if it's something to be ashamed of.[/spoiler]
-
Of course. Women make 80 cents on every dollar men make.