Most of my friends are young, indie, and liberal, so I get a lot of this stuff on social media.
I'm just wondering what your opinions on this are, and I'd particularly appreciate responses from any women or transgender folks. Barring any bandwagon feminist-extremism, do you perceive our society as markedly patriarchal, in that it strongly favors men over women in many capacities such as employment, government office, social equality, and legal stature?
A common issue several of my female friends have stated is that they have a "constant/daily fear" of men. I'm not even sure what to say to that..."sorry"? Is it fair to treat all men as hostile because of that fear, and how is that any different than fearing a different race for the crimes some members have committed?
There's more I could go on about, but I'd prefer to return back to the main question. Post yours thoughts. Thanks Flood.
[spoiler]One tiny thing. The link above is what prompted this thread, and I find it really off-putting that the author has to keep mentioning, even confessing, that he's a "cis white male", as if it's something to be ashamed of.[/spoiler]
-
Edited by M37h3w3: 6/6/2014 2:17:09 AMMy initial answer before reading the article? Yes. But not as overblown as some of the posters whose postings have escaped tumblr to the web at large. Women in general still suffer slights against them solely because of their race. You can still find parts in America were women are treated as the lesser. Denied jobs, made to suffer insults. Course if you go elsewhere in the world you'll find places where women are -blam!- without justice being sought (pretty much 24/7 in India), traded like an item, mistreated cruelly (burned with acid for leaving an abusive arranged marriage), and so on.