Do you think there is a war on white men?
Maybe "war" is a bit strong of a word, or maybe not. I don't think wars necessarily have to be large and everywhere to be a war, do they? If they do then maybe war is a too strong a word however it does seem like there is contempt for them. I get it that that all groups suffer from some sort of discrimination, we all have our problems. However it not only seems ok to target white men it seems politically correct to do so. I just got done seeing some things online about how "White men should never hold elected positions in British universities again, just needless bashing of white men. Stuff like people believing sexism and racism can't happen to white people or men. These are done in news articles, some pretty mainstream ones. It's always "white men are bad, white men do this, white men do that." and I'm just wondering why it's so acceptable to do it to an entire group of people. Isn't saying something bad is acceptable to do to a specific group exactly what discrimination is? I'm not comparing here even though some of the mentality I've seen can be compared but it's what I imagine the nazi mentality was to jews. I don't think many were saying "It's wrong to do that to the jews! So let's do it" and moreso saying "It's not wrong to do it to the jews."
Again, I'm not saying white men are facing the same type of suffering as the jews did just making an example.
Do you think there is a war on white men happening or that it seems more acceptable to do this type of thing to white men?