Is it normal that i feel like women really don't understand men.
I feel like guys are told that we need to work harder to understand women's points of view. I used to be a feminist, oddly when I called myself that I was the most ignorant about what it meant. I still saw women as weak and needing protection, I was the white knight type who actually still thought less of women. I read a lot and learned better. More over I found that I related to a lot of the self esteem issues myself. However I solved my self esteem issues with different strategies simply because I'm a man. When I try and explain the way guys can think and be insecure and how this leads to them becoming all of those "bad guys" no one ever wants to hear it. I understand that it makes us hurt women emotionally but no one ever seems to care about what emotional trama gets us there. It's always just "you're an asshole" "you lack confidence" "that's your problem" "stay away from women". I'm not an abuser but I have friends who are, they're such sad people who have problems that come from early developmental stress. No one wants to learn how to prevent this for young guys. Where is the mens self esteem advocacy? We get hurt, we just don't cry after the first time we get shamed for it, instead just we get angry and defensive.