Is it normal that i fear showing my good side to women?
growing up in America Uncle Sam has has thought me that women are these so called angelic creatures that absolutely dont hold a threat to us men in any way, shape, or from... hypocrisy as usual!
but where I'm originally from think where a "little" more different. back there I was always thought that men who...
- show their weak side to women
- refuse to whip their cocks out at women
- grab their tits and ass without their consent
- refuse to talk dirty to a women
- cant fight other men who post a threat to their women
- and beyond!
are weak cowards and who shall be punished by not getting their "reward" aka women. yes a extremely barbaric society! I remember when I was a toddle I use to enjoy going to church and enjoying the good view and all and I remember one of my four sisters used to shout "wow we always wanted a male brother, but now he's gonna be a priest when he grows up!" after that a few years later when in my early teen years everybody in my family was rushing me to get a girlfriend and every time I would refuse they would up the pressure up a notch like if I was about to become gay or something.
but anyways, does all this justify my reasons for having such a distorted views on women?