Is it normal that i don't trust men?
Well I have recently watched a documentary about men feeling as though they're not I power and that's all good. But they had classes on how to get in bed with women and it's as though they don't want to settle down or get married or have kids. Me being brought up how I was wants to have sex after marriage so is it normal or not that I feel as though I can't trust men I don't have a stereotype by the way In case it seems like that.