Is it normal that all the women i know think they're better then men?
Most women I know think they're better than men. I don't know why, but everything seems so sexist. Everywhere I go, boys are holding the door for girls, giving up their seat for them, driving then places, paying for their stuff, the whole "no hitting girls" thing. It's in the law too. If there is any domestic violence, it is automatically charged against a man. Even if they girl is abusing him, he gets charged. When men do favors for women, they just act like it's expected and don't even say thank you. I never see women doing this for men. And when I ask women why they just say "because"... WTF is it like this round here?