Ask Your Question today
Women are no better then men and vice versa and neither gender is oppressed in America anymore. You don't like their sexist behavior, then don't hang around them.
Is it normal that all the women I know think they're better then men?
← View full post
Women are no better then men and vice versa and neither gender is oppressed in America anymore. You don't like their sexist behavior, then don't hang around them.