Is it normal that i think our society is backwards when it comes to sex?
Think about it, on tv, in movies, and in video games, graphic violence is widely accepted and glamorized. When someone gets their head cut off, everyone is okay with it. But when a woman takes off her clothes, then suddenly the whole world is coming to an end. You got all these idiot religious fanatics crying "SIN" and you got all these feminists crying "SEXISM." Even prostitution is illegal in the US. Like really? There are more serious crimes out there, but our government and lawmakers are more concerned with who you should and should not sleep with? So am I to understand that Americans are really just blood thirsty savages? Because that what it looks like to me. It seems this country would rather see a life being taken away rather than see a new life being created because thats what sex is. Sex creates a life, but yet there are idiots who thinks its dirty and sinful? What is up with that?