Is it normal that i've never seen rape culture in my own life?
I would like preface this by saying that I am a feminist. I am in no way saying anything bad about the feminist movement. I wouldn't have even found out about rape culture if I wasn't reading feminist blogs.
Well anyways, I recently finished reading this article: http://www.shakesville.com/2009/10/rape-culture-101.html
The article says that we live in a culture that is filled with people excusing rapists, and that most people seem to think that rape is okay. Granted I live in a very liberal town, that may not be representative of the whole world, but I personally have never run into this kind of thinking. Any time I've ever heard the topic of rape brought up is in the context that it is completely horrible, Unless you count some people on the internet who fantasize about getting raped. I've never heard a rape joke outside of depraved places like 4chan, the same people who use the word f*g as a suffix.
When I read this article, I saw links to articles about judges giving out light sentances to the rapist of a ten year old girl because she "Dressed provocatively". I saw that apparently some people think it's not rape if you're married, her lips said no but her eyes said yes, or she was "Asking for it" After reading this, I was wondering if we are living in the same culture, because that certainly isn't what I've seen in my experience. I have been told that rape is worse than murder, and I believe that 100%. The only reason I would rather be raped than murdered is because I'm already I've been scarred, abused and broken so many times that I have only made it through because I have a strong will to survive.
Well, now that that's gotten out of the way, my question is, is it normal to not notice rape culture? Do you see things like this all the time in your real life?