I hate being an american
I hate the Bible belt. I hate the Pro-life assholes who are willing to berate an innocent woman for her own life decisions. I hate the overly conservative Christians. I hate that most people here don't believe that healthcare is a basic human right. Or education, for that matter. I hate that racism, sexism, and general bigotry are so prevalent. I hate that we only have two serious political parties from which we can choose, and I hate how both of these parties use simple social issues (gay marriage and abortion) to try to distract us from the real problems that face us all. I hate that this manipulative strategy actually works. I hate that our congress wouldn't be able to agree on if the sky is blue or not, much less fix our economic issues. I hate how Socialism is a dirty word in the USA, even though the failure of pure Capitalism is so incredibly obvious. I hate the blind paranoia. I hate how no one is willing to trust the government (We the People), but everyone is totally fine with big businesses being considered people (not exactly known for their great ethics). I hate the rape culture here, and the fact that if a woman wears a low-cut dress people will actually think it's her fault for being raped (and yet they have something against Burkas because...freedom?) I hate the greedy, manipulative, money-grubbing twats on Wall Street.
But most of all, I hate that I am associated with this war-mongering, ignorant, and greedy country. If you're a 1st world country, shouldn't you be trying to do more good in the world? Not more evil? Am I just being unreasonable?