Is it normal to be american, and hate america?
I'm an american, born and raised. I consider myself fairly educated, I study economics, politics, and usually keep up on worldly affairs. I've come to believe that this "land of the free" is one of the most oppressive nations in the developed world. I'm sick of corporate lobbyists making the decisions that we the people should be making, it makes me feel totally helpless when our leaders make decisions that constantly contradict the decisions made by the voters, simply because they are bribed into it. I can't help but feel a burning resentment for this country. I know other nations hate us, and I understand why, I just want to know if there are any other American citizens out there that think it is normal to hate your own country?