Is it normal americans so arrogantly patriotic?
I'm American myself but I've never liked this country much. I've noticed as I've gotten older that the citizens of this country are so overwhelmingly proud of where they live and I honestly don't understand why?