Is american culture better than the rest?
As a North American, I get the impression that most white people here believe foreigners from other countries, like China or France or India, have unappealing cultural practices and think poorly of them for it.
Is it weird or normal then that I feel differently? I think it's more likely to be ::us:: that is unappealing culturally.
We are a relatively new country, populated with descendants of the lower classes of the ancient nations of the world. Most of our ancestors came here to make a better life for themselves and their families since they couldn't do so in their own country. Essentially, we're all immigrant shit here, with the exception of the original tribes that inhabited the continent prior to European discovery.
From an outside perspective, we probably look like a bunch of unsophisticated louts will new money; money which has allowed us to gain some world power very early in our history compared with other civilizations. Yet we have the audacity to look down on the cultures of nations that have endured thousands of years of cultural evolution as if we are the standard by which every nation should be judged.