Is it normal i sometimes feel ashamed to be an american?
(If I ever run for President this is really gonna bite me in the back).
I’ll admit, I enjoy being in America. There’s certainly far worse places in the world right now, but throughout history my country has done some pretty horrible shit. We started off genociding Native Americans, participated in the transatlantic slave trade which lead to centuries of slavery, and decades of Jim Crow era racism and segregation, the effects of which are still felt today. It wasn’t until the 1960s when we finally gave black people equal rights, and that seems like a long time ago, but in retrospect it really isn’t.
But it doesn’t stop there. Right after Pearl Harbor FDR put countless innocent Japanese Americans in internment camps. During WW2 we fought a totalitarian regime with ideas of a racial hierarchy, yet we had our own ideas of a racial hierarchy at home!
The Jim Crow Museum is a good place to look to see just how racist our country has been. On one hand, it would be somewhat reassuring to know that other places had similar racist attitudes (like the aforementioned Nazi germany, but to a lesser extent like us), since it would give me a sense of “see, we’re not the only ones!” But on the other hand it would just be sad that that happened at all.
Am I just an ignoramus? Is America really not as unique in these attitudes as I’ve been taught? Let me know your thoughts.