Is it normal, white people are never reminded of being white?
In America white people are never reminded of being white but anyone who isn't white is always reminded of their non whiteness. That seems racist to me. It's as if being white is just the default, and being anything else is seen as 'different'.