Is it normal or only happens to me?
I'm American, I graduated from the university and I try really hard to have a good life. However, I can see that immigrants (from any country) come to US set up businesses, buy houses, enroll colleges and etc. Also, any place I go the owner is immigrant, the professor, the doctor, the dentist, the taxi driver. And they do not speak English properly. What the hell happen wit the native/American. Why immigrants do better than us?