Is it normal i don't want to live in america?
A lot of times when I surf on the net, I see those ads saying things like: "You've won a Green Card! Come live and work in the USA!"
Now first of all, I think it's a scam.
Secoundly, even if it wasn't a scam, I don't wanna live in the USA. I live in Denmark, and winter this year I went to America on vacation. And though it's of course nice to be on vacation, life in Denmark just seems so much better than in USA.
So is it normal I don't want to live/work in the USA?