Is it normal that i'm starting to be more socially oriented?
People that seems to think that they are the center of the world and that everything and everyone should act according to their views are starting to irritate me to a point where I'm starting to think that they are the real plague of the Occidental civilization.
I'm starting to think that the ''hyper'' individualism we are living in will be the downfall of our society and that we should start to be acting in a more ''common-well-being '' oriented way.
Is it normal to feel like people should start to view themselves as a part of a group/tribe/nation , whatever you name it instead of the only thing that matters....because, in the end, the sociaty we are happy to be living in is the result of a common goal shared by thousands of individual and not just a handful of self-centered teenagers