Is it normal that almost everything humans do disgusts me?
Hi, I've actually been leading quite a sheltered life and I've only recently been finding out more and more about the world, most of which I never wanted to know.
There are just so many things wrong with humanity that I do not even know where to start...
People kill each other just because they have different believes and opinion, like now with the ISIS.
People rape their children, I've actually been in contact with a child that has been raped by their parent and you do not know how fucked up in the head that makes them, by the way she was around 6 years old when I met her and the thing she loved doing most was killing animals because she wanted them to die.
And even if we completely ignore all social issues we still have to deal with the fact that humans destroy the earth, which is ironicly the thing which gives them life.
I've been told at school that humans have both good and bad in them and that the side that comes out usually has something to do with the situation you're in. Which means I am capable of doing terrible things too. Actually I am already destroying the earth by driving a car and eating food that was made in a factory.
Thanks to this, and a shitton more that I am not even going to get started on I think it is best for humans to die out and let the earth restore itself to it's former beauty.
We have fucked up norms and we kill the thing that first gave us life, why should we continue to exist? wouldn't it be better if we all died?
I'd actually like it for someone to try and change my mind about this, I hate thinking like this and want to learn seeing the brighter side of things again.