Is it normal that i'm really depressed in the winter?
I feel like in the winter I'm super depressed but when spring finally comes around I feel so much better. I fight with people less, feel better about myself, and am just an all around happier and better person. Is it normal?