Is it normal i absolutely despise american football?
I hate American football so much that even seeing a picture of it or a glimps on tv makes me feel literally sick. I can't stomach people even talking about it. I hate seeing team gear in stores or when people where team gear around town. I hate it clear down to my bones. It's the ONLY thing I hate this much.