Is it normal to find the concept of vampires disgusting?
I have noticed that rather recently, Vampires have made their way back into the pop culture limelight (need I explain how?).
For as long as I can remember, I have found the concept of Vampires and Vampirism quite disgusting. Vampires drink the bodily fluids of human beings, which I find incredibly disgusting and hazardous to one's health (great way to get AIDS).
In fact, that's about it. I don't understand why Vampirism is such a great and novel thing. Drinking blood sounds very gross, as it contains damn near every disease one has ever had.