Is it normal that i hate giving blow jobs?!?
I don't understand what's so appealing about blow jobs as a woman. It's not fun to do at all and I actually find it kind of degrading to women in general. A dick is meant to be put in a vagina, not a mouth? Ew? If you guys like it so much why don't you just do it to each other and save us women the pain and agony of pretending that we even remotely like your dick in our mouths? Like when you really think about it, it's like wtf? When did we come up with this? Did we even do this in the middle ages? is it normal for me to feel this way?