Is it normal to think sex would be better as a man?
I've had sex before - granted, not a lot of times - and I can't help but conclude that sex must be way better for men than women.
Having a female reproductive system in the first place kind of sucks - you get periods, cramps and infections, but then you add sex and have to worry about STDs, potential pregnancy, higher risk of UTI, and soreness. I mean, sex can be painful for women since they're the ones getting something shoved inside their bodies. It's pretty invasive.
Plus, with sexual assault being more prevalent among women than men, many women who are victims might end up with bad impressions of sex that ruin the fun part of it for them.
If I were a guy, I'd probably want to have sex a lot. It'd be easier and more fun probably. As a girl, though, it just doesn't seem worth the 20 minutes of feeling good for all the other risks. I don't think there's anything wrong with sex, but it just looks way more advantageous for men.
Is it normal to think that way?