There's no way i'll ever accept this butt showing trend as a norm
I have nothing against women's bodies nor with women being as sexual as men, and I am also someone who supports nudism.
But honestly, all the overly popular butt showing trend, like what we see in Instagram, is way too trashy, and quite narcissistic.
The message in such trend isn't even nudism or any other good message. It's all about just making a naked body look like as if it's only about sex.
Why is it that just because someone is comfortable with their own sexuality, they should be letting everybody know that by showing their butt literally everywhere, even at like some Chuck E. Cheese's? I don't think literally everybody should be aware of some random person's sexual comfortability. Like, are kids supposed to know about some adult's sexual comfortability too?
And for some reason everybody's apparently supposed to defend and tolerate this degeneracy, or else, we get called as far right incels, fat jealous girls, misogynist boomers, women forced to be modest, religious fanatics, and other stuff that don't describe us.
As if all real, physically and mentally healthy, non-religious and non-conservative ladies from healthy households are into showing off themselves in a lewd manner literally everywhere.
Like I said earlier, I have no problem with women's bodies nor with women being as sexual as men, and I also support nudism. I just see the butt showing trend as something that should be marked as 18+ stuff instead of something that should become a new cultural norm or a social expectation.
And I'm very comfortable with my own sexuality, but it doesn't mean that I have to let everybody know, and I think that butt flashing in the name of sex stuff belongs to 18+ places and not the whole public.
Tell me it's normal that I'll never accept this stuff as a new useless norm.
And even if it'll never become a norm or a social expectation, I still think it's trashy and not a good thing at all.