Is it normal to shamelessly have bare feet?
I wear nothing on my feet with great pride and people are saying "you can't have bare feet", no? Since when were we ashamed of our feet so we can cover them? Lots of girls and women I know wear nothing on their feet, and hippies go out barefoot. Even in the cold my feet are bare and I'm proud of it, I even go on the bus with bare feet, is that normal?