Is it normal to think gender roles should be abolished in our society?
Society should promote being individuals, not fitting in a mold of either: being strong, brave, & hard-working (males) or caring, patient, and motherly (females). Gender roles are harmful and only put unneeded stress on everyone involved. Girls shouldn't have it ingrained to them to become mothers and wives with baby dolls/cooking toys and boys shouldn't be forced into being masculine early with car, sport, or G.I. Joe toys. Children being associated to certain colors should be abolished too. This is a response to another IiNer's post.
If a girl should grow up to get married and have kids while a boy wants to be a very manly man, it should be their choice, not a learned behavior/societal pressure.