Who said gender roles have changed?

Why do many feminists, especially the younger less experienced ones, act like gender roles have changed? Go to a grocery store, and ask every woman you encounter who minds the children, cooks, cleans, etc. I guarantee 95%+ will say SHE does. Yes, because of financial reasons many mothers HAVE to work, but by far most mothers are still the homekeepers. And this is in the Western World. Feminism has yet to affect 2/3 of the world, (thank GOD).

So, I ask again, who says gender roles have changed?

3 Answers

Relevance
  • Anonymous
    1 decade ago
    Best Answer

    Those who are deluded and in the " moralistic fallacy ". We are not going to throw off a million years of evolved strategy and physiological fact for some ideological distopia. Of course, you already know this and everyone else in the real world. Change in the equality of opportunity can happen, incrementally. Change in the equality of identity can not happen, unless men become woman and woman become men.

  • 1 decade ago

    Yeah, mostly I would say they haven't. But mainstream Feminism isn't really about effecting changes in gender roles, it's about bringing about changes in equality. Giving women the OPTION to have a job, the ability to support themselves when their husbands leave them, and be paid the same for equal work. And I would guess that, as a women of the Western World, you've benefited from Feminism in some way. And you THANK GOD that feminism HASN'T affected 2/3 of the world (yet)? So I guess you agree with the fact that too many women, in Patriarchal non-Western societies the world over, still have to walk behind the males, undergo the degradation and pain of female genital mutilation, (because the outer female sex organs are seen as "unfeminine") are the victims of "honor killings" (where the male family members murder their sister, daughter, etc, because she had "sex" before marriage, even if she was raped), where female children are aborted routinely (in countries with "birth per family" laws) because the males are "more desirable?" Where women and girls are still denied the right to an education, the right to vote, the right to even walk the streets freely? If you think feminism has done so much harm to western society, and you think these other places in the world are to be "envied" because they are not affected by feminism, then move to one of these countries, and see how much happier you are, and how much you like it. And if you think that all of the things I've just mentioned don't actually happen ALL THE TIME in the non-western world, do your research, it won't take long.

  • 1 decade ago

    i think it has more to do with the family's financial situation than women's rights. when i was younger, most of the families in m neighborhood had working mothers who had to help keep up income, also, many fathers started to help the moms look after the children and the house hold, along with doing thier own jobs. but if a family was more upper class, the mother probably wouldnt have had to work, (unless of course, the mother was responsible for the upper-class level income).

Still have questions? Get your answers by asking now.