Post-Feminism
From Conservapedia
Post-Feminism is the belief the women have the right to choose their own destiny. Post-Feminism promotes the idea that women do not have to choose a career rather than having children but have the choice of working or staying home and raising a family without feeling or being treated as traitors to the feminist cause.
External links
- The Feminist eZine Archive of articles about Feminist History.