Post-Feminism

From Conservapedia
Jump to: navigation, search

Post-Feminism is the belief the women have the right to choose their own destiny. Post-Feminism promotes the idea that women do not have to choose a career rather than having children but have the choice of working or staying home and raising a family without feeling or being treated as traitors to the feminist cause.

External links