Professional Documents
Culture Documents
Amna Baig
What is Feminism?
French Word: Feminism, a medical term to
describe feminization of male body or to
describe women with masculine traits
Used in USA – Early part of twentieth century
for the group of women with the political
agenda of changing the social position of
women.