Define Feminism. |
What is Feminism?
Feminism is, by definition, "a belief in the social, political, and economic equality of the sexes." Many people mistake feminism with misandrist, or "a person who dislikes, despises, or is strongly prejudiced against men." This is not feminism. People who are feminists want a free world for women, a safe world for women, an equal world for women. That is what feminism is about. How Can You Become One? |