Simple Definition of feminism

from our Learner's Dictionary

feminism

noun

1
: the belief that men and women should have equal rights and opportunities
2
: organized activity in support of women's rights and interests

Cite this Entry

“Feminism.” Merriam-Webster.com Simple Definition, Merriam-Webster, https://www.merriam-webster.com/simple/feminism. Accessed 15 Apr. 2026.

More from Merriam-Webster on feminism

More from Merriam-Webster