Well, I am majoring in Women Studies.
In today's world, which self-respecting woman isn't a feminist?
I know the term has a bad history of being associated with the women who burned their bras, hated all men, and were quite extreme in their beliefs.
That isn't what feminism is about.
Feminism is the belief that women and men are equal.
Why is that such a radical idea for people to support?
If you believe in human rights you should believe in feminism, because it's just a category of human rights. It's really that simple.
Some people that I have talked with don't like the idea of women's studies, because we don't have a men's studies. To them I say this:
We wouldn't feel the need to learn more about women and celebrate our accomplishments, differences, and history if we hadn't been tossed aside, ridiculed, and treated like nothing for centuries.