Recently in the media I have noticed the rise of news articles with titles similar to, ‘Why I Would Never Call Myself a Feminist.’
These articles include things about feminism that really wouldn’t be a problem if people were educated about what it means to be an advocate for women’s rights. Unfortunately in the 21st century, a lot of the time, a feminist is seen as a woman who doesn’t shave and hates men.
But truly a feminist is any person who supports the EQUALITY of all sexes. A lot of the time the term ‘feminist’ brings forth many negative thoughts.
Getting the rights we protest for and strive for wouldn’t make men less than us. It would make us more equal.
This is something that I understand but feel like plenty of other people just don’t. I understand that demonizing men is not going to solve any problems.
All in all it’s important for people to be educated before they go spitting ‘facts’ about feminism.
If there wasn’t so much controversy surrounding the topic we’d be able to make way more progress than what is happening.
I’d hate to say it but, in my opinion, feminism has made hardly any progress since we got the right to vote nearly 100 years ago.
We still have a low male to female ratio in congress, the pay gap between men and women is slowly widening, and more.
Women’s rights are a mess right now.
As long as people think that feminists are seen as crazy men haters nobody is going to take our points seriously.
If your feminism isn't intersectional we don't want to talk to you.