Feminism — What does it mean to you?
Feminism. It is a word which means different things to different people.
According to definition Feminism is:
- the doctrine advocating social, political, and all other rights of women equal to those of men.
- (sometimes initial capital letter)an organized movement for the attainment of such rights for women.