A lot of women I've known and seen who call themselves feminist are actually some of the most sexist and hateful people I've ever come across. It seems like the word feminism has taken on a new meaning from the original, a belief of equality among the sexes. I've known "feminists" who think women should be allowed to beat their husbands, shouldn't have to register for the draft, and that men should always hold the door for women and pay for dinner, etc. Call me crazy, but that doesn't really seem like equality to me. I'm not complaining, I'm just stating a fact. Having different standards for two groups of people isn't equality.
What do you think?