Do you think it is fair that guy will make more money doing the same job as you? Does it piss you off and scare you when you find out about your friends getting raped? Do you ever feel like shit about your body? Do you ever feel like something is weong with you because you don't fit into this bizarre ideal of what girls are supposed to be like? Well, my friend, I hate to break it to you, but you're hardcore feminist. I swear.