I'm in high school now and the actions of girls and women in general have started to disgust me. Their inferiority, disloyalty, differences from boys, stupidity, and their power over men sickens me. I don't want to be a sexist and I don't want to turn out being gay (not that there is anything wrong with that) when I grow up. I don't know where these ideas came from but its just all the things I saw around me and the classical texts I read depicting woman as disloyal and untrustworthy influenced my thoughts. What can I do to stop feeling this way?

I really don't want to offend any women out there, so I'm really sorry if reading this bothered you. Please don't judge me harshly.