Will sex ruin my relationship or not? HELP PLEASE.
I've been together with my boyfriend for 2 years. We both really love each other (not being childish or anything) We never had sex, he never pressured me either or anything. But lately we've been thinking about doing it... I honestly think I am fully ready, like absolutely no doubt. But my in family, they always told me that once a men has sex with you, they won't view you the same anymore. It's like getting "the prize" then turning your back... Is it true if both of us really love each other? Will it ruin our relationship or make it better?