Was Sex A "Dirty" Word In Your Family?

sex Apr 20, 2019

My entire life, I have, along with millions of other girls, been told that sex is a degrading and shameful act. When I was 5 years old and beginning to discover the wonders of my body, my mother, completely horrified, told me that if I masturbated, my vagina would fall off.

The most striking view I was indoctrinated with was that sex is something women “have,” but that they shouldn’t “give it away” too soon -– as though there’s only so much sex in any one woman, and sex is something she does for a man that necessarily requires losing something of herself, and so she should be really careful who she “gives” it to.

The prevailing societal brainwashing dictates that sexuality and sex "reduce" women, whereas men are merely innocent actors on the receiving end. By extension, our virginity or abstinence has a bearing on who we are as people -- as good people or bad people, as nice women or bad women.

Women's ability to be moral actors is wholly dependent on their sexuality. It is, honestly, insane.

The virgin-whore dichotomy is an insidious standard that we have unfairly placed upon women. Women are supposed to be outwardly pure and modest, while at the same time being sexually alluring and available. If a woman does not have sex after a date, she will be labeled as a prude. If she does have sex, she will be referred to later as a ho or a slut.

Society thus sets up a norm in which women simply cannot win.

Anonymous

Close

50% Complete

Our Best Advice For Free!

Get the Conscious Relationship Video and Workbook -> 

Plus ongoing support as long as you're subscribed!