I Was Lied To
About the fantasies we were fed growing up and the expectations of our culture as Southern women.
Southern women are encouraged to behave in ways that honestly take their control from them and they are promised rewards for that behavior. I can only speak to what I was taught from childhood up through my formative as a teen and even into adult and motherhood. And I grew up in the Southern United States.