The Dark Side of Southern Charm
So, for those of you that don’t know, I grew up in the American South. To be more specific, central and southern Alabama. I’m Appalachian-raised, and we often went to the mountains for our weekend trips and even some vacations. Or we went to the gulf coast.
My parents and most of my family were lower-middle class. In some cases, they were downright poor and on public housing assistance. Growing up, I knew what food stamps were and understood that I couldn’t go to the same places as many of my friends because we couldn’t afford that luxury.
We made do. I know what government cheese is and ham loaf. I grew up with a garden in our yard everywhere we lived because that’s how we could afford vegetables better than from a can. I was taught to work hard for my needs and not to complain. My parents both had amazing work ethics. That was for sure.
I have always been a proud Southerner. We have a heritage of enterprise and quiet strength. When you think about it, even the Hunger Games author, Suzanne Collins, made good use of those positive characteristics when she described District 11 and 12 in her books — they were the south-east gulf coast and the Appalachian areas of Panem.
However, there are things folded into Southern culture that a lot of people don’t see or overlook. I’d like to admit to some and touch on others. In other words, if you only think of sweet tea, peach cobbler, and drawling accents — all positive things when you think of the South, well… bless your heart.
The American Myths
I grew up in a place that was culturally diverse by number and yet segregated by the institutions playing with people’s lives. Growing up, I didn’t know there was such a thing as institutional racism because my school books taught me that things got better for black men since Reconstruction in the South and the Civil Rights Movement.