I Just Don’t Understand Why Black Women Who Have to Constantly Justify Their Humanity Are So Angry All the Time

White Woman Speaks:

I love being a woman. Despite all of the struggles we continue to face, women have made so much progress in society, and my own womanhood brings me an immense amount of joy. That’s why it’s so difficult for me to understand why black women, who are still forced to justify their humanity to everyone including other women, are just so angry all the time.

 

I just don’t get how any woman could not be even a little bit happy at our rapidly increasing presence in our culture. Our foremothers fought and died for us, the future generations, to be able to claim what we have today. And even though those lady freedom fighters specifically excluded black women from their activism and forced black women to have to forge their own separate movement to get society to see them as deserving of the same rights, it’s hard for me to fathom why they’re so irate all the time, you know what I mean?

 

Being a woman is amazing! Not only are we fierce and powerful and soft and strong, our bodies have the unique and special power to create and nourish new life. If only black women could focus on that remarkable aspect of womanhood, instead of having to go on and on about the major medical discrimination they face regarding pregnancy and childbirth care as a result of their historical dehumanization in society and the implicit biases of many, many people. Maybe then they’d be a little less bitter?

 

 

It’s so important to acknowledge all the good that is happening for women right now. We’re making our way into science, technology, business, the arts, and even the White House! Sure, there are so many factors that prevent black women from being able to achieve at the same rate as white women, including the very fact that they are black and not white. But why do they always have to be so mad about those all of it? Didn’t anybody tell them that positivity is key?

 

Look, we’re in this age where women’s issues and feminism are reaching the mainstream, and women are finally able to feel good about themselves in a way they have never been able to before. So while I get that black women may have had a different life than me, I don’t get why they’re always yelling about the fact that their very existence as humans in this country have to constantly be reaffirmed at great mental and emotional cost in a way that I will personally never have to experience. Am I right, ladies?