I see so many people complain about how "everything is racist these days" and "people just blame white supremacy for everything."
Yeah. You know why? Because racism and white supremacy actually are infused and embedded into almost everything in our country. We're just finally starting to acknowledge it.
And by "we," I mean white folks.
(To be clear, when I talk about white supremacy, I'm not just talking about the extremist/Neo-Nazi/KKK hate groups. I'm referring to the notion, conscious or unconscious, that white people are preferable, better, more deserving, or otherwise superior to non-white people—a notion that was widely accepted among white people throughout American history.)
The vast majority of people of color in America already know this to be true and have always known it to be true. White Americans, by and large, have been ignorant, oblivious, or in denial about how America's legacy of white supremacy impacts us.
There's a reason for this: