Asking
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Do you think the united states of america is a racist country?

I want your opinion. I think that it has some racist mechanism with immigrations, law enforcement, and justice system but overall I will say no because you can be whatever it is you want to be in the united states.
This page is a permanent link to the reply below and its nested replies. See all post replies »
TinyViolins · 31-35, M
The United States has a long history of racist policies and practices that were legal up until relatively recently. It's so recent that its legacy is still lingers in aspects of peoples lives and perpetuates racial inequalities.

That isn't to say that incredible progress hasn't been made, because attitudes and achievement outcomes today are generally far better towards minority groups than they've ever really been.

Still, racism casts a long shadow and minority communities are still dealing with the aftershocks of intentional and unintentional racial discrimination. Things that have attempted to help, like the war on drugs or war on poverty or tough on crime policies, have often backfired and exacerbated some negative outcomes without meaning to.

I mean, does failing to remedy these negative outcomes and leaving these communities to pick up the pieces from so many generations of discriminatory policies and practices could make the US a racist country? Probably not. But I feel like the US is still going to need to do some work to gain back the faith and trust of the people it's wronged