Can someone tell me why Americans think that America is the best country in the world? Aside from the massive propaganda machine of course
It seems like there's a lot that is not that great:
Poverty, poor education, expensive healthcare, massive rates of incarceration, rampant obesity, huge hard drug imports, incredible racial tension.
I mean, the US is a developed nation for crying out loud.