Yes. I am well aware.
We claim to be the most free democracy on earth but all the nonbiased polls place us near the bottom of the list.
If you actually look and see what rights people have in other democratic nations it becomes very apparent that we have been lied to; in the UK people have a right to wander. They can walk through and hang around for a reasonable amount of time on private property so long as they do no harm.
Here we have castle laws. If someone catches you walking on their land they can shoot you dead no questions asked.
In the US we are oppressed. The difference is that mostly we oppress one another instead of some big government doing it...but there is some of that going on too.
Weed is illegal. We have no right to healthcare. Price gouging is tolerated on all but the most extreme levels.
Millionaires are allowed to slowly gain control over our food supply.
The USA is a sad fucking joke compared to what it was "supposed" to be.
Love it or leave it? If it was that simple we'd be depopulated in a decade. Americans are unwanted by civilized nations because they know the damage that has been done to most of our headpsaces by this system.
They dont want violent puritan idiots pouring into their countries. Are we all like that? Of course not. But that is the image our country has created for us.
Not to mention the IRS has fucked up rules for expats