Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

"If it wasn't for us you'd all be speaking German!"

This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
This page is a permanent link to the reply below and its nested replies. See all post replies »
xixgun · M
Every country teaches that their's was the country that won the war... regardless of which war.

If it's any consolation, history books in the US are routinely "rewritten", so as to keep with the political and social narrative of the day. The fact that history is made up of both good and bad things, means nothing.

This is why US college students are tearing down Civil War era statues (because they don't know that most of the statues represent those who fought slavery and sought equality for all, regardless of which side they fought for).

[media=https://www.youtube.com/watch?v=gczkM8cL2hs]