Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

"If it wasn't for us you'd all be speaking German!"

This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
This page is a permanent link to the reply below and its nested replies. See all post replies »
SW-User
Canadian, I'm told Americans were the last to join WWII, as well. I've also heard Russia played an insurmountable part in ending the war. I've never tried to verify those accounts, but what you say adds up to me.
SW-User
@SW-User Okay, I seriously needed someone non-European and not from the US to confirm that I'm not going crazy. Thank you
SW-User
@SW-User America rewrites history all the time. It's egocentric, but that seems to be their salvation.
@SW-User i can confirm that what taught in Asia is the same with European version
SW-User
@YukikoAmagi Thank you!!