Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

"If it wasn't for us you'd all be speaking German!"

This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
This page is a permanent link to the reply below and its nested replies. See all post replies »
Hitlers obsession with Russia did play a serious part in Germany losing the war-Churchill was quite keen on invading Russia after it ended but the Americans talked him out of it.
American help was invaluable in ending the war but they were late players in the game.Pearl Harbour helped in getting them involved.But if Americans are taught that they and they alone ended the war in Europe then that’s a bit rich.I can’t believe they are.The War in Asia is another matter.The Nuclear bombs undoubtedly caused Japans surrender.