Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

"If it wasn't for us you'd all be speaking German!"

This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
This page is a permanent link to the reply below and its nested replies. See all post replies »
Some American volunteers went to the European conflict by joining Canadians on troop ships bound for England but the USA was not officially involved in WWII until the bombing of Pearl Harbour. The bombing of Nagasaki and Hiroshima would not have influenced the war in Europe much. The end of WWII came about both by Canadians finally taking key strategic beaches on the west coast of Europe and by the Russians who bated the Germans further into their cold north, flanking and surrounding them to cut off their supply lines completely.

Ever ask an American "to what HOLY SITE were the founders of the thirteen colonies headed for on their "PILGRIMAGE"? Or "To what REPUBLIC (swearing allegiance to the flag and to the republic on which it stands) are they swearing allegiance to? Like the Roman Empire did (and the Roman Catholic Church) Americans edited and rewrote their history.