Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

"If it wasn't for us you'd all be speaking German!"

This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
This page is a permanent link to the reply below and its nested replies. See all post replies »
mark245pineapple · 31-35, M
I don't remember much, but I can tell you without doubt we were not thought we were the ones to end the war.

We were thought we were of a major turning point to the war, but the russians were given their big credit where it was due.


Most of the pride went into the retaliation on Japan rather the involvement in Europe.
SW-User
@mark245pineapple I see. Finally someone answering the question, thank you.
mark245pineapple · 31-35, M
@SW-User dont take me wrong, there was still quite some egoistic feeling as we went through the chapters on WW2, but never was it claimed that we were the reason it was ended or won.

more like, had Japan not summoned us, things might not be what they are today.
you know what I mean?


if you go down the street and ask people "How did Germany lose the war?"
like 60% of them will more than likely answer "hiroshima and Nagasaki".
there is so much pride in that, that you would not be able to digest it.