"If it wasn't for us you'd all be speaking German!"
This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.