Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

"If it wasn't for us you'd all be speaking German!"

This is a serious question cause I've gotten this or similar responses quite a few times and I am really confused.
What kind of history are people in the US taught? Are you seriously taught that it was America that ended WWII? Because what we learn in Europe (or at least in many European countries) is that the US were the last ones to join and the war would never be over if it wasn't for Russia (and the Russian winter).
Do not try to attack me! It is an actual question and I need serious answers, it's like people on the two sides of the ocean are getting two completely different versions of the story.
This page is a permanent link to the reply below and its nested replies. See all post replies »
tj786100 · 51-55, M
I should probably say here too that 1) America is HUGE, so teaching is not the same everywhere (both in quality overall and focus on WWII), and 2) FAR, FAR, FAR too many Americans are completely uninterested in its history.

Seriously, my in-laws are completely oblivious to history, and they are otherwise intelligent people. Without a question, many of those who DON'T PURSUE learning and knowledge of WWII and how it really went would say "the US won it for those other guys".

I guess I'm saying, it isn't necessarily the actual education in many cases, but rather the shallowness or interest of the individuals receiving the education. My mom and dad lived during WWII - so I grew up with stories and with a natural interest. In the years since then, there is just no tie for people to the war, and so education and interest has waned significantly which causes a tendency to oversimplify to "yep, we won it".
SW-User
@tj786100 I see. Someone else actually mentioned propaganda and shallowness and I hadn't even considered that. Thank you for making it clear, that was the whole point of the.question