Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Hollywood films shove progressive messages down our throats in the most self-righteous manner possible? Read below

So, I love the USA, but I always hated those bullshit ass patriotic guilt trips that call me to prove my loyalty to this nation at every football game and what not; I find it to be insulting, honestly. I basically feel like the left shoving random images of minorities breaking sterotypes in my face is the left's answer to patriotic virtue signalling. Just like "hey, we need to show more women of color as scientists cause there aren't enough". Now, my favourite Disney movie was Mulan, i'm obviously not against breaking tradition in movies, like, not at all! But back in the day, movies that stood for something didn't feel forced, and actually conveyed their ethic in a deeper, more fleshed out way. A lot of the Disney classics for instance were about a character being stuck under a system that prevented them from following their heart, Like Mulan, or Ariel, Or princess Jasmine, or pocahontas. But now? Movies are basically just flexing on how diverse they are to win social browny points; it is self-righteous, annoying, insulting, and lame. Seriously, like directors just want pats on the back these days, and doing good deeds so that you will be seen by men is prideful intent, and is no good at all.
This page is a permanent link to the reply below and its nested replies. See all post replies »
Longbeachgriffy does a skit about this and it's legit as fuck. Diversity is the selling point