Random
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Body positivity

I see the term a fair bit, and always miss out on the posts on here about it. But it feels like it's really not aimed at me? It always ends up being about women and how they should be confident in their bodies - which is obviously a good thing - and I'm just here on the outside looking in. It seems a societal expectation that men are completely confident in every aspect of themselves, and anything else makes you a worthless failure.

But it doesn't really matter, does it? If noticed at all, it's just dismissed as whinging from a pathetic worthless loser.
This page is a permanent link to the reply below and its nested replies. See all post replies »
I haven't done the math but I bet there are just as many guys as women in those posts