Body positivity
I see the term a fair bit, and always miss out on the posts on here about it. But it feels like it's really not aimed at me? It always ends up being about women and how they should be confident in their bodies - which is obviously a good thing - and I'm just here on the outside looking in. It seems a societal expectation that men are completely confident in every aspect of themselves, and anything else makes you a worthless failure.
But it doesn't really matter, does it? If noticed at all, it's just dismissed as whinging from a pathetic worthless loser.
But it doesn't really matter, does it? If noticed at all, it's just dismissed as whinging from a pathetic worthless loser.