Asking
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Are women taking our jobs?

I’m noticing a trend now with men claiming women ruined the economy by entering the work force.

Is this the direction we are headed, America?

My grandpa survived the Holocaust and USSR. He had this idea that he would have some sort of freedom in the US.

I guess you are only free if you’re not a woman 💀 we are apparently ruining society with our jobs and abortions.

This page is a permanent link to the reply below and its nested replies. See all post replies »
Lets think about this a moment.

So...wasn't it Dan Quayle that espoused "family values" (the Murphy Brown speech) at a time when it was rapidly becoming necessary to have a dual-income household in order to 'live the American Dream.'
(Note: The average inflation rate of the dollar between 1987 and 1993 was 4.03% per year. The cumulative price increase of the dollar over this time was 27.20%)

So more married women joined the workforce.... Please note that: The Bureau of Labor Statistics found that married women earn 75.5% as much as married men (BTW: stats tell us it is 94.2% for women who have never married).

So the answer to your question IMO is rather easy: Employers in general pay women less.