Asking
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

Do you think American Christians need to get out of politics and get back to the teachings of Jesus?

Evangelicals have created so much division and hate that "loving your neighbor" has become a radical idea. Agree or disagree? Please be respectful.
This page is a permanent link to the reply below and its nested replies. See all post replies »
Graylight · 51-55, F
I heard someone say yesterday, "Christian love shouldn't be so hateful."

I wanted to argue that, but it's hard to do these days with the evangelicals speaking so loudly for everyone. They've learned that politics is a cudgel they can wield to enforce (what I consider to be) primitive religious values. Politicd is also their shield.

I think if more evangelicals and other Christians would concentrate on the message of love that comes from the faith rather than restrictions to keep others out and Christians insulated, there'd be a lot more middle ground on which to set up tents.
Sarah333 · 31-35, F
@Graylight I heard something similar, "There is no hate like Christian love." Sad, but so true. I think the nail in the coffin for me was the embrace of GOP politics by American Christians. At this point, the word "Christian" provokes images and thoughts that are almost entirely negative, to me. I think of "the good ones" as exceptions, not the rule. The most hurtful, hateful things said to me came from Christians. It will take decades to repair the damage if it can ever be done. Younger generations are already moving on, feeling 'good without god.'