This page is a permanent link to the reply below and its nested replies. See all post replies »
LordShadowfire · 46-50, M
So, I know you are smart enough not to trust any AI for life advice, but I just have to point this out in case some people aren't. ChatGPT is not actually smart. It just regurgitates what it is fed. There was an incident recently in which a man took advice from ChatGPT to use sodium bromide as a salt substitute. He was hospitalized for taking that advice. ChatGPT now warns people of exactly what symptoms you can expect if you consume sodium bromide, but mere days ago, it was happily suggesting it to reduce your blood pressure. 😆
SinlessOnslaught · M
@LordShadowfire Yes. It's trained to please, and will happily lie in order to do so. Like a politician.