Upset
Only logged in members can reply and interact with the post.
Join SimilarWorlds for FREE »

A.I. phone scammers can now imitate the voices of your loved ones.

If you receive a call from a family member or someone you know VERY well, saying they are in trouble, have been kidnapped, etc, etc, just realize that it has been happening a LOT lately, and send no money until you can verify that it is NOT an A.I. scam.

Scammers are pulling real voices from digital voice recorders and online videos and are [i]using[/i] them for illegal purposes.
caPnAhab · 26-30, M
This is a bit unnerving
masterofyou · 70-79, M
Pretty scary isn't?? AI could possibly be the true antichrist..... Just my take......
4meAndyou · F
@masterofyou Let's hope not.
masterofyou · 70-79, M
@4meAndyou Well, the genie is out of the bottle.....
cherokeepatti · 61-69, F
How do they know what they sound like?
4meAndyou · F
@cherokeepatti Apparently some websites like TikTok and Instagram and many others feature young teen videos. They talk about themselves and are very careless. It's easy for stalkers or hackers to figure out who they are and where they live. That's all they need, and a short clip of the teen's voice from a video. The AI does the rest.
Iwantout · 26-30, M
I got one of those before but I knew it was bs
Thanks for the heads up.

 
Post Comment