New AI systems can create realistic audio clips using just a snippet of a target’s voice. Scammers can easily source from content posted online and allegedly use the voice of a victim’s loved ones to hook victims into forking over cash.
KIRO7 reported last week that a family in Tacoma, Wash., recently encountered a phone scammer they believed to be their 16-year-old daughter, informing them of her involvement in a serious car accident.
The news station reported that the scammer, using voice-cloning software, demanded at least $10,000 for their daughter’s safe return.
The incident, which involved local law enforcement, pushed the county’s Sheriff’s Department to issue a warning about the rise of phone scam using voice-cloning software.
“Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie,” said the post from Pierce County, originally published by the FTC in March. “We’re living with it, here and now.”
The FTC noted that consumers should create plans with their family members to avoid falling victim to scam calls using voice-cloning tools.
If targeted, victims should verify their loved ones’ situation using a known phone number, or try to get in contact with another family member or close friend.
Additionally, they should watch out for requests to hide money trails, such as wiring money, using cryptocurrency, or sharing gift card details, the FTC advised.
Full Link ( Here )
© CopyRights RawNews1st