The Washington Post reports that scammers are using high-quality AI-generated voice technology to impersonate loved ones and persuade victims that they are in distress and need money urgently.
One example in the article is about the parents of a man named Benjamin Perkin, who were victims of an AI voice scam. A criminal pretended to be a lawyer and told them that their son had been involved in a car accident that killed a U.S. diplomat. The scammer used voice cloning technology to create a fake conversation between the parents and a synthesized version of their son. The scammer then convinced the parents to send $21,000 via a Bitcoin ATM to cover legal fees for their son.
From the Post:
Article From & Read More ( How scammers are using AI voice cloning to trick victims into sending money - Boing Boing )Perkin's parents later told him the call seemed unusual, but they couldn't shake the feeling they'd really talked to their son.
The voice sounded "close enough for my parents to truly believe they did speak with me," he said. In their state of panic, they rushed to several banks to get cash and sent the lawyer the money through a bitcoin terminal.
When the real Perkin called his parents that night for a casual check-in, they were confused.
It's unclear where the scammers got his voice, although Perkin has posted YouTube videos talking about his snowmobiling hobby. The family has filed a police report with Canada's federal authorities, Perkin said, but that hasn't brought the cash back.
"The money's gone," he said. "There's no insurance. There's no getting it back. It's gone."
https://ift.tt/7LmhHId
Tidak ada komentar:
Posting Komentar