Voices of Concern: AI Voice Cloning Scams on the Rise, Targeting Innocent Victims”.

Thieves Utilize AI Voice-Cloning Technology to Steal Funds

Scammers are using AI voice cloning to deceive individuals into sending them money quickly, according to a report by the Better Business Bureau (BBB). This scam involves creating a clone of someone’s voice using just a three-second clip, which can often be found on social media. Once the scammer has cloned the voice, they use it to manipulate individuals into sending them money under the guise of urgency.

Nicole Cordero of BBB Eastern Carolinas warns that scammers can sound like someone you know and create a sense of urgency to pressure you into sending money without thinking. By using advanced technology, scammers can mimic voices and phrases to trick individuals into falling for their ploy.

Computer science professor Paul Cerkez, who specializes in artificial intelligence at Coastal Carolina University, explains how scammers can create a database for cloning a person’s voice. By recording snippets of that person’s voice, scammers can generate text that is spoken in the cloned voice to appear authentic and convincing.

In 2023, consumers lost a staggering $10 billion to fraud, with imposter scams being the most common. Scammers typically ask victims to wire money or pay using money transfer services like Venmo or Cash App. To protect yourself from these scams, the BBB suggests being cautious when receiving unsolicited calls requesting money and always questioning the legitimacy of the request before securing your accounts.

Leave a Reply