• Tue. Mar 21st, 2023

New voice cloning technologies makes it possible for scammers to impersonate any person


Mar 17, 2023

Published March 17, 2023 five:02 p.m. ET

Artificial intelligence specialist Marie Haynes says AI tools will quickly make it hard to distinguish AI from a actual person’s voice. (Dave Charbonneau/CTV News Ottawa)

As artificial intelligence technologies continues to advance, scammers are obtaining new approaches to exploit it.

Voice cloning has emerged as a specifically harmful tool, with scammers employing it to imitate the voices of people today their victims know and trust in order to deceive them into handing more than income.

“Individuals will quickly be in a position to use tools like ChatGPT or even Bing and ultimately Google, to make voices that sound incredibly significantly like their voice, use their cadence,” mentioned Marie Haynes, an artificial intelligence specialist. “And will be incredibly, incredibly hard to distinguish from an actual actual reside particular person.” 

She warns that voice cloning will be the new tool for scammers who pretend to be a person else.

Carmi Levy, a technologies analyst, explains that scammers can even spoof the telephone numbers of household and good friends, creating it appear like the contact is essentially coming from the particular person they are impersonating.

“Scammers are employing increasingly sophisticated tools to convince us that when the telephone rings it is in truth coming from that household member or that substantial other. That particular person that we know,” he says.

Levy advises people today who acquire suspicious calls to hang up and contact the particular person they believe is calling them straight. 

“If you get a contact and it sounds just a small bit off, the initially issue you must do is say ‘Okay, thank you incredibly significantly for letting me know. I am going to contact my grandson, my granddaughter, whoever it is that you are telling me is in difficulty straight.’ Then get off the telephone and contact them,” he advises.

Haynes also warns that voice cloning is just the starting, with AI strong sufficient to clone someone’s face as nicely. 

“Quickly, if I get a FaceTime contact, how am I going to know that it is legitimately somebody that I know,” she says. “Perhaps it is somebody pretending to be that particular person.”

As this technologies becomes far more widespread, authorities are urging people today to be vigilant and to confirm calls from good friends and household prior to sending any income. 

“There are all sorts of tools that can take written word and make a voice out of it,” says Haynes. “We are quickly going to be obtaining that scam calls are going to be actually, actually on the rise.”

One thought on “New voice cloning technologies makes it possible for scammers to impersonate any person”

Leave a Reply