Published March 17, 2023 five:02 p.m. ET

Artificial intelligence professional Marie Haynes says AI tools will quickly make it complicated to distinguish AI from a true person’s voice. (Dave Charbonneau/CTV News Ottawa)

As artificial intelligence technologies continues to advance, scammers are obtaining new methods to exploit it.

Voice cloning has emerged as a especially harmful tool, with scammers working with it to imitate the voices of persons their victims know and trust in order to deceive them into handing more than income.

“Persons will quickly be capable to use tools like ChatGPT or even Bing and ultimately Google, to build voices that sound incredibly a lot like their voice, use their cadence,” stated Marie Haynes, an artificial intelligence professional. “And will be incredibly, incredibly complicated to distinguish from an actual true reside particular person.” 

She warns that voice cloning will be the new tool for scammers who pretend to be an individual else.

Carmi Levy, a technologies analyst, explains that scammers can even spoof the telephone numbers of loved ones and good friends, creating it appear like the get in touch with is in fact coming from the particular person they are impersonating.

“Scammers are working with increasingly sophisticated tools to convince us that when the telephone rings it is in truth coming from that loved ones member or that considerable other. That particular person that we know,” he says.

Levy advises persons who obtain suspicious calls to hang up and get in touch with the particular person they consider is calling them straight. 

“If you get a get in touch with and it sounds just a tiny bit off, the very first factor you should really do is say ‘Okay, thank you incredibly a lot for letting me know. I am going to get in touch with my grandson, my granddaughter, whoever it is that you happen to be telling me is in difficulty straight.’ Then get off the telephone and get in touch with them,” he advises.

Haynes also warns that voice cloning is just the starting, with AI strong adequate to clone someone’s face as nicely. 

“Quickly, if I get a FaceTime get in touch with, how am I going to know that it is legitimately somebody that I know,” she says. “Possibly it is somebody pretending to be that particular person.”

As this technologies becomes additional widespread, authorities are urging persons to be vigilant and to confirm calls from good friends and loved ones ahead of sending any income. 

“There are all sorts of tools that can take written word and build a voice out of it,” says Haynes. “We are quickly going to be obtaining that scam calls are going to be seriously, seriously on the rise.”

By Editor

Leave a Reply