AI voice cloning shows potential abuse, threat to every day life

Hollywood actors and writers remain on strike, in part, over how 'artificial intelligence' might threaten their jobs, but the threat is already real in our everyday lives. Imagine getting a call from a loved one saying they were in trouble and needed your help.

Most of us wouldn't hesitate to do what needed to be done, even if the message was a convincing fake.

Houston tech expert Juan Guevara-Torres has been exploring a new online voice-cloning artificial intelligence service, that's left quite an impression n him, "Question everything you see and hear."

AI: Artificial intelligence can detect 'sextortion' before it happens and help FBI: expert

We won't share the company name, but he shows how it works, starting with a report we worked on, together, three years ago.

Guevara-Torres takes the audio of my voice and loads it into the website, which takes just moments to generate a cloned version of my voice that can say anything the user types. Despite some awkward inflections, it's very similar to the real thing.

"We're at a stage where AI could, potentially, train what we see and what we hear," says Guevara-Torres.

It can also be used against us.

RELATED: AI public safety investment to increase to $71B by 2030 to 'predict crime, natural disasters': study

Fox News recently shared the story of Gerry Scally, who got a call that sounded like it was from his grandson. "I said ‘What happened? What’s wrong?’ He says ‘I was involved in a car accident and I have a broken nose,’" recounted Scally. He, ultimately, hung up because the voice sounded a little ‘off’, and likely saved himself from a scam. It turned out that his grandson was just fine.

Guevara-Torres is concerned that, not far in the future, such happy endings may be in the minority as would-be crooks get easy access to these tools.

"There are potential uses of this technology that can be very good," he says, "The problem is that a lot of the bad ways that this technology can be used: they're chilling."

DOWNLOAD THE FOX 26 HOUSTON APP BY CLICKING HERE

The company that provides the voice cloning that we used did not respond to a request for comment.

The website does require users to acknowledge they 'do' have authorization to use the voice being cloned, which likely reduces their liability. Meantime, law enforcement shares some tips to avoid being victimized in similar scams:

  • Establish a safe word with family. If a caller doesn't use it, that's a clue that something's wrong.
  • Call or text the suspected victim from another phone.
  • Don't send money. Instead, call police if you suspect trouble.