A few months ago, and without much fanfare, Apple released a rather important new feature on its digital devices. Called ‘Personal Voice’, this software tool asks users to record themselves speaking a variety of sentences. The AI then analyzes the user’s voice, encoding the sounds into an algorithm. After setup is complete, users can type whatever they want into their device — and then listen as the device reads the text back to them in the user’s own voice.
But every tool can also be used in ways it wasn’t designed for. As voice cloning software is added to the cybercriminal’s toolbox, new types of scams have begun to surface. A familiar voice that can bring joy, can also be made to cause panic — and separate people from their money. “Mom, I’ve been arrested,” one such scam goes. “I’ll explain the details later, but for now I just need you to wire some money to help get me out of jail. Use this link …”
Another version: “Dad, I’ve been kidnapped! I’m so scared. They’re gonna kill me if you don’t pay them. Send $10,000 to this bank account — and please hurry!”
Voice cloning technology is already of good enough quality to fool most people, particularly if they are unprepared for this type of scam.
How, then, can people defend themselves against such tactics? Here are some ways: