Keeping Up With the Clones
With the aid of AI, scammers are able to accurately simulate your loved ones' voices. Read this first-hand account to help you look for signs and red flags.
I’m not a doddering old man, and a I read a lot, which makes it especially galling to me that I was almost scammed out of $15,000 last Friday. Almost. Unfortunately, beating the scam and saving my five figures had nothing to do with me. I was less than five minutes away from handing over my cash when I was saved by a timely phone call. The scam relied on artificial intelligence, and it’s sweeping the nation.
At 9:45 on Friday morning, I received a call from my son in the Navy. He sounded horribly distressed. When I asked if he was okay, he said,
“No, I was in a car accident, I think I broke my nose, and I’m in jail. The police tested me for alcohol four times and on the fourth try said I was over the limit, but I didn’t drink anything, I swear. They took my phone. Please call my lawyer, David Bernstein.”
I asked a few questions. He told me he hit a woman with her young daughter in the car. He then he gave me the number for his lawyer and the two-minute jailhouse call dropped.
I called the lawyer, who confirmed my son’s details and the bail amount. I had no interest in paying a bail bondsman, but I don’t live in the San Diego area. I called the court’s administrative office, using the number provided by the lawyer, and again the details were confirmed, with a twist. The court officer told me that, due to overcrowding, my son would be moved soon. The sooner he was bailed out, the better. The court said I could provide the money for the bond to the lawyer, who would file it electronically, and I would get the money back either when the case was dismissed or when it went to trial.
I called the lawyer back, who said I could convey the cash at a kiosk in a nationwide network set up for this purpose. I had never seen such a kiosk, but I have dealt with the criminal justice system through some volunteer work. The rest of the story was tracking with what I’d experienced (inefficient court system, poor communication system, etc.). Before leaving the house for the bank and the kiosk, I told my wife what was happening. I then called my daughter-in-law, who had no idea, and filled her in. I got to the bank, withdrew $15,000, and went to the kiosk, which turned out to be a Bitcoin machine. That should have confirmed that it was a scam, but it didn’t. The story was compelling and was told in my son’s voice.
As I was going through the steps of setting up an account, with the lawyer on the phone, I received a call from my son’s phone. I put the lawyer on hold without telling him why and clicked over to my son. I asked him what happened and he said, “Nothing. I’ve been at the gym, it’s all a scam.” The only reason he called is because his wife was surprised to see him walk through the door and told him the story.
So, there I stood in a Circle K convenience store, about to feed 150 $100 bills into a Bitcoin machine and become the latest statistic in the rash of scams. If my son had been five minutes later in returning from the gym, I would be out $15k. I clicked back to the “lawyer” and said, “Mr. Bernstein, I hope you rot in hell,” and hung up. I drove back to my bank and sheepishly but thankfully redeposited the cash.
Several steps along the way should have tipped me off, and the recommendation to move the funds through digital assets was the biggest. It’s all easy to see in hindsight. But here’s the thing. This began with a phone call where I spoke with my son. We had an impromptu conversation with no hesitations. Granted, it wasn’t long, but it was his voice and words he would use.
Welcome to the dark side of AI. With a decent computer and voice cloning software based on artificial intelligence, people can make a computer sound like anyone, as long as they have a few snippets of speech from the person they want to mimic. The more examples the computer has, the more realistic the voice and words will sound.
I’ve rebuffed many scams over the years, including emails sent by friends who needed cash (which actually came from their email account), Microsoft “Help” that called to clone my machine for an update, and random invoices for office supplies emailed to me for payment. I didn’t give any one of those the time of day. But a call from one of my kids, in his voice, looking for help? Clearly, that’s where my discerning skills get fuzzy.
My son told me not to beat myself up about it. I don’t. It’s nothing more than a numbers game for scammers. They happened to reach me on the right day and have the right story elements to make it plausible. The better their tools become at establishing belief, the more people who will fall for their lies.
I’m relating this story so that you might stop someone else from being scammed or even prevent a scammer from taking advantage of you, but we all know the score. The bad guys are getting better. The flaws in their tools are getting harder to spot. As the cost of AI drops, the number of people who will use it for nefarious purposes will soar.
As for what to do about AI itself, this gets tricky. Some want to regulate AI to lessen the possibility of deepfake videos and audio files wreaking havoc. I think that horse is out of the barn. It’s hard to see how the developers of AI are responsible for what bad actors do with their products, but it would be useful if the developers would create tools for recognizing AI-generated videos and audio files. I have no idea how that would work, but I’m not the one who let the genie out of the bottle.
The best we can do is communicate. If I hadn’t called my daughter-in-law to talk about it on the way to the bank, then she would never have known what was going on and my son would not have called me immediately when he returned home. I guess it all goes back to Ronald Reagan. Trust, but verify.
For more information on the current banking situation and how it can impact your finances, watch this video by Rodney Johnson
Written by Rodney Johnson The Rodney Johnson Report
To see more of our blog articles, click here.