Voice-Clone AI Scams — it’s NOT ME on the Phone, Grandma

Voice AI tech being misused by scammers: Scrotes fake your voice and call your grandparents.

And then “you” beg them for money to get you out of jail. We’re talking about AIs such as ElevenLabs. Already the narrative has shifted from theoretical ethics to real victims “reacting with visceral horror.”

Stop the world. In today’s SB Blogwatch, we want to get off.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: The (funky) Chain.

Wolfie’s fine, honey

What’s the craic? Pranshu Verma reports—“They thought loved ones were calling for help”:

Visceral horror
Scammers are using artificial intelligence to sound more like family members in distress. People are falling for it and losing thousands of dollars.

As impersonation scams in the United States rise … technology is making it easier and cheaper for bad actors to mimic voices, convincing people, often the elderly, that their loved ones are in distress. In 2022, impostor scams were the second most popular racket in America, with over 36,000 reports of people being swindled by those pretending to be friends and family..

Although impostor scams come in many forms, they essentially work the same way: a scammer impersonates someone trustworthy — a child, lover or friend — and convinces the victim to send them money because they’re in distress. … Victims report reacting with visceral horror when hearing loved ones in danger.

ELI5? James Vincent explains like we’re five—“Beware AI voice scammers”:

Disappointingly predictable
Here’s the danger: If there are recordings of you speaking online, it’s trivial to clone your voice using free AI web services.

Scammers can call up your relatives, pretend you’re in trouble, and try to get you to transfer them money. … It’s troubling stuff, and disappointingly predictable.

Troubling, indeed. Trevor Mogg milks it—“AI is making a long-running scam even more effective”:

Many others will lose money
The fear is that with AI tools becoming more effective and more widely available, even more people will fall for the scam in the coming months and years. The scam still takes some planning, however, with a determined perpetrator needing to find an audio sample of a voice. … Audio samples, for example, could be located online via popular sites like TikTok and YouTube, while phone numbers could also be located on the web.

Some are calling for the companies who make the AI technology that clones voices to be held responsible for such crimes. But before this happens, it seems certain that many others will lose money via this nefarious scam.

Yikes. Something new to worry about. You can practically hear Perri Morrison sigh from here:

As the daughter and niece of 90 and 98 year olds, my battle with these charlatans is endless.

Wanna fight back? userbinator has good advice:

This is why shared secrets are important. You might be able to clone a voice, but you won’t know what its real owner knows.

As does Betsynotliz:

I’ve told my old people that if it really is someone who’d be asking you for help, they won’t be offended if you ask them a question that only they would know the answer to, in order to verify their identity.

Has it come to this? Avery Edison—@aedison—wonders aloud:

Is it weird to start setting up password challenges and code phrases with my family? … Attending the two-factor family reunion.

But phphphphp sticks their fingers in their ears and yells, “LALALALA I’M NOT LISTENING”:

We are not yet at the point where a voice can be mimicked accurately from a voicemail or a clip on social media. … [It] is not that this will never be possible (it will) but rather it is not happening right now and this article is just victims and journalists doing exactly what all scam victims do: Look for a reason they fell for the scam.

AI in the news? Must be AI!

And Thomas Brewster—@iblametom—is similarly skeptical:

Interesting, but it’s not 100% clear that AI cloning was used in the cases described. Possible that people are just easier to dupe … over the phone.

Meanwhile, flangola7 channels Arnold’s hacked T-800:

Your foster parents are dead.

And Finally:

A funky tribute to Christine McVie

Previously in And Finally


You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or [email protected]. Ask your doctor before reading. Your mileage may vary. Past performance is no guarantee of future results. Do not stare into laser with remaining eye. E&OE. 30.

Richi Jennings

Richi Jennings is a foolish independent industry analyst, editor, and content strategist. A former developer and marketer, he’s also written or edited for Computerworld, Microsoft, Cisco, Micro Focus, HashiCorp, Ferris Research, Osterman Research, Orthogonal Thinking, Native Trust, Elgan Media, Petri, Cyren, Agari, Webroot, HP, HPE, NetApp on Forbes and CIO.com. Bizarrely, his ridiculous work has even won awards from the American Society of Business Publication Editors, ABM/Jesse H. Neal, and B2B Magazine.

richi has 605 posts and counting.See all posts by richi