top of page

Scammers Get Even Sneakier with Voice-Cloning AI Technology

The rise of artificial intelligence has brought many benefits to our daily lives, but unfortunately, it has also opened new avenues for scammers to exploit innocent people. One of the latest and most concerning scams involves the use of voice-cloning A.I. tools to mimic the voices of victims' relatives claiming they are in desperate need of financial assistance.


Here's how the scam works:


You receive a phone call, and the voice on the other end sounds exactly like your grandson, granddaughter, or any other relative. They claim to be in a dire situation - perhaps they've been in a car accident, arrested, or hospitalized - and urgently need you to send them money. They may even sound emotional, crying or begging for help.


Naturally, you want to help your loved one in their time of need. But before you do, it's essential to take a moment and consider whether this call is legitimate. Unfortunately, scammers are becoming increasingly sophisticated in their tactics, and voice cloning technology makes it possible for them to sound almost identical to your family member's voice.


The Federal Trade Commission's consumer education specialist Alvaro Puig warned about this type of scam on the agency's site, stating that the AI cloning tool only needs 3 seconds of voice recording to generate a realistic sounding call. All it takes is a short audio clip of the person's voice, which is readily available on the internet or can be recorded from a spam call, and a voice-cloning app like ElevenLabs' VoiceLab.


This scam is particularly effective because scammers often target older adults who may not be as familiar with technology and may be more vulnerable to emotional manipulation. They may also be less likely to question a call from someone claiming to be their grandchild or another relative in distress.


To protect yourself from this type of scam:

  • Be wary of unsolicited phone calls, especially if they ask for money or personal information.

  • Verify the caller's identity by asking questions that only your family member would know.

  • Don't send money or provide personal information until you're sure it's not a scam.

  • If you're still not sure, ask another family member to call the person directly to confirm the story.

In conclusion, the rise of voice-cloning A.I. tools has made it easier for scammers to prey on unsuspecting victims, particularly older adults. It's crucial to stay vigilant and take steps to verify the caller's identity before sending any money or personal information. By staying informed and aware, we can protect ourselves and our loved ones from falling victim to these scams.

תגובות


bottom of page