Blog

AI Voice Scams 2024: A growing threat to your security

Written by Nimblr Security Awareness | Sep 16, 2024 4:00:00 AM

How do AI Voice Scams 2024 work?

The availability and sophistication of AI tools has changed the playing field for cybercriminals. Previously, these scams were carried out via text, voice or email, but now fraudsters can use cloned AI voices of the victim's loved ones. With inexpensive and easy-to-use tools, they can create customised messages through calls or voicemails, asking for financial help.

McAfee's research found that 53% of adults share their voice online at least once a week. While this may seem harmless, our digital footprint and what we share online can give cybercriminals the information they need to target our friends and family. With just a few seconds of audio from an Instagram Live video, a TikTok post or even a voice message, fraudsters can create a believable clone that can be manipulated to suit their needs.

The impact of AI Voice Scams in 2024

A quarter of adults surveyed globally have experienced an AI voice scam, with one in ten having been personally targeted. In India, 47% of respondents said they had either been a victim themselves (20%) or knew someone else who had been (27%). The US comes second, with 14% saying it had happened to them and 18% a friend or relative.

Protect yourself from AI Voice Scams in 2024

  1. Think before you share: Limit your posts to friends and family through social media privacy settings.

  2. Monitoring services: Use services that alert you if your personal information is available on the Dark Web.

  3. Code words: Create a code word with loved ones that only they know. Always use it if they ask for help.

  4. Question the source: Ask direct questions that can expose the scammer.

    Don't let emotions take over: Take a step back and think if it really sounds like the person before you act.

By being vigilant and proactive, we can protect ourselves and our loved ones from this growing threat in 2024. Get protected today, contact us now!