The availability and sophistication of AI tools has changed the playing field for cybercriminals. Previously, these scams were carried out via text, voice or email, but now fraudsters can use cloned AI voices of the victim's loved ones. With inexpensive and easy-to-use tools, they can create customised messages through calls or voicemails, asking for financial help.
McAfee's research found that 53% of adults share their voice online at least once a week. While this may seem harmless, our digital footprint and what we share online can give cybercriminals the information they need to target our friends and family. With just a few seconds of audio from an Instagram Live video, a TikTok post or even a voice message, fraudsters can create a believable clone that can be manipulated to suit their needs.
A quarter of adults surveyed globally have experienced an AI voice scam, with one in ten having been personally targeted. In India, 47% of respondents said they had either been a victim themselves (20%) or knew someone else who had been (27%). The US comes second, with 14% saying it had happened to them and 18% a friend or relative.
By being vigilant and proactive, we can protect ourselves and our loved ones from this growing threat in 2024. Get protected today, contact us now!