Table of Contents
What was once a far-fetched idea out of a sci-fi movie is now a reality, Artificial Intelligence is a part of our lives and it will only grow from here. AI offers many perks and can enhance your skills, however, it also, unfortunately, enhances and helps people with malicious intentions. That’s where AI scams come in.
AI Scams
This terrifying new kind of scam involves using AI to clone the voice of your loved ones. All they need is a short audio clip of your family member’s voice — which they could get from content posted online — and a voice-cloning program. When a scammer calls you, they’ll sound just like your loved one.
During the call, your loved one will be speaking with a panicked voice, telling you they’ve gotten in some kind of trouble-maybe went to jail, or got in a car accident. But you can help them by sending money. This ruse is becoming more and more convincing as victims report reacting with visceral horror when hearing loved ones in danger. There was one instance where a mother from Arizona, received a phone call from a stranger claiming that he kidnapped her daughter after they got into a car accident. The phone call featured her daughter’s voice screaming and asking for help, the scammer then proceeded to ask for a $1 million ransom.
This dark phenomenon is on the rise and all a scammer needs is a voice sample that can be taken from a video posted online and some basic AI tools and software that are accessible to everyone. Scammers tend to target older, especially grandparents as they would be more vulnerable to falling for this type of deception.
How The Technology Works
Essentially, scammers take 30 seconds, a minute, or a few minutes of a person’s voice. It can be found on Facebook, Instagram, and YouTube … Which is then uploaded to a website. And then the AI software analyzes everything that makes one’s voice unique—age, accent, gender, regional differences, and every little part of one’s voice. Then, it goes into a vast database of voices that it has analyzed. After that, if you’re going to say, for example, the word “hello,” it knows how you would say hello, how every little phoneme might sound. Using these online tools, you can type in “Hi, I’m in danger,” and using a cloned voice, you can make it say that.
How To Protect Yourself
The American Federal Trade Commission published a blog post giving tips on how to avoid and protect yourself from such scams. Since there still aren’t any official regulations on the creation of deep fake content, you should keep a few things in mind if you ever find yourself receiving one of these phone calls.
Don’t trust the voice, no matter how accurate it sounds. Call the person who supposedly contacted you and verify the story. Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.
Scammers will ask you to pay or send money in ways that make it hard to get your money back. If the caller asks you to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs, beware as those could be signs of a scam.
Make sure to warn and send this article to your loved ones, especially older individuals such as parents and grandparents as they are the primary targets of this scam.
If you spot a scam, here are some police portals to report it in the UAE: https://u.ae/en/information-and-services/justice-safety-and-the-law/cyber-safety-and-digital-security/report-cybercrimes-online
Stay updated on all of the latest news by subscribing to the ITP Live newsletter below or by clicking the push notifications.