AI Makes Voice Cloning Scams More Convincing

Voice cloning scams have been around for years. However, the cons continue to evolve as AI improves and becomes more accessible and easier to use.

Source: ITRC | Published on March 27, 2024

Artificial Intelligence voice cloning

With the rise of artificial intelligence (AI)-fueled scams, how do you know if the person on the other end of a phone call is a friend or foe? Voice cloning scams have been around for years. However, the cons continue to evolve as AI improves and becomes more accessible and easier to use.

A recent high-profile voice cloning scam happened when New Hampshire residents received AI-generated robocalls mimicking U.S. President Joe Biden’s voice asking voters to “save your vote for the November election.” A more extreme example involves a business in Asia where cybercriminals used real-time voice cloning and deep fake video technology to stage a Zoom call where executives instructed a team member to wire $25 million.

Not only can AI be used to clone celebrities and public figures, but it can also be used in everyday voice cloning scams that claim to be from friends, family members or co-workers. These AI-fueled scams have many variations, ranging from election misinformation to kidnapping for ransom scams.

Who Are the Targets?

Identity criminals are not interested in attacking individuals unless they have a high net worth or they are an employee with access to business information or systems. Cybercriminals prefer to launch attacks they can automate and can be used against large groups of individuals – like automated phishing attacks. Voice cloning scams are the exact opposite of an automated attack on a large scale. That’s why businesses are more likely to be targeted, along with people with a high profile and obvious financial resources.

Identity criminals increasingly find their targets using AI programs to search the internet for information about people and businesses, including audio or video posts on social media or the web, as well as for details that can be used to make compelling calls to victims.

What is the Scam?

Scammers use AI tools to clone the voices of individuals they target on social media or the web to make calls to family, friends or co-workers. AI tools require as little as three seconds of a voice to create a realistic (enough) clone. Criminals can also spoof a phone number so it looks like a known caller. Using AI tools, criminals add sounds like laughter, fear and other emotions into the cloned voice, as well as sound effects like a subway station, an airport or a car crash. The technology is so advanced that scammers can also add accents and age ranges.

What they Want

Just like in traditional “Grandparent” or “Business Email Compromise (BEC)” scams, cybercriminals use a variety of tactics to create a sense of panic or urgency – like claiming a loved one is in danger, or an important vendor must be paid NOW! The criminals hope to scare people into sending money or sharing business or personal information that can be used in another identity crime.

How to Avoid Voice Cloning Scams

  • Hang up and don’t panic. Bad actors count on your fear or sense of duty to get you to take an action you otherwise would not. If you have any doubt about who is calling you and what they are asking you to do, hang up, collect your thoughts, and contact the person who supposedly called you and verify the situation. If you cannot reach them, connect with them through other family, friends or co-workers (if the call is business-related).
  • Be vigilant on all phone calls, even if you recognize the voice. Listen for odd statements, questions, or requests, especially for money or personal or business information. If you think something might not be right, ask questions that only the real person would know.
  • Avoid personalized voicemail messages. They can give bad actors easy access to your voice. Instead, use automated tools offered on mobile devices and office phone systems.