Home  >  Article  >  Technology peripherals  >  AI voice cloning creates security vulnerability

AI voice cloning creates security vulnerability

WBOY
WBOYforward
2023-05-31 13:43:12858browse

According to McAfee (McAfee), AI technology is driving a surge in the number of online voice scams, and it only takes as little as three seconds of audio to clone the victim's voice.

AI voice cloning creates security vulnerability

McAfee surveyed 7,054 people from seven countries and found that one in four adults had experienced some kind of In the form of AI voice fraud, 10% of people said they had been defrauded, and 15% said someone they knew had been defrauded. 77% of victims claimed to have lost money as a result.

In addition, security researchers from McAfee Research Labs have drawn the following important conclusions after in-depth research on AI voice cloning technology and its use by cybercriminals.

Scammers use AI technology to clone voices

Everyone’s voice is unique, equivalent to a biometric fingerprint, which is why listening Speaking to someone is a widely accepted way of establishing trust.

But 53% of adults share voice data online at least once a week (through channels such as social media and voicemail), and 49% share it up to 10 times a week , cloning someone’s voice is now one of the most powerful tools in a cybercriminal’s arsenal.

As artificial intelligence tools rapidly proliferate and develop, it is easier than ever for criminals to manipulate the images, videos, and voices of friends and family (the latter is perhaps the most disturbing).

McAfee’s research shows that scammers are now using AI technology to clone voices and then send fake voicemails or call victims’ address book contacts to pretend they are in trouble. 70% of adults say it is difficult to distinguish cloned voices from real voices. No wonder this technology is becoming more and more popular among criminals.

45% of respondents said they would respond to a scam voicemail or voice note pretending to be from a friend or family member, especially if the voicemail came from their partner or spouse (40%) , parents (31%) or children (20%).

Victims of AI voice fraud have suffered heavy losses

Parents over the age of 50 are the group most likely to respond to their children’s voices, proportion Reached 41%. Messages most likely to be responded to are those claiming to have been in a car accident (48%), been robbed (47%), lost their phone or wallet (43%) or need help while traveling abroad (41%).

But if victims fall for AI voice fraud, they often face heavy losses. More than one-third of victims claimed to have lost more than $1,000, and 7% were defrauded out of $5,000 to $15,000.

The survey also found that the rise of deepfakes and disinformation has led to people becoming more wary of what they see online, with 32% of adults saying they are now less trusting than ever social media.

McAfee Chief Technology Officer Steve Grobman said: "Artificial intelligence presents incredible opportunities, but as with any technology, there is always the possibility that this technology will fall into the wrong hands. In the hands of criminals, they are then abused. This is what we are seeing today, as easy-to-use AI tools help cybercriminals scale fraud in increasingly realistic ways."

voicecloninggetseasier

As part of McAfee’s effort to analyze and evaluate this new trend, McAfee researchers spent three weeks investigating the accessibility of AI voice cloning tools , ease of use and effectiveness, his team found more than a dozen free AI voice cloning tools online.

Free and paid tools are readily available, and many require only basic experience and expertise to use. For one tool, just three seconds of speech was enough to generate a clone that matched 85% of the time. Accuracy could be further improved if criminals put in more effort.

By training the data model, McAfee researchers achieved a 95% speech match using only a small number of audio files.

The more accurate the cloned voice is, the more likely it is that cybercriminals can trick victims into handing over money or taking other requested actions. Because these scams exploit the emotional vulnerability inherent in human intimacy, scammers can make thousands of dollars in just a few hours.

The evolution of voice cloning technology

Grobman said: “Advanced artificial intelligence tools are changing the rules of the game for cybercriminals. Now, they With almost no effort, it is possible to clone someone's voice and trick friends and family into sending money."

Grobman concluded: "The important thing is to remain vigilant and take proactive measures. to keep you and your friends and family safe. If you receive a call from a spouse or family member in need asking to send money, be sure to verify the caller’s true identity—use a previously agreed-upon password, or ask a code that only the other person knows. issues. Identity and privacy protection services will also help limit the digital footprint of personal information, which criminals can use to craft compelling stories when cloning voices."

McAfee researchers used the cloning tool they discovered and found that it was easy to clone accents from around the world, whether the victim was from the United States, the United Kingdom, India or Australia, but more unique accents were more difficult to clone. For example, the voices of people who speak at an unusual speed, rhythm or style will take more effort to accurately clone and are therefore less likely to be targeted.

However, the research team generally believes that artificial intelligence has changed the rules of the game for cybercriminals. The barrier to entry has never been lower, which means committing cybercrime has never been easier.

AI voice cloning prevention tips

Agree on the voice password. Set a verbal code word with your children, family members or close friends that only they know. Make a plan that requires them to always provide a code word when calling, texting, or emailing for help, especially if they are older adults or children.

Always question the source. If it’s a call, text, or email from an unknown sender, or even a call from someone you know, you should think calmly when you receive a distress call: Does this really sound like their voice? Will they ask you this? Hang up the phone, answer the caller, or try to verify the authenticity of the message before responding (and of course before sending money).

Click and share with caution. Who is in your social media network? Do you really know and trust them? Be aware of your friends and contacts online. The wider your network and the more content you share, the greater the risk of your identity being maliciously cloned.

Use identity monitoring services to help ensure that your personally identifiable information cannot be accessed or exposed if your private information ends up on the dark web. Stay in control of your personal data and avoid being impersonated by cybercriminals.

The above is the detailed content of AI voice cloning creates security vulnerability. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete