Dark
Light

AI-based fraud on the rise globally

2 mins read
January 22, 2024

Russia becomes the latest country to express concern over bank scams as global organizations race to share ways users can protect themselves from sophisticated voice fakes. With the latest artificial intelligence technology, a human voice can be cloned with a three-second audio clip of the person speaking.

Last week, Russia’s Central Bank shared public concerns about the rise of fraudsters who are increasingly faking the voices of relatives and friends to financially deceive potential victims via phone scams.

RIA Novosti, Russia’s state-owned news agency, reports that scammers are increasingly imitating voices using special programs that convince a person to provide essential information or make money transfers based on their online digital portrait.

“We recommend that people be careful when posting personal and financial information on social networks and other resources,” the Bank warns. “Never send information from documents in instant messengers and social messengers, and do not enter your data on dubious sites. There is no need to make any monetary transactions at the request of persons via telephone.” If there are any doubts, the Bank adds, call a person directly to confirm.

The Central Bank has also published a list of phrases that telephone scammers commonly use.

Upon hearing them, recipients should immediately hang up. The phrases include such terms as “a loan application has been submitted”, “Central Bank employee”, “special or secure account” and more.

A new story in the Financial Times last week echoed the concern as data from Cifas, a not-for-profit fraud prevention service in the UK, reports that AI tools being used to fool bank systems increased by 84 percent over 2022.

Experts warn that systems available on the dark web, including WormGPT, FraudGPT, and DarkBART, enable criminals to offer malware-writing services or advanced phishing emails. “It can be hard to tell the authentic from the artificial these days,” Natalie Kelly, Chief Risk Officer for Visa Europe, told the FT.

With the latest technology, all that is needed to replicate a human voice is a three-second audio clip of a person speaking, which is often available through content posted on social media. The voice is then mimicked with an AI voice-cloning program to sound just like the original clip.

Targets who answer a phone can also be recorded, which then provides even more information for scammers to prey upon.

According to a report published by Seattle-based startup Hiya last year, respondents in 30 of 39 countries showed an increase in scam and nuisance calls.

As a result, significant fraud rate increases were particularly seen in Puerto Rico (375%), the Czech Republic (175%), and Australia (175%). The company, which sells spam-blocking software, claims to have identified up to 70 million spam and fraud calls every day in the second quarter of 2023.

They also identified the top five ways scammers may be spotted, which include impersonating Amazon employees, offering fake insurance policies and Medicare, posing as credit card company representatives, offering cryptocurrency deals or presenting themselves as a loved one in trouble via AI replicated voices.

American software company McAfee came to a similar conclusion last year when they released their “Artificial Imposters” report, which was based on a survey done with 7,000 people in the U.S., the U.K., France, India, Germany, Australia and Japan.

They learned that a quarter of those surveyed had experienced some kind of AI voice scam, with one in 10 targeted personally and 15% claiming it targeted someone they know. And 77% of the victims had lost money as a result.

Furthermore, 70% revealed that they couldn’t tell the difference between a real voice and an AI clone. For their part, McAfee researchers learned that they could replicate accents from anywhere in the world using voice-cloning technology, but particular vocal patterns, such as pace, were harder to replicate.

The issue has become so serious and widespread that the US-based Federal Trade Commission held a “Voice Cloning Challenge” this month on its website. Challenge submissions were asked to address ways to limit the use of voice cloning software, offer a solution for detecting cloned voices, and provide ways in which audio clips can be determined to be cloned voices. The challenge offered 25,000 dollars to the winner.

Scott Murphy

Scott is a journalist for Newsendip.

He is American and has been living in Hong Kong for years. He has extensive experience as a lifestyle journalist, interviewer and TV producer. His stories also appeared in other media like CNN, Hollywood Reporter, or South China Morning Post.