Article count:10350 Read by:146647018

Account Entry

AI swindled 1.73 million yuan! The company was deceived by forging the boss's voice to ask for a transfer. The stolen money changed hands several times and the scammer disappeared without a trace

Latest update time:2019-09-04
    Reads:
Guo Yipu from Aofei Temple
Quantum Bit Report | Public Account QbitAI

"Come to my office."

"Who are you?"

"What? Can't you tell who I am?"

"Is it... Mr. Zhang?"

Then, you will find that the "Mr. Zhang" on the other end of the phone asks you to find a colleague in the finance department, and then asks you to transfer money to an unfamiliar account under the pretext of "accompanying a client" or "contacting senior management"...

Have you ever received a similar call? The key to this scam is that you have to misinterpret the voice of the boss in your company.

Fortunately, most people will not hear it wrong and therefore will not be deceived.

But now, with the help of AI speech synthesis technology, scammers can make their voices exactly like your boss, and a British company was actually fooled and defrauded of a huge sum of 220,000 euros (1.73 million yuan) .

The incident

According to the Wall Street Journal, a client of Paris-based insurance company Euler Hermes was duped.

The client is a UK-based energy company whose parent company is based in Germany and the one in the UK is a subsidiary.

One day, the CEO of the British subsidiary received a call from the "boss" of the parent company. After all, the parent company is German, so the boss of the parent company speaks English with a German accent. The CEO of the British subsidiary heard that the caller's English had a German accent, just like the boss of the parent company, so he decided it was the boss himself.

It turned out that the German "boss" was negotiating business with the "Hungarian supplier", and the accounts had to be settled through the British subsidiary, that is, the British subsidiary paid the money to the "Hungarian supplier", and the German parent company would later make up for the British subsidiary.

The German "boss" had a tight time frame and a heavy task, which required that 220,000 euros, equivalent to 1.73 million yuan, be transferred to the account of the "Hungarian supplier" within one hour.

After the money was transferred, the "boss" called again later that day. This time, the "boss" said that the German parent company had already transferred the 220,000 euros to the British subsidiary, and now the British subsidiary needed to transfer another sum of money.

The people at the UK subsidiary felt something was wrong. On the one hand, the money the "boss" had transferred had not arrived; on the other hand, why did the "boss" call from Austria this time?

Although the British subsidiary was not deceived for the second time, the 220,000 euros previously transferred could not be recovered. Investigators found that the 220,000 euros were transferred to the Hungarian account of the "Hungarian supplier" and then transferred to Mexico and then to other places, so the police could not find where the scammers were.

In order to maintain confidentiality for its customers, Euler Hermes Insurance Company did not disclose which company this was. In the end, they paid the claim in accordance with the requirements of their British subsidiary.

Voice cloning technology is quite mature

The core problem encountered by this defrauded company was that the caller sounded too much like the real boss of the parent company.

It is unlikely that the scammer himself has exactly this voice, but in order to make the boss's voice appear and say specific words, voice cloning or speech synthesis technology is used.

Voice cloning technology has already had many applications. For example, Sogou Input Method has previously launched a "voice changing" function on its terminals . You can speak a sentence into your phone, and when you send it to your friends, it will be the voice of celebrities such as Lin Chiling, Stephen Chow, and Gao Xiaosong.

At this year's iFLYTEK new product launch conference, the synthesized voices of Luo Yonghao and Lin Chiling were broadcast live.

Speech synthesis is even more mature. The voice packages of celebrities such as Lin Chiling and Yi Yang Qianxi used for navigation in the Baidu Maps and Amap maps that you often use are all trained using the corpus of the celebrities' own words and generated through speech synthesis technology.

Even the AI ​​that made harassing phone calls at this year's 315 Gala used voice synthesis as the voice of the incoming calls.

If you want to experience the effect of voice cloning, you can try this open source project of real-time voice cloning which has been very popular on GitHub recently:

Real-Time-Voice-Cloning
https://github.com/CorentinJ/Real-Time-Voice-Cloning

A pre-trained model has been provided in the project. After downloading it to local deployment, it only takes 5 seconds each time to convert what another person says into your voice.

That is why, since voice technology is easy to obtain, easy to use, and has good effects, it is inevitable that it will be targeted by bad guys.

If the voice is fake, how about video verification?

In the past, telecommunications frauds were carried out via text messages and WeChat messages, and the victims would be reminded by bank staff to make a phone call to confirm.

Now, I am afraid that I have to not only call but also video chat to make sure it is the real person. However, some AI technologies now can not only synthesize the voice of a specific person, but can even lip-sync and process videos based on the voice.

You can make static photos of people move and say specific words . The " Realistic Speech-Driven Facial Animation with GANs " paper by Imperial College and Samsung uses many static photos of celebrities:

Directly generate dynamic videos of speaking and singing:

Click on the video and listen. Doesn’t the facial expressions and lip movements look seamless?

Apple's closing price today was $191.4, and you couldn't tell it changed to $182.2.

You can also let the person in the photo talk with rich expressions and gestures . The technology published in the paper titled " Few-Shot Adversarial Learning of Realistic Neural Talking Head Models " from Samsung and the Skolkovo Institute of Science and Technology in Russia makes the static Mona Lisa:

Come alive:

You should know that many similar AI technologies are open source. If they are used for malicious purposes, such as to confuse the public or commit fraud, it will cause serious consequences.

If it doesn’t work, use AI to catch AI scammers

Whether it is processing voice, pictures or videos, the new functions developed by these technologies are generally believed to play a role in helping humans save time, increase fun, etc.

However, there are also many uses for evil. From AI face-changing applications that can ruin a person’s reputation, to face recognition that can infringe on privacy, to today’s AI scams, technology always has its two sides.

As former AAAI President Subbarao Kambhampati commented upon seeing the news:

I am shocked that voice technology can be used for such evil purposes! Didn’t Baidu clearly tell us that voice technology is used to allow busy mothers to sing lullabies to their babies?

Some people also suggested that using AI to fight AI might be a good approach:

We need an AI-assisted cybersecurity framework to address AI-driven cybercrime

Finally, if you received a call that sounded like your boss’s voice, how would you identify it?

Reference link:
https://www.wsj.com/articles/fraudsters-use-ai-to-mimic-ceos-voice-in-unusual-cybercrime-case-11567157402

-over-

AI Community | Communicate with outstanding people

AI Insider | Focus on industry development

Quantum Bit QbitAI · Toutiao signed author

Tracking new trends in AI technology and products

If you like it, click "Watching"!


Latest articles about

 
EEWorld WeChat Subscription

 
EEWorld WeChat Service Number

 
AutoDevelopers

About Us Customer Service Contact Information Datasheet Sitemap LatestNews

Room 1530, Zhongguancun MOOC Times Building,Block B, 18 Zhongguancun Street, Haidian District,Beijing, China Tel:(010)82350740 Postcode:100190

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号