Scammer uses Deepfake voice to steal ~RM1 million

fdgsa.jpg

If you thought that deepfake videos created with a single photo was bad, wait till you hear about this. A scammer managed to trick an employee into transferring USD$243000 (~RM1022544) by deepfaking their CEO’s voice.

The victim thought that he was talking to the company’s CEO. He mentioned that the voice he heard had the same German accent and the same lilt. The scammer called the victim thrice. Once to ask for the transfer, once to say that the money had been reimbursed and once more to ask for more money. However, after the first two calls, the victim grew wary when he saw that the number used was an Austrian number. 

Euler Hermes fraud expert Rüdiger Kirsch told WSJ that this was the first time the company had dealt with crime involving AI. In recent months, the issue with Deepfake has become more prevalent thanks to commercially available software out in the market. 

This issue is particularly scary as it can not only be used for defamation and blackmail for political gain, but it can also be used for impersonation for clearance, identity theft and more. It is rumoured that in a few years, only a single second clip of someone’s voice is needed to create full sentences. Recently, it is found that only a single photo is needed to create a full video of someone. What do you think? Keep up with the latest global tech news on TechNave.com!