In South Korea, financial fraud using deepfake videos starring celebrities is on the rise. Deepfake content is computer-manipulated images or videos using artificial intelligence (AI) technology to display digitally generated graphics that are not easily distinguishable from real footage.
Earlier this year, a group of investment fraudsters in South Korea used deepfake videos to deceive more than 100 people into investing, causing losses of hundreds of millions of Korean won. One of the scammers, who went by the name of a group manager, deceived his victims into claiming he was a close aide to "Battery Man" Park Soon-hyuk, who is well known to electric car battery investors on YouTube. The manager claimed that he would use the funds provided to invest on behalf of group members.
The fraud gang also uploaded a video online for investors, in which Jo In-sung and Song Hye-kyo praised the unauthorized trading gang because part of the investment had been "donated." Many investors, including the victim, who is in her 60s, believed the fraud was legitimate after seeing the video because they believed famous stars were involved with them.
Eventually, however, it was all revealed to be a hoax. The victims demanded refunds from the management team, but they disappeared after claiming commissions and taxes.
The video is a deepfake using artificial intelligence technology to make it look like the actor is speaking in the clip. This manager is not Park Soon-hyuk's secretary, and he has also achieved an 800% return on investment, but this is all an illusion.
However, concerns about the use of deepfake videos for financial fraud are not limited to South Korea.
According to media reports, earlier this month a Hong Kong company employee was tricked into accidentally transferring HK$200 million (US$25.5 million) after being deceived by a deepfake video of the company's financial officer.
In December, the likeness of Singapore Prime Minister Lee Hsien Loong was used in a scam; in November, Australian entrepreneur Dick Smith was used in a deepfake video. Both figures were described as being interviewed to promote an investment platform, which resulted in actual fraudulent losses.
Unfortunately, there is no way to prevent financial fraud from deepfake content other than “not being fooled.”
Measures such as individually flagging deepfake content are being discussed, but would be of no use if criminals remove it.
Many people believe that the most effective way to prevent deepfake video crimes is through prior detection, but there are limitations because detection technology cannot keep up with the rapid development of generation technology.
Some say policies against this new type of fraud need to improve because it is nearly impossible to recover victims' losses from perpetrators.