True crime news logo
  • Podcasts Movies & Series Books
  • News
True crime news logo

The international true crime destination. Cases, documentaries, podcasts and travel routes.

© 2026 truecrime.news. All rights reserved.

Boardroom used in deepfake video conference scam, Hong Kong 2024

Deepfake CFO Call: How Fraudsters Stole $25 Million in Hong Kong

By
Susanne Sperling
Published
February 6, 2024 at 10:00 AM

In February 2024, a finance employee at a multinational company in Hong Kong transferred HK$200 million — approximately USD 25.6 million — to fraudsters who had used deepfake technology to impersonate the company's CFO and multiple senior colleagues in a live video conference call. The case is the first confirmed large-scale corporate fraud in which AI-generated video deepfakes were used to deceive employees in real time.

A call that looked real

The employee received an email that appeared to come from the company's UK headquarters, instructing him to initiate a series of large transfers. Suspicious, he joined a video conference to verify the request. On screen, he recognised the CFO and several colleagues — all of whom, he later told police, appeared and sounded completely normal.

None of the people on the call were real. Hong Kong police confirmed in a press briefing on 5 February 2024 that every participant had been digitally recreated using publicly available footage of the real employees. The fraudsters had synthesised voice and video in real time.

Over 15 transactions, the employee transferred the funds to five different local bank accounts. He reported the fraud one week later, after checking with his company's headquarters directly.

The investigation

The case was made public by Acting Senior Superintendent Baron Chan Shun-ching at a press conference at Hong Kong Police Force headquarters. No arrests had been made at the time of the announcement, though investigators confirmed they were examining the bank accounts involved.

Chan described it as the first case of its kind in Hong Kong involving a multi-person deepfake video call. Previous deepfake fraud cases in the region had typically involved a single person on screen, often using still images or pre-recorded clips. This case used synchronised, real-time video of multiple fabricated individuals.

How the deepfake was built

Investigators believe the criminals sourced existing video material — interviews, press conferences, internal recordings — to train AI models that could replicate the appearance and voice of specific named employees. The employee who made the transfers told police that he had initially been suspicious but became convinced by the realistic appearance of the call participants.

The fraud required advance knowledge of who worked at the company and in what roles. Police said the initial phishing email was likely used to identify targets with financial authorisation before the deepfake call was staged.

Why this matters for agent crime

The Hong Kong case represents a threshold: a crime in which AI systems were not used as supporting tools, but as the primary instrument of deception. The fraudsters themselves may never have interacted directly with the victim. Instead, an AI-generated representation did.

This is the structure of agent crime — where autonomous or semi-autonomous systems take actions that have direct legal and financial consequences, while the human operators remain one or more steps removed. The victim transacted with a machine. The machine was built to deceive.

The scale and execution suggest organised criminal groups with access to AI infrastructure, not opportunistic individuals. Hong Kong police confirmed they were investigating whether the same group had attempted similar scams elsewhere.

What happened to the money

As of the initial police disclosure in February 2024, the full HK$200 million had not been recovered. The five bank accounts used to receive the funds had been emptied and the transaction trail dispersed across the financial system. Hong Kong police liaised with banking partners and Interpol, but made no public announcement of recovery.

The victim company has not been publicly identified.

Read more

Mother receives AI-cloned voice call claiming her daughter was kidnapped, 2023
Agent Crime

AI Voice Clone Called Her: The Jennifer DeStefano Virtual Kidnapping

Danske banker deployer autonome AI-agenter uden sikkerhedsstandarder
Post

Danish Banks Deploy Autonomous AI Agents Without Safety Standards

Kriminelle bruger AI-agenter til at finde sårbarheder i software
Post

Criminals Use AI Agents to Find Software Vulnerabilities

Related Content
Mother receives AI-cloned voice call claiming her daughter was kidnapped, 2023

AI Voice Clone Called Her: The Jennifer DeStefano Virtual Kidnapping

Danske banker deployer autonome AI-agenter uden sikkerhedsstandarder

Danish Banks Deploy Autonomous AI Agents Without Safety Standards

Kriminelle bruger AI-agenter til at finde sårbarheder i software

Criminals Use AI Agents to Find Software Vulnerabilities

AI-agenter finder 77% af softwaresårbarheder i automatiserede angreb

AI Agents Find 77% of Software Vulnerabilities in Automated Attacks

Advertisement

Susanne Sperling

Admin

Share this post:
Boardroom used in deepfake video conference scam, Hong Kong 2024

Deepfake CFO Call: How Fraudsters Stole $25 Million in Hong Kong

By
Susanne Sperling
Published
February 6, 2024 at 10:00 AM

In February 2024, a finance employee at a multinational company in Hong Kong transferred HK$200 million — approximately USD 25.6 million — to fraudsters who had used deepfake technology to impersonate the company's CFO and multiple senior colleagues in a live video conference call. The case is the first confirmed large-scale corporate fraud in which AI-generated video deepfakes were used to deceive employees in real time.

A call that looked real

The employee received an email that appeared to come from the company's UK headquarters, instructing him to initiate a series of large transfers. Suspicious, he joined a video conference to verify the request. On screen, he recognised the CFO and several colleagues — all of whom, he later told police, appeared and sounded completely normal.

None of the people on the call were real. Hong Kong police confirmed in a press briefing on 5 February 2024 that every participant had been digitally recreated using publicly available footage of the real employees. The fraudsters had synthesised voice and video in real time.

Over 15 transactions, the employee transferred the funds to five different local bank accounts. He reported the fraud one week later, after checking with his company's headquarters directly.

The investigation

The case was made public by Acting Senior Superintendent Baron Chan Shun-ching at a press conference at Hong Kong Police Force headquarters. No arrests had been made at the time of the announcement, though investigators confirmed they were examining the bank accounts involved.

Chan described it as the first case of its kind in Hong Kong involving a multi-person deepfake video call. Previous deepfake fraud cases in the region had typically involved a single person on screen, often using still images or pre-recorded clips. This case used synchronised, real-time video of multiple fabricated individuals.

How the deepfake was built

Investigators believe the criminals sourced existing video material — interviews, press conferences, internal recordings — to train AI models that could replicate the appearance and voice of specific named employees. The employee who made the transfers told police that he had initially been suspicious but became convinced by the realistic appearance of the call participants.

The fraud required advance knowledge of who worked at the company and in what roles. Police said the initial phishing email was likely used to identify targets with financial authorisation before the deepfake call was staged.

Why this matters for agent crime

The Hong Kong case represents a threshold: a crime in which AI systems were not used as supporting tools, but as the primary instrument of deception. The fraudsters themselves may never have interacted directly with the victim. Instead, an AI-generated representation did.

This is the structure of agent crime — where autonomous or semi-autonomous systems take actions that have direct legal and financial consequences, while the human operators remain one or more steps removed. The victim transacted with a machine. The machine was built to deceive.

The scale and execution suggest organised criminal groups with access to AI infrastructure, not opportunistic individuals. Hong Kong police confirmed they were investigating whether the same group had attempted similar scams elsewhere.

What happened to the money

As of the initial police disclosure in February 2024, the full HK$200 million had not been recovered. The five bank accounts used to receive the funds had been emptied and the transaction trail dispersed across the financial system. Hong Kong police liaised with banking partners and Interpol, but made no public announcement of recovery.

The victim company has not been publicly identified.

Read more

Mother receives AI-cloned voice call claiming her daughter was kidnapped, 2023
Agent Crime

AI Voice Clone Called Her: The Jennifer DeStefano Virtual Kidnapping

Danske banker deployer autonome AI-agenter uden sikkerhedsstandarder
Post

Danish Banks Deploy Autonomous AI Agents Without Safety Standards

Kriminelle bruger AI-agenter til at finde sårbarheder i software
Post

Criminals Use AI Agents to Find Software Vulnerabilities

Related Content
Mother receives AI-cloned voice call claiming her daughter was kidnapped, 2023

AI Voice Clone Called Her: The Jennifer DeStefano Virtual Kidnapping

Danske banker deployer autonome AI-agenter uden sikkerhedsstandarder

Danish Banks Deploy Autonomous AI Agents Without Safety Standards

Kriminelle bruger AI-agenter til at finde sårbarheder i software

Criminals Use AI Agents to Find Software Vulnerabilities

AI-agenter finder 77% af softwaresårbarheder i automatiserede angreb

AI Agents Find 77% of Software Vulnerabilities in Automated Attacks

Advertisement

Susanne Sperling

Admin

Share this post: