The Ultra-Realistic AI Face Swapping Platform Driving Romance Scams
Summary
Haotian is a Chinese-language AI face-swapping app that produces near‑perfect real‑time deepfakes for video calls. Marketed primarily via Telegram, the desktop tool offers granular facial controls and voice‑cloning features, and can stream its output into messaging and video platforms such as WhatsApp, WeChat, Telegram, Zoom and more.
Researchers and WIRED’s analysis link Haotian to large volumes of payments — at least $3.9 million into crypto wallets — and show ties to a scam marketplace and escrow provider used by organised “pig butchering” fraud groups. The company says it targets streamers and denies promoting illegal activity; after WIRED contacted Telegram, Haotian’s main public channel and some accounts became inaccessible.
Haotian is sold by subscription (previously advertised up to $4,980/year) and promoted services including same‑day on‑site installation in Southeast Asia. Security firms, UN researchers and crypto investigators warn face‑swap and voice‑cloning tools like this are being incorporated into romance and investment scams, making video verification far less reliable.
Key Points
- Haotian creates highly convincing live face swaps and offers voice‑cloning and an AI support chatbot for real‑time calls.
- The service was marketed via Telegram and integrated with messaging platforms, making it easy for scammers to impersonate video callers.
- Cryptocurrency tracing links at least $3.9 million of payments to Haotian, with many transactions tied to scam marketplaces and questionable wallets.
- Researchers found marketing language and features that appear tailored for social engineering and “pig butchering” scams.
- After WIRED’s inquiry, Haotian’s main Telegram presence became inaccessible; the company denies promoting illegal use and says it will cancel accounts used for fraud.
- Practical defence tips include asking the person on video to wave their hands or perform unpredictable movements to spot glitches — though Haotian claims to have mitigations against such checks.
Context and Relevance
Deepfake tools have matured rapidly and are now inexpensive enough to be weaponised in large‑scale romance and investment scams. Haotian exemplifies how a commercial AI product can be adopted by criminal ecosystems — supported by payment and escrow services — to make fraud more convincing and lucrative.
This story ties into wider trends: the growth of fraud tech in Southeast Asia, the trade in stolen data and fake accounts, and the increasing difficulty of verifying identity online. Law enforcement, messaging platforms and users all face harder detection and prevention challenges as real‑time deepfakes improve.
Why should I read this?
Because if you or someone you know uses dating apps, investment chats or gets unexpected romantic attention online, this is the short briefing you actually need. It explains how scammers now fake live video and voice so convincingly that the old trick of “see it on a video call” no longer guarantees authenticity — and it tells you what to watch for.
Author note
Punchy take: this isn’t just a creepy tech demo — it’s infrastructure for crime. Read it to know why video calls are no longer ironclad proof and why vigilance matters.