A new fraud kit was found to trick KYC verification systems on various financial platforms using AI-generated deepfakes and real-time voice alteration.
AI tools to crack KYC checks
A threat actor identified under the name Jikusu has been allegedly selling cybercrime tools designed to get away with the Know Your Customer (KYC) checks built around banks and crypto platforms.
The Jinkusu cam is a cybercriminal tool designed to breach the security protocols for platforms like Binance, Coinbase, Kraken, and OKX. They used real-time facial swap, voice changers, and emulator support to enable attacks even on platforms that require ‘live’ selfies for verification.
Unlike the existing deepfake patterns that relied on pre-recorded media, the tool uses software for live verification. It can feed altered video directly into browsers or verification apps, making the interaction appear authentic.
Integrating AI tools like GFPGAN and facial mesh tracking, these platforms are mapping detailed expressions to improve how real the generated identity looks.
Attackers could present a completely fabricated identity during a verification call without raising immediate suspicion.
Security experts say this kind of capability could open the door to more sophisticated frauds in the future, including synthetic identity attacks. This is where the bad actor blends the real stolen data with forged identities and details to create a new persona. The accounts are then used to open new bank accounts, move funds, or even carry out scams.
The timeline of AI-influenced scams
The case has reflected on the shortcomings of KYC verification systems according to Deddy Lavid, CEO of blockchain security platform Cyvers.
Almost $3.01 billion has reportedly been stolen during the 2025 worst wave of crypto scams, with AI playing a major role in making it easier for the bad actors to run their scams. In the U.S. alone, nearly 160,000 crypto-related fraud complaints were reported in 2024.
One of the recent cases addressed by the UK’s National Cyber Security Centre (NCSC) involved an AI-generated crypto expert starting a YouTube channel with almost 100,000 followers. The channel published videos educating viewers on using codes to unlock special features on Tradingview, but the codes ran malware that stole their passwords, emails, and crypto wallet details.
AI can reportedly enable a single attacker to create and manage thousands of fraudulent activities, such as phishing messages, fake support agents, or even investment bots.
Deepfake crypto scams reported in crypto
One major style of scams was known by the name “pig butchering,” which involved a long-term process of building trust with the victim and convincing them to invest large sums of capital in platforms of their convenience.