KYC, or “know your customer,” is a process designed to help financial institutions, fintech startups, and banks verify the identity of their customers. Not uncommonly, KYC authentication includes “identity pictures” or crossed selfies that are used to confirm that a person is who they say they are. Wise, Revolut and cryptocurrency platforms Gemini and LiteBit are among those that rely on ID images for onboarding security.
But genetic AI could cast doubt on those controls.
Viral posts on X (formerly Twitter) and Reddit show how, leveraging open-source and off-the-shelf software, an attacker could download a person’s selfie, edit it with artificial intelligence tools, and use the fake ID picture to pass a KYC test. There is no evidence that GenAI tools have been used to fool a real KYC system — yet. But the ease with which relatively convincing fake ID images are a cause for concern.
Makes fun of KYC
In a typical KYC ID picture authentication, a customer uploads a picture of themselves holding an identification document — a passport or driver’s license, for example — that only they could have. A person — or an algorithm — cross-references the image with documents and selfies on file to (hopefully) prevent impersonation attempts.
ID picture authentication has never been foolproof. There were scammers sale fake IDs and selfies for years. But GenAI opens up a host of new possibilities.
Learning online Show how Stable Diffusion, a free, open source image generator, can be used to create synthetic renderings of a person in any desired setting (eg a living room). With a little trial and error, an attacker can modify the renderings to show that the target appears to be holding an identity document. At that point, the attacker can use any image editing program to insert a real or fake document into the hands of the person being spoofed.
Now, getting the best results with Stable Diffusion requires installing additional tools and extensions and acquiring about a dozen images of the target. A Reddit user with the username _harsh_, who posted a workflow for creating deepfake selfie IDs, told TechCrunch that it takes about one to two days to create a convincing image.
But the barrier to entry is definitely lower than it used to be. Create identity images with realistic lighting, shadows and environments used require somewhat advanced knowledge of photo editing software. This is not necessarily the case now.
Feeding KYC images to an app is even easier than creating them. Android apps running on a desktop emulator like BlueStacks can be tricked into accepting fake images instead of a live camera feed, while web apps can be overridden by software that allows users to turn any image or video source into a virtual webcam.
Growing threat
Some apps and platforms implement “liveness” checks as an added security to verify identity. Typically, they involve a user taking a short video of themselves turning their head, blinking, or otherwise showing that they are indeed a real person.
But liveness checks can also be bypassed using GenAI.
Early last year, Jimmy Su, the head of security for cryptocurrency exchange Binance, he said Cointelegraph reports that deepfake tools today are sufficient to pass liveness checks, even those that require users to perform actions such as head turns in real time.
The bottom line is that KYC, which was already hit or miss, could soon become virtually useless as a security measure. Su, for one, doesn’t think doctored images and videos have gotten to the point where they can fool critics. But it may only be a matter of time before that changes.