The obligation to provide information on crypto exchanges can apparently be tricked with AI-generated ID cards. This makes it easy for criminals.
Anyone who wants to trade on crypto exchanges must provide identification. This is referred to as authentication, proof of one’s identity, usually with an ID card. This is often where the “Know your Customer” procedures end. However, authentication is not guaranteed: that the person and data even exist. With the help of artificial intelligence, these vulnerabilities can apparently be exploited and papers can be forged in such a way that they circumvent the examination procedures at crypto trading venues.
Providers like “Onlyfake” turn this into a business. Images of fake passports and driver’s licenses from 26 countries are available for just $15. To make them look even more real, place them on bed sheets or kitchen tables. The trick seems to work: “404 Media” reports that they say they have passed the KYC procedures at OKX.
On Telegram, users congratulate each other for outsmarting other well-known crypto platforms. According to the owner of OnlyFake, who calls himself “John Wick”, Binance , Kraken , Bybit , Huobi and Coinbase were also deceived.
ID cards are made either with an original photo from the buyer or randomly. Image metadata can also optionally be forged. GPS location, date, time and recording device can be adjusted as desired.
According to OnlyFake, “false documents will not be produced” as this is “illegal”. The images are only “templates” and intended “for use in films, television shows and web illustrations”.
Representatives of the crypto industry have been warning about the risks and challenges of artificial intelligence for a long time. Jimmy Su, Chief Security Officer of Binance, already pointed out the problems with the verification processes: “The hacker is looking for a normal picture of the victim somewhere on the Internet. Based on this, they can use deepfake tools to produce videos to circumvent the process”. The programs are now so advanced that they react to “director’s instructions” in real time.
Forms of fraud are also increasingly using AI tools. This is the case with the scam known as the “Giveway Scam” to have celebrities advertise alleged crypto gift campaigns. Michael Saylor explained that he has 80 such fake videos removed every day .
This website uses cookies.