Kaspersky has discovered that deepfake creation tools and services are available on darknet marketplaces and, depending on the complexity of the project and the quality of the final product, prices for one minute of deepfake video range from $300 to $20 000.

The services uncovered offer generative AI video creation for a variety of purposes including fraud, blackmail, and stealing confidential data.

Kaspersky analysed various darknet marketplaces and underground forums offering creation of deepfake videos and audios for different malicious purposes. In some cases, individuals may request specific targets for deepfake creation such as celebrities or political figures.

Cybercriminals use generative AI video in several ways for illegal activities. They can use deepfakes to create fake videos or images that can be used to defraud individuals or organisations.

For example, they can create a fake video of a CEO requesting a wire transfer or authorising a payment which can be used to steal corporate funds. Fake videos can be used to create compromising videos or images of individuals which can be used to extort money or information from them. Cybercriminals can also use deepfakes to spread false information or manipulate public opinion. For example, they can create a fake video of a politician making controversial statements which can be used to influence the outcome of an election.
Deepfake technology can be employed to bypass verification in payment services by creating realistic fake videos or audio recordings of a legitimate account owner. These can be used to trick payment service providers into thinking that they are the actual account owner, thereby gaining access to the account and its associated funds.

“Increasingly, deepfakes are used in attempts at blackmail and fraud,” says Vladislav Tushkanov, lead data scientist at Kaspersky. “For example, the CEO of a British energy firm was tricked out of $243 000 by a voice deepfake of the head of his parent company requesting an emergency transfer of funds. As a result, funds were wired to the fraudster’s bank account.

“Suspicions were only raised when the criminal requested another transfer, but by then it was too late to get the funds that were already transferred back,” Tushkanov says. “A similar case was reported in the UAE, where $400 000 was stolen in a scam also involving voice deepfake.

“It’s important to remember that deepfakes are a threat not only to businesses, but also to individual users – they can spread misinformation, be used for scams, or to impersonate someone without consent. Increasing your digital literacy level is key to counter these threats.”