The accessibility of generative AI tools has lowered the barriers for would-be criminals, while the transition to hybrid work models and geographically dispersed teams has expanded the attack surfaces they can exploit. In this context, the overlap of AI technology and the interests of financial leaders becomes increasingly significant, writes Ryan Mer, CEO of eftsure Africa.
Numerous financially motivated cybercrimes hinge on the manipulation of accounts payable (AP) staff and the evasion of traditional financial safeguards. As generative AI technologies advance, the art of deception within these crimes is poised to become more sophisticated than ever before.
Businesses could soon face a host of new challenges as generative AI models can be used to make financial crime more efficient for criminals.
Deceptive ploys, amplified
In the realm of business transactions, generative AI fuels a new era of payment fraud, employing advanced methods in crafting deceptive content, exploiting vulnerabilities, and deceiving individuals within payment systems.
Generative AI has new implications for crafting remarkably convincing phishing emails, messages, and websites that mirror legitimate entities to deceive individuals into divulging payment data and sensitive information.
Auditory and visual fraudulent deception
Voice manipulation tools are becoming increasingly popular in the cyber-criminal arsenal. Empowered by advanced voice synthesis, fraudsters generate lifelike voice recordings that allow them to impersonate figures of authority, such as CEOs and heads of finance, coaxing victims into actioning unauthorised payments.
Deepfake realism is another alarming facet of generative AI, with the potential to fabricate realistic video footage that depicts falsified payment transactions or endorsements that reinforce social engineering tactics.
This contributes to a form of fraud known as instructive imitation, in which fraudsters mimic genuine communication patterns of well-known business figures, such as CEOs, exploiting generative AI to send messages coercing subordinates into making unapproved payments.
Generative AI also greatly enhances forgery expertise and capability, giving criminals the tools to produce counterfeit invoices and payment-related documents with a greater authenticity to dupe individuals and businesses into remitting funds to bogus accounts.
Further uses of generative AI for payment fraud may include attempts at biometric subversion, through the fabrication of seemingly genuine biometric data. However anti-spoofing technology is also progressing and tech giants like Apple state that the chance of fooling FaceID is one in a million.
Additionally, generative AI can be used to perform credential onslaught, by generating volumes of username-password combinations to amplify the effectiveness of credential stuffing attacks. Here, shared credentials across accounts, including payment ones, can heighten the risk of unauthorised access and fraudulent actions.
Unravelling dark web origins
Where do all of these deceptive generative AI capabilities come from? They spread through the dark web, promoted and sold on illicit forums.
One of the most prominent examples of malicious generative AI is WormGPT, a tool designed to assist cybercriminals in their nefarious activities. It is labelled a black-hat alternative to popular AI models like ChatGPT, and it automates cyberattacks, including phishing and other criminal endeavours.
WormGPT is trained on massive datasets of text and code, and it can generate realistic and convincing phishing emails, malware, and other malicious content.
Unprecedented digital threats
The dangers of WormGPT are significant. It can be used to generate sophisticated phishing emails that are more likely to trick users into clicking on malicious links or attachments, as well as create malware that is more difficult to detect and remove.
It can exploit vulnerabilities in computer systems to gain unauthorised access, in addition to a multitude of ever-evolving threats. By employing intricate social engineering strategies and orchestrating business email compromise (BEC) scams, generative AI like WormGPT equips cybercriminals with the means to mimic trusted contacts, entice employees into divulging sensitive data, and orchestrate convincing large-scale phishing campaigns, all with one ultimate goal: to scam people and businesses out of their money.
While WormGPT might be a relatively new tool, discovered on 13 July 2023, it is clear that it has the potential to cause significant damage. Businesses need to be aware of the dangers of WormGPT and similar tools, and take steps to protect themselves.
Robust payment fraud prevention
While there is no silver bullet, many threats can be avoided with the correct operational and financial controls as well as server, IT, and email monitoring processes. Because email accounts are conduits of sensitive information, BEC attacks are unlikely to subside, particularly in South Africa which had the highest targeted ransomware and BEC attempts on the continent according to an Interpol report.
To help minimise the risk of these kinds of attacks, firms should re-evaluate their manual email-based processes and consider software solutions to digitise, automate and safeguard these processes, as well as having financial controls in place when it comes to payments by using independent real-time verification systems that cross-reference payments an organisation is about to release with independently verified bank account details.
As significant as these generative AI threats can be, they do not have to spell doom for businesses. The same technological advancements that empower criminals can also equip organisations with the tools to fight back. Investing in robust fraud prevention technology is accordingly an essential measure to protect financial systems and sensitive data.
By embracing cutting-edge solutions, businesses can bolster their defences to detect and thwart fraudulent activities in real-time. This not only safeguards their financial integrity but also preserves the trust of customers, partners, and stakeholders.