It’s the run-up to the year-end holiday season when criminals ramp up their efforts to fleece us and ISPA is warning of the growing problem of voice cloning.

South Africa’s official internet Industry Representative Body (IRB) says that, with a 356% rise in impersonation fraud, according to the SA Fraud Prevention Service (SAFPS), there is a serious issue emerging that has been met with silence from the responsible entities.

Call it voice cloning, voice scamming, voice mimicking calls, or audio deepfakes; the overriding fact is Artificial Intelligence (AI) can now convincingly mimic anyone’s voice. All it takes is a few minutes of recorded speech easily sourced from the internet and social media.

ISPA earlier this year requested the Independent Communications Authority of South Africa (ICASA) to take action to prevent the unlawful use of mobile numbers. Voice cloning is the latest example of how we are beginning to not trust the phone numbers conveying information. ICASA and the National Consumer Commission (NCC) have thus far failed to announce any steps to address the general misuse of numbers.

“It’s been reported that up to 80% of South Africans struggle to differentiate between real and AI-generated content that includes voice calls,” says Sasha Booth-Beharilal, ISPA chair.

With a convincing digital copy of someone’s voice, criminals can manipulate victims into sending money or revealing personal information. For example, a voice cloning victim could be tricked into thinking their spouse was calling to ask for the urgent transfer of a significant amount of money to deal with some supposed emergency.

As with any scam, ISPA says our own common sense is the most effective weapon we have against criminals cloning the voices of our friends, family, neighbours and colleagues.

ISPA provides South Africans with the following hints and tips to help prevent them from becoming voice cloning scam victims this holiday season:

Firstly, take independent steps to confirm who is phoning, or leaving voice notes, and that obviously means calling the supposed person on their usual phone number to confirm the current situation.

Secondly, if you don’t already have agreed safe words or fact-checking phrases with the person supposedly calling, ask the person to provide answers to questions only they could know. A pet’s name, maiden name, first car or similar would help determine identity here.

Thirdly, stay calm, sceptical and pause – do not rush to action. Listen carefully for unnatural pauses, and overly lengthy sentences that appear monotone.

“AI doesn’t know everything and this lack of intimate knowledge can be effectively turned against fraudsters using AI to deepfake our voices,” concludes Booth-Beharilal.