Cybersecurity expert, Joy Ladegbaye,has warned businesses about the growing threat of voice cloning scams, urging organisations to adopt Artificial Intelligence (AI)-powered call screening tools as a first line of defence.
According to her, voice cloning, a form of deepfake technology, enables fraudsters to mimic a person’s voice with alarming accuracy using just a few seconds of recorded audio from phone calls, videos, or voicemails.
In a statement, Joy observed that once acquired, scammers can generate fake audio messages or even make live calls that convincingly sound like trusted colleagues, managers, or clients.
The expert explained that the risk to businesses is particularly high because corporate environments often involve large financial transactions, sensitive data, and time-sensitive decision-making.
She said these were the conditions scammers exploit to pressure employees into acting without verification.
She said: “Imagine getting a call from your boss asking you to quickly transfer money to a supplier. You recognize the voice, so you don’t think twice. But later, you find out it wasn’t your boss — it was a scammer using AI to clone their voice. By then, the money’s gone, and the damage is done.
“This type of scam is already happening in the real world. In one case, a company manager received a call from what sounded like the CEO, urgently requesting a large fund transfer. The manager complied — only to later discover it was a scam.
-Businesses are prime targets because they often deal with large sums of money, sensitive information, and fast decision-making. Scammers know this and use voice cloning to create pressure and urgency, making employees act quickly without double-checking.
According to her, “It’s not just about tricking one person — it’s about bypassing trust. When someone hears a familiar voice, they let their guard down.”
To counter the threat, Joy urged businesses to deploy AI-powered voice verification systems capable of detecting unusual speech patterns, identifying deepfake voices, and cross-checking calls against verified recordings.
She stressed the importance of training staff to verify any urgent or unusual request, regardless of how familiar the caller’s voice may sound.
Equally critical, she advised, is the establishment of secure communication channels for high-risk transactions such as money transfers.
She further cautioned organisations to reduce the amount of voice data of their key executives that is publicly available through interviews, podcasts, or voicemails, as these recordings can be exploited by scammers.
“Voice cloning scams are a real and growing threat in today’s digital world. The technology is improving fast, and so should our defences. By using AI tools and staying alert, businesses can protect themselves from falling victim to this sneaky form of fraud”, She said