
CBA deploys AI bot army to disrupt and study scam attacks
Thousands of voice and message bots intercept scammers before they reach real customers.
Commonwealth Bank of Australia (CBA) has found a way to protect its customers from scams whilst collecting data on fraud schemes—through artificial intelligence (AI)-powered bots.
Working with security platform Apate.ai Pty Ltd., the lender now deploys 10,000 voice bots and 2,500 message bots to intercept scammers before they can reach real customers.
“For every minute that a scammer is on the line with a bot, or in a messaging conversation with a bot, they are not attacking an Australian,” James Roberts, CBA’s head of group fraud, told Asian Banking & Finance.
Each intercepted attempt not only prevents a scam but also feeds data into the bank’s anti-fraud systems, he said in a Microsoft Teams interview.
“One phone call might generate one piece of intelligence that then allows us to block 10 other scam attempts because of what scammers disclose to the customer on the line,” Roberts said. “It’s almost like a multiplier effect that eats at the very business model of the scammers.”
Before using bots, banks relied on analysts to track scammer behaviour. However, that approach lacked the scalability needed to combat evolving threats. “What the bots bring is the ability to do this at scale,” he pointed out.
The ability to scale was in Dali Kaafar’s mind when he and Apate.ai decided to develop the AI bots.
Kaafar, who is CEO and co-founder of the AI intelligence and fraud prevention company, said a 44-minute conversation he had with a scammer while on a picnic with his family was one of the inspirations for developing the AI bots.
“My kids had a lot of fun,” Kaafar said in the same videoconference. He described the scam conversation as a “fun comedy show” for his children. “As I was hanging up, I realized that this is 44 minutes of the scammer’s time that is not being put into action to scam others.”
But that was also 44 minutes of his own life that he can never get back. “Just like honeypots attracting malicious actors in networks, we wanted to create these believable personas of bots that scammers would engage in. Technology can do this at scale.”
The system has recorded calls as long as 54 minutes, helping banks learn scammers’ tactics and scripts. Roberts said message bots are especially valuable for tackling investment scams, which often occur through messaging platforms.
CBA is expanding its data collection efforts by working with other organisations, notably Telstra Group Ltd., Australia’s largest telecommunication company.
“They feed us indications for our joint customers when there’s potentially a scam call happening,” he said. “We can see someone on a potentially long, bad call, and then logging on to do a new high‑value payment. That gives us a signal to intervene.”
In November 2024, CBA introduced a mobile-behaviour scoring system to monitor the risk of fraudulent account creation and takeovers. These measures, combined with the bot network, form part of the bank’s evolving anti-fraud strategy.
“It’s definitely an arms race,” Kaafar said, noting that scammers are becoming more automated and socially engineered, often using leaked data to craft believable pitches. “The key battleground will really be speed and flexibility.”
Both executives cited the need for collective industry action.
“One of our calls to action is that the bigger our bot army, the more effect it can have,” Roberts said. “We’re calling out to the rest of the banks and the rest of the telcos to essentially join [us], and have a bigger bot army, which then essentially multiplies the impact.”