Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Facing the growing threat from AI scams

Scammers are using artificial intelligence to improve their fraud techniques, which makes it increasingly important to educate investors about the potential risks.
A recent report from the Ontario Securities Commission tested simulations of regular and AI-driven investment scams on 2,010 Canadians, along with different protective measures. Subjects invested 22 per cent more in scams that used AI to defraud victims.
“What we’re definitely seeing is that there is an evolving level of sophistication when it comes to these scams,” says Meera Paleja, program head for behavioural insights at the OSC. “You can now generate videos with a few keywords that resemble a very legitimate investment opportunity.”
Such techniques include deepfakes, in which scammers use AI to generate convincing videos of well-known figures promoting fraudulent investments. Multiple Canadians have fallen victim to these scams, including one case in September, 2023, when a man from Barrie, Ont. lost $11,000 to scammers using a deepfake of Justin Trudeau.
Generative AI can make scam delivery channels such as e-mail spam and social media posts more effective by producing more convincing text, the OSC warned. It can even use publicly available information on victims to craft targeted messages.
“AI has really lowered the barrier to entry for creating high-quality scams,” says Laura Payne, chief executive officer at Toronto-based cybersecurity consultancy White Tuque Inc. “The availability of services using AI at low cost, or in many cases free to the public, makes them easily accessible.”
Some scammers also use AI as a marketing tool to make fraudulent investments more attractive, the OSC adds. One example is YieldTrust.ai, a fake trading bot that claimed to use “quantum AI” to deliver daily returns of 2.6 per cent.
Investment scams often target the elderly, warns Jordan Schwann, managing director and portfolio manager at Toronto-based investment and wealth management company Newport Private Wealth Inc., describing those who reach out to the firm for advice on fraudulent investment propositions.
“On average, they’re a bit older and less comfortable with technology,” he says. “That would probably be a common thread.”
Other vulnerable groups include recent immigrants and younger self-directed investors, whom the OSC report says tend to skew male.
Mr. Schwann says that part of a wealth management company’s value lies in its ability to warn clients about suspicious investments.
“That’s unlikely to happen if you’re using a robo-advisor or a do-it-yourself investment account,” he says. “You’re more on your own to make some of those decisions, an that probably leads to poor outcomes.”
The report highlights two main measures to reduce the risk. The first – education – warns the public about telltale scam indicators. These include non-authoritative sources, unknown authors and outlandish claims. A subset of this – inoculation – uses tests and games to train users in spotting fake investment proposals. This form of mitigation reduced the amount invested in AI scams by 10 per cent, the OSC found.
“As the regulator, we try to get out there and spread those education messages so that [investors] can learn the warning signs and help protect their money,” says Perry Quinton, manager of the investor’s office at the OSC.
Prevention is better than cure, she adds. “When you’ve got scam artists operating in jurisdictions all over the world, it’s pretty hard to go after them to get anything back, and usually the money’s gone.”
While education is critical, a second approach proved far more effective in the OSC’s tests. Using technology to flag potential scams as victims view information dropped the amount invested in scams by almost one-third in the OSC’s trial. The commission used a simulated browser plug-in to test this approach but said it didn’t know of any available real-world solutions.
Ms. Payne agreed that technology-based solutions have potential. “I would certainly welcome seeing some sort of technology in that space that helps interrupt and give people an opportunity to take a step back,” she said.
Ultimately, scams tend to prey on individuals’ emotions, the OSC report warned. Fear of missing out (FOMO) is a common tactic scammers use to force people into quick investments.
Ms. Payne highlighted romance scams as another example, in which scammers defraud victims by pretending to be interested romantically in them. This kind of scam, also ripe for AI-driven enhancement, capitalizes on loneliness. It’s becoming a common technique for “pig butchers” who manipulate victims into making fraudulent investments.
Ms. Payne highlights the need to control one’s emotions in the heat of a transaction – commonly known as mindfulness – as a mitigation technique. The OSC’s report also mentioned this as a powerful weapon against scams.
“It’s about understanding and being able to see what is happening to yourself,” Ms. Payne says. “Then you take back control of making choices about how you respond to those emotions. That’s the path folks must follow to stop falling for a scam.”

en_USEnglish