The recent increase in the scale and applications of artificial intelligence (AI) presents a range of new possibilities and potential risks to retail investors, according to the Ontario Securities Commission (OSC).
The report developed by the Ontario Securities Commission (OSC) in collaboration with the Behavioural Insights Team (BIT), Artificial Intelligence and Retail Investing: Use Cases and Experimental Research, detailed the results of a behavioural science experiment that focused on the role of AI in supporting retail investor decision-making.
Results showed that participants who received the investment suggestion from a human using an AI tool (blended) adhered to the investment suggestion most closely, although this difference was not significant.
Notably, there was no discernible difference in adherence to investment suggestions provided by a human or AI tool, indicating Canadian investors may be receptive to taking advice from an AI system.
“This research highlights the opportunities AI can create for Canadian investors and market participants,” said Leslie Byberg, Executive Vice President, Strategic Regulation at the OSC.
“It is important that we are agile and able to harness these opportunities while ensuring investor protection remains at the forefront of how we regulate,” she said.
OSC researchers also examined the current investor-facing use cases of AI in Canada and abroad. In doing so, they identified three broad use cases: Decision support involves AI systems that provide recommendations or advice to guide investment decisions; Automation consists of AI systems that automate portfolio and/or fund (e.g., ETF) management; and Scams and fraud includes AI systems that either facilitate or mitigate scams targeting retail investors, as well as scams capitalizing on the “buzz” of AI.
The report detailed both benefits and risks of using AI.
For example, AI systems could provide increased access to more affordable advice for investors, but there is also the possibility that systems may provide investors with advice that is biased or not relevant, appropriate, or accurate.
As AI-enhanced scams and frauds may pose significant risks to investors, the OSC continues to research its use, as well as ways to provide effective investor protection and potential mitigation strategies.
The OSC said that regulators are already proposing approaches to address these risks.
For example, in the US, the SEC proposed a new rule that would require investment firms to eliminate or neutralize the effect of any conflict of interest resulting from the use of predictive data analytics and AI that places the interests of the firm ahead of the interests of investors.
More broadly, industry regulators and stakeholders should seek to leverage data collected by investing platforms and investor-facing AI tools to investigate the extent to which these tools are resulting in positive or negative outcomes for investors.
The research builds on the OSC’s existing research in the area of artificial intelligence and reinforces the benefit of using behavioural science as a policy tool by regulators.
The OSC said that as AI continues to advance in capabilities, more research is needed to help capital markets stakeholders better understand the implications for retail investors.