FT Wealth Management  

How to assess AI's threats and opportunities

  • To summarise some of the risks presented by a misuse of AI
  • To list some of the ways AI can help business
  • To explain how AI might change the way financial services companies work
CPD
Approx.30min

At the Empowering Advice Through Technology conference earlier this year, Foster Denovo revealed that it was looking to develop a digital advice service.

Helen Lovatt (second from right) speaking at the 2024 EATT conference in London (FT Adviser)

Chief operating officer Helen Lovett told delegates the firm was also working on new AI policies. 

Article continues after advert

And Ian McKenna, chief executive and founder of consultancy FTRC, warned that a widespread use of AI in the independent advice process could render financial paraplanners redundant unless they became the tech go-to people at their firms. 

This is why advice firms need to start thinking of how they deploy AI and machine learning, even if they are not yet rushing to implement it.

Sloane comments: "AI is at a very early stage of development and maturity in the wealth and advice industry, albeit at a fast pace, and remains an exploratory opportunity for most firms rather than being deployed at scale or interacting with clients.

"That said, where firms really understand the AI technology they are or will be using and its purpose, they will adopt suitable control mechanisms to ensure its accuracy and operation.

"However, the regulatory environment and clarity on where the responsibility and liability for any future instances of AI harm may fall are still not entirely clear.

"Firms may be reluctant to go too far with specific uses of AI until further guidance on the liability of its use is provided.”

Things to watch out for

While the world of deep fakery might not yet come to a Zoom meeting near you, there is always the possibility that advisers with well-known clients could end up on the wrong end of an AI-altered real-time video or voice call, which tricks the adviser into disclosing financial information to the person on the other side of the screen. 

It sounds Mission: Impossible, but the technology is learning so quickly that it could become mission: critical. 

Sloane says: "Unfortunately, AI is the cyber criminal's new best friend, enabling bad actors to create scams, phishing attacks, and deepfakes quickly and easily and without having to be software developers."

There is also the potential for AI trading to intercept real-time market movements and so escalate buying and selling patterns on the exchange so quickly that share prices are significantly affected. 

Last year, a paper from the US Wharton School of the University of Pennsylvania, and with input from the Hong Kong University of Science and Technology, suggested that AI-powered collusion in stock trading could hurt price formation and demolish any belief in market efficiencies.

The paper, "AI-powered trading, algorithmic collusion, and price efficiency", suggested: "Informed AI traders can collude and generate substantial profits by strategically manipulating low order flows, even without explicit co-ordination that violates anti-trust regulations."