Opinion  

'Lessons for advisers from the Post Office scandal'

Derek Bradley

Derek Bradley

This is where it gets complicated: the outcomes delivered to all those postmasters via the Fujitsu Horizon software could hit advisers in a similar way. 

Should the software manufacturer's algorithm that the adviser and their client relied upon prove in five, 10 or 15 years to have had an unforeseen glitch, regulatory retrospective retribution will rain down on the advisory firm and not the software maker of the programme.

Article continues after advert

There is a simple solution to a complex problem.

That is to have those technology firms providing algorithms certified as fit for the purpose they were designed for by the FCA.

Fit for purpose accreditation already exists in other areas of regulation. 

Aircrafts cannot fly in UK airspace without CAA approval. Drugs are certified as fit for purpose and prescription with the Medicines and Healthcare Products Regulatory Agency.

So why can the FCA not approve automated advice models as fit for purpose? In doing so, the adviser is no longer in the firing line for adopting technology to provide affordable advice to the mass market consumer.

The ‘why not’ answer according to Andrew Mansley – a technical specialist within the FCA’s innovation department, engaging with firms developing innovative business models to explain the requirements of the rules and relevant regulated activities – who I spoke to at some length at the 2017 PFS Festival, was that it would be “anti-competitive”.

What?

The FCA needs to consider the following simple steps to improve the embrace of automated opportunities.

  • All providers of robo models and modelling software should apply to the FCA for approval – that approval will certify what the programme can and cannot do as well as when.
  • The FCA approval will apply to that provider, their algorithms, the programme and its use.
  • Any changes or upgrades would require a certification upgrade.
  • The robo model software provider would require their own professional indemnity cover for any unforeseen failures.
  • The advisory firm would not be responsible for the failure of the programme as part of the FCA sign off.

Hawking warned that AI could develop a will of its own that conflicts with that of humanity. The advice responsibility buck stops with the technology provider.

Put these in place and both the regulator and the software house would think very carefully about failure, and the adviser could engage with more consumers, at lower cost with confidence restored.

Derek Bradley is founder and chief executive of Panacea Adviser