Phoenix Group  

AI brings threats and opportunities for client vulnerability

AI brings threats and opportunities for client vulnerability
FCA said it expects firms to have systems in place for vulnerable clients (pexels/ kampus production)

Artificial intelligence brings both threats and opportunities in dealing with vulnerability, according to Phoenix Group. 

In its vulnerability summit report, the provider discussed how there was a danger for AI to make firms less sensitive rather than more sensitive. 

While on the other hand, if used well, Phoenix Group believed AI could augment human intelligence, not replace it.

Article continues after advert

It said: “Organisations push up against cost restraints when dealing with vulnerable customers. AI can automate straightforward tasks, freeing up individuals to give more tailored support where needed.”

The provider highlighted potential uses for AI to manage vulnerability which included education and the ‘nudge’ theory.

It explained: “The gamification of financial decision making is already happening. There are tools to meet savings goals, for example or understand the consequences of decisions in real time. AI can help people empathise with their future selves.”

Another way AI could be used was through tools to develop customer’s financial acumen, taking a ‘Duo Lingo’- style approach to creating financial literacy and capability. 

AI could also be harnessed to help identify vulnerability, according to Phoenix Group. 

It said: “Vulnerable customers often do not declare themselves. Equally, vulnerability may be transient, created by life events or health problems. As such, organisations need to be able to identify signs of vulnerability and create frameworks for dealing with customers appropriately.”

By having a better understanding of the customer, this in turn means products and services can be better targeted. 

The provider highlighted AI can also be used to tackle fraud, pointing out there was already a lot of work looking into ensuring vulnerable customers are not subject to fraud. 

Diane Berry, chief data and analytics officer at the Phoenix Group, said: “AI allows us to analyse large data sets, including customer behaviours and transactional patterns, to identify early signs of vulnerability.

“This should be an aid to customer service teams, allowing them to ask, ‘how else can I help this customer?”

Phoenix Group emphasised AI was not about replacing humans, instead if used effectively it should help free up humans to give a more personalised and nuanced response to customers who need it most. 

It added: “Artificial intelligence is still in its infancy. Trust will be important to its evolution and will take time to build. The application of AI will develop over the next three to five years and could be embedded in customer interactions within a decade.”

Joanna Legg, head of department at the FCA said the regulator expects financial services firms to have systems to support vulnerable customers. 

Legg pointed out the firms successfully delivering good customer outcomes and meeting the requirements of consumer duty and the vulnerability guidance, have reviewed communications to boost customer understanding.

She said: “Successful firms have reviewed communications, changing the format and language in order to make content clearer and boost customer understanding."