Vulnerable customers: Using AI to enable good outcomes

Vulnerable customers: Using AI to enable good outcomes

Image depicting a business group
Contents

Earlier this year, the FCA found that 58% of vulnerable customers don’t disclose their circumstances to financial services providers. Nikhil Asthana, Paul Willis, Kamaldeep Kaur and Peter Lovegrove explore how AI can identify opportunities to offer the right support under Consumer Duty.

The findings stem from the FCA’s multi-firm review of ‘Firms’ treatment of customers in vulnerable circumstances’, which drew on insights from over 700 organisations and 1,500 retail customers. The review highlighted that 44% of vulnerable customers had negative experiences with a financial services firm, compared to 33% of customers generally. Of those who did disclose their vulnerability, 58% stated that firms provided them with additional support. 

In short – vulnerable customers tend to have worse experiences, despite firms making support available.  

In the supporting research, many individuals said they felt uncomfortable discussing their circumstances with a financial services provider. Of those, many respondents said they felt embarrassed, or worried about being offered a worse deal. Moving forward, the FCA wants firms to make it easier for customers to volunteer support needs throughout the relationship, without necessarily using the term ‘vulnerable’ at all.  

While this approach can break down some of these barriers – many of the underlying reasons for non-disclosure are beyond a firm’s control. The regulator has highlighted the use of AI as an example of good practice to help identify potential characteristics of vulnerability and offer greater opportunities for tailored support. 

Identifying potential vulnerabilities and support needs 

Identifying potential characteristics of vulnerability isn’t a once-and-done process. Customer circumstances change all the time and traits of vulnerability can include health conditions, significant life events, resilience and customer capability. Many of these can be short term – so, the processes to identify vulnerabilities (and to remove such markers) need to be equally as responsive. 

As such, forward-looking firms are moving away from manual reviews and creating innovative AI approaches to identify customers with characteristics of vulnerability. To achieve this, firms need to set the necessary groundwork and solution architecture to support AI approaches. This includes: 

  • consolidating all customer interaction data from emails, phone calls, branch visits or social media into one single view 

  • pulling information from customer profiles, for example any disclosed health issues or life events 

  • drawing on unstructured data, such as recorded calls or emails, and applying natural language processing (NLP) 

  • transforming the above information into standardised machine-readable formats to feed into AI tools 

  • creating machine learning models to assess the information to generate insight over client vulnerabilities 

AI techniques increase the consistency and scalability of vulnerability assessments compared to manual reviews. However, it’s important to note that these assessments should be tailored to the product, journey and customer base in question, and guided by human judgement, with appropriate oversight throughout. 

How can AI identify opportunities for tailored support? 

With the basics in place, firms can begin to apply the following supervised machine learning techniques and large language models (LLM) to draw customer insights and understand changes in vulnerability. It’s vital to make sure that those processes are repeatable, ethical and explainable, with robust oversight. They also need to align to the Government’s five principles for safe and responsible use of AI, namely that they’re: safe, secure and robust; transparent and explainable; fair; that they’re subject to appropriate accountability and governance processes; and support contestability and redress. 

Firms should review their AI processes periodically to make sure they continue to meet the intended goals, and that they remain fit for purpose.  

Speech-to-text (S2T) analytics 

S2T analytics can transcribe audio files in batches, or in real-time, for further analysis using the tools below. AI can identify the languages spoken, separate individual speakers and recognise industry-specific vocabulary.  

Sentiment analysis 

Sentiment analysis can assess content for any signs of distress, frustration, anxiety or other support needs, which may help to identify emotionally (or otherwise) vulnerable customers. 

Key word detection 

Drawing on bespoke lexicons, this technique can identify where customers have used specific terms or phrases that could indicate vulnerability. For example, terms such as ‘missed payments,’ ‘lost my job,’ ‘worried’ or ‘ill.’ 

Named entity recognition (NER) 

Going a step further, NER tools can look for key information against pre-set categories such ‘emotional state,’ ‘financial references’ or ‘medical conditions.’ This creates more standardised data to make it easier to assess and leverage information. 

Retrieval augmented generation (RAG) 

Retrieval augmented generation can be used to draw on pre-existing documents, such as FCA guidance or past case records, and compare them to customer data using an LLM. This can identify potential signs of vulnerability and recommend appropriate next steps or escalation. 

Behavioural analysis 

The combined outcome of these techniques, along with customer profile data, can be used for in-depth behavioural analysis to examine patterns in customer interactions, financial activity, and demographic information. This can help to identify deviations or risk indicators that may signal vulnerability. 

Maximising the value of Consumer Duty compliance 

Under Consumer Duty, firms must make sure that vulnerable customers receive outcomes that are at least as good as the broader customer base. Firms that can do this successfully will gain significant insight into what their customers value in a financial services provider. Seen through this lens, the ability to identify vulnerable customers, and provide tailored support, is about more than regulatory compliance. It’s about a positive customer experience. Applying it effectively can also improve brand loyalty and boost market share.  

As such, it’s essential to factor outputs from vulnerability assessments and support into outcome analysis, product reviews and fair value judgements, to demonstrate that customers’ support needs are identified and met. This can inform future service offerings to improve customer journeys, communication, support capabilities and product design. When considering product design, it’s important to note that some products – such as life or health insurance, or interventions for arrears management in retail banking – could include customers who are more likely to be in vulnerable circumstances. So, firms should have processes in place to anticipate and cater to vulnerabilities within the specific target market, with appropriate staff training to offer effective support.  

Where to start? 

Firms can review their current approach to identifying vulnerable customers, and establish key points where AI and data techniques can enrich that process. For example, firms can identify the most likely types of customer vulnerability for a given product and target market, then use data-driven techniques such as keyword detection or sentiment analysis to identify where those (or other) vulnerabilities are emerging. 

Firms also need to consider how they can factor AI and machine learning into every customer touchpoint, to help rapid identification of any potential characteristics of vulnerabilities or change in circumstances. Looking more broadly, firms can use this information across the customer journey to offer the right support and document the associated processes. Using emerging technology in this way can help financial services firms offer a consistent end-to-end approach and promote good outcomes for all customer groups. 

For further information contact Nikhil Asthana, Paul Willis or Peter Lovegrove.