It’s the heart sinking moment when you realise a call to your insurance company is going to be required. You have just pranged your car and are sizing up the damage — as well as the hit to your no-claims bonus.
For some drivers, the story takes an unconventional turn at this point. If you are a customer of Chinese financial services group Ping An, you can use a new service allowing you to send the company photos of your vehicle captured on your smartphone. Its computers will automatically assess the damage, cross-check records of your driving history and offer you a while-you-wait quote to settle the claim — bypassing the usual rigmarole.
Launched earlier this year, Ping An’s innovation is one example of how accelerating developments in artificial intelligence and data science are shaking up our personal finances — including banking, pensions and investments as well as insurance.
Data science is increasingly being used to speed up processes, compare products, find deals, and produce answers customised to an individual’s circumstances.
This brings the prospect of cost savings for customers, not to mention the companies providing these services. As low-cost automation greases the wheels of our interactions with banks, insurers and pension providers, there is also potential to make useful connections between previously discrete elements of our finances.
The greatest challenge is whether technology can interpret the human emotions that play a part in setting financial goals and making major money decisions. Will financial consumers be prepared to trust machines — and the companies behind them — when it comes to sharing insights into their financial data?
FT Money has explored the possibilities — and potential limits — of a personal finance landscape transformed by data science.
Promise and reality
Artificial intelligence is clearly defined in computer science, but has increasingly become a buzzword tacked on to all kinds of commercial projects. Experts say a genuine AI system is something that “learns” from the data it is fed, to carry out tasks typically requiring human intelligence — either with the help of a human specialist, or on its own.
The best-known example was the triumph in 2017 of AlphaGo Zero, a self-taught computer program developed by DeepMind, the London-based start-up bought by Google’s owner Alphabet. AlphaGo Zero mastered the Chinese board game Go in three days, defeating previous computer models that had beaten the world’s best human players.
Practical applications have followed, with data science increasingly integrated into money-related apps at varying levels of complexity and capability.
“We’re seeing the development of services based on algorithms which, while increasingly sophisticated, aren’t artificially intelligent,” says Sarah Coles, personal finance expert at Hargreaves Lansdown.
Software that asks customers to answer a series of questions by selecting from fixed responses — in other words, a traditional decision tree guided by pre-determined rules — does not require the help of AI. The technology has nonetheless been put to work by financial services companies, including the UK tax authority, via the rapid introduction of chatbots which can tackle customer queries online before people pick up the phone.
Using natural language processing, chatbots are designed to cope with the huge variety of unstructured responses that people may type or speak into their devices. As they become more sophisticated, chatbots learn from each encounter, so they can pick up on more subtle or idiosyncratic phrasing and better identify ways to help users. This includes “sentiment analysis”, where the bot detects tension or upset in a user’s tone of voice, and quickly switches over to a human operative to resolve the issue.
Within the area of savings, some financial apps such as Chip and Plum already use a series of algorithms and basic machine learning to assess how customers use their current accounts, how much they can afford to move into a savings account, and when they should do so. Most analyse data on spending patterns, and the timing of regular payments such as household bills.
Yet the use of AI to select a suitable life insurance policy could also become commonplace in future, according to Kay Ingram, director of public policy at advice group LEBC.
“If the consumer is able to answer a few simple questions about the needs they have — for example, capital sums to repay debt, pay for larger purchases such as cars and home improvements, and provide an income stream for dependants — shopping online for basic insurance policies could become the norm,” she predicts.
Fraud and data
Imagine you are applying for a loan and your prospective lender requires you to answer a series of questions while looking into a video app on your smartphone. The reason? To allow the system to discern facial micro-expressions that might suggest you are lying.
Another initiative from Ping An, this AI-driven software is one of the more striking developments in fraud prevention — the biggest area of growth in the use of data science in personal finance. According to a recent study by the Association of Certified Fraud Examiners, 13 per cent of companies already use AI to tackle financial crime, and a further 25 per cent plan to do so in the coming year.
Psychologists say micro-expressions are not an infallible guide to the truth, but other patterns of potentially fraudulent behaviour can be spotted early by technology. Over time, as a system’s performance improves, more fraud can be detected while avoiding “false positives”.
A classic example is a credit card being blocked over a potentially “fraudulent” transaction when a customer is travelling abroad on holiday, says Micah Willbrand, managing director of identity and fraud services at credit scoring agency Experian.
AI will help eliminate the problem of geographic anomalies, he says, since it will be able to link card transactions made at the airport with, say, the purchase of a curry in Thailand the following day.
GoCompare, a comparison website, is using AI to detect insurance fraud such as quote manipulation, ghost broking and application fraud in a partnership with analytics company Featurespace.
Its software can detect suspicious behaviour at the point of quote — for instance, a succession of changes to name, employment or postcode — then either block the transaction or raise an alert depending on the insurer’s or broker’s risk tolerances.
Sandra Peaston, director of research and development at Cifas, the fraud prevention service, says the use of advanced data science techniques has a lot of potential for improving fraud prevention. However, she says the technology needs to be handled appropriately because of the nature of machine learning techniques: “We wouldn’t want to see models that develop bias and don’t get corrected.”
Using AI to make data-related decisions or to block a customer’s service for fear of fraud creates another issue. One of the guiding principles of good data usage is that companies ought to be able to explain to a customer how a decision was made. “With true AI, that’s incredibly difficult,” Ms Peaston says.
Because of the cost of technology, the latest data science techniques are unlikely to be adopted evenly across the financial services sector. Smaller companies won’t just have smaller budgets — they are also disadvantaged by their lack of data.
The volume of a big bank’s daily interactions with customers will produce a much more powerful pool of data — the feedstock of AI applications — than a small building society. This could not only give them the edge in targeting particular products or loans to the right people, but customers might eventually prefer the security of improved fraud detection that comes with bigger volumes of data.
Ms Peaston says smaller organisations could find they lack the data to create accurate models. “Ensuring that customers of those societies have the same level of protection as customers of bigger banks is something that should be looked at. If we get into a position where you’ve got haves and have-nots, it’s not good for customer choice,” she says.
Some of the AI-fuelled innovations predicted by finance experts may run up against another issue — data privacy. Regulatory initiatives such as Open Banking in the UK have made it possible to link large data sets from disparate areas of personal finance, but only if customers give their consent.
If data sharing fills some with trepidation, others will accept it as a given. Adrian Poole, UK head of financial services at Google Cloud, says younger people are already consuming AI through their devices, and will only demand more of it.
“There’s a millennial generation that will expect things to be a little bit more predictive than we’ve ever seen before,” he says, giving the example of linking a map app with your diary appointments, allowing your phone to warn you if heavy traffic is likely to make you late for your next meeting.
More relevant to personal finance is his suggestion of a financial “dashboard” that allows people to see all their key financial data in real time. At present, these figures are held by different companies, in formats that are often hard to link or may not even be electronically available.
Like many people, Mr Poole keeps track of his own finances via a spreadsheet that pulls all of this together but requires regular updating from disparate sources.
“So long as we treat our clients’ data with care, what we’ll see is an expansion of the Open Banking principle which says ‘Why can’t I have all my financial situation in one place?’ I really just want to go to a dashboard and have a look at it with the press of a button. That may not be 20 years away — it may be five years away,” he says.
Ms Peaston says the drive to utilise personal information has to be weighed against the risks of putting data in one place: “[Companies] are treading carefully but it will become more important because some of the value from these sorts of things is quite huge.”
The AI-driven adviser
Instead of paying to see a financial adviser, could AI be used to guide decisions about how quickly to pay off a mortgage versus how much to save or invest — all judged in the context of an individual’s personal goals and circumstances?
In theory, such an all-purpose service could become reality says Daniel Hegarty, founder of Habito, an online mortgage broker.
“Once you understand a person’s preferences, and have all of the data and all of the products, it’s just arithmetic,” he says. “There will initially be a series of point solutions — for Isas, current accounts, mortgages, pensions — but as the data infrastructure matures across all of personal finance, that doesn’t sound crazy to me.”
Whilst robo advice has become commonplace, the idea that machines might replace human financial advisers across these disciplines appears a much more distant prospect to Greg Davies, head of behavioural science at consultancy Oxford Risk. Unlike a game of chess, he says, a financial plan will not contain an end point when a participant can be said to have “won” or “lost”.
“There’s a whole set of trade-offs along a journey, and they’re not just about measurable financial things but also about values, opinions and empathy,” he says. “More importantly, the rules of the game are constantly changing. What I want from my life changes over time.”
Machines may be brilliant at optimising a system given a fixed set of rules, but personal finance is a more fluid activity. Mr Poole thinks it would take years simply to pull together the data needed to create a model of an advised process. This would risk bypassing a key function of the human-advised conversation on personal finance — to crystallise what someone’s financial goals are, not simply to act upon them.
Nevertheless, advisory companies can see exciting possibilities of using AI to tap new markets. In the UK, Wealth Wizards has launched its own digital independent financial adviser, MyEva, regulated by the Financial Conduct Authority.
MyEva’s web app and chatbot takes people through a financial health check and is able to give personal recommendations based on the customer’s responses. MyEva can then nudge users to prioritise debt repayments over saving, for example, and also covers areas such as how much someone should be contributing to their workplace pension.
Andrew Firth, founder and chief executive of Wealth Wizards, said: “Eva uses data learnt from interactions with previous users to build up a body of information to suggest what users should do next.”
Crucially, the system has a human back-up. The website says: “If MyEva cannot help you, she can introduce you to someone who can.”
Considering the cost of financial advice, Ms Coles at Hargreaves thinks AI could help close the “advice gap” in future.
“There are times when the generic guidance on offer isn’t enough to enable people to make an informed decision, but they don’t want to pay for financial advice,” she says. “AI could use data about the individual to tailor the guidance and make it more useful to people.”
However, she stresses a regulatory rethink would be needed for this to happen. Although providers may want to go further than providing generic guidance, if they use the information they have about customers to tailor information and offer opinion, this could currently be deemed as providing advice.
When it comes to investment, AI has yet to show it can beat the market over the long term. Were that to change, it would still leave unanswered the question of how investors should respond, says David Miller, executive director at Quilter Cheviot Investment Management.
“If AI starts to dominate fund management because of good results, people will quite reasonably surrender control to machines. The problem with this is that people will stop learning how to make decisions. Fund managers make decisions every day, some good, some bad, and the good ones learn from their mistakes,” he says.
Whilst AI can potentially open up the financial advice market to consumers who are currently excluded on cost grounds, traditional advisers are not losing much sleep.
For now, the face-to-face model is what the wealthiest clients expect. But just as lower cost advice solutions offer automated processes backed up by human expertise, there is potential for AI to move up the wealth management value chain, streamlining services and cutting costs.
The obvious downsides are the privacy concerns, and the risk that the robots will get it wrong. The one thing that humans have over machines, advisers claim, is the ability to build a relationship of trust.
Get alerts on Personal Finance when a new story is published