A blog published last weekclaimed that almost a third of Australians would be happy to receive financial advice from a robot.  According to the blog’s author, futurist Anders Sorman-Nilsson, the study showed that 30 per cent of Australians would “trust a robot to offer them financial advice”. 

“Tellingly, this is compared to just 9 per cent of respondents who would trust a robot to provide them with psychology/counselling or relationship advice or allow bots to help them make other long-term choices, such as career or marriage decisions,” Sorman-Nilsson wrote.

“In other words, we trust robots with our money but not our hearts.”

There’s three points probably worth making about the current state of robo-advice. First, robo-advice and robo-investment are simply not the same thing, and at the moment while we have no shortage of robo-investment services there are virtually no true robo-advice services.

Second, what is a financial adviser if not part psychologist/counsellor and long-term decision-making assistant – and not only on financial matters, often?

And third, clients ultimately are responsible for their own decisions and actions, and no amount of advice (whether delivered by human or machine) can or should change that.

Know your client

A robo-investment service knows little about a client beyond how much money they have available for investment and perhaps a few things about their tolerance for risk, gender, age and location.

Consider a scenario where an individual inherits a windfall and they decide they want to invest it. Fine. But what’s the better long-term financial decision: to invest the full amount; or to invest some of it, and use some of it to pay off unproductive debt (for example, credit cards), or to contribute more to super or to purchase life insurance?

These are not trivial issues, and a quick on-line risk-tolerance questionnaire linked to a series of investment portfolios not only doesn’t come close to addressing those issues, it doesn’t even scratch the service of what financial advice really is.

In effect, robo-advice presupposes the individual who comes to it has already gone through a good planning process and has concluded for good reasons that investing money is the best thing to do, given all other potential options and opportunities. 

In a way, it’s creating work for advisers. Unless and until robots can guide clients through those issues and assist them with making sound decisions, they’re not really giving financial advice – at least, not in the way it is defined as a professional service. 

Machine learning and artificial intelligence are improving by the minute, and professions are not immune from the impact. But it’s true that professionals are, generally speaking, further than process or factory workers from being replaced by machines. 

Aspects of financial advice can be automated and some already have been. Robots have already effectively replaced advisers whose value proposition extends little further than investing. But these advisers were never really engaging in the goal-setting, strategic advice, psychology and counselling activities of professionals in the first place.

Financial adviser: has hammer, seeks nails

Good financial advice is not prescriptive. It doesn’t follow a script and it does not automatically herd clients towards some predetermined outcome – and that includes managed accounts or other investment solutions. Advisers need a toolkit and not just a hammer, because not all clients are nails.

Financial advice is not a cookie-cutter process, or at least it should not be. Every individual is different. They make decisions in their own way and move along different paths to get there. Things change – sometimes for the better, sometimes for the worse. An adviser is a counsellor and psychologist, who can tease out and understand an individual’s goals, desires and drivers, and then present them with options and alternatives and with the information they need to make decisions that are best for them. 

Speaking to Canadian Business in late 2017 (admittedly, a previous epoch in AI terms), the author Daniel Pink addressed the impact on financial advisers directly: “That person needs knowledge of asset allocation, but the truth of the matter is a lot of that kind of thing has become commoditised,” Pink said.

“If that financial adviser is going to survive, what is she going to do? She needs to have greater amounts of empathy, understand where her client is coming from. [The adviser] needs to be very good not at the reductive task of problem solving, but at the more creative, complex task of problem finding. Can you see around corners? Can you identify issues that your client doesn’t face now but might face in the future?”

It’s the client’s decision

But that’s the crux of it – it’s the client’s responsibility to make the final decision, based on the best available advice and guidance they can get. A client who understands the decision-making process, and where that process is collaborative and co-operative, is more likely to stick to a strategic plan than abandon it at the first sign of the unexpected, because they understand that issues can arise along the way to achieving a goal. They know a plan built on good advice (particularly ongoing advice) is more able to deal with ups and downs and twists and turns and what comes at them from around the corner.

Robots will make, and are already making, advisers’ lives easier and their businesses more productive and profitable. And it is true that not everyone wants or needs to seek financial advice from a qualified professional. But advice implies a lot more than just investing, just “the reductive task of problem solving”; advice is part therapy, part counselling, part education, part defining solutions and strategies, and part implementation. For the time being, at least, it remains the province of the human adviser.