Advisor Perspectives welcomes guest contributions. The views presented here do not necessarily represent those of Advisor Perspectives.
Advisors can already pick from dozens of tech products that use generative AI to automate aspects of their work. How long will it be until we see an end-to-end generative AI platform that does everything a human advisor can do, just as well as they can - if not better?
Research from MIT Sloan contends that a full-service advisor AI “is closer than you might think.” Andrew Lo, a professor of finance at MIT Sloan and director of the MIT Laboratory for Financial Engineering, explores AI-powered financial advice as a snapshot of the exact kind of complex, high-touch professional service that large language models (LLMs) might replace.
However, I have my doubts, due to the practical realities of meeting those challenges and real-world examples from other industries that have already tried to go down the road of full automation.
Parsing The AI Hype
I’ll be blunt: a lot of generative AI’s potential is still hype. We see promises every day of what LLM tools could do. We’re assured the current state of generative AI is a work in progress, and the next version is just around the corner. But we have to reckon with the technology as it currently exists.
Microsoft was the first out the gate to incorporate generative AI into Bing, its search engine. They hit Google at a time when Google’s search results were (and are still) clogged with terrible results. After a year, Google has not gotten any better. But Bing’s market share barely budged versus its competitors.
If we look at Amazon, another enthusiastic adopter of this technology, Amazon CEO Andy Jassy expects AI will drive “tens of billions” in revenue in the future. But at present, he said generative AI revenue was “still relatively small.”
There’s a difference between big winners like Nvidia, who are selling shovels in a gold rush, and the industries using those shovels to dig.
We already see examples of LLMs falling short as replacements for human workers in high-touch service jobs. New York City built a generative AI chatbot to help business owners answer regulatory and legal questions. It encourages its users to break the law. Among other errors, the chatbot falsely suggested to users that it is legal to fire workers who don’t disclose their pregnancies or refuse to cut dreadlocks. Need another example? Air Canada must now foot the bill after it lost a courtroom battle over a chatbot that misstated the airline’s bereavement policy.
Customers Don’t Like Chatbots
I bring up chatbots because they’re the generally accepted vision for the user experience of a hypothetical, full-service financial AI platform. The client types or speaks into the interface, and the generative AI replies in kind. The fundamental technology of a chatbot is not new. MIT Professor Joseph Weizenbaum released ELIZA in 1966. E-commerce websites and tech support companies have used chatbots for years.
Customers hate them.
Research bears this out. Gartner found only 8% of consumers used chatbots during their most recent customer interaction, and only 25% of those said they would use a chatbot again. How many of your clients would you guide to a tool with a bounce rate like this? Even if you suppose these are the early growing pains, the evolution needed to replace human advice may not be practical - or even possible.
LLMs do not “understand” the output they produce. A chatbot can return impressive results, but it cannot “think” like a human. LLMs guess at the most likely answer from the data they are given. The same question will not always produce the same answer - this is why generative AI struggles with math. This is an intrinsic part of LLM technology, and it’s why “hallucinations” exist. The best we can do to improve accuracy is to feed the model more data. A lot more.
Could We Run Out Of Data (And Electricity) To Power AI?
So much, in fact, that The Wall Street Journal reports that AI models may soon run out of high-quality data needed to sustain the current rate of development.
We already see signs of slowdown from the largest models. By now, ChatGPT, Claude, and Gemini have more or less caught up with each other. Most of the innovation we have seen in the last few months boils down to API connections or marginal changes to the user experience.
On top of more and better data, LLMs need empathy and accountability. Lo describes current generative AI chatbots as opaque “sociopaths.” They understand emotion only as symbols in training data. They will argue for all sides of an argument without attachment to any point of view. Lo suggests that a rapidly simulated evolutionary process “at the speed of computation” might allow an LLM to teach itself true empathy.
The processing power and electricity needed to make this a reality may be in short supply. The energy costs of generative AI are already steep. Alex de Vries, a researcher at VU Amsterdam, estimates that Nvidia is on pace to ship 1.5 million AI server units per year by 2027, consuming at least 84.5 terawatt-hours of electricity per year. A financial institution sitting on a trove of data big enough to create the perfect, artificial advisor would still need to foot the electricity bill. At some point, they would need to weigh productivity gains against the cost of powering the technology.
This is not to say that generative AI is a dead end for financial advisors. Even if the improvements are only incremental from here on, advisors and tech platforms have already found inventive ways to put LLMs to work.
However, I believe it is far more likely that generative AI will follow the same trajectory as robo-advisors: hyped up as an existential threat, then quietly commoditized as a useful tool in the arsenal of human-led advice.
Adrian Johnstone is the CEO of Practifi.
A message from Advisor Perspectives and VettaFi: Dive into alternative investment opportunities at our upcoming Alternatives Symposium on May 30th, and gain insights into diversifying portfolios beyond traditional equities and fixed income.
Read more articles by Adrian Johnstone