For many advisers, the rapid growth of artificial intelligence in financial planning has raised a difficult question: will technology eventually replace human advice?
The short answer is no, says Enrico Louw, general practice principal at Old Mutual Personal Finance. Instead, the rise of AI-driven Do-It-Yourself financial plans is reshaping the customer journey in ways that reinforce why professional advice matters more than ever.
Advisers are increasingly meeting clients who arrive with spreadsheets and financial strategies generated by AI tools. Although these plans often look complete, they can contain significant blind spots – from overconfident return assumptions to neglected protections, such as severe illness cover or estate planning.
“For financial advisers, the real opportunity is to show the value of a human-led financial plan,” says Louw.
“It’s a plan that grows with people’s lives, adapting to their circumstances and anticipating challenges in ways digital advice cannot. It’s encouraging to see customers taking the initiative to map out their financial futures, but a plan built solely on a Large Language Model is, at best, an average answer based on probability. It can only go so far. This is where we step in to turn those first steps into something robust, personal, and aligned with long-term financial well-being.”
The human edge over AI
Louw believes advisers have a clear advantage: the ability to act as a family’s financial coach, offering judgment, perspective, and emotional understanding no algorithm can replicate.
“AI won’t ask: ‘How would you feel if your investment dropped 20 to 30%?’ That emotional layer, how customers react under stress, is where human financial advisers can show their expertise,” he explains.
He also warns that AI can reinforce what customers want to hear rather than challenge unrealistic assumptions. “That’s where the real value of advice comes in. We ask the tough questions, stress-test overly optimistic projections, and bring realism into the discussion so customers are prepared for setbacks as well as successes.”
From red flags to relationships
Rather than treating AI-generated plans as threats, advisers can use them as conversation starters. By pointing out gaps in a respectful way, advisers can demonstrate their expertise while strengthening client trust.
“AI may start the conversation, but financial advisers give it real direction,” says Louw. He adds: “When we respectfully correct flawed assumptions, customers feel guided rather than judged. That’s what builds lasting relationships and keeps the customer engaged over the long term.”
Far from signalling the end of advice, AI-driven DIY plans present a chance for advisers to show their unique value. Each gap in an AI-generated plan is an opportunity to guide clients with sound knowledge, improve financial well-being, and deepen long-term relationships.
Says Louw: “Each shortcoming in a DIY plan is a chance to guide customers with expertise and sound knowledge, improve financial well-being, and build stronger relationships rooted in trust. In a digital-first world, that human edge is the financial adviser’s greatest advantage.”
Six gaps to spot in a DIY AI plan
Louw identifies the following common problems with AI-generated plans and how to respond to them:
Overgeneralisation
Customers may assume rules such as “save 15% of your income” apply universally, but this ignores personal realities such as life stage, dependents, debt, and financial goals.
Response: Acknowledge the customer’s effort, then explain why a tailored plan unique to their situation makes it more realistic and achievable.
Overconfidence in returns
Many DIY plans assume constant high growth, often 10% or more annually, but markets fluctuate, and these inflated expectations can leave customers underprepared.
Response: Stress-test the plan with more conservative assumptions and demonstrate how different market scenarios impact outcomes.
Blind spots in risk protection
Self-made plans often focus only on retirement and discretionary savings, while overlooking other needs, including severe illness, disability, or retrenchment, which leaves customers exposed to life’s curveballs.
Response: Use real-life examples to show why protection strategies are critical for long-term financial wellbeing. These are uncomfortable conversations that AI won’t initiate, because it is designed to reinforce what customers want to hear rather than challenge their assumptions.
Ignoring tax efficiency
AI-generated plans may suggest the right products but fail to calculate actual tax implications, leading to unnecessary costs, fees and taxes from poorly timed withdrawals or insufficient contributions.
Response: Walk customers through the tax implications of different options and illustrate the benefits of structuring contributions (lump sum or recurring) and the impact of withdrawals more effectively.
No emotional layer
DIY advice assumes perfect behaviour during volatility, but generative AI often tells customers what they want to hear and does not prepare them for the stress of a market dip of 20% to 30%.
Response: Ask reflective questions such as “How would you feel if your investment dropped 20%?” to connect numbers to real behaviour and build emotional readiness.
Missing pieces in holistic planning
AI often sidesteps difficult but essential topics, such as whether the client has a valid will, whether beneficiaries are up to date, or whether there is enough liquidity to cover estate costs.
Response: These are uncomfortable conversations, but they are too important to ignore. Unlike AI, which acts like a yes-man and avoids raising the subject of death, advisers can address it with sensitivity and ensure families are properly protected.
“AI-generated plans are not obstacles but openings. Each gap is a chance to show the value of a human-led financial plan, improve financial well-being, and build stronger relationships rooted in trust. In a digital-first world, that human edge is our greatest advantage when the financial plan is reviewed regularly,” Louw says.





