image/svg+xml

Resources

3 reasons your clients shouldn’t rely on ChatGPT for financial advice

Hands typing on a laptop

The use of artificial intelligence (AI) is no longer confined to science fiction films and technology professionals. It’s increasingly becoming part of everyday life, with people of all ages routinely using AI in various forms, such as smart home devices and virtual assistants.

According to figures published by Forbes, the UK AI market was worth more than £72 billion in 2024, and the number of AI companies has increased by over 600% in the last 10 years.

Chatbots, such as ChatGPT, are one of the most widely used types of AI by companies and consumers alike. They simulate a human conversation and can quickly provide answers to all manner of questions, from how to get help with a faulty product to what you might cook for dinner tonight.

Worryingly, IFA Magazine has reported that in 2024, more than a third of UK adults said they would consider using ChatGPT for financial advice, compared to just over a quarter in 2023 – an estimated increase of 4 million people.

If you have clients who are relying heavily on such AI programmes for financial advice, it might be time they had a rethink. Here are three reasons not to trust ChatGPT for financial guidance.

1. A lack of personalisation and context

There’s no one-size-fits-all when it comes to building, managing, and growing your wealth. A tailored approach is essential because it aligns your clients’ financial strategies with their unique circumstances and goals.

While AI has advanced significantly in recent years, and chatbots may seem extremely human-like, they can’t offer the level of personalisation provided by a (human) financial planner.

Moreover, ChatGPT will base its answers on the information it’s given. It may ask some follow-up questions, but a chatbot is largely reliant on what your clients tell it. This could mean that crucial details aren’t considered when it generates an answer. As such, the guidance may be generic and misleading.

In contrast, an experienced financial professional will take the time to get to know your clients and build a holistic picture of their lives. By nurturing a relationship in this way, they can gain a deeper understanding of an individual’s situation and aspirations, such as their tolerance for risk and long-term goals.

A financial planner also has the emotional intelligence to provide empathetic guidance and reassurance, which may be especially important when addressing sensitive topics such as estate planning.

Read more: Most clients seek financial advice for emotional, not financial reasons – here’s why

2. No accountability to financial regulations

If your clients act on financial guidance they receive from ChatGPT and it doesn’t turn out as they hoped or expected, they’re unlikely to have any recourse to action.

That’s because an AI chatbot has no responsibility to act ethically, and it can’t be held accountable for negligent advice.

On the other hand, financial services professionals are regulated and have a fiduciary duty to act in the best interests of their clients.

In the UK, financial advisers and planners must be approved or authorised by the Financial Conduct Authority (FCA), which works to protect consumers from poor professional conduct.

If your clients receive a financial product or service they’re unhappy with, they can complain to the Financial Ombudsman Service (FOS). The FOS will then complete an independent investigation of the case.

So, seeking professional advice provides your clients with a level of confidence and protection that isn’t available from ChatGPT.

3. Data security risks when sharing personal information

When seeking financial advice, it’s typically necessary to share various types of personal information, such as income, expenses, debts, and so on.

If your clients enter such sensitive data into ChatGPT, they could be exposing themselves to the risk of identity theft and financial fraud.

Indeed, it’s generally not recommended to disclose any private, confidential, or Personally Identifiable Information (PII) – such as names, addresses, and so on – to chatbots.

According to the Chartered Trading Standards Institute, fraudsters are increasingly using AI chatbots to build trust with victims by crafting sophisticated scams that are difficult to detect.

Instead, your clients could enjoy the peace of mind that their personal data is in safe hands by seeking professional financial support.

The FCA handbook sets out specific guidance on how financial planners and advisers must store and transfer personal data. It also stipulates that regulated professionals must communicate with their clients “clearly and fairly” about how their information will be managed.

Get in touch

If you have any questions about how we can support your clients with all their financial needs, please get in touch.

Email hello@sovereign-ifa.co.uk or call us on 01454 416653.

Please note

This article is for general information only and does not constitute advice. The information is aimed at retail clients only.

All information is correct at the time of writing and is subject to change in the future.

Approve by Best Practice IFA Group Ltd on 14/07/2025

What do our clients have to say?