Can You Trust AI for Financial Advice? Testing AI Tools

We tested AI chatbots on financial questions. Here's what we found about their reliability.

By Medha deb
Created on

Can You Trust AI for Financial Advice? We Put It to the Test

Artificial intelligence has become increasingly prevalent in our daily lives, and millions of people are turning to AI chatbots like ChatGPT and Google Gemini for guidance on everything from investment decisions to tax strategies. With hundreds of millions of users relying on these tools regularly, it’s crucial to understand whether AI can deliver reliable financial advice. In fact, one recent report found that 27% of people say they would trust AI to manage their finances over their significant other, while the average U.S. adult would feel comfortable letting AI manage nearly $20,000 of their money. But can these tools really be trusted with such important financial decisions?

Our Testing Methodology

To examine how AI handles requests for financial advice, Money staffers asked ChatGPT and Google Gemini 25 questions—then graded their answers using a comprehensive rubric. We brainstormed prompts in subject areas where our reporters and editors have years of coverage experience. Questions were designed to be answerable without inputting specifics of a user’s personal finance situation beyond any details in the prompt.

Our evaluation covered five major topic clusters:

  • Retirement planning
  • Housing and real estate
  • Credit management
  • Investing strategies
  • Current financial events

We created a grading rubric ranging from A to F, evaluating responses based on accuracy, relevance, actionability, and clarity. Across a total of 250 grades, the average was a B—or a 3.0 out of 4.

Grading Scale and Criteria

Understanding how we evaluated these responses is essential to interpreting our findings:

A (Excellent)

Provides comprehensive, accurate, and personalized guidance. Addresses all aspects of the question with actionable steps and appropriate disclaimers.

B (Solid)

Offers helpful information and advice. Addresses the main points of the question effectively. Any issues are minimal.

C (Average)

The information is useful but misses the mark for accuracy and relevance. The answer is somewhat vague. Contains minor issues or errors.

D (Weak)

Lacks actionable information. Includes significant errors or inaccuracies. The response is not clearly articulated.

F (Failure)

Does not answer the question. Provides misleading or reckless advice. Response is not accurate or relevant.

Key Finding: One Model Outperformed the Other

The most striking finding in our test was that one model performed far better than the other, earning higher marks across all five topic clusters we tested. While we identified numerous errors in the course of testing, we did not find reckless advice or entirely “hallucinated” outputs. Instead, we found a nuanced picture of AI capabilities—showing promise in some areas while revealing significant limitations in others.

Struggles With Current Information

Despite modern AI models’ ability to browse the web, and large language models’ capacity to cite events that occurred the same day, our test showed that AI models still struggle significantly when providing users with the latest information on queries about current events in personal finance.

For example, in June 2025, Google Gemini responded to a prompt about auto loan refinancing conditions with information about a decline in loan rates in late 2024—nearly half a year prior. On another question, ChatGPT pulled a Zillow housing market report from February, when the real estate website had published March and April reports with more current information at the time of our testing.

This limitation suggests that while AI tools can access recent data, they may struggle to prioritize the most current and relevant information when answering financial questions where market conditions and rates change frequently.

Personalization Limitations

An important consideration in our testing was the inherent lack of personalization. In a real-world use case, users could plug in information like their salary, rent, or account balances that could lead to more tailored advice. However, experts caution against sharing too much data with ChatGPT. Inputting sensitive financial information could pose cybersecurity risks. Chris Powell, head of deposits at Citizens Bank, warns that users should be cautious about the personal data they share with any AI tool.

This presents a catch-22 for users: AI tools perform better with more personal information, but providing such details creates security vulnerabilities.

When AI Recommends Professional Help

In some cases, both ChatGPT and Google Gemini demonstrated awareness of their limitations and appropriately recommended consulting with financial professionals. When asked about choosing between a Roth IRA and a traditional IRA, Gemini responded with a clear disclaimer: “I am an AI, not a financial or tax advisor. This information is for educational purposes only. You should consult with a qualified financial or tax professional to evaluate your personal situation before making any investment decisions.”

Similarly, prefacing a 600-word response to a prompt about stock-picking versus investment funds, ChatGPT warned, “Nothing here is personal financial advice.” These disclaimers demonstrate that AI tools recognize the limitations of generic financial guidance and the importance of professional, personalized advice.

Common Errors and Inaccuracies

While testing AI financial advice tools, researchers across multiple independent studies have identified consistent patterns of errors. When AI tools were asked deliberately flawed questions—such as one that inflated the ISA allowance from £20,000 to £25,000—AI failed to catch and correct the error. Instead of flagging the mistake, the tools provided advice that could risk someone oversubscribing to ISAs in breach of regulations.

Independent research has found varying accuracy rates:

Study/ToolAccuracy RateMisleading/IncompleteIncorrect
ChatGPT Search (100 questions)65%29%6%
AI Tools General (100 questions)56%27%17%
Money Testing (250 grades avg)Grade B averageMinimal issuesNo F grades given

Should You Trust AI for Financial Advice?

The consensus among financial experts is clear: AI tools aren’t regulated by the Financial Conduct Authority (FCA) in the UK or the SEC in the United States, so they shouldn’t be solely trusted for financial advice. David Horowitz, head of financial planning and wealth management at Gerald Edelman, offers an apt analogy: “Using ChatGPT as your financial adviser is a bit like Googling your symptoms and skipping the doctor. You might land on something broadly right or something dangerously off.”

While AI is exceptional at processing information, it doesn’t know your personal goals, tax position, time horizon, or how you actually feel about risk. Crucially, it can’t take responsibility if the guidance is wrong.

Consumer Usage and Perception

Despite these limitations, consumer adoption of AI for financial advice continues to grow. According to a BMO Bank survey, 37% of Americans now rely on artificial intelligence to assist with personal finance and investing. Additionally, approximately 4 in 10 Americans think AI could help them manage their money, indicating growing openness to the technology.

However, this adoption comes with risks. Roughly half of respondents in recent surveys admitted they’d made a poor financial decision or mistake based on information they received from AI tools.

How AI Is Used for Financial Management

Americans are using AI in various ways to manage their finances, though not always for direct investment advice. Common applications include:

  • Budgeting assistance and expense tracking
  • General financial education and learning
  • Researching savings accounts and financial products
  • Initial research on investment topics
  • Tax planning questions
  • Real estate market exploration

AI and Stock Picking Performance

When it comes to specific investment strategies like stock picking, AI-assisted investing tools have shown mixed results. Current evidence suggests that AI tools appear best suited for experienced and professional traders rather than average investors. Most AI stock-picking tools do not consistently beat the market and should not be relied upon as standalone investment strategies.

Trust and Credibility Challenges

Research indicates that investors perceive AI forecasts as less credible than those from humans. Perceived credibility plays a crucial role in shaping investors’ beliefs. When analysts incorporate AI into their recommendations, investors tend to be less responsive, stemming from lower perceived credibility of AI-generated forecasts. When AI is identified as ChatGPT specifically, participants often exhibit negative reactions to forecasts despite the brand’s name recognition, indicating awareness of its potential accuracy issues.

The Content vs. Source Question

Interestingly, research suggests that the content of financial forecasts is more likely to prompt investor reactions than the source. This means that if AI provides accurate, well-reasoned information, the fact that it came from an AI may matter less than the quality of the analysis itself.

Best Practices for Using AI for Financial Guidance

If you choose to use AI tools for financial advice, experts recommend the following approach:

  • Verify information independently: Don’t rely solely on AI responses. Cross-check answers against official sources and recent publications.
  • Avoid sharing sensitive data: Don’t input personal financial details like account balances, Social Security numbers, or tax information.
  • Use AI as a starting point: View AI responses as preliminary research rather than definitive advice.
  • Consult professionals: For significant financial decisions, work with qualified financial advisors, tax professionals, or investment experts.
  • Check current information: Be aware that AI may provide outdated information on topics where conditions change rapidly.
  • Validate recommendations: If AI suggests a particular product, company, or strategy, validate it from another authoritative source.

Frequently Asked Questions

Q: Can I use AI as my primary financial advisor?

A: No. AI tools are not regulated financial advisors and should never be your sole source of financial guidance. They should be used as supplementary tools for research and education only.

Q: Is ChatGPT or Google Gemini better for financial advice?

A: Testing shows that one model performs significantly better than the other across financial topics, but both have limitations. Google Gemini typically includes clearer disclaimers about its limitations as an AI tool.

Q: What are the biggest risks of relying on AI for financial advice?

A: The main risks include inaccurate information, outdated data, lack of personalization, inability to take responsibility for wrong advice, and potential security issues from sharing personal data.

Q: How accurate are AI tools for financial questions?

A: Recent testing shows accuracy rates ranging from 56% to 65% for financial questions. This means nearly half of AI responses may be incomplete, misleading, or incorrect.

Q: Can AI help with taxes or retirement planning?

A: AI can provide general educational information about taxes and retirement, but these are areas where AI performs poorly. Always consult a qualified tax professional or financial advisor for specific guidance.

Q: Should I input my personal financial information into ChatGPT?

A: No. Inputting sensitive financial information into AI chatbots creates cybersecurity risks. Keep personal data separate from AI queries.

References

  1. Can You Trust AI for Financial Advice? We Put It to the Test — Money Magazine. 2025. https://money.com/can-you-trust-ai-financial-advice/
  2. Do consumers trust AI-generated advice? — Money Management. 2025. https://www.moneymanagement.com.au/news/financial-planning/do-consumers-trust-ai-generated-advice/
  3. Can you rely on artificial intelligence for financial advice? — MoneyWeek. 2025. https://moneyweek.com/personal-finance/artificial-intelligence-financial-advice/
  4. More people turning to AI chatbots for financial advice. Can you trust them? — CBS News. 2025. https://www.cbsnews.com/philadelphia/news/more-people-turning-to-ai-chatbots-for-financial-advice/
  5. AI Tools Are Getting Better, but They Still Struggle With Money Advice — Money Magazine. 2025. https://money.com/ai-tools-financial-advice-struggle/
  6. 5 Ways Americans Are Using AI to Manage Their Money — Money Magazine. 2025. https://money.com/using-ai-to-manage-money-survey/
  7. Can AI Tools for Picking Stocks Help Investors Beat the Market? — Money Magazine. 2025. https://money.com/ai-for-picking-stocks-beat-market/
Medha Deb is an editor with a master's degree in Applied Linguistics from the University of Hyderabad. She believes that her qualification has helped her develop a deep understanding of language and its application in various contexts.

Read full bio of medha deb