š° Study Finds AI Assistants Give Wrong or Misleading News Answers in 45% of Cases š¤ā ļø
- MediaFx

- Oct 24, 2025
- 2 min read
TL;DR:A new Search Engine JournalĀ report has revealed that AI assistants ā including popular chatbots and voice tools ā gave inaccurate or misleading responses in nearly half of tested news-related queries.Ā š° The findings raise red flags about the reliability of AI as a news source.

What Happened?
Researchers tested hundreds of real-world news questionsĀ using leading AI assistants such as ChatGPT, Google Gemini, Claude, and others.
Results showed that 45% of their answers were partially or fully incorrect, ranging from factual errors to outdated or biased information. š
Many AI models failed to cite credible sources, and in several cases, confidently fabricated detailsĀ (āhallucinationsā) about recent events.
Accuracy rates were highest for sports and businessĀ topics but dropped sharply for politics, international affairs, and health.
Flashback / Context
AI models rely on large language datasets that may not include real-time updates, especially for breaking news.
Companies like OpenAI, Google, and AnthropicĀ have been racing to improve news accuracy filtersĀ after global criticism of AI misinformation.
Media experts warn that the blend of authority tone + factual uncertaintyĀ makes AI-generated news especially risky if not cross-checked. š§
Who Gains & Who Loses?
Gains:Ā News readers who double-check facts ā awareness leads to safer consumption.
Losses:Ā AI companies ā credibility takes a hit as trust in automated news tools dips.
Media outlets:Ā Regain importance as verified, human-driven journalismĀ proves essential.
Peopleās Angle
For youth who rely on AI summaries for news, this study is a wake-up call: āFastā doesnāt mean āfactual.āĀ Always cross-check from trusted media before sharing or reacting. š
MediaFx Take
AI may be the future of news delivery ā but truth still needs human editors.Ā āļø As long as algorithms can hallucinate, journalismās heartbeat must remain human.













































