top of page

📰 Study Finds AI Assistants Give Wrong or Misleading News Answers in 45% of Cases 🤖⚠️

TL;DR:A new Search Engine Journal report has revealed that AI assistants — including popular chatbots and voice tools — gave inaccurate or misleading responses in nearly half of tested news-related queries. 📰 The findings raise red flags about the reliability of AI as a news source.

ree

What Happened?

  • Researchers tested hundreds of real-world news questions using leading AI assistants such as ChatGPT, Google Gemini, Claude, and others.

  • Results showed that 45% of their answers were partially or fully incorrect, ranging from factual errors to outdated or biased information. 📉

  • Many AI models failed to cite credible sources, and in several cases, confidently fabricated details (“hallucinations”) about recent events.

  • Accuracy rates were highest for sports and business topics but dropped sharply for politics, international affairs, and health.

Flashback / Context

  • AI models rely on large language datasets that may not include real-time updates, especially for breaking news.

  • Companies like OpenAI, Google, and Anthropic have been racing to improve news accuracy filters after global criticism of AI misinformation.

  • Media experts warn that the blend of authority tone + factual uncertainty makes AI-generated news especially risky if not cross-checked. 🧠

Who Gains & Who Loses?

  • Gains: News readers who double-check facts — awareness leads to safer consumption.

  • Losses: AI companies — credibility takes a hit as trust in automated news tools dips.

  • Media outlets: Regain importance as verified, human-driven journalism proves essential.

People’s Angle

For youth who rely on AI summaries for news, this study is a wake-up call: “Fast” doesn’t mean “factual.” Always cross-check from trusted media before sharing or reacting. 🔍

MediaFx Take

AI may be the future of news delivery — but truth still needs human editors. ✍️ As long as algorithms can hallucinate, journalism’s heartbeat must remain human.

bottom of page