Chart of the Month

Hallucinations in Your Search Bar: When AI Makes Things Up


Recent tests¹ of eight popular Artificial Intelligence (AI)-powered search tools asked each system to take real paragraphs from news articles and return the original headline, publisher name, date, and link. Across 1600 trials, the AI tools failed to correctly identify the original source article more than 60% of the time, with most responses inaccurate or incorrectly attributed. As the chart² shows, the error rates varied widely. Even the most accurate model was wrong about 37% of the time, while the worst, a model called Grok-3, botched nearly 94% of its citations. Crucially, these tools seldom admitted they did not know the answer and instead produced incorrect or made-up sources with full confidence, rarely signaling uncertainty.

AI Hallucination Rate

Ranked AI Hallucination Rates by Model | Visual Capitalist

The larger takeaway from this study is that AI can be a valuable tool, but it is not always right. It often responds with a confident and usually agreeable tone, but that does not mean it is always correct. Before you share or act on what an AI tool tells you, it's worth taking a moment to double-check the source. Like carpentry, measure twice, cut once, and verify before you trust. AI models still have progress to make before they can be relied upon completely.

Highlights

AI can be a valuable tool, but it is not always right. It often responds with a confident and usually agreeable tone, but that does not mean it is always correct. Before you share or act on what an AI tool tells you, it's worth taking a moment to double-check the source

Jump to section

Hallucinations in Your Search Bar: When AI Makes Things Up

Download PDF

Our definitions & disclosures