AI Search: Great at Finding Headlines but Terrible at Giving Credit—Here’s Why That Matters

They summarize the news—but never say who wrote it. Here’s why AI’s citation problem threatens trust, journalism, and transparency

AI SearchJournalismAttribution
AI Search: Great at Finding Headlines but Terrible at Giving Credit—Here’s Why That Matters
3 min read
August 27, 2025

Ever plugged something into an AI search engine and gotten a neat, bullet-pointed summary—without a single mention of who actually wrote the story? The Columbia Journalism Review (CJR) tested eight different AI-powered search tools to see how they credit their news sources. The verdict? They mostly don’t. Below, we’ll break down how CJR figured that out and why it’s such a big deal.

How CJR Put AI Searches to the Test—In Simple Terms

  1. Pick Common News Questions Researchers at CJR asked each AI engine about real news events—things happening in politics, sports, or local happenings.
  2. Look for Links or Mentions They paid attention to whether the AI search results included original article links, named the reporter or publication, or simply gave a vague “some website said.”
  3. Compare Across Eight Tools They did this with multiple queries on eight popular AI search platforms. They noted each platform’s style—some offered a quick reference at the bottom, others none at all—and measured the consistency.
  4. Spot Patterns Finally, they compared how frequently each AI tool actually linked back to a credible source. Spoiler: almost all of them left out key citations or only offered partial references.

Basically, it was a straightforward test: “Ask about news, see if the AI says who wrote the news.”

Why Does This Matter?

1. You Can’t Double-Check the Story

Ever read something online and think, “I want more details”—like the in-depth interview or full data? If the AI won’t show you the original piece, you might never see that background.

2. Journalists Lose Visibility

Many outlets rely on page views and brand recognition. If AI tools yank out the best info but fail to credit the original story, it’s a raw deal for reporters. That can reduce ad revenue, audience trust, and future story budgets.

3. Trust and Transparency

If you can’t see where the AI found its facts, you’re basically taking its word. That’s tricky in a world where misinformation is already a problem. Transparency helps us decide what’s legit and what’s not.

My Take: The Human-Centered Angle

I firmly believe technology should help, not overshadow, the real people behind our digital world. These AI search engines might be amazing at sifting through mountains of data, but ignoring who originally wrote a piece is the opposite of “human-centered innovation.”

  1. Respecting Authorship: It’s about giving journalists, bloggers, or local news reporters the credit they deserve.
  2. Integrity in Transformation: AI is powerful, but if we allow it to blur the lines about who did the work, we’re losing a key piece of authenticity.

What Can Be Done About It?

  1. Better Design: AI platforms could build in automatic citation features—like a “source trace” that lists each website and writer. Yes, it’s extra coding, but it’s an upgrade that fosters trust.
  2. User Feedback: If we all keep clicking “Where did you get this?” or complaining when sources are missing, companies may feel the pressure to respond with actual references.
  3. Support Tools That Link Properly: If one AI search engine consistently shows you the real authors, while another lumps them into “Various Outlets,” choose the more transparent one. Our usage patterns can encourage better behavior.

Looking Ahead: More AI, More Responsibility

The more AI search becomes part of daily life, the more we risk turning news into a soulless “just the facts” feed. CJR’s study is a reminder that for all their cleverness, AI tools still rely on real human writers and data-gatherers—and those creators deserve more than a passing nod.

Bottom Line: Next time you see a quick “AI summary” of a news item, pause and ask, “Who actually reported this?” Because if nobody’s acknowledging the behind-the-scenes journalism, we might be hurting the very people who keep us informed.

Source: AI Search Has A Citation Problem (CJR)

Enjoyed this article?
Click on a star to rate it.

Subscribe to Our Newsletter

Join a growing community of more than 80 readers. Each week, I share actionable Agentic AI tips, news and personal opinions, directly to your inbox.

By submitting this form, you'll be signed up to my free newsletter. You can opt-out at any time and we respect your privacy. Here's our privacy policy.