Lucknow: In 2023, the favorite headline of the tech-obsessed news cycle was a fear-soaked prediction: “ChatGPT is coming for your job.” It was an irresistible storyline—machines replacing humans, newsrooms emptying, journalism bowing before an all-knowing algorithm. Two years later, that prophecy looks less like a credible threat and more like a cultural panic. The dust has settled. Thousands of reporters have spent thousands of hours prompting, editing, rewriting, and fact-checking large language models. And the consensus among those who actually do the work is overwhelmingly clear: ChatGPT is a powerful assistant, a dazzling intern who never sleeps, never calls in sick, never asks for health insurance—and still cannot be trusted alone in a newsroom.
The reasons, as shared by dozens of working journalists across the world, reveal not just the limitations of AI but the essence of journalism itself.
1. Quotes Are Not Commodities—They Are Relationships
A feature lives or dies on the strength of its voices. Real quotes are the currency of authenticity, earned through patience, empathy, and trust. They are not interchangeable Lego pieces; they are the products of relationships—messy, fragile, human.
“Try asking ChatGPT to get a grieving mother on the record the day after her son was killed by police,” says Sarah Conway, an investigative reporter with the Tampa Bay Times, who spent nine months coaxing sources for a series on no-knock warrants. “The model can hallucinate a perfect tear-streaked quote—‘I’ll never hold him again’—but it didn’t sit in that woman’s living room for four hours while she decided whether a stranger deserved her pain.”
Conway’s frustration is widely shared. In November 2025, a media outlet surveyed 87 working reporters. Ninety-one percent said the most important part of their job—building relationships with sources—has absolutely no AI substitute. Some even laughed at the idea.
As Zeina Khodr, former Beirut bureau chief for Al Jazeera, put it during a WhatsApp call, “I once spent three days drinking tea with a militia commander who wouldn’t even tell me the weather. AI doesn’t have three days, and it doesn’t have tea. It definitely doesn’t understand when silence is the real story.”
Even when sources are polished public figures, the human dance of an interview matters. Financial Times technology correspondent Madhumita Murgia recalls interviewing Sam Altman in 2023. “Half the value came from what he said off the record while we waited for coffee. That context shaped everything. A language model has no off-record. It has no coffee.”
2. Verification Is Not a Prompt—It Is a Blood Sport
Journalists repeat the same warning: ChatGPT’s biggest strength—its fluency—is also its Achilles’ heel.
In October 2025, an AI-generated “exclusive” circulated across crypto Telegram channels claiming that the Swedish central bank had secretly begun CBDC trials in the rural Norrland region. The piece looked immaculate: detailed policy analysis, precise figures, and a quote attributed to the deputy governor. The Riksbank shut down the rumor within minutes. No such trial existed. The quote was grammatically perfect Swedish translated into English. It was also entirely fabricated.
For Craig Silverman, media editor at ProPublica and one of the world’s leading misinformation experts, the incident was depressingly predictable.
“Verification is adversarial by design,” he explains. “Your job is to prove yourself wrong before someone else does it in public. LLMs are trained to sound right, not to be right. That’s the opposite of what we do.”
So publishers are responding pragmatically. According to a 2025 Reuters Institute report, 68 percent of publishers now require at least two human verifiers for any AI-assisted story. Silverman laughs at the irony. “We added staff to check the robot. That’s not job replacement; that’s expensive babysitting.”
3. Ground Truth Requires Mud on Your Shoes
The gap between human reporting and synthetic reporting becomes most visible during breaking news—when stakes are high, details are scarce, and rumors travel faster than facts.
In July 2025, Hurricane Sara devastated parts of the North Carolina coast. Within hours, AI-driven news aggregators pushed out vivid, 1,200-word stories of “families clinging to rooftops in Blounts Creek” and “National Guard helicopters making daring midnight rescues.” The problem? Blounts Creek was barely touched. The dramatic scenes came from a mixture of FEMA manuals, past hurricane coverage, and generic prompt filler.
Meanwhile, Charlotte Observer reporter Théoden Janes spent the night driving through floodwater with a sheriff’s deputy to reach a retirement community cut off by downed lines and submerged roads. His resulting piece began with a detail no model could conjure: 79-year-old Betty Harrington describing how she and her husband used a broom handle to break their attic hatch when water reached the ceiling.
“AI can rewrite FEMA press releases faster than I can,” Janes says. “But it wasn’t there when Betty cried because she thought she’d die without seeing her granddaughter one last time. That moment is why people still pay for newspapers.”
Ground truth requires presence—physical, emotional, ethical. No model was chest-deep in Pamlico Sound that night.
4. Ethics Lives in the Gray, Not in the Guardrails
Newsrooms live in gray areas algorithms cannot navigate. Every serious publication wrestles with questions no safety guideline can resolve.

Do you name a 16-year-old rape survivor who has posted publicly on TikTok but whose family pleads for anonymity?
Do you publish leaked medical records of a politician if they expose hypocrisy around reproductive rights?
Do you identify a whistleblower whose testimony could topple a local government—but put them in danger?
“These decisions destroy friendships, end marriages, and haunt you for years,” says Maria Torres, former standards editor at the Los Angeles Times. “We once spent eleven hours debating whether to print a single name. Eleven hours. ChatGPT would spit out an answer in four seconds and sleep like a baby.”
OpenAI’s own documentation acknowledges that frontier models still struggle to apply professional ethical codes—like the SPJ Code of Ethics—to novel contexts. The gap is not computational. It is existential. Ethical judgment emerges from experience, empathy, and consequences—none of which AI possesses.
5. The Public Can Taste the Difference
The final judges of storytelling—the readers—are not fooled.
In a blind study conducted by the University of Texas in September 2025, volunteers read two 1,200-word features: one written by a Washington Post reporter, the other by GPT-4o with heavy human editing. Seventy-eight percent preferred the human story.
When asked why, their answers revealed journalism’s core advantage:
-
“It felt like someone had actually been there.”
-
“The quotes sounded like real people, not movie characters.”
-
“I trusted it more.”
Trust remains journalism’s only irreplaceable currency. As The Guardian’s audience editor Maria McKay puts it: “We’re not selling paragraphs. We’re selling the promise that someone risked something—time, access, reputation, sometimes safety—to bring you the truth. That promise cannot be hallucinated.”
6. The Jobs Actually Disappearing—And the Ones Rising Again
AI has indeed automated a slice of journalism, but not the part most people feared. The roles being erased are low-value churn: listicles, earnings recaps, high-school sports box scores.
The Associated Press now auto-generates roughly 4,000 quarterly earnings briefs every earnings season. No reporter misses writing those.
But the long-form feature—the kind that requires three weeks, six cities, dozens of interviews, and at least one meltdown in a rental car at 2 a.m.—is experiencing a global resurgence. Outlets from The New York Times to nonprofit newsrooms like The Texas Tribune report rising demand for the exact journalism AI does worst.
“Paradoxically,” says Semafor editor-in-chief Ben Smith, “the better AI gets at mediocre 600-word explainers, the more readers crave the 4,000-word pieces that feel impossibly human.”
7. The New Contract
Journalism is not dying; it is evolving. The reporters who will survive 2030 are the ones who treat AI the way photographers treat Photoshop—indispensable, dangerous, and never the author.
As Sarah Conway reflects on her Pulitzer-winning no-knock series: “ChatGPT didn’t hold LaToya’s hand while she read her son’s autopsy report for the first time. I did. That’s the job. The rest is just typing.”
And no language model—no matter how brilliant—can ever take that moment away from her. Or from us.
