Hallucination
- The Fabrication Machine: What Happens When You Skip Verification
Peter Vandermeersch, former editor-in-chief of NRC and a Mediahuis fellow focused on journalism and AI, was suspended after publishing dozens of AI-hallucinated quotes attributed to real people. This is not a story about a rogue junior staffer. It is a story about what predictable LLM failure modes look like when someone who should know better ignores them.
- Why LLMs Need Bayesian Reasoning (and How Google Is Teaching It)
Google Research published a paper showing LLMs can be trained to reason like Bayesians -- updating beliefs as evidence arrives rather than pattern-matching to a confident answer. For engineers running production systems, this matters more than most benchmark improvements.