Guess we could start calling this a 'hallucitation'? Kate Crawford coins an excellent neologism for hallucinated citations in LLMs like ChatGPT.
Recent articles
- The Summer of Johann: prompt injections as far as the eye can see - 15th August 2025
- Open weight LLMs exhibit inconsistent performance across providers - 15th August 2025
- LLM 0.27, the annotated release notes: GPT-5 and improved tool calling - 11th August 2025