Anonymous
6/24/2025, 1:27:54 PM
No.24492310
>>24492306
Yes. When their LLM is wrong or hallucinates, they will often not even know it's happening. Here is one example: https://desuarchive.org/r9k/thread/81417906/#q81419335
A ChatGPT user's LLM began hallucinating that the Y-Chromosomal Adam meant that there was a time when humans did not have a Y-Chromosome. He continued arguing because he wasn't really engaging or understanding what was going on, just using each prompt as if it were gospel. They are unthinking, ignorant zealots.
Yes. When their LLM is wrong or hallucinates, they will often not even know it's happening. Here is one example: https://desuarchive.org/r9k/thread/81417906/#q81419335
A ChatGPT user's LLM began hallucinating that the Y-Chromosomal Adam meant that there was a time when humans did not have a Y-Chromosome. He continued arguing because he wasn't really engaging or understanding what was going on, just using each prompt as if it were gospel. They are unthinking, ignorant zealots.