>>105756194 (OP)>GPTs are fundamentally the same since release. Only dataset sizes increased.There have been improvements. Newer models are capable of working with larger contexts that would've made old models shit themselves, improved system prompts have helped to reduce hallucinations and get better answers, and reasoning training has helped models to work through complex problems that would have been impossible for older models.
But all of those are improvements at the edges. They don't fix the fundamental flaws. It's like upgrading your PC by putting in better fans. Sure it will probably improve performance but not by much, you'll still be limited by the speed of the core parts, like the CPU, GPU, and RAM.
>GPTs will not lead to AGI and are a dead end.Yep.