>>106316815
LLMs at least as they exist now are fundamentally flawed. They can't have anything like an active memory, short and long term, recursive information updating, or situational awareness.
You throw an LLM some type of untrained for string and it will throw out the user's intent entirely. It could be a riddle or joke an 8 year old would understand. The LLM after failing then being explained to will be all
>oh haha I get it now clever lol ur such a good user keep going tee hee
But the system failed to connect disparate ideas because it has no capacity to do so.
They aren't intelligence. They are word-task completion programs.
Whatever fixes this requires the program to fundamentally be something more than an LLM, something with LLM function in it, but not an LLM.