>>106315949
LLMs can't be AGI.
All they can be is a word pattern approximator.

And AGI needs to be tangible, not pseud.
A person can have tangible, real knowledge.
An LLM can only pick an approximation.
LLMs can only be pseud.
They cannot be AGI, and the answer to the problem is not more LLM compute, but what type of program(s) need to be made and combined, along with LLMs functions, for an actual AGI.