>>511321476
there is no AGI, there probably never will be
what there is
A) glorified prompt complete, which sometimes "hallucinates"
B) greedy idiots, who have no grasp of LLMs actual capabilities and who want to jump on the hypetrain as fast and as hard as possible to not miss the imagined free money galore
that said, we will probably die because some greedy idiots let LLMs write some shitty code, that fails in a way no proooompt "engineer" imagined and that triggers a nuke exchange or some other stupid shit