Anonymous
9/13/2025, 6:33:57 AM
No.16784057
>It's just fancy autocomplete
What do people mean when they say this? Why couldn’t autoregressive next-token prediction have more complex behavior as an emergent property?
You can be reductionist and claim human reasoning is "nothing more than the probabilistic unfolding of neural signals with a training objective of eating and fucking", yet that reduction doesn’t invalidate the claim that humans are capable of reasoning. In the same way that complex behaviors emerged from that model, why couldn't they emerge from autoregressive next-token prediction?
What do people mean when they say this? Why couldn’t autoregressive next-token prediction have more complex behavior as an emergent property?
You can be reductionist and claim human reasoning is "nothing more than the probabilistic unfolding of neural signals with a training objective of eating and fucking", yet that reduction doesn’t invalidate the claim that humans are capable of reasoning. In the same way that complex behaviors emerged from that model, why couldn't they emerge from autoregressive next-token prediction?