The halting problem as anti AGI argument
Can the halting problem be generalized as to be a problem of running out of resources in a machine that has physical limitations?
Therefore a magical sci fi AGI is not possible under our current tech, because we don't have the resourced to develop it?
Therefore a magical sci fi AGI is not possible under our current tech, because we don't have the resourced to develop it?