>>82166057
You're right in saying compute /=/ better AI, inherently. But compute power is the contsraints of AI to achieve AGI. WHEN AGI is achieved, then maybe compute power can scale back. But in the mean time, I don't see a world where we can just find the most efficient method with the least compute power that can achieve AGI. Maybe I'm wrong, all it can take is a supergenius being born if you look back in history, but I just don't see it.
AGI to us is a lot different than what AI companies (and through extension the federal government) view it as. This is the biggest division that sets the barebone to when AI will stop being packed in the ass with funding. There's no such thing as a product, innovation, or industry that can sustain significant funding, like what we see in AI, forever.
When AI replaces the aforementioned jobs and industries, and meets the definition set by their makers and government, it will be set back a ton. A lot less money, compute power, and time spent for R&D and it'll transition into sustaining it and expanding what they have on a larger scale.
I personally think the catalyst for true AGI will be when WW3 happens. But that's just me.
>>82166104
I still think conceptually my reasoning stands. But GPT 5 is AMAZING at programming and coding, in contrast with GPT 4.
>>82166123
>>82166133
I don't mean that when the definition of AGI is set it will automatically become more economically feasible, but it will be a lot more profitable once sustainability of the AGI is established and it becomes too big to collapse. On the individual businessman or whathaveyou, it will be more economically feasible. On the AI companies, all they have to do is to sustain it.