>>717982127
I don't think you understand the sheer scale these things operate on. Even Grok requires an entire server farm and dedicated power grid just to post shit on twitter while absolutely shitting out pollution in doing so.
These AI models they keep making have to run on minimum dozens of computers to function in any sort of capacity because we keep reaching the point of "more more more" where the only way to develop is to just slap on additional processing power and hire a couple retards to code in a bigger memory bank.
Not only is this comically impractical to ever have for private usage, but it is physically impossible to condense these things even three times over. It's not a matter of engineering at that point, it's a matter of physics.