← Home ← Back to /g/

Thread 106330464

16 posts 8 images /g/
Anonymous No.106330464 >>106330504 >>106330612 >>106330899 >>106331494 >>106333431 >>106334437 >>106335539 >>106338053 >>106339425 >>106339444
>AI companies do be like I'll spend 12 billion dollars to make 700 million in profit this will be the future
Anonymous No.106330504 >>106336754 >>106337463
>>106330464 (OP)
twominutepapers made the estimation that OpenAI's model cost $10m to train, and it's widely known Deepseek's cost around $6m.
>I need 24 quadrillion dollars ((for national security purposes))
Anonymous No.106330612
>>106330464 (OP)
(and that's a good thing)
Anonymous No.106330899
>>106330464 (OP)
The 12 billion they spend wasn't theirs to begin with, so that's a perfectly viable strategy.
Anonymous No.106331494
>>106330464 (OP)
>do be like
Anonymous No.106333431 >>106339282
>>106330464 (OP)
Are they really making that much money?
I.e the running costs for cpu/gpu farms, has to be astronomic.
Anonymous No.106334437
>>106330464 (OP)
Is this the new "I want AI to fail" high grade copium?
Anonymous No.106335539
>>106330464 (OP)
>debt based system
>has debt
wow

ron paul was right you know
Anonymous No.106336728
the real purpose of them is monitoring everyone in real time, llms are just a byproduct
Anonymous No.106336754
>>106330504
Those dollarinos probably go into future research.
Anonymous No.106337463
>>106330504
>SOTA LLM
>$10m training costs
Calling this an uneducated guess would be an understatement. But hardly surprising, you and you eceleb pop science grifter fell for the deepseek $6m bait.
A very stark oversimplification is that training needs at least 4 times as much memory as inference, and obviously vastly more compute and bandwidth to finish before the share holders die of old age. That means a shit ton of expensive hardware and significant power draw, as well as overpaid experts to run this shit.
Deepseek was important because it was the best open model and you need only $300k in gpus to run the model properly, that's however nothing compared to what you need to train it.
Anonymous No.106337621
It's to get people used to llms and dependent on them. Once it's an enstablished technology they will start enshitifying them and putting ads on them, possibly in the text generation itself. Everyone I know uses them, so we're almost there.
Anonymous No.106338053
>>106330464 (OP)
>I'll spend 12 billion dollars of other people's money to make 700 million for myself.
Fixed.
Anonymous No.106339282
>>106333431
The operating costs aren't particularly high.
Let's take xAis Colossos as an example, it originally featured 100k h100 gpus and draws 150MW, albeit that means the gpus aren't 100% utilized mind you.
Regardless, that means power costs are somewhere between 50-100 million annually, which really isn't that high all things considered.
Much higher are the cost of building the damn data center, they're estimated at $3b, so power cost are only 1/60th to 1/30th of that...
It's estimated that with this cluster gpt 4 could have been trained quite quickly. Some fags said 4 days, let's be a bit more conservative and go for 1-2 weeks. The value of these cards is gonna drop quickly, so let's say after 5 years the cards are almost worthless with all the wear and tear, and what value remains might just be enough to finance the disassembly of the cluster. I mean, hopper has already been superceded by blackwell, so...
Grok has to bring in literally billions in a few years or it'll just be a massive loss. The same is true for every ai company. OpenAI might just have the advantage that they get a advantageous deal from Microsoft but that's just a form of subsidation...
Meta is absolute making no money on this and they have even more gpus...
Neither is google.

They're all losing money out the ass but they don't care because the stock market expects them to invest massively, and because investors are colossal retards who can at most see one quarter ahead the companies are simply motivated to push their stock up to keep investors happy, so they throw more money at it.

For example, on average, x/xai would need every grok user to pay north of $200 annually to break even before the cards are trash. That ain't happening, we all know that, and it's the same for all other services. People will stop using certain LLMs if free plans become unusable, and other hosts will just offer a better free version in the hope of gaining more customers.

Tl;dr its a fking bubble
Anonymous No.106339425
>>106330464 (OP)
The American business strategy
Anonymous No.106339444
>>106330464 (OP)
They'll get their returns once AGI is achieved(it's predicted to come in 2027)