← Home ← Back to /g/

Thread 106949258

22 posts 6 images /g/
Anonymous No.106949258 [Report] >>106949273 >>106949293 >>106949453 >>106949522 >>106950075 >>106950558 >>106950699 >>106950936 >>106951082 >>106951643 >>106951667
I want to run local language models, is this 32GB RAM MacBook Pro M5 a good purchase? It has about 150GB/s memory bandwidth.
Also with 32GB RAM which language models can I run?
Anonymous No.106949273 [Report] >>106949305
>>106949258 (OP)
>run local language models
this is as abstract as stating "i will connect to the internet"

what do you want to run?
what purpose?
Anonymous No.106949279 [Report]
Why do poorfags want to do rich people stuff? Pay your Gemini Pro subscription and fuck off. You don'r even know how much RAM you need
Anonymous No.106949293 [Report]
>>106949258 (OP)
you should either buy the m4 pro chip or wait till the m5 pro chip. don't buy the base chip models for such work.
Anonymous No.106949305 [Report] >>106949534 >>106950075 >>106950862 >>106950978 >>106951684
>>106949273
Like smaller versions of DeepSeek R1 and the eventual R2 release. DeepSeek R2 might have a version that can run at 32GB RAM.
Anonymous No.106949317 [Report]
if you want to do something gpu intensive you'd want to wait for the pro/max models they have like 4-10 times more gpu cores iirc also you want to max the ram i think its like 128gb
with 32gb you can only run the shittiest models so you dont want to spend $2k to run that trash either
bonus tip dont buy it directly from apple retailers are always running 10-30% sales while apple always charges release recommended price
Anonymous No.106949453 [Report]
>>106949258 (OP)
1. a $2000 m5 mac is a worse option than a $2000 5090
2. 32gb limit is relatively strict for mac chips. m5 pro/max/ultra will have more RAM, which may provide value over the 5090
Anonymous No.106949522 [Report]
>>106949258 (OP)
Are you sure you wanna do that or do you have the money to play around? Also with that memory i think you could run the 32B models but thats kinda it
Anonymous No.106949534 [Report]
>>106949305
>Like smaller versions of DeepSeek R1 and the eventual R2 release.
It's worth noting that these are actually just Qwen/Llama finetuned on Deepseek outputs.
Anonymous No.106950075 [Report]
>>106949258 (OP)
>a good purchase?
fuck no lol

>>106949305
ram is only part of the problem. arm shitbook won't come anywhere near what someone with latest nvidia card is capable of doing.
Anonymous No.106950558 [Report]
>>106949258 (OP)
this won't bring you any joy beyond 1 week
Anonymous No.106950596 [Report] >>106950742 >>106951006
Why the fuck does everything have 512 gb SSD now? I get it that HDD is a distant memory now but it seems retarded to have a fraction of the storage space low end computers had a decade ago
Anonymous No.106950699 [Report]
>>106949258 (OP)
Buy a 2nd hand GPU you fucking retard
Anonymous No.106950742 [Report]
>>106950596
It probs seems less retarded when some of your money comes from paid cloud storage, whether directly or not.
Anonymous No.106950862 [Report]
>>106949305
you want 128 ram at the very minimum for that
Anonymous No.106950936 [Report]
>>106949258 (OP)
>32GB
you don't want to cheap out on this one, fa.m
Anonymous No.106950978 [Report]
>>106949305
You'd probably be better off investing in a $1000 Nvidia card and 128 GB of RAM in a desktop.
Otherwise, here's a video that may give you an answer.

https://www.youtube.com/watch?v=jdgy9YUSv0s
Anonymous No.106951006 [Report]
>>106950596
If you want an HDD that won't die in a month, you just about have to get a decade old one off of ebay anymore.
There was a thread on here a week or two ago that explained why.
>t. went through several new HDDs recently, both Seagay and WD.
Anonymous No.106951082 [Report]
>>106949258 (OP)
i run gpt-oss 20b with my 18gb mbp, i have to close other stuff but the inference itself runs great and the model is so good, I don't think there is anything better at that size so I can recommend that, with 32gb you will just be very comfortably using the llm and your other stuff at the same time
Anonymous No.106951643 [Report]
>>106949258 (OP)
>$2000+
>Non user upgradable 512GB storage
Why does Apple treat it high paying customers like this?
Anonymous No.106951667 [Report]
>>106949258 (OP)
Mac mini is unironically the best choice for local LLMs
Anonymous No.106951684 [Report]
>>106949305
Just run gpt-oss and qwen3-30B-A3.5B
deepseek-r1 distills are overthinking all the time