Anonymous
10/24/2025, 6:47:06 PM
No.29684048
>>29684040
I mean, price mostly. Rigs for running big llm models are expensive, even the ones with unified memory. People do get mac/strix halo setups with 100+gb of unified memory for running llms locally, but it's a tough pill to swallow at 5 grand plus for a mostly single use system.
I mean, price mostly. Rigs for running big llm models are expensive, even the ones with unified memory. People do get mac/strix halo setups with 100+gb of unified memory for running llms locally, but it's a tough pill to swallow at 5 grand plus for a mostly single use system.