>>106537340
I ran Mythomax 13b quant on card and it was... OK. Then I threw in the towel on LLMs until I started using DS API.
The hardware for local just isn't there yet. API access will cost me less than $20 for an entire year. That won't even cover the cost of a decent keyboard... I'm just not willing to spend the cash on an LLM inference machine while the market sorts itself out.
My machine a middling gamer rig; 12gb VRAM can run AI art modules locally, and modern games. For API aside from playing with small models... I'll just rent inference.
>>106537272
Hey, it's a Drummer.
We should all have dreams.