Anonymous
8/14/2025, 6:27:45 AM
No.106255033
>>106254883
>chizo telling people that finetuning is pointless, which of course (if true) removes the remaining reason to run locally
Finetuning isn't even in the top five reasons to run your LLM locally.
1. Privacy/Sensitive Data
2. Internet access at client or endpoint
3. Guarantee of having the same product
4. Censorship/prompts in the middle
5. Ability to see all aspects of what you're running to develop around it properly
In addition, the vast majority of finetunes overfit the shit out of a particular segment and just make the model fucking dumber.
> I suppose by that logic there's no point to it for imagegen either, everyone should just have used base SDXL all this time
Two major differences here
1. There are a SHITLOAD more base LLMs than there are imagen models, by a factor of hundreds.
2. People do not use imagen models the same way they use LLM's. You can and should switch to a different imagen model to get one particular image, or a set of particular images that come out great on that model, and it doesn't matter of it's so overfit on that it can't gen anything else.
LLM's NEED generalized domains to work properly, and overfitting them on particular concepts makes them WORSE at those concepts, because they tie into everything around them for coherent responses.
>chizo telling people that finetuning is pointless, which of course (if true) removes the remaining reason to run locally
Finetuning isn't even in the top five reasons to run your LLM locally.
1. Privacy/Sensitive Data
2. Internet access at client or endpoint
3. Guarantee of having the same product
4. Censorship/prompts in the middle
5. Ability to see all aspects of what you're running to develop around it properly
In addition, the vast majority of finetunes overfit the shit out of a particular segment and just make the model fucking dumber.
> I suppose by that logic there's no point to it for imagegen either, everyone should just have used base SDXL all this time
Two major differences here
1. There are a SHITLOAD more base LLMs than there are imagen models, by a factor of hundreds.
2. People do not use imagen models the same way they use LLM's. You can and should switch to a different imagen model to get one particular image, or a set of particular images that come out great on that model, and it doesn't matter of it's so overfit on that it can't gen anything else.
LLM's NEED generalized domains to work properly, and overfitting them on particular concepts makes them WORSE at those concepts, because they tie into everything around them for coherent responses.