Anonymous
7/15/2025, 11:25:51 AM
No.11334022
>>11333750
Depends entirely on the model you're running. Sillytavern is just a fancy webui to chat, you'll need something else (text-generation-webui, ollama, etc) to handle the model.
If you have 24GB or more VRAM you can run pretty decent models, splitting between VRAM and normal RAM is also an option but most likely considerably slower to generate responses. I'd recommend the local models general on /g/ if you want to know more. Not the thread itself, the generals are fucking awful but the resources in the OP are good.
As for sissy content just check chub.ai, I know I've seen at least a dozen sissy/sissyfication cards on there.
Depends entirely on the model you're running. Sillytavern is just a fancy webui to chat, you'll need something else (text-generation-webui, ollama, etc) to handle the model.
If you have 24GB or more VRAM you can run pretty decent models, splitting between VRAM and normal RAM is also an option but most likely considerably slower to generate responses. I'd recommend the local models general on /g/ if you want to know more. Not the thread itself, the generals are fucking awful but the resources in the OP are good.
As for sissy content just check chub.ai, I know I've seen at least a dozen sissy/sissyfication cards on there.