Search Results
6/18/2025, 8:58:13 AM
>>712974797
You don't even need vram for it I don't know what guy is talking about. Silly Tavern is just a frontend that connects to an API server. All the work is done on their end. You CAN run things locally but why bother?
https://sillytavern.app/
Install from github.
Get an api key from any of the LLM sites like chutes.ai
Connect to the key using an openai compatible url
Set the model to deepseek
viola
You don't even need vram for it I don't know what guy is talking about. Silly Tavern is just a frontend that connects to an API server. All the work is done on their end. You CAN run things locally but why bother?
https://sillytavern.app/
Install from github.
Get an api key from any of the LLM sites like chutes.ai
Connect to the key using an openai compatible url
Set the model to deepseek
viola
Page 1