Anonymous
7/25/2025, 12:00:12 AM
No.106013665
>>106013579
Hmm, after looking further into how thinking is handled in llama.cpp, I believe it hardcoded with <think>. It won't work with your model. It's quite bad as most frontend and tools won't work correctly.
Hmm, after looking further into how thinking is handled in llama.cpp, I believe it hardcoded with <think>. It won't work with your model. It's quite bad as most frontend and tools won't work correctly.