>>106310370
>Is there a way to get this working with just normal Llama.cpp?
Literally just select llama.cpp as the backend.