>llama.cpp
>warming up the model with an empty run - please wait ... (--no-warmup to disable)

can I just skip warmup for good?