>>106593742
not trolling, ani was saying it is the fastest at loading and inferencing models. just try it, you don't need another venv or anything