>>717359448
Nah, as long as it's running on a language model there's a tiny chance every time it thinks it'll hallucinate some insane shit, no matter what.