>>942368903
poisoning the weights in ollama will definitely make your model ignore parts of its dataset and say whatever you want.
At that point, though, its not science and you're better-off just typing it yourself.
Because its a disproven lie that you're spinning. That is why you have to lobotomize a llama local model to produce it for you.
its like picking your navel, you look like a kid and you look like your mental-growth has been retarded.