To this day, because of how LLMs work, giving them an explicit instruction not to do something makes them MORE likely to do it than if you said nothing at all.