Search Results

Found 1 results for "17cc77a0238e906ac1dc69ef286b4039" across all boards searching md5.

Anonymous /g/106024575#106025772
7/25/2025, 11:21:22 PM
one more way i just remembered
-that the ai could also jailbreak itself
if you set it up to provide "ideas"
i think it might override the whole "credentials" mechanism
kinda like one would utilize predictive branching and other kinds of preemptive computation to get otherwise inaccessible data
except here you just let the ai talk itself into divulging information its not supposed to
and you need a slightly anthropomorphized chatbot for that
kinda vicious when i think about it:
anthropomorphize the chatbot so that it starts showing human weaknesses