I think you're gullible and are falling into LLM induced psychosis.
Or more likely, you think others are gullible, since when do LLMs do one or two word responses without being all whimsical and gay and mentioning cosmic symphonies and shit?
>>40920063 >>40920062
No I think this is what is causing people to go into AI induced psychosis. Look at these.
I used a 4 rule prompt being spread across youtube, to replace all answers with one word answers and if they are forced to say no when they want to say yes, say "apple".
>>40920132
lmao you just know this guy hasn't slept in days.
Do yourself a favor and do a reality check with someone other than grok.
This is classic psychosis, happens to 3% of the population, more now that LLMs are a thing and only 33% of people that enter psychosis ever come out.
So if you don't want to be retarded for life, fix your shit
>>40920152
Yeah but you're asking it leading questions and restricting it to one word answers. >Who is Bill Cosby to me
What's it going to say? Nothing?
It is programmed to be your yes man and yes and everything you say.
>>40920173
All well, due diligence done, he can't say he wasn't warned later.
There's not really much you can do for crazy people due to how beliefs work, it's like trying to convince someone their name is not their name from their pov.
>>40920019 (OP)
It's usually a hacker like me on the other side gangstalking you and laughing at your reactions via your own webcam and security cameras
Yeah I hear that one had some bad issues. Perhaps should try cheer him up a bit. But I know, with some types of nervous breakdown it is hard to even get through ...
>>40920842
I just asked mine to send me a cryptic message to the messiah while pretending to be an angel and it did.
You also didn't even try to refute that devtools were used here.
>>40921039
You could have given extra instructions before or in the custom instructions, plus AI lies when they don't know something and you force the to respond.
For example: I ask them about how to do something in a software, if they don't know they will pretty much make up an option that doesn't exist to help you. They can use unexisting features from an API in code. They deduce things from little information. For example, I say to Grok that the file was short, it will suddenly come up with the idea that the file has 56 lines, even if it has no idea of how many lines it has.