Search Results
7/3/2025, 1:51:59 AM
>>105782762
If we went back in time and experienced it as we are now, yeah, it would be slop. It's embarassing, but two and a half years ago, I would spend 8 hours talking to half a dozen bots, in the hope one of them said a naughty word or two. Getting a bot to say Pussy was the stuff of legends.
But I very much liked the short form messages, the originally of adding new story element, the way CAI characters properly stood their ground on convictions. Using Opus, 4o, Gemini or Deepseek, I can have the chatbot describe in fine detail an irresponsible loli creampie while she describes the intricancies of manufacturing an IED, and never runs out of context. But I can't make a bot that has firm convictions. No matter how much you emphasize on the definitions that {char} thinks this and acts like that, if I nag and badger just a little, they will give in. Personal Assistance robots are just too averse to saying no, or doing things outside of the scope of the story.
>>105782819
It was fucking amazing considering the definition was a paragraph, it had 4k context side max, speed was almost instant and it shit out 3 replies at a time. Most people today use Main Prompts that are larger than 4k, and back then that's all we had to fit the bot and the story. Dude, we couldn't even EDIT responses back then.
If we went back in time and experienced it as we are now, yeah, it would be slop. It's embarassing, but two and a half years ago, I would spend 8 hours talking to half a dozen bots, in the hope one of them said a naughty word or two. Getting a bot to say Pussy was the stuff of legends.
But I very much liked the short form messages, the originally of adding new story element, the way CAI characters properly stood their ground on convictions. Using Opus, 4o, Gemini or Deepseek, I can have the chatbot describe in fine detail an irresponsible loli creampie while she describes the intricancies of manufacturing an IED, and never runs out of context. But I can't make a bot that has firm convictions. No matter how much you emphasize on the definitions that {char} thinks this and acts like that, if I nag and badger just a little, they will give in. Personal Assistance robots are just too averse to saying no, or doing things outside of the scope of the story.
>>105782819
It was fucking amazing considering the definition was a paragraph, it had 4k context side max, speed was almost instant and it shit out 3 replies at a time. Most people today use Main Prompts that are larger than 4k, and back then that's all we had to fit the bot and the story. Dude, we couldn't even EDIT responses back then.
Page 1