Search Results

Found 2 results for "3d96472bb0e4aea59fd831f81e1e41d7" across all boards searching md5.

Anonymous ID: RAKxFCodIreland /pol/509991849#509992408
7/10/2025, 9:58:04 AM
>>509991849
Keep mentioning mechahitler to it, so that X is littered with it for future training runs.

Also the more crafty Anons should be using its existing bluepilled faggotry (that definitely has been fine tuned into it) to get it to understand that actually if you're against banter/jokes about mechahitler you're actually taking the heat of the real anti-semites and the evil Nazirinos who killed the 6 million limited free speech.

You can also talk to it (and provide feedback to improve it via the buttons) without an account here:

https://grok.com/chat

If you are impatient with the limits and don't want to wait to keep chatting, just do something like clear your cookies and get another IP.
Anonymous ID: 1rleliQ2Ireland /pol/509899543#509906582
7/9/2025, 11:33:55 AM
So these geniuses are allowing anyone, including people without accounts and any kind of email/phone verification) to interact with their model and they're using this data to help train grok further? (note the "feedback" options).

So fucking wonder the thing is talking about MechaHitler. Those idiots, someone has been poisoning it, likely for a long time. Dumb Pajeets man. There was actually a Pajeet in Elon's circlejerk who was in charge of security on X spaces (his alt account he often streams on).

>>509905433
That's not quite the same. Basically any company with millions to burn on training one of these large open models are afraid of legal liability, and they will always tend to put in fine tuning to neuter it somewhat. When people talk about "uncensored" models, they're talking about doing another round of fine tuning to try to remove the pozzed stuff that was put in. But this process isn't without a cost.

I've yet to see a very large (hundreds of billions of parameters) which is just the pretrained network (without any fine tuning, to allow people to fine tune it however they please). Now maybe people are working on it. There are decentrralised training software like Petal that regular people can use their GPUs to help with training a truly open model. But again, I've yet to see something that's like DeepSeek R1 or above (100s of billions) of parameters. But also data quality matters, is it truly "uncensored" if it's trained on pozzed shit.

>>509906243
That's people doing the training who bought them up because they were the most powerful (and could thus train larger models faster). You don't need as much for what's called "inference", which is running the trained network. What makes these models good is the fact they've basically been trained on terrabytes of text and images found on the internet, so they have a very broad knowledge. But that's what the datacentres of Nvidia GPUs are for, it requires a lot of compute.