← Home ← Back to /g/

Thread 105679312

9 posts 2 images /g/
Anonymous No.105679312 [Report] >>105681394
LLMs are fundamentally the same since the first GPT dropped. They are suffering from the same defects and nothing apart fro mdataset size has changed, as these defects(hallucinations etc) are inherent to the design and cannot be solved. LLMs are a dead end.
Anonymous No.105679338 [Report]
+10 smart boy points
now back to work
Anonymous No.105679608 [Report] >>105679613 >>105679714 >>105680861
We've hit peak data. Anything produced now will be tainted by AI outputs. AIs can't train on AI outputs, it amplifies any distortions and and dampens ambient. (I call it the mad AI diesease).
We will need to create an insolated society with no AI access so that they may generate training data for us. Perhaps this has already happened, perhaps that's us
Anonymous No.105679613 [Report]
>>105679608
future AI will be trained on the data from sentinel island nogs
Anonymous No.105679714 [Report]
>>105679608
>AIs can't train on AI outputs
They can, and do all the time. That's pretty much how distillations are made. The drawbacks to synthetic data is well understood and there are a bunch of techniques to prevent over-fitting to potentially bad data. They don't just pipe the output of one LLM straight into a new LLM, human centipede style. There's a bunch of filtering and scoring and reinforcement learning that happens.
Anonymous No.105680861 [Report] >>105681178
>>105679608
They are paying people to write coding and mathematical solutions that AI can ingest.
Anonymous No.105681155 [Report]
At this point AI should have found a cure to cancer and found how to go to the moon
Anonymous No.105681178 [Report]
>>105680861
Pajeets?
Anonymous No.105681394 [Report]
>>105679312 (OP)
Self-feeding schizo loop is unavoidable for LLMs. I've been saying it for years.