Search Results
6/28/2025, 10:27:04 PM
>>105737064
>it's a distilled model, and distilled models are this way because they learn from a teacher AI model, so they've only been trained on synthetic dataset, which makes them ultraslopped
>>105737130
>>put the same image through Kontext Pro and show us how different it is
>this is what I got
So distillation doesn't actually slop models? WTF I've been lied to this whole time.
>it's a distilled model, and distilled models are this way because they learn from a teacher AI model, so they've only been trained on synthetic dataset, which makes them ultraslopped
>>105737130
>>put the same image through Kontext Pro and show us how different it is
>this is what I got
So distillation doesn't actually slop models? WTF I've been lied to this whole time.
Page 1