>>105660397
I think collapse only happens if you train on outputs over and over in many iterations, and without adding human writing into the corpus to ground the model.
The idea Musk mentions of using AI to curate the training set should be fine. Though, spot checking the curation would be prudent.