the term is “Model collapse” or “model autophagy disorder” and any generative model is susceptible to it
as to why it has not happened too much yet: Curated datasets of human generated content with minimal AI content
If it does: You could switch to an older version, yes, but to train new models with any new information past a certain point you would need to update the dataset while (ideally) introducing as little AI content as possible, which I think is becoming intractable with the widespread deployment of generative models.
I knew the answer was “Yes” but it took me a fuckin while to find the actual sources again
https://arxiv.org/pdf/2307.01850 https://www.nature.com/articles/s41586-024-07566-y
the term is “Model collapse” or “model autophagy disorder” and any generative model is susceptible to it
as to why it has not happened too much yet: Curated datasets of human generated content with minimal AI content If it does: You could switch to an older version, yes, but to train new models with any new information past a certain point you would need to update the dataset while (ideally) introducing as little AI content as possible, which I think is becoming intractable with the widespread deployment of generative models.