Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Foundation model sponsors already pay humans to generate authentic content, especially in technical areas that are underrepresented in general internet scrapes. I would imagine that this trend will continue.

Further, the "model collapse" hypothesis of 2020/2021 seems to have failed to materialize. Maybe we're still too early, and we're not yet seeing negative effects of OpenAI training on OpenAI output. But maybe "slop" is not being rewarded as much as human content, and having humans in the loop (even as readers) is preventing a slide into incoherence.

Will LLMs eventually disincentivize people from producing and publishing new original content? If that content is easily replicated by an LLM query, maybe. And maybe it's not the worst thing in the world. 5 years ago I would have bought an "FFmpeg Cookbook" from O'Reilly, but now I would just tell Claude exactly what I'm trying to achieve. As a consumer, I'm better off, and arguably we've saved the author of a hypothetical FFmpeg Cookbook weeks out of their precious life. Weeks they could spend doing something—anything—more valuable than rewording FFmpeg documentation.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: