This is overall a good analogy, and I share the trepidation of answers without citations stifling further curiosity - but the filtering and paraphrasing is not new to LLMs. There is plenty of misleading compression and reconstruction of source material undertaken by human authors.
Hey, really enjoyed this piece. Clarified some things for me. I mentioned it in my last weekly retrospective: https://novum.substack.com/i/101103540/big-tech-stagnation
This is overall a good analogy, and I share the trepidation of answers without citations stifling further curiosity - but the filtering and paraphrasing is not new to LLMs. There is plenty of misleading compression and reconstruction of source material undertaken by human authors.