As LLMs become the go-to for quick answers, fewer people are posting questions on forums or social media. This shift could make online searches less fruitful in the future, with fewer discussions and solutions available publicly. Imagine troubleshooting a tech issue and finding nothing online because everyone else asked an LLM instead. You do the same, but the LLM only knows the manual, offering no further help. Stuck, you contact tech support, wait weeks for a reply, and the cycle continues—no new training data for LLMs or new pages for search engines to index. Could this lead to a future where both search results and LLMs are less effective?

  • chaosCruiser@futurology.todayOP
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    5
    ·
    3 days ago

    Sure does, but somehow many of the answers still work well enough. In many contexts, the hallucinations are only speed bumps, not show stopping disasters.

    • oakey66@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      3 days ago

      It told people to put glue in their pizza to make the dough chewy. It’s pretty fucking awful.

      • chaosCruiser@futurology.todayOP
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        Copilot wrote me some code that totally does not work. I pointed out the bug and told it exactly how to fix the problem. It said it fixed it and gave me the exact same buggy trash code again. Yes, it can be pretty awful. LLMs fail in some totally absurd and unexpected ways. On the other hand, it knows the documentation of every function, but somehow still fails at some trivial tasks. It’s just bizarre.

        • oakey66@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          3 days ago

          It does this because it inherently hallucinates. It’s just an analytical letter guesser that sounds human because it amalgamates and predicts the next word. It’s just gotten so much input that it can sound human. But it has no concept of right and wrong. Even when you tell it that it’s wrong. It doesn’t understand anything. That’s why it sucks. And that’s why it will always suck. It will not replace search because it makes shit up. I use it for coding here and there as well and it’s just making up functions that don’t exist or attributes functions to packages that aren’t real.