As LLMs become the go-to for quick answers, fewer people are posting questions on forums or social media. This shift could make online searches less fruitful in the future, with fewer discussions and solutions available publicly. Imagine troubleshooting a tech issue and finding nothing online because everyone else asked an LLM instead. You do the same, but the LLM only knows the manual, offering no further help. Stuck, you contact tech support, wait weeks for a reply, and the cycle continues—no new training data for LLMs or new pages for search engines to index. Could this lead to a future where both search results and LLMs are less effective?

  • Annoyed_🦀 @lemmy.zip
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    25 days ago

    And where does LLM take the answer? Forum and socmed. And if LLM don’t have the actual answer they blabbering like a redditor, and if someone can’t get an accurate answer they start asking forum and socmed.

    So no, LLM will not replace human interaction because LLM relies on human interaction. LLM cannot diagnose your car without human first diagnose your car.

    • oyo@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      24 days ago

      The problem is that the LLMs have stolen all that information, repackaged it in ways that are subtly (or blatantly) false or misleading, and then hidden the real information behind a wall of search results that are entire domains of ai trash. It’s very difficult to even locate the original sources or forums anymore.

      • chaosCruiser@futurology.todayOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        24 days ago

        I’ve even tried to use Gemini to find a particular YouTube video that matches specific criteria. Unsurprisingly, it gave me a bunch of videos, none of which were even close to what I’m looking for.

    • chaosCruiser@futurology.todayOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      25 days ago

      That’s true. There could be a balance of sorts. Who knows. If LLMs become increasingly useful, people start using them more. As they loose training data, quality goes down, and people shift back to forums etc. Could work that way too.