I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

  • makyo@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    3 days ago

    I feel like people are using those terms pretty well interchangeably lately anyway

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        LLM is the technology, Chatbot is an implementation of it. So yes a Chatbot as it’s talked about here is an LLM. Although obviously chatbots don’t have to be LLM, those that are not are irrelevant.

        • Greg Clarke@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          9
          ·
          3 days ago

          No, a chat bot as it’s talked about here is not an LLM. This article is discussing limitations of LLM training data and inferring that chat bots can not scale as a result. There are many techniques that can be used to continue to improve chat bots.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            3 days ago

            The chatbot is a front end to an LLM, you are being needlessly pedantic. What the chatbot serves you, is the result of LLM queries.

            • Greg Clarke@lemmy.ca
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              8
              ·
              2 days ago

              That may have been true for the early LLM chatbots but not anymore. ChatGPT for instance, now writes code to answer logical questions. The o1 models have background token usage because each response is actually the result of multiple background LLM responses.