• JackGreenEarth
    link
    fedilink
    English
    arrow-up
    7
    ·
    7 months ago

    Yes, but what LLM has a large enough context length for a whole book?

    • ninjan@lemmy.mildgrim.com
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 months ago

      Gemini Ultra will, in developer mode, have 1 million token context length so that would fit a medium book at least. No word on what it will support in production mode though.

      • JackGreenEarth
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        Cool! Any other, even FOSS models with a longer (than 4096, or 8192) context length?