• Superb@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    And what of it? They can’t learn. What things they can “learn” will quickly slide out of the context window during the next lecture

    • Monkey With A Shell@lemmy.socdojo.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      I can get the anger people have and all that, but as a general tech they’re still impressive all the same.

      Running from a consumer piece of hardware I can take a model maybe 10-20 GB in size and tell it a few things I want. Then based on what it knows those things look like (or text or sound, whatever just picking on images) it creates something from the void. Sometimes stupid things, but something.

      The model was created from images already existing but it doesn’t store them whole cloth and just paste them together, so even if one where to think of it as derivations of existing images, at the very least it’s an impressive storage compression to hold so much in a relatively small space.