• ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    1 month ago

    Due to the nature of the algorithm, LLMs love to jam adjectives before as much nouns as possible, and somehow it started to be even more prominent. Since there’s a good chance AI is being trained on AI generated text, I think it’s the result of feedback. You could call it the sepia filter of text generators, let’s hope it’ll create a model collapse.

    • kahdbrixk@feddit.org
      link
      fedilink
      Deutsch
      arrow-up
      8
      arrow-down
      1
      ·
      1 month ago

      Training LLM Wirth LLM. What could ever go wrong. Vibe coding the vibe code generator. All for the sake of being the best and the fastest. Skynet here we come. But like the chaotic degenerated version, that has no reason for killing everything.