• bassomitron@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    10 months ago

    Would that actually be decent? Even 6b models feel way too rudimentary after experiencing 33+b models and/or chatgpt. I haven’t tried those really scaled down and optimized models, though!

    • gibson@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      They’re decent for text completion purposes, e.g. generating some corpspeak for an email, or generating some “wikipedia”-like text. You have to know how to write good prompts, don’t try to treat it like ChatGPT.

      For example if i want to know about the history of Puerto Rico I would put:

      “The history of puerto rico starts in about 480BC when”

    • acec@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      Decent enough for a model 50 times smaller than ChatGPT. I use orca_mini_3b.