I’m trying to figure out how to host one myself. I’m trying to use barvarder and localai. But I am failing due to not enough knowledge and missing instructions. Any advice? did someone succeed with anything? I’d be happy to make other smaller steps at first as well. As long as I get somewhere.

  • das@lemellem.dasonic.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    If you want to be able to get into the nitty gritty or play with options besides just a chat, I recommend Text Generation WebUI.

    Installing is pretty easy, then you just download your desired model from Hugging Face.

    Or if you want to use it for roleplay or adventure style games, KoboldCPP is easy to set up.

  • hazeebabee@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    Sounds like a really cool project, sadly i dont have much knowledge to contribute. Still, what kind of issues have you run into? Any specific errors or problems?

    • das@lemellem.dasonic.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Surge is probably the easiest way to get a basic setup. If you just want to download a model and chat, I recommend it.

  • DontNoodles@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I’ve heard good things about H2O AI if you want to self host and tweak the model by uploading documents of your own (so that you get answers based on your dataset). I’m not sure how difficult it is. Maybe someone more knowledgeable will chime in.

  • Aties@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I haven’t looked into specific apps, but I have been wanting to try various trained models and figured just self hosting jupyterhub and getting models from hugging face would be a quick and flexible way to do it

  • Sims@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    If low on hw then look into petals or the kobold horde frameworks. Both share models in a p2p fashion afaik.

    Petals at least, lets you create private networks, so you could host some of a model on your 24/7 server, some on your laptop CPU and the rest on your laptop GPU - as an example.

    Haven’t tried tho, so good luck ;)