I’m interested in hosting something like this, and I’d like to know experiences regarding this topic.

The main reason to host this for privacy reasons and also to integrate my own PKM data (markdown files, mainly).

Feel free to recommend me videos, articles, other Lemmy communities, etc.

  • Buffalobuffalo@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    Dbzero Lemmy has a relationship with the Horde AI shared LLM group. My primary use is for chat roleplay but they have streamlined guides to hosting your own models for personal or horde use. One of the primary interfaces is SillyTavern but they integrate numerous models

  • SuperiorOne@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 months ago

    I’m actively using ollama with docker to run llama2:13b model. It’s generally works fine but heavy on resources as expected.