If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?

Thanks for any recommendations in advance.

  • Autonomous User@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 days ago

    maid + VPN to Ollama on your own computer.

    Use an Onion service with client authorisation to avoid needing a domain or static IP.