If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?
Thanks for any recommendations in advance.
If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?
Thanks for any recommendations in advance.
You want to run it on the phone itself? I don’t think any phone would be good enough for that. The issue with AI assistants is not just privacy. It’s also the resource consumption (and of course stolen content). It’s so high and only these big companies with huge server farms can do it.
If you just want a voice assistant for simple commands, I’ve heard of an open source local assistant called Dicio. But I don’t think you can talk to it like ChatGPT or something.
I’ve successfully ran small scale LLM’s on my phone, slow but very doable. I run my main AI system on an older, midrange gaming PC. No problems at all.
Dicio is a pre-programmed assistant, which one can talk to if one has speech recognition software installed. It has a preset of tasks it can do, in my experience it’s quite incomparable to how LLM’s work.