It’s alright, my second point was more something to keep in mind and not an acual argument against using AI for therapie.
It’s alright, my second point was more something to keep in mind and not an acual argument against using AI for therapie.
That is not what I mean. I was talking about Sam Altman using your trauma as training data.
The smallest Modells that I run on my PC take about 6-8 GB of VRAM and would be very slow if I ran them purely with my CPU. So it is unlikely that you Phone has enough RAM and enough Cores to run a decent LLM smootly.
If you still want to use selfhosted AI with you phone, selfhost the modell on your PC:
You now can use selfhosted AI with your phone and an internet connection.
If you use Ai for therapie atleast selfhost and keep in mind that its goal is not to help you but to have a conversation that statisvies you. You are basicly talking to a yes-man.
Ollama with OpenWebUi is relativly easy to install, you can even use something like edge-tts to give it a Voice.
Dang Glowies
Social Credit --;