If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?
Thanks for any recommendations in advance.
If you were running a LLM locally on android through llama.cpp for use as a private personal assistant. What model would you use?
Thanks for any recommendations in advance.
There you have it. Changed my comment to not include your text.
You really need help. Not because i hate your or something but because you would profit from a coherent thought framework that helps you think logically without the need for constant violence.
Good luck