this post was submitted on 21 Nov 2023
1 points (100.0% liked)
LocalLLaMA
3 readers
1 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The direction I took was to start making a Kivy app that connects to an LLM API at home via OpenVPN. I have Ooba and LLama.cpp API servers that I can point the android app to. So, works on old or new phones and is the speed of the server.
The downsides are, you have to have a static IP address or DDNS to connect a VPN to. And cell reception can cause issues.
I have a static to my house, but a person could have the API server be in the cloud with a static IP, if you were to do things similarly.
A normal person would not be able to do it, the first people that create a oogaboga app for android and iPhone and place it on the store at 15$ will have my money for sure and probably from a million other people too.
🤔 hmmm... I have some ideas to test...