this post was submitted on 21 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

With high-end Android phones now packing upwards of 24GB of RAM, I think there's huge potential for an app like this. It would be amazing to have something as powerful as the future Mistral 13B model running natively on smartphones!

You could interact with it privately without an internet connection. The convenience and capabilities would be incredible.

you are viewing a single comment's thread
view the rest of the comments
[–] SlowSmarts@alien.top 1 points 11 months ago (1 children)

The direction I took was to start making a Kivy app that connects to an LLM API at home via OpenVPN. I have Ooba and LLama.cpp API servers that I can point the android app to. So, works on old or new phones and is the speed of the server.

The downsides are, you have to have a static IP address or DDNS to connect a VPN to. And cell reception can cause issues.

I have a static to my house, but a person could have the API server be in the cloud with a static IP, if you were to do things similarly.

[–] Winter_Tension5432@alien.top 1 points 11 months ago (1 children)

A normal person would not be able to do it, the first people that create a oogaboga app for android and iPhone and place it on the store at 15$ will have my money for sure and probably from a million other people too.

[–] SlowSmarts@alien.top 1 points 11 months ago

🤔 hmmm... I have some ideas to test...