this post was submitted on 21 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

With high-end Android phones now packing upwards of 24GB of RAM, I think there's huge potential for an app like this. It would be amazing to have something as powerful as the future Mistral 13B model running natively on smartphones!

You could interact with it privately without an internet connection. The convenience and capabilities would be incredible.

you are viewing a single comment's thread
view the rest of the comments
[–] Nixellion@alien.top 1 points 11 months ago

Check Ollama, they have links on their GitHub page to stuff using it, and they have an android app that I believe runs locally on the phone. It uses llama.cpp