this post was submitted on 28 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

I got Llama.cpp to work with BakLLaVA (Mistral+LLaVA 1.5) on colab.

Here's a working example that offloads all the layers of bakllava-1.Q8_0 to T4, a free GPU on Colab.

https://colab.research.google.com/gist/chigkim/a5be99a864c4196d5e379a1e6e280a9e/bakllava.ipynb

FYI, Colab has no persistent storage, and you cannot keep a Colab instance running for a long time. I guess it's on purpose for their business reason. You have to setup and download everything from scratch every time you run. Colab is more for demo/experimentation, not meant to run a server for production.

top 1 comments
sorted by: hot top controversial new old
[–] teddybear082@alien.top 1 points 11 months ago

I had done something similar and set up a little chat bot with whisper, stable diffusion, llava and speech to text (me know if you want me to send you a link) but I hadn’t seen before what you did with the google-colab-output eval js cell. Is that basically a way to create a tunnel without having to use trycloudflare or similar? If so that’s awesome!