this post was submitted on 17 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] a_beautiful_rhind@alien.top 1 points 10 months ago (3 children)

Nice. A lightweight loader. Will make us free of gradio.

[–] oobabooga4@alien.top 1 points 10 months ago (2 children)

Gradio is a 70MB requirement FYI. It has become common to see people calling text-generation-webui "bloated", when most of the installation size is in fact due to Pytorch and the CUDA runtime libraries.

https://preview.redd.it/pgfsdld7xw0c1.png?width=370&format=png&auto=webp&s=c50a14804350a1391d57d0feac8a32a5dcf36f68

[–] tronathan@alien.top 1 points 10 months ago

Gradio is a 70MB requirement

That doesn't make it fast, just small. Inefficient code can be compact.

load more comments (1 replies)
load more comments (1 replies)