this post was submitted on 13 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

Why is there no analog to napster/bittorent/bitcoin with LLMs?

Is there a technical reason that there is not some kind of open source LLM that we can all install on our local host which contributes computing power to answering prompts, and rewards those who contribute computing power by allowing them to enter more prompts?

Obviously, there must be a technical reason which prevents distributed LLMs or else it would have already been created by now.

you are viewing a single comment's thread
view the rest of the comments
[–] remghoost7@alien.top 1 points 10 months ago (12 children)

It actually does exist.

It's called Petals.

I believe it was made to run Bloom 176B.

[–] PookaMacPhellimen@alien.top 1 points 10 months ago (10 children)
[–] ExTrainMe@alien.top 1 points 10 months ago (1 children)

Bad marketing. I only seen it recently.

Plus you get one model no loras (unless something changed recently).

[–] lordpuddingcup@alien.top 1 points 10 months ago

It runs a few models and if others decide to run models it runs with em just try the chat we app or the dashboard to see what’s currently running issue is not enough people donating compute

load more comments (8 replies)
load more comments (9 replies)