this post was submitted on 26 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Not super knowledgeable about all the different specs of the different Orange PI and Rasberry PI models. I'm looking for something relatively cheap that can connect to WiFi and USB. I want to be able to run at least 13b models at a a decent tok / s.

Also open to other solutions. I have a Mac M1 (8gb RAM) and upgrading the computer itself would be cost prohibitive for me.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] ThinkExtension2328@alien.top 1 points 11 months ago (4 children)

Honestly the m1 is probably the cheapest solution you have , get your self LLM studio and try out a 7b_K_M model your going to struggle with anything larger then that. But that will let you get to experience what we are all playing with.

[โ€“] ClassroomGold6910@alien.top 1 points 11 months ago

3b's work amazingly and super smoothly but 7b models while running at a fair 15 tokens per second prevent me from using any other application at the same time and occasionally freeze my mouse and screen temporarily until the response is finished

load more comments (3 replies)