this post was submitted on 26 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Not super knowledgeable about all the different specs of the different Orange PI and Rasberry PI models. I'm looking for something relatively cheap that can connect to WiFi and USB. I want to be able to run at least 13b models at a a decent tok / s.

Also open to other solutions. I have a Mac M1 (8gb RAM) and upgrading the computer itself would be cost prohibitive for me.

you are viewing a single comment's thread
view the rest of the comments
[–] ThinkExtension2328@alien.top 1 points 11 months ago (4 children)

Honestly the m1 is probably the cheapest solution you have , get your self LLM studio and try out a 7b_K_M model your going to struggle with anything larger then that. But that will let you get to experience what we are all playing with.

[–] ClassroomGold6910@alien.top 1 points 11 months ago (2 children)

What's the difference between `K_M` models, also why is `Q_4` legacy but not `Q_4_1`, it would be great if someone could explain that lol

[–] ThinkExtension2328@alien.top 1 points 11 months ago

Not sure about the K but the M means medium loss of info during the quantisation phase afaik

load more comments (1 replies)
load more comments (2 replies)