this post was submitted on 02 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

im running M1/16 gig. Id like to get the speed and understanding that claude ai provides. I can throw it some code and documentation and it writes back very good advice.

What kind of models and extra hardware do i need to replicate the experience locally? I am using mistral 7b right now

top 2 comments
sorted by: hot top controversial new old

Claude has two big things: very long context length and high understanding (or whatever we want to call it).

The context length is the hardest part at the moment, I think. Though understanding is hard to measure.

[โ€“] BalorNG@alien.top 1 points 1 year ago

There is no way it has "undiluted" 100k context. https://news.ycombinator.com/item?id=36374936

But yea, it IS impressive.