this post was submitted on 14 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

openchat 3.5 16k

you are viewing a single comment's thread
view the rest of the comments
[–] rkzed@alien.top 1 points 1 year ago (2 children)

I'm confused with their prompt format, do we really need to use their library to try the model?

[–] Dear_noobs@alien.top 1 points 1 year ago

I came across this yesterday, one interface to be able to jump between all the things.

Find what you want to try, click Download, then chat with it..

[–] perlthoughts@alien.top 1 points 1 year ago (1 children)

nah you can use llama.cpp or whatever you like, thebloke already has multiple gguf versions up already.

[–] involviert@alien.top 1 points 1 year ago

They were talking about the prompt format. Because obviously their library will be translating that OpenAI API-style to actual proper prompt format internally, which is not documented at all.