this post was submitted on 19 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hi community,

i am writing my own GUI in wich i want to use a LLM completely local. The problem is i dont know how to start with the LLM.

Can someone explain to me how the first steps work to integrate/work with the LLM or does someone know some good tutorials?

The LLM is downloaded localy. Now i need to integrate a library or something? sry i could not find a lot useful/direct information about the topic.

Thank you very much in advance!

you are viewing a single comment's thread
view the rest of the comments
[–] FPham@alien.top 1 points 1 year ago

Don't start writing entire GUI. First make a simple code that loads model and does interference. There are probably 10 lines in total. You can just grab the code people have with their models (like TheBloke always post a code snippet)

Now create gui and instead of preset text add text box and a button "Go" so it can do interfernce from your text.

Boom, A GUI.

Now go from there, keep adding - dropdown list to select model, dropdown list to select instruction template...etc...