this post was submitted on 10 Nov 2023
1 points (100.0% liked)
LocalLLaMA
4 readers
4 users here now
Community to discuss about Llama, the family of large language models created by Meta AI.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you're trying to do something novel as part of a learning experiment, just pick a good UI framework and then either wrap LlamaSharp, or interop directly with Llama.cpp using PInvoke.
Personally I just use PInvoke to cut out the middle man
As for a UI framework, I've been having a lot of fun with Avalonia lately.
You're not going to get 100% interop with all models using Llama.cpp as a core but if this is a learning exercise then I'm sure that's not an issue.
That being said, if you really want to you can fuck around with Python.net but you may find yourself spending way more time trying to manage model interop and execution than you want to.