sleeper-2

joined 10 months ago
[–] sleeper-2@alien.top 1 points 9 months ago

sweet, ty 😎!

[–] sleeper-2@alien.top 1 points 9 months ago (2 children)

send a PR with your patch!

[–] sleeper-2@alien.top 1 points 9 months ago (2 children)

huge fan of server.cpp too! I actually embed a universal binary (created with lipo) in my macOS app (FreeChat) and use it as an LLM backend running on localhost. Seeing how quickly it improves makes me very happy about this architecture choice.

I just saw the improvements issue today. Pretty excited about the possibility of getting chat template functionality since currently all of that complexity has to live in my client.

Also, TIL about the batching stuff. I'm going to try getting multiple responses using that.

[–] sleeper-2@alien.top 1 points 10 months ago

hell yeah, glad you can use it!

[–] sleeper-2@alien.top 1 points 10 months ago (2 children)

and is taiwan part of china?

[–] sleeper-2@alien.top 1 points 10 months ago

I think I needed a newer API for something. Min is 13.5 (last minor of last major). What version are you on?

 

Hey ya'll, quick update about my open source llama.cpp app, FreeChat. As of this weekend it's live on the mac app store. Big thanks to this community for all the feedback and testing, would not have gotten here without ya'll. Next I'm working on the most common request I get here: a model catalog.

Have friends who aren't hackers who you think should try local AI? Send them a link! Hoping to expand local AI usage by making it dead simple.

App Store! https://apps.apple.com/us/app/freechat/id6458534902

And fOR tHe HaCkers: https://github.com/psugihara/FreeChat