this post was submitted on 31 Oct 2023
1 points (100.0% liked)

LocalLLaMA

4 readers
4 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

Hey ya'll, quick update about my open source llama.cpp app, FreeChat. As of this weekend it's live on the mac app store. Big thanks to this community for all the feedback and testing, would not have gotten here without ya'll. Next I'm working on the most common request I get here: a model catalog.

Have friends who aren't hackers who you think should try local AI? Send them a link! Hoping to expand local AI usage by making it dead simple.

App Store! https://apps.apple.com/us/app/freechat/id6458534902

And fOR tHe HaCkers: https://github.com/psugihara/FreeChat

top 10 comments
sorted by: hot top controversial new old
[–] six-ddc@alien.top 1 points 2 years ago (1 children)

LLooks good, but I need to upgrade the system in order to install it. Why does it require such a high system version?

[–] sleeper-2@alien.top 1 points 2 years ago

I think I needed a newer API for something. Min is 13.5 (last minor of last major). What version are you on?

[–] DroidMasta@alien.top 1 points 2 years ago

Stupid question. Would it be possible to make this work on ios?

[–] NovaDragon@alien.top 1 points 2 years ago (1 children)

Are transparent windows no longer part of the macOS private api?

[–] Shir_man@alien.top 1 points 2 years ago

Unfortunately, it does not work with "CasualLM 14B"

Error Loading (causallm_14b.Q5_1.gguf)

Also, as I can see, prompt formatting is not possible to set, is it being selected automatically? ChatML etc

[–] danigoncalves@alien.top 1 points 2 years ago

Congrats for the launch! Its nice to witness the rise of these new open source tools.

[–] livinaparadox@alien.top 1 points 2 years ago
[–] Helpful-Gene9733@alien.top 1 points 2 years ago (1 children)

I just want to add for those that might wonder … this will support running at least up to 7B models (e.g. some nice newer Mistral models!) on a 8GB ram 2017 i5 iMac 3.4 intel … I can get about 4.5 - 6 t/s on the old beast on the 7B model … about 7-8 t/s running the 3B orca_mini. So there’s some hope even for old machines. Thanks for making an app and running it thru the App Store process too!

[–] sleeper-2@alien.top 1 points 2 years ago

hell yeah, glad you can use it!