this post was submitted on 20 Nov 2023
-1 points (33.3% liked)
Self-Hosted Main
504 readers
1 users here now
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
For Example
- Service: Dropbox - Alternative: Nextcloud
- Service: Google Reader - Alternative: Tiny Tiny RSS
- Service: Blogger - Alternative: WordPress
We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.
Useful Lists
- Awesome-Selfhosted List of Software
- Awesome-Sysadmin List of Software
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think the most advanced OpenSource LLM model right now is considered to be Mistral 7B Open Orca. You can serve it via the Oobabooga GUI (which let's you try other LLM models as well). If you don't have a GPU for interference, this will be nothing like the ChatGPT experience though but much slower.
https://github.com/oobabooga/text-generation-webui
You can also try these models on your desktop using GPT4all, which doesn't support GPU ATM.
https://gpt4all.io/index.html
Mistral OpenOrca is a good one. I pull about 10 to 11 tokens/sec. Very impressive. For some reason though, i cannot get GPT4ALL to use my 2080ti even though it is selected in the settings.