this post was submitted on 26 Nov 2023
1 points (100.0% liked)

LocalLLaMA

1 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 10 months ago
MODERATORS
 

So far, I have experimented with the following projects:

https://github.com/huggingface/chat-ui - Amazing clean UI with very good web search, my go to currently.

https://github.com/oobabooga/text-generation-webui - Best overall, supports any model format and has many extensions

https://github.com/ParisNeo/lollms-webui/ - Has PDF, stable diffusion and web search integration

https://github.com/h2oai/h2ogpt - Has PDF, Web search, best for files ingestion (supports many file formats)

https://github.com/SillyTavern/SillyTavern - Best for custom characters and roleplay

https://github.com/NimbleBoxAI/ChainFury - Has great UI and web search (experimental)

https://github.com/nomic-ai/gpt4all - Basic UI that replicated ChatGPT

https://github.com/imartinez/privateGPT - Basic UI that replicated ChatGPT with PDF integration

LM Studio - Clean UI, focuses on GGUF format

-

Really love them and wondering if there are any other great projects,

Some of them include full web search and PDF integrations, some are more about characters, or for example oobabooga is the best at trying every single model format there is as it supports anything.

What is your favorite project to interact with your large language models ?

Share your findings and il add them!

top 41 comments
sorted by: hot top controversial new old
[–] OrdinaryAdditional91@alien.top 1 points 9 months ago (1 children)
[–] iChrist@alien.top 1 points 9 months ago

Agreed, will add that !

[–] itsuka_dev@alien.top 1 points 9 months ago

What is your favorite project to interact with

I still don't have a favorite, tbh. I've tried a few of the UIs you shared, and I found them to be either too complicated or lacking in certain areas I need. Like many others, I ended up building my own.

Share your findings

Recently, I started collecting local UIs and this is what I've gathered so far: UI list.

[–] No-Belt7582@alien.top 1 points 9 months ago

I use kobold cpp for local llm deployment. It's clean, it's easy and allows for sliding context. Can interact with drop in replacement for OpenAI.

[–] Cradawx@alien.top 1 points 9 months ago (1 children)

I mostly use a UI I made myself:

https://github.com/shinomakoi/AI-Messenger

Works with llama.cpp and Exllama V2, supports LLaVA, character cards and moar.

[–] RYSKZ@alien.top 1 points 9 months ago

Right now, I'm using your earlier project [1]. It's proving to be incredibly helpful, thank you!.

Since it's a desktop application, it's more convenient for me than the WebUIs, because I tend to have a lot of tabs open in my browser, which makes it pretty chaotic. I have set up an AutoHotkey script to can easily launch it using a easy to remember hotkey.

[1] https://github.com/shinomakoi/magi_llm_gui

[–] mattapperson@alien.top 1 points 9 months ago (1 children)

Here is a new one I found the other day. Still seems to be WIP but overall I really like what is being done here - https://github.com/lobehub/lobe-chat

[–] iChrist@alien.top 1 points 9 months ago

Wow looks very good indeed, how is the web extraction plugin? can you share some screenshots?

[–] w4ldfee@alien.top 1 points 9 months ago (2 children)

exui by turboderp (exllamav2 creator) is a nice ui for exl2 models. https://github.com/turboderp/exui

[–] uhuge@alien.top 1 points 9 months ago

Can it serve on a CPU-only machine?

[–] CheatCodesOfLife@alien.top 1 points 9 months ago

I wish we had a UI like this for GGUF (for Apple)

[–] orrorin6@alien.top 1 points 9 months ago

Nice, thanks for compiling this info.

[–] Tim-Fra@alien.top 1 points 9 months ago (1 children)

https://github.com/serge-chat/serge

A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.

(...without websearch)

[–] kubbiember@alien.top 1 points 9 months ago

Serge is underrated, unknown, and development is slow because of it

[–] noco-ai@alien.top 1 points 9 months ago

I released a UI last week noco-ai/spellbook-docker (github.com). Has chat plugins with 50+ in v0.1.0 that handle things like simple math (multiplication, addition, ...), image generation, TTS, Bing news search, etc.

[–] mcmoose1900@alien.top 1 points 9 months ago (1 children)

No exui?

https://github.com/turboderp/exui

Its blazing fast, vram efficient, supports minp and has a notebook mode... what else could I ask for.

I was using ooba before, but I have dumped it because its so much slower.

[–] FPham@alien.top 1 points 9 months ago

That looks very clean for sure.

[–] elfish_dude@alien.top 1 points 9 months ago

I’d check out Sanctum too. Super easy to use

[–] Shir_man@alien.top 1 points 9 months ago
[–] Temsirolimus555@alien.top 1 points 9 months ago (1 children)

Websearch is dope. Too bad for me because I am comfortable with pip, not npm. Setting this up will involve pulling some hair out, so I will not even attempt.

I have decent results with langchain and SERP API for google search with gpt 4 function calling. However, I would LOVE the implementation of ChatUI search functionality in python. I hope someone makes a wrapper (if thats even a thing - I am not a programmer by profession).

[–] iChrist@alien.top 1 points 9 months ago (1 children)

Im not great at troubleshooting errors but the install of chat-ui was pretty straightforward.

If you already have a llamacpp server it would be very easy to connect.

I enjoy the search functionality so much and I think its worth the hassle, if you need any help with it just comment here.

[–] Temsirolimus555@alien.top 1 points 9 months ago (1 children)

I have llamacpp server up and running. I will def give the install a shot!

[–] iChrist@alien.top 1 points 9 months ago

If you need any help with the local.env. File, tell me and il help out

[–] faldore@alien.top 1 points 9 months ago (1 children)

How chat-ui local? Last I tried they still require mongo.

[–] iChrist@alien.top 1 points 9 months ago

I had some struggles with it, it works best for me in combination with llamacpp, and you need to run a docker command to start a mongo DB for you chats locally.

Even the search results can be queried on your device instead of API.

[–] XhoniShollaj@alien.top 1 points 9 months ago (2 children)

To keep track of this I put it all in a repo: https://github.com/JShollaj/Awesome-LLM-Web-UI

Thank you for all the recommendations and the list (Ive also been looking for some time :) )

[–] klenen@alien.top 1 points 9 months ago

Wow, thanks!

[–] iChrist@alien.top 1 points 9 months ago (3 children)

Cool! Can the list be added to the main repo ( GitHub - sindresorhus/awesome: 😎 Awesome lists about all kinds of interesting topics )

Or linked there under a small category?

People need to know about all of those great alternatives to ChatGPT :D

[–] itsuka_dev@alien.top 1 points 9 months ago (1 children)

I have two kind of lists, one for OpenAI API-powered UIs (source, last updated this July), and another for one for local UIs (I'll update this with the list from XhoniShollaj). I feel like a better organization is needed, e.g. whether the UI is open source or not, model backend and architecture, differentiating feature, etc. Otherwise, the list is impossible to navigate (at least for me).

[–] XhoniShollaj@alien.top 1 points 9 months ago (1 children)

Thats a very neat layout u/itsuka_dev - love your project. I think we can keep both at the meantime (I want to add my mine to other Awesome lists for more exposure) - let me know what breakdown makes more sense from your end so I can improve my repo.

[–] itsuka_dev@alien.top 1 points 9 months ago

Thank you for the kind words! I stopped maintaining OAI UIs because things got a bit stagnant a few months ago (there were literally no new UIs for weeks, I think). But with the new features announced at DevDay, I'm expecting to see a surge in new UIs, especially those that leverages multi-modality. As a maintainer of list projects, this is such an exciting time.

I think we can keep both at the meantime

Absolutely. Someone needs to maintain an awesome list for local UIs, and I don't think my projects fall into that category. Besides, my list got a mix of native UIs in there too, which is important to me since I'm building both web and native UIs.

let me know what breakdown makes more sense from your end

For local UIs, sorting/grouping by model backend (e.g. llama.cpp, ollama, ExLlama) makes the most sense, IMO - and the rest of what I mentioned above is optional.

[–] XhoniShollaj@alien.top 1 points 9 months ago

Thank you - I submitted a pull request to add it there. Hopefully gets approved. Let me know if there are other lists you would like me to add it to.

[–] XhoniShollaj@alien.top 1 points 9 months ago

Actually will need 30 more days to get approved. Feel free to contribute additional projects to it at the meantime :)!

[–] 9gigsofram@alien.top 1 points 9 months ago

Any of these projects supply clustering multiple gpus/users, or even multiple machines.

[–] kaloskagatos@alien.top 1 points 9 months ago (1 children)

Hi, is there a good UI to chat with ollama and local files (pdf, docx, whatever) and if possible multiple or even a lot of files ?

By the way, what is the difference between ollama and llamacpp, are the API incompatible ?

[–] iChrist@alien.top 1 points 9 months ago (1 children)

For PDF , docx and like 50 more formats, use h2oGPT, great for this kind of stuff.

[–] kaloskagatos@alien.top 1 points 9 months ago
[–] SideShow_Bot@alien.top 1 points 9 months ago (1 children)

So, in the end which one would you recommend for someone just beginning to run LLMs locally? Windows machine (thus Sanctum is out of the question for now). I'm interested in 3 use cases, so maybe there would be a different answer for each of them:

  1. Python coding questions
  2. Linux shell questions
  3. RAG: in particular, I would like to be able to ask questions and have the model retrieve an answer online, supported by one or more working hyperlinks
[–] iChrist@alien.top 1 points 9 months ago (1 children)

You should look at LoLLMs webui, it has those options

[–] SideShow_Bot@alien.top 1 points 9 months ago

I'll have a look into it and compare it to LM Studio.

[–] uhuge@alien.top 1 points 9 months ago

I've got mixed experiences with Bavarder, native UI, fair choice of models to grab, but offen not working reliably. They seem to improve it slowly but steadily.