Evening_Ad6637

joined 1 year ago
[–] Evening_Ad6637@alien.top 1 points 11 months ago

thanks for your feedback. that's strange, I couldn't reproduce this bug (or I didn't understand the error?)
I'll answer you on github more detailed.

[–] Evening_Ad6637@alien.top 0 points 11 months ago (1 children)

We also have LlaVa and BakLlaVA, two multimodal models based on llama and the latter on mistral.

[–] Evening_Ad6637@alien.top 1 points 11 months ago

I’ve tested „amica“ yesterday. I was very impressed. Try it out:

https://github.com/semperai/amica

[–] Evening_Ad6637@alien.top 1 points 11 months ago

O.M.G. What an incredibly huge work! Wtf?! I am speechless.

You are the most angel like wolf i know so far and you really really deserve a price dude!

Again: WTH?!

[–] Evening_Ad6637@alien.top 1 points 11 months ago (1 children)

Yeah I dont think authors are intentionally bullshitting or intentionally doing "benchmark cosmetics", but maybe it's more lack of knowledge on whats going on in terms of (most of) benchmarks and their the image that has become ruined in the meantime.

[–] Evening_Ad6637@alien.top 1 points 11 months ago (5 children)

heheh i can't read that any more.. i really have become very prejudiced when comes to that.. to be honest, when it comes to any comparison with GPT-4.

People have really to understand that even GPT-4 has been aligned, lobotomized and it has been massively downgraded in terms of its perfomance – due to security reasons (what is understandable for me), but anyway this thing still is an absolute beast. if we consider all the restrictions GPT-4 has to undergo, all the smartness at openAI, all the ressources at microsoft and so on, we have to realize that currently nothing is really comparable to GPT-4. Especially not 7B models.

[–] Evening_Ad6637@alien.top 1 points 11 months ago (2 children)

Yes it means predict n tokens. Is it not easy to understand? I might change it back... For me it is important that an ui is also not overbloated with "words" and unfortunately "Predict_n Tokens".. how can I say.. it 'looks' aweful. So I am looking for something more aesthetic but also easy to understand. It's difficult for me to find.

[–] Evening_Ad6637@alien.top 1 points 11 months ago

That's a pretty good idea! thanks for your input. I will definitely make a note of it as an issue in my repo and see what I can do.

Thank you for saying that. It makes me feel valued for my work. I've already made a pull request and Gerganov seems to like the work in general, so he would accept a merge. I still need to fix a few things here and there though - the requirements at the llama.cpp dudes are very high : D (but i don't expect anything else there heheh)

[–] Evening_Ad6637@alien.top 1 points 11 months ago

did you cloned it from my repo?

[–] Evening_Ad6637@alien.top 1 points 11 months ago

u/ambient_temp_xeno ah I have now seen that min-p has been implemented in the server anyway, so I have now added it too.

[–] Evening_Ad6637@alien.top 1 points 11 months ago

Yes the openai playground was my styling inspiration. I thought this is good since a lot of users will used to it.

the llama.cpp dev (gerganov) already answered and accepts a merge : ))

[–] Evening_Ad6637@alien.top 1 points 11 months ago

Ah one sidenote: selecting a model via dialog is absolutely not intuitive. If you want to navigate into a folder, you have to press space two times. Do not press enter until you decide to choose a specific folder. It doesnt matter that much if you are in parent folders, since the script will search recursively - but of course if you have many files it could take a long time.

 

Hi folks, I have edited the llama.cpp server frontend and made it look nicer. Also added a few functions. Something I have been missing there for a long time: Templates for Prompt Formats.

Here to the github link: ++camalL

Otherwise here is a small summary:

- UI with CSS to make it look nicer and cleaner overall.

- CSS outsourced as a separate file

- Added a dropdown menu with prompt style templates

- Added a dropdown menu with system prompts

- Prompt Styles and System Prompts are separate files, so editing is very easy.

- Created a script that uses "dialog" to compose the command for the server.

- Script offers the possibility to save and load configs

In planning or already started:

- WIP Multilingual: You will be able to select the language from a dropdown menu. So far language files only for English and German. (concerns UI elements and system prompts).

- Dark Mode

- Templates for the values of the UI options (samplers etc.), e.g. deterministic template, creative template, balanced template etc...

- Zenity start script (like dialog, but gui)

---

As for the prompt format templates, I just picked a few by feel. The most important are the four to which almost all others can be traced back: Alpaca, ChatML, Llama2, Vicuna.

But if you want more templates for a specific model, feel free to let me know here or on github.

As you can see on the third picture, it should now be easier for beginners to use the llama.cpp server, since a tui dialog will assist them.

Hope you like my work. Feel free to give feedback

ps: I've made a pull request, but for now I publish it on my own forked repo.

ui-1

ui-2

tui-1

view more: next ›