freehuntx

joined 10 months ago
[–] freehuntx@alien.top 1 points 9 months ago (2 children)

Why buy a car when there is uber?

 

If i have multiple 7b models where each model is trained on one specific topic (e.g. roleplay, math, coding, history, politic...) and i have an interface which decides depending on the context which model to use. Could this outperform bigger models while being faster?

 

I did some ratings on chatbot arena and i noticed one things. When an ai honestly said "i dont know that" or "i dont understand that" it was always better received by me and felt kinda smarter.

Does some dataset or lora train on that? Or is "knowing about not knowing" too hard to achieve?

 

Currently we manually set the temperature and keep it the whole chat. Wouldnt it make more sense to let the model decide a temperature itself depending on the topic?

 

I have the feeling alot of models include alot of data in many languages. Would it make more sense to train just on english data and have a seperate translation layer? Or do i misunderstand something?

[–] freehuntx@alien.top 1 points 10 months ago

I think openchat is pretty good. Gives me similar answers and feelings like gpt 3.5 and has the same api structure. Pretty much THE opensource openai alternative.