this post was submitted on 19 Nov 2023
1 points (100.0% liked)

LocalLLaMA

4 readers
4 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 2 years ago
MODERATORS
 

We've seen pretty amazing performance of mistral 7b when comparing with Llama 34B & Llama2 13B. I'm curious, theoretically, will it be possible to build an SLM, with 7-8B parameters, able to outperform GPT4 in all tasks? If so, what are potential difficulties / problems to solve? And when do you expect such SLM to come?

ps: sorry for the typo. This is my real question.

Is it possible for SLM to outperform GPT4 in all tasks?

you are viewing a single comment's thread
view the rest of the comments
[–] FPham@alien.top 1 points 2 years ago

Short answer: Nope.

Long answer: Nooooooooope