wtf? Really? I mean I kinda thought that too because of the way GPT3.5 compares to Falcon 180B. Even tho Falcon has more parameters still GPT3.5 works way better than it. I credited all this to the Data used to train the model. I believe that Not just more parameters but more quality data will help AI Models increase proportionally in terms of quality & performance.
Can't believe that ChatGPT is just 20B, I always thought that it's 175B Model. What about the actual 175B+ Model? Are they going to be AGI? lol.
If this is true then it means all Open Source Models are trained cheaply and is nothing compared to what OpenAI did.
Most-Trainer-8876
joined 1 year ago
It worked for me tho? It literally said, "Alright, let me flex my creative writing skills" XD
https://preview.redd.it/ilahogsgq13c1.png?width=871&format=png&auto=webp&s=f7af75668628913cdd4697cb0921314bed8477a9