GPT 3.5 probably has more than 20b parameters, but then why is its API several times cheaper than text-davinci-003?
Although at the same time GPT 3.5 is good at facts and is great at creating text in many languages, while the opensource models are not always good even with English, because with 20b parameters it's hard to store much data, so there's probably a lot more than 20b
GPT 3.5 probably has more than 20b parameters, but then why is its API several times cheaper than text-davinci-003?
Although at the same time GPT 3.5 is good at facts and is great at creating text in many languages, while the opensource models are not always good even with English, because with 20b parameters it's hard to store much data, so there's probably a lot more than 20b