this post was submitted on 29 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Didn't see any posts about these models so I made one myself.

This first set of models was trained with 288B high quality tokens, will be interesting if the 51B and 102B models hold up. Commercial use is allowed with no authorization.

https://github.com/IEIT-Yuan/Yuan-2.0/blob/main/README-EN.md

(Chinese) https://github.com/IEIT-Yuan/Yuan-2.0

Paper: https://arxiv.org/abs/2311.15786

Huggingface download links

https://huggingface.co/pandada8/Unofficial-Yuan-2.0-2B

https://huggingface.co/pandada8/Unofficial-Yuan-2.0-51B

https://huggingface.co/pandada8/Unofficial-Yuan-2.0-102B

โ€‹

Here's the second set of models I found. 7B and 65B were trained with 2.6T tokens, and the 13B with 3.2T. The 65B model supports up to 16K context, while the two smaller ones support up to 8K.

https://huggingface.co/xverse/XVERSE-65B

https://huggingface.co/xverse/XVERSE-13B

https://huggingface.co/xverse/XVERSE-7B

These models know 40 over human languages plus several programming languages too. Commercial use is allowed, but you have to submit an application form.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] fallingdowndizzyvr@alien.top 1 points 11 months ago

I'm really interested in having a 51B model. I would love something between 34B and 65/70B.