this post was submitted on 08 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Google released T5X checkpoints for MADLAD-400 a couple of months ago, but nobody could figure out how to run them. Turns out the vocabulary was wrong, but they uploaded the correct one last week.

I've converted the models to the safetensors format, and I created this space if you want to try the smaller model.

I also published quantized GGUF weights you can use with candle. It decodes at ~15tokens/s on a M2 Mac.

It seems that NLLB is the most popular machine translation model right now, but the license only allows non commercial usage. MADLAD-400 is CC BY 4.0.

you are viewing a single comment's thread
view the rest of the comments
[–] Puzzleheaded_Mall546@alien.top 1 points 1 year ago (1 children)

I write text that is incomplete to see how it will translate it and the results is a coninuation of my text not the translation.

[–] jbochi@alien.top 1 points 1 year ago (1 children)

How are you running it? Did you prepended a "<2xx>" token for the target language? For example, "<2fr> hello" will translate "hello" to French. If you are using this space, you can select the target language in the dropdown.

[–] Puzzleheaded_Mall546@alien.top 1 points 1 year ago (1 children)

I am using the code of the space.

[–] jbochi@alien.top 1 points 1 year ago

Got it. Can you please share the full prompt?