What do you consider "large" cluster? How many MW GPU capacity do you operate?
LocalLLaMA
Community to discuss about Llama, the family of large language models created by Meta AI.
please do something like this, or provide detailed example, on how an open source framework api can be added to a coder LLM.
how do we prepare the data with code sample, docs, so the coder LLM learns it can can do code completions and answer documentation?
You can train on any dataset as long as it follows our format.
Soon we'll publish a video tutorial.
but what would be the proper formatting example for code? just paste in a bunch of files from a repo? or should be more a cheatsheet format?
Well, awesome. Thanks.
I'll be over here assembling some TV show transcripts for a fandom tune.
Out of curiosity, is it a full finetune or a LoRA? What context length?
cool stuff
👏
Just curious why you don't plug it into bittensor? I mean, you get best of both worlds then?
Registered. I am very interested and grateful to use it, but I haven't uploaded the dataset to huggingface, so I can't use it yet.
And I don't understand the new learning paradigm that is done just by registering the model and dataset.
What is it that is running behind the scenes?
A very simple snippet OR code would be helpful to understand.
For example
If I give you a model and a dataset, the code will run something like this, and under what conditions will the training be finished.
I don't want to be the party pooper. But this site doesn't seem legit. Have any of you got any info other than provided?
Hi, is it possible to learn a RAW file? I have about 2gb of artistic text marked up with tags and titles, is it possible to train Mistral on them?
This is awesome! What training parameters are you using?
If you collapse does that end the universe????
Spidey-Senses tingling: Caution