this post was submitted on 25 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Hi!

I’m quite new to LLMs and want to use it to make training workouts. My idea would be to feed it scientific studies and a bunch of example workouts.

Is this what “training a model” is for? Any resource where I can start to learn how to train one?

Can I use and already fine tuned model like Mistral, or do I need to train a base model like LLama2?

Can I train a quantized model or do I need to use a vanilla one? And quantize it after training?

I have 2x3090, 5950x and 64GB of Ram. If that matters. If I can load a model for inference can I train? Are the resources needed the same?

Thanks!

you are viewing a single comment's thread
view the rest of the comments
[–] bullerwins@alien.top 1 points 11 months ago (1 children)

Hi! It’s the first time I’m seeing SPR, any resource where I can learn more about it? I’ve seen privateGPT, I believe it’s a front end that lets you upload files and I guess it build a database using something like chromaDB that learns what you feed it and takes it into consideration when giving answers, is that right?

[–] ThinkExtension2328@alien.top 1 points 11 months ago

SPR is not a technology it self it’s a methodology to “compress information” in a way that ai can effectively achieve larger context input for the same size, David does a great video explaining the methodology behind it. Iv found it to be useful as hell.