this post was submitted on 19 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
 

Recently, I got interested in fine-tuning low-parameter models on my low-end hardware. My hardware specs are as follows: i7 1195G7, 32 GB RAM, and no dedicated GPU. I want to finetune the model to model my writing style based on years of text written by myself. Right now, I'm looking to fine-tune this model (TinyLlama). Is this possible? If it's possible, how long will it take for the model to be fine-tuned?

top 1 comments
sorted by: hot top controversial new old
[–] __SlimeQ__@alien.top 1 points 11 months ago

I can't speak for 1B models but you're going to have a really hard time training with no gpu. It's just going to take an insanely long time.

For $500 though you can get a 4060ti with 16gb of ram which is good enough to train a 13B lora