this post was submitted on 17 Nov 2023
1 points (100.0% liked)

LocalLLaMA

3 readers
1 users here now

Community to discuss about Llama, the family of large language models created by Meta AI.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Wonderful_Ad_5134@alien.top 1 points 11 months ago (1 children)

I'm getting tired of all those merges, as if this was the magical solution to everything

[โ€“] arekku255@alien.top 1 points 11 months ago

At a high level, merges allows you to get a lot of training done cheaply.

If you have a model finetuned on set A and another model finetuned on set B, merging these would allow you to very cheaply create a model that was trained on both set A and set B.

It is the magical solution to "I can't afford to finetune a model".