nohodlnodough

joined 10 months ago
[–] nohodlnodough@alien.top 1 points 10 months ago (1 children)

reviews are out. Does anyone know if we are allowed to make modifications to the paper or we are only allowed to answer directly to the feedbacks?

 

I want to train llama using qlora on multiple tasks in a sequential manner, ie task A-> B-> C. Would it be possible to combine, say adapter weights trained on A with B and subsequentially, A with C? How can i can go about doing this?

Has anyone tried doing so and achieving reasonable results? i am aiming for task A to be continual pretraining for domain adaptation, while B and C to be the downstream tasks.