1
Practical Tips for Finetuning LLMs Using LoRA (Low-Rank Adaptation)
(magazine.sebastianraschka.com)
Community to discuss about Llama, the family of large language models created by Meta AI.
From my experience, Here are some other things related to Lora:
+ FSDP doesn't work for Lora because FSDP requires all parameters to be trainable or frozen.
+ For Qlora, currently we can only use deepspeed zero2 (deepspeed zero 3 is not supported)