Have extensively working with this past 6 months , specifically llm and whisper finetuning
Lora defenitly gives a significant boost in training speed as well as a massive reduction in mem requirements
Have extensively working with this past 6 months , specifically llm and whisper finetuning
Lora defenitly gives a significant boost in training speed as well as a massive reduction in mem requirements