Relevant_Outcome_726@alien.topBtoLocalLLaMA@poweruser.forum•Practical Tips for Finetuning LLMs Using LoRA (Low-Rank Adaptation)English
1·
1 year agoFrom my experience, Here are some other things related to Lora:
+ FSDP doesn’t work for Lora because FSDP requires all parameters to be trainable or frozen.
+ For Qlora, currently we can only use deepspeed zero2 (deepspeed zero 3 is not supported)
Can you provide the prompt for function call?