adapter_config.json - The Hugging Face adapter configuration file.adapter_model.bin or adapter_model.safetensors - The saved addon file.adapter_config.json must contain the following fields:
r - The number of LoRA ranks. Must be between an integer between 4 and 64, inclusive.target_modules - A list of target modules. Currently the following target modules are supported:
q_projk_projv_projo_projup_proj or w1down_proj or w2gate_proj or w3block_sparse_moe.gatefireworks.json file directory containing: