support LTX2.3 hot switch (multiple) lora && support LTX2.3 S2V mode#1028
support LTX2.3 hot switch (multiple) lora && support LTX2.3 S2V mode#1028helloyongyang merged 4 commits intomainfrom
Conversation
There was a problem hiding this comment.
Code Review
This pull request introduces the ltx2_s2v task for audio-conditioned video generation and adds functionality to apply multiple LoRA adapters simultaneously by merging their weights. Review feedback identifies a bug in the LoRA merging logic where keys not present in the first file are ignored, and notes redundant audio assignment logic in the LTX2 runner. Additionally, suggestions were provided to reduce code duplication in the LTX2 model's LoRA update method and to improve shell script robustness through proper variable quoting.
| for k, v in weight_dict.items(): | ||
| if k in merged: | ||
| merged[k] = merged[k] + v * s |
There was a problem hiding this comment.
There's a bug in the logic for merging multiple LoRA weight dictionaries. If a LoRA file contains weights (keys) that were not present in the first LoRA file, they will be ignored. The loop only updates keys that are already in the merged dictionary.
You should handle cases where a key exists in the new weight_dict but not in merged by adding it to merged.
for k, v in weight_dict.items():
if k in merged:
merged[k] += v * s
else:
merged[k] = v * s| def _update_lora(self, lora_path, strength): | ||
| if isinstance(lora_path, dict): | ||
| lora_weight = self._remap_lora_dict_keys_for_mm_weight(lora_path) | ||
| else: | ||
| lora_weight = self._load_lora_file(lora_path) | ||
| self.pre_weight.update_lora(lora_weight, strength) | ||
| self.transformer_weights.update_lora(lora_weight, strength) | ||
| self.post_weight.update_lora(lora_weight, strength) |
There was a problem hiding this comment.
This _update_lora method duplicates a lot of logic from the base class BaseTransformerModel._update_lora. You can simplify this by calling the super method and handling the LTX2-specific key remapping before the call. This improves maintainability by reducing code duplication.
| def _update_lora(self, lora_path, strength): | |
| if isinstance(lora_path, dict): | |
| lora_weight = self._remap_lora_dict_keys_for_mm_weight(lora_path) | |
| else: | |
| lora_weight = self._load_lora_file(lora_path) | |
| self.pre_weight.update_lora(lora_weight, strength) | |
| self.transformer_weights.update_lora(lora_weight, strength) | |
| self.post_weight.update_lora(lora_weight, strength) | |
| def _update_lora(self, lora_path, strength): | |
| if isinstance(lora_path, dict): | |
| lora_path = self._remap_lora_dict_keys_for_mm_weight(lora_path) | |
| super()._update_lora(lora_path, strength) | |
| save_audio = self._ltx2_s2v_mux_audio | ||
| save_video( |
There was a problem hiding this comment.
| export CUDA_VISIBLE_DEVICES=0 | ||
|
|
||
| # set environment variables | ||
| source ${lightx2v_path}/scripts/base/base.sh |
There was a problem hiding this comment.
It's a good practice in shell scripting to quote variable expansions to prevent issues with paths that contain spaces or other special characters. Please wrap ${lightx2v_path} in double quotes.
| source ${lightx2v_path}/scripts/base/base.sh | |
| source "${lightx2v_path}/scripts/base/base.sh" |
No description provided.