Skip to content

Support for Mixed Preference Optimization (MPO)#1614

Open
LarryLeeee wants to merge 4 commits intoInternLM:mainfrom
LarryLeeee:mpo-clean-v2
Open

Support for Mixed Preference Optimization (MPO)#1614
LarryLeeee wants to merge 4 commits intoInternLM:mainfrom
LarryLeeee:mpo-clean-v2

Conversation

@LarryLeeee
Copy link
Copy Markdown

Implement Mixed Preference Optimization (MPO) for Vision-Language Models in XTuner v1.

@LarryLeeee LarryLeeee changed the title Mpo clean v2 Support for Mixed Preference Optimization (MPO) Mar 23, 2026
@LarryLeeee
Copy link
Copy Markdown
Author

@claude review

@nil0x9
Copy link
Copy Markdown
Collaborator

nil0x9 commented Mar 24, 2026

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants