Koratahiu

Results 9 issues of Koratahiu

### Feature Idea ComfyUI already supports OFT. OFTv2 is better, lighter, and faster: https://spherelab.ai/oftv2/ It's included in PEFT library: https://github.com/huggingface/peft/tree/main/src/peft/tuners/oft And this PR in OneTrainer is waiting for support on...

Feature

This PR introduces **Muon** (added) and **AdaMuon** (~~to be added~~ added) as new experimental optimizers for `adv-optm`. 1. **[Muon](https://kellerjordan.github.io/posts/muon/)**: An orthogonalizing optimizer that has demonstrated strong performance across diverse tasks....

This Pull Request introduces a new boolean option, `Compiled Optimizer`, to all advanced optimizers, allowing the core update logic to be compiled using **`torch.compile`** (Tested on PyTorch 2.8). By using...

This PR integrates the **Diff2Flow** methodology, based on the paper "[Diff2Flow: Training Flow Matching Models via Diffusion Model Alignment](https://arxiv.org/abs/2506.02221)". This feature adapts a pre-trained diffusion model to a flow matching...

Effort: Medium

Tested on 4-bit GGUF and worked

waiting

This PR implements the timestep sampling method from: [A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training](https://arxiv.org/abs/2405.17403). Claims 3x faster pretraining at same quality:...

A small modification to bring **Generalized Offset Noise (GON)** to inference by adding offset noise to the initial latent. The [GON paper](https://arxiv.org/abs/2412.03134) clearly states: > “Our approach modifies both the...

Adds the original Muon optimizer: https://github.com/KellerJordan/Muon Includes: #1064

This PR introduces **[LoKr (Low-Rank Kronecker Product)](https://huggingface.co/docs/peft/en/package_reference/lokr)**, a new parameter-efficient training method available alongside LoRA, and LoHa. LoKr constructs the update matrix from the Kronecker product of two smaller, more...