Neural_Color_Transfer
Neural_Color_Transfer copied to clipboard
Implementation of Neural Color Transfer between Images by PyTorch.
Neural_Color_Transfer_PyTorch_Implementation
In progress
We are trying to implement Neural Color Transfer between Images by Mingming He et al in PyTorch. They revised their paper to version 2 which is Progressive Color Transfer with Dense Semantic Correspondences.
Paper
Progressive Color Transfer with Dense Semantic Correspondences
Abstract
We propose a new algorithm for color transfer between images that have perceptually similar semantic structures. We aim to achieve a more accurate color transfer that leverages semantically-meaningful dense correspondence between images. To accomplish this, our algorithm uses neural representations for matching. Additionally, the color transfer should be spatially variant and globally coherent. Therefore, our algorithm optimizes a local linear model for color transfer satisfying both local and global constraints. Our proposed approach jointly optimizes matching and color transfer, adopting a coarse-to-fine strategy. The proposed method can be successfully extended from one-to-one to one-to-many color transfer. The latter further addresses the problem of mismatching elements of the input image. We validate our proposed method by testing it on a large variety of image content.
Process
Implemented on Single Reference Neural Color Transfer And changed WLS-based filter to Deep Guided Filter.
Pipeline

Results
Input Image

Style image

And the result from those images are here. The images are from layer 5 to 1 below.
L=5

L=4

L=3

L=2

L=1

TODO
- Need to modularize our notebook implementation.
- Our guidance and result image is different from original paper so need to fix some codes.
- The performance speed is way too slow than original paper.
- If the single reference is working well, need to work on multi-reference.
- Refine MarkDown