Unable to get proper output from NRD
I am integration NRD in my rendering software and i am close to have it running. But still I am "close". Here what i get when i use Optix denoiser. It works properly https://youtu.be/oGUy7VgZD0o
With NRD when denoising start i get a fullscreen pink, which is the color i use to init the output DirectX 12 textures https://youtu.be/UgYthvk9pPk
To debug here the first frame inputs https://www.dropbox.com/scl/fi/jfdwt470rnn6cjmcqy5yc/NRD-AOVs.zip?rlkey=7q1i2hvqddo01en0g52e3n5vh&st=44rr8jz5&dl=0
There is a lot to say so here the source code: https://www.dropbox.com/scl/fi/r07shlham0tlg94rm0xbv/Source-code.zip?rlkey=goh61m43lakhgnkal02p5wqw9&st=w486xmv8&dl=0
Denoising happens here:
//=======================================================================================================
// PERFORM DENOISING
//=======================================================================================================
NrdUserPool userPool = {};
{
NrdIntegration_SetResource(userPool, nrd::ResourceType::IN_MV, *In_NRD_MotionVectorsTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::IN_VIEWZ, *In_NRD_ViewZTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::IN_NORMAL_ROUGHNESS, *In_NRD_NormalRoughnessTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::IN_BASECOLOR_METALNESS, *In_NRD_BaseColorMetalnessTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::IN_DIFF_RADIANCE_HITDIST, *In_NRD_DiffuseRadianceHitTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::IN_SPEC_RADIANCE_HITDIST, *In_NRD_SpecularRadianceHitTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::OUT_DIFF_RADIANCE_HITDIST, *Out_NRD_DiffuseRadianceHitTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::OUT_SPEC_RADIANCE_HITDIST, *Out_NRD_SpecularRadianceHitTexture);
NrdIntegration_SetResource(userPool, nrd::ResourceType::OUT_VALIDATION, *Out_NRD_ValidationTexture);
};
const nrd::Identifier denoiserId = NRD_ID(REBLUR_DIFFUSE_SPECULAR);
m_NRD->Denoise(&denoiserId, 1, *ConstantNRDContextSingleton::instance()->getNRICommandBuffer(), userPool);
So when I run denoising I get the color from the OutDebugColor image. This means denoising is not performed at all. I mean NRD doesn't write anything to the output textures(Out_NRD_DiffuseRadianceHitTexture and Out_NRD_SpecularRadianceHitTexture).
By instance commenting this line gives the same result: // m_NRD->Denoise(&denoiserId, 1, *ConstantNRDContextSingleton::instance()->getNRICommandBuffer(), userPool);
So I am doing something wrong but what? Debug layers doesnt claim anything. And unfortunately my pipeline is not linked to my swapchain so take a PIX capture wont give more info. The only suspicious things i can think is that i am using a RGBA32FImage for any NRD inputs. I should use RGBA8 for the out validation texture but the issue is not there. I will try to implement the specific formats tomorow and see if its helps.
All provided info is useless, because it's just a prerequisite to get NRD running. Please, enable NRD validation:
- set
CommonSettings::enableValidation = true - pass
OUT_VALIDATION(already here) and manually overlay it on top of your final image - grab and attached a screenshot
https://github.com/NVIDIAGameWorks/RayTracingDenoiser?tab=readme-ov-file#validation-layer
[UPDATE]
i am using a RGBA32F image for any NRD inputs
If NormalEncoding and RoughnessEncoding are respected, it probably should work... except materialID. So, I hope R10_G10_B10_A2_UNORM has not been selected for normal encoding, i.e. NRD is compiled with another encoding, because this one is the default one)
-
Here the validation output
-
Here the CPU input fed to NRD https://www.dropbox.com/scl/fi/jfdwt470rnn6cjmcqy5yc/NRD-AOVs.zip?rlkey=7q1i2hvqddo01en0g52e3n5vh&st=rgp6p87h&dl=0
Thanks for helping :)
I would assume that after getting this output you could say something like "Something is wrong. I will get back to you after fixing major problems on my side"...
Obvious observations:
- "Roughness" looks more like normals. Mismatched encoding?
- "Z" looks like a solid color and most likely wrong
- "Units" shows black, but should consist of "1 unit" blocks
- Diff and spec "hitT" are 0 (it's fine for the initial integration)
Knowledge:
- README
- NRD sample
Yes something is clearly wrong. I am using a similar pipeline with DLSS and it works https://computergraphics.stackexchange.com/questions/14226/what-is-the-fastest-way-to-readback-a-directx12-texture-on-the-cpu
Will double check my current implementation.
Additional thought - to work NRD requires a valid sequence of frames. NRD has history, which is tried to be preserved in a good state. It means that NRD expects at least 10 FPS (bare minimum) if the camera moves, or at least a valid sequence of frames with any performance if the final image is absolutely static (but why not use REFERENCE denoiser here?). Does it meet your expectations?
Camera is static for now but i do plan to support animation. I also need interactive framerates.
AMD have their own denoiser but it seems to only support reflections aka it doesn't denoise primary rays. https://gpuopen.com/fidelityfx-denoiser/
So NRD seems to be the best solution for me. But I am struggling to have it run fine. The fact i am CPU based doesnt help. I will give it another shot this weekend :)
For your first question I solve the monte carlo integration by using a cumulative moving average. Every time a new frame is rendered it is merged with the previous one then I feed the merge to NRD/Optix
The discussion is in stagnation. Closing, since it's not an NRD issue. Feel free to reopen, if there is a real need.