Hole-filling filter for depth images
Is your feature request related to a problem? Please describe.
Hi, I am new to Kinect and ROS. I am trying to write an object tracking program with accurate distance info. I tried to retrieve the depth frames from the Kinect on ROS1 and I found some depth pixels are having 0 distance (shown black in the image). I guess the distance is out of range or the material is special that the IR cant reach. Are there any filters provided by Azure_Kinect_ROS_Driver that can help me to fix the hole problem of Kinect? Or we are supposed to implement our own Holes Filling filter?
The below image showing the depth_to_rgb/image_raw topic with the hole/noise problem.

Are there also any ways to configure the max depth when launching driver.launch? I saw the operating range are at most 5.5m in the documentation page but somehow I found it can detect up to 7m.
Describe the solution you'd like
I did some research and I found the hole-filling filter can be implemented using the Region Growing technique and Gaussian/Bilateral filter. I am not sure how it can be implemented efficiently but I noticed the ROS Wrapper for Intel® RealSense has a launch parameter allowing the user to apply some filters to the depth image.
Thank you for the feature request. This seems like a good feature to have, especially under a dynamic configuration option with hardware acceleration.
I've added a Help Wanted tag to this incase anyone wants to take a crack at it.
Otherwise, we will investigate adding it once we have completed some hardware acceleration workstreams.