Mesh generation
Hi, thanks for your work. I have a question while running mapping api.
I read that the mesh is generated on dense reconstruction. Is there a condition to get the mesh data in real time? Does it only work under replay conditions, or do I need to set other conditions for it to work?
Hey,
The mesh generation works in real-time (runs asynchronously in the background). If meshing parameters are enabled, then you can get the mesh from our Mapping API on all devices (OAK-D, Realsense, Azure Kinect, Orbbec Femto Mega/Femto Bolt/ Astra2). The mesh quality depends on the sensor (e.g. Azure Kinect and Orbbec Femto devices produce much nicer looking mesh than OAK-D).
If you have an OAK-D device you can try our mapping_ar.py example that visualizes the mesh in real-time. For other devices, we don't currently have examples that visualize the mesh in real-time, but you can try mapping_ar.py or the sai-cli process commandline tool in replay mode.
If you wish to enable mesh reconstruction in your own project, then you should copy the config from mapping_ar.py:
configInternal["useSlam"] = "true"
configInternal["computeStereoPointCloud"] = "true"
configInternal["pointCloudNormalsEnabled"] = "true"
configInternal["computeDenseStereoDepthKeyFramesOnly"] = "true"
configInternal["recEnabled"] = "true"
configInternal["recCellSize"] = "0.02"
# configInternal['recMeshSavePath'] = "path/to/mesh.obj" # Uncomment to serialize mesh to disk
# configInternal['recTexturize'] = "true" # Uncomment to generate mesh textures