Align created mesh with images' position, direction and rotation
Hello there, I have a dataset of images I took and, alongside it, I have saved information on the camera position, direction and rotation (in form of a quaternion) for each picture. I was wondering if there is a way to use MVE with this information so that the created mesh is aligned with my camera's coordinates. images_positions.zip Thank you for the attention!
Yes. Create a scene using makescene -i to only import images. In each image directory, create the meta.ini file with your intrinsic and extrinsic parameters (see here [1]). If you want SfM points, you can use the featurerecon tool to generate SfM features from known camera parameters. And then use the the rest of the pipeline as is. dmrecon, fssr, etc.
It might be a bit fiddly, but it's supported.
[1] https://github.com/simonfuhrmann/mve/wiki/MVE-File-Format