Hi cam3d.
Right now I'm working on a subsea drone which takes photos of marine infrastructure. We are experimenting with the ZED camera to get a measureable model, scaling the photogrammetric model with the baseline of the image. We're also trying to do sensor fusion between the ZED camera's position estimate and some other sensors (pressure, IMU, compass, DVL).
However, I would also like to use other multi-camera configurations, with 3 or more cameras at different angles - for this I imagine the software has an easier time if we supply the correct transforms between the cameras?
We also have a relative position estimate of our drone, which i would also like use in the post processing, to help with the varying visibility levels underwater. So that's another coordinate transform from the INS system of our drone to the camera coordinate frames.
Anyway, from your description and some trial and error, I found the orientation of the camera to be the one called "coordinate_system_image" in the image i posted before, so the X axis points to the right, Y points down, and Z points forward (through the image/camera).
Stereoscopic camera - coordinate frame confusion.
Collapse
X
-
Stereoscopic camera - coordinate frame confusion.
Hey everyone.
I'm having some issues setting up my stereo camera dataset in the full version of 3DF-Zephyr. It's possible to input the transform between the cameras as a translation + rotation - however, I can't find anywhere how the coordinate system is defined. Is x forward (i.e. through the image), and y to the side, and z up and down? As an example, see the attached image of a ZED camera, where there exists a lot of different coordinate frame configurations.
I know I can just use the distance+rotation config, but i would like to use more than two cameras in the future.
1 PhotoTags: None
Leave a comment: