Announcement

Collapse
No announcement yet.

Import .csv data on xyz and camera tilt angle relative to turntable?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Import .csv data on xyz and camera tilt angle relative to turntable?

    I have a computer controlled photogrammetry booth with a turntable, and a stereo camera system that can move up and down along z axis, along with changing camera pitch angle relative to the turn table. This scan path is controlled from a .csv file, is it possible to use this file data in 3DF Zephyr to speed up the cloud generation and make it more accurate, along with getting true scale in my point clouds? Similar to the fixed rig setup but where I can just feed it x, y, z and camera angle info for each image?

    I'd really appreciate any suggestions!

    Thanks

  • #2
    Hi ossian.se,

    you'd have to compute the camera positions and rotations so you can feed the xmp/fixed rig setup as if the object was still and your virtual camera was a real camera, as you'd have to simulate the object in a fixed position, and multiple cameras If you feed the same x,y,z, and camera rotaitons (either in euler angles or projection matrix) as is, zephyr will think all the images are in the same position. So it's up to you to define the reference system and feed that to zephyr as it if twas a real fixed rig scenario.

    Truth be told, it's probably much much easier to do a test acquisition , calibrate it with real distances, and export that orientation to be reused in next runs (as you can very easily control your acquisition rig and have it output the very same camera images files).

    best,
    Andrea

    Comment


    • #3
      Thank you Andrea, I think your suggestion to just do calibration with real distances is much simpler!

      Comment


      • #4
        Originally posted by Andrea Alessi View Post
        Hi ossian.se,

        you'd have to compute the camera positions and rotations so you can feed the xmp/fixed rig setup as if the object was still and your virtual camera was a real camera, as you'd have to simulate the object in a fixed position, and multiple cameras If you feed the same x,y,z, and camera rotaitons (either in euler angles or projection matrix) as is, zephyr will think all the images are in the same position. So it's up to you to define the reference system and feed that to zephyr as it if twas a real fixed rig scenario.

        Truth be told, it's probably much much easier to do a test acquisition , calibrate it with real distances, and export that orientation to be reused in next runs (as you can very easily control your acquisition rig and have it output the very same camera images files).

        best,
        Andrea
        How can I do a test acquisition and calibration and can this be done in the Light version?

        Thank you,
        Ossian

        Comment


        • #5
          Hi Ossian,

          unfortunately no, you need the full version to use control points and to load cameras with known parameters/xmp.

          You need to simply create a test acquisition, making sure to add some reference distance in the scene.

          Do your reconstruction, place control points in the scene and use it to scale the reconstruction.

          Save the camera externals and use them in every new reconstruction - they will be loaded as if the cameras were indeed fixed.

          Comment

          Working...
          X