georeferenced project - export - import(back) mesh related question

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • artsomnia.art
    3Dfollower
    • May 2019
    • 22

    georeferenced project - export - import(back) mesh related question

    Good day Zephyr support team!

    My question is:
    - I can not visualize 16bit double GCP point verified "huge" mesh, nor in Houdini, or Zbrush where I would like to work with it due to cleaning and decimation process
    - I can however export out the mesh in local rendering reference system that I can see properly in the viewport of Houdini or Zbrush, but it is totally losing it's orientation, also, if I try to import back the cleaned, decimated mesh to Zephyr, it can not make it structured.

    I tried to find the corresponding tutorial, probably my bad, but could not.

    If any of you would be able to help me, that how to export georeferenced mesh in in local rendering reference by keeping it's position, size and orientation and also being able to import it back to the original (geo.ref. project) and making it structured.

    Thanks in advanced!
  • Andrea Alessi
    3Dflow Staff
    • Oct 2013
    • 1304

    #2
    Hi!

    resetting to local rendering coordinates will, by definition, change its coordinate positions and thus lose georeferencing. You will need to export the mesh as-is and work in a software that allows double precision.

    If that's not an option, then you will have to export in local rendering coordinates, do your own editing, then reimport it back, then align it once again via Control Points + ICP. Ideially you want to work in a 3D editor that supports double precision

    Comment

    • artsomnia.art
      3Dfollower
      • May 2019
      • 22

      #3
      Originally posted by Andrea Alessi
      Hi!

      resetting to local rendering coordinates will, by definition, change its coordinate positions and thus lose georeferencing. You will need to export the mesh as-is and work in a software that allows double precision.

      If that's not an option, then you will have to export in local rendering coordinates, do your own editing, then reimport it back, then align it once again via Control Points + ICP. Ideially you want to work in a 3D editor that supports double precision
      Thank you so much for your answer Andrea! It's all clear! Just one last question, if I'm scanning fragments and later on I would like to merge them in a 3D dcc tool (Houdini, Maya, Max, Blender...) and I would like to snap them into their right place (based on GCP alignment) do I have an option for it? And how the same scenario could work, lets say in Unreal engine? (lets say, I would like to merge 10, 20 or hundreds of fragments and I want them to find their position? Is there a way, to maybe delete the gcp points and keep the alignment due to the pivot / xyz planes?

      I know it is a bit weird question, since georeferencing is nothing to do with origin based XYZ orientation But I read in your forum, that someone was able to place georeferenced meshes to Unity 1:1 due to size and position, just he did not described how.

      Comment

      • artsomnia.art
        3Dfollower
        • May 2019
        • 22

        #4
        Originally posted by Andrea Alessi
        reimport it back, then align it once again via Control Points + ICP. Ideially you want to work in a 3D editor that supports double precision
        And one more last question, I can not make the imported back mesh structured once it was exported in local rendering reference, should I generate a pointcloud from it? (probably not, since in that case I lose my cleaning process on the mesh) How to align a non structured mesh via control points? Sorry if my question covers some basic knowledge, but I could not find an answer.

        Comment

        • Andrea Alessi
          3Dflow Staff
          • Oct 2013
          • 1304

          #5
          No need to apologize, all questions are welcome

          In my opinion the easiest way for you would be to process everything regardless of georeferencing, do your cleanup etc so that you can export and reimport everything without the worry of having coordinates changing their position in zephyr or outside of zephyr.

          Then, when you have everything in place in zephyr, you can georeference the dataset and export it cleaned, but georeferenced.

          I don't think Unity3D supports double precision but don't quote me on that

          May I ask you exactly what are you trying to achieve?

          Comment

          • artsomnia.art
            3Dfollower
            • May 2019
            • 22

            #6
            Originally posted by Andrea Alessi
            No need to apologize, all questions are welcome

            In my opinion the easiest way for you would be to process everything regardless of georeferencing, do your cleanup etc so that you can export and reimport everything without the worry of having coordinates changing their position in zephyr or outside of zephyr.

            Then, when you have everything in place in zephyr, you can georeference the dataset and export it cleaned, but georeferenced.

            I don't think Unity3D supports double precision but don't quote me on that

            May I ask you exactly what are you trying to achieve?
            Thank you so much for your answer Andrea!

            I will try to handle a project as you described above.

            Due to your question; "May I ask you exactly what are you trying to achieve?"
            With a friend of mine who is an archaeologist we are aiming for to build up an archaeological excavation documentation pipeline, that not just delivering the requirements due to our actual protocol, but it could be a subject of future researches (right now basically a 2D ortho projection based printed out drawing is enough based on the protocol - clearly, it could not serve the porpoise of visibility based researches and in most cases these finding will be buried back, or will end up in a museum placed in a closed paper bag, that in most cases not many will touch again.
            Right now we are doing a test project, we were allowed to reconstruct a 350 square meter excavation field. We are parallelly working next to geodetics that are running the old pipeline. If we were able to prove, that by using photogrammetry, we can deliver precise ortho projections, + high detailed optimized mesh that can be a subject of visual researches and we also can do it even faster as the classic method, it could happen, that the base industrial archaeological excavation protocol will be rewritten. We already had some presentations, we had a meeting with a museum and so far it seems that it will go true. But we are still far a bit to deliver everything, so now we are fighting with time. And let me play with open cards here; the museum who we had the meeting with, who were specially interested in our workflow is responsible for a huge project that will happen in spring (we are talking about hundreds of thousand square meters of excavation) So if we can make it go true, we could be part of that project.

            - we would like to deliver ortho projections, as many as the actual subject required (due to the nature of pg based ortho projection, the number of projections do not limiting us vs the manual drawing method)
            - we would like to deliver a high detailed 3D objects that can be used up for research workflows, runs real-time even by using a tablet, laptop or a phone (here comes the mesh optimization, cleaning process and after the final export, a PBR texture set baking)
            - we would like to integrate all projects to Unreal Engine, so by the help of LODs all fragments of an excavation field could be merged, being a subject of research due to the high detail that we are aiming for at LOD 0, and thanks to dynamic subdivision we would be able to merge unlimited size of fields.

            So in a few sentence, this is what we are working right now.

            Right now it is a test project, no contract, no payment involved.

            Comment

            • Andrea Alessi
              3Dflow Staff
              • Oct 2013
              • 1304

              #7
              Thank you for your detailed followup!

              If georeferencing is required for the research then, unfortunately there isn' tmuch you can do - you need to find an engine that natively supports georeferenced meshes.

              If instead that's not an issue, than you can simply work locally and simply creating your own reference so that the different parts of the excavation are "locally" in the right place.

              Comment

              Working...
              😀
              😂
              🥰
              😘
              🤢
              😎
              😞
              😡
              👍
              👎