Dence point cloud quality

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Student_ISVB
    3Dflower
    • May 2017
    • 5

    Dence point cloud quality

    Hello everyone, hello Roberto,

    I'm working in a company that reconstruct the road accidents. We use Zephyr for reconstruction the roadways.
    Earlier I contacted with Roberto for questions per email, but may be my questions will be usefull somebody else too, so I decided to continue to communicate through the forum

    So, just to cover the entire width of the road and necessary lenght (from 20 up to 150 meters), we use big datasets for Zephyr. It can be from 50 till 600 photos for each project. 20 megapixels photos, fileformat - tiff. Camera - Canon EOS 6d, constant focal length 24 mm. Camera is placed on a holder with height of 3 meter and tilted down for 45 degrees.

    We had never problems with photos orientation, even if photos were made in different directions of road.

    The main issue, what we have almost in every project, is too less points in dence point cloud. The roadway sometimes is not closed (it has holes), and after mesh generation this holes are still on the road way. We tried to use the dencification filter, but it works not so fine, because we were getting worse quality of points texture and then the worse quality of mesh (we loosing the geometry of curbstone line and the quality of roadbed).
    Sorry for comparing, but, for example, the same project of 50 photos we calculated in Zephyr and Agisoft Photoscan. The settings were almsot maximum (more than 90% resolution, very high level of discretization). In Agi we got a 42 million points, in Zephyr - only 10. And the meshes were really different by quality. In the project was the road with 35 meters length and 8 meters width.
    The worst result we got by project with curve road with road length for 150 meters (we made 535 photos in both directions). We got only 16 million points and the quality of points positions was so low, that we could not see the markers on the road, which we made ourselves. Agisoft in this project could use only 335 photos but gave us 55 million points.

    I would like to ask, how can we improve the number of points in project? Which settings should we use and what we do possible wrong?

    What we expect as result: huge dense point cloud or mesh with highest resolution. In the end we use top view of project to make a sketch. On this top view we expect position error of objects (for example road marking line or gully) not more than 2.5-3 cm of real world.

    Is it possible in our case of such big projects?

    If it nesessary, I can send some examples of projects.

    Thank you in advance.

    Regards,
    Vladyslav Buriakovskyi

  • Andrea Alessi
    3Dflow Staff
    • Oct 2013
    • 1304

    #2
    Hi Vladyslav,

    Thank you for the feedback! There is no need to apologize, we are very happy to compare results and improve if in some cases other software perform better. We're never afraid of comparison!

    That said, here are my thoughts:

    - If you're experiencing too many holes, try increasing the "DephTrimFactor" in the advanced settings

    - you can exploit the photoconsistency even if you're interest in point clouds. Create the mesh using the photoconsistency filter (photoconsistency takes a while, but usually improves the results a lot) then from the left panel, right click the mesh to extract a point cloud from said mesh.

    - You can improve the number of points by increasing the discretization levels. However if you're still experiencing issues, we will gladly have a look at your dataset!

    Andrea

    Comment

    • Student_ISVB
      3Dflower
      • May 2017
      • 5

      #3
      Hi Andrea,

      thanks for the feedback! Tomorrow I'll start calculate new project and will try your suggestions.
      Actually in the end we are more interested in mesh, because the point cloud in our cases going to be really big, it takes too much disk storage and mesh is more useful for us. I just thought, that the main reason of low quality of mesh is strongly dependent of low point cloud quality.
      Anyway, I'll try and then will give the feedback.

      Regards
      Vladyslav

      Comment

      • Student_ISVB
        3Dflower
        • May 2017
        • 5

        #4
        Hi Andrea,
        So, there a results of new project. 98 photos, tiff. For first point cloud "DephTrimFactor" was 4, for second - 2. The resolution was at 100%, the descretization level -
        Here some screenshots:
        [img src="http://s014.radikal.ru/i329/1706/7c/28f012b3c357t.jpg" alt="" /]
        [img src="http://s015.radikal.ru/i331/1706/15/9f080bc70143.jpg" alt="" /]
        And the second one:
        [img src="http://s019.radikal.ru/i620/1706/8c/160a36a436e7t.jpg" alt="" /]
        [img src="http://s018.radikal.ru/i502/1706/5d/4b431340b66a.jpg" alt="" /]
        And the mesh, calculated from first point cloud using photoconsistency :
        [img src="http://s018.radikal.ru/i508/1706/6e/ad610e05ba18t.jpg" alt="" /]
        As You can see, there still many gaps and the reconstructed area is not so big.
        For example, there is a dence point cloud from AgiSoft. Same dataset, resolution was "high":
        [img src="http://s019.radikal.ru/i640/1706/38/925bdc2cf519t.jpg" alt="DPC from AgiSoft" /]

        Are there another possibilities to improve the results?

        Regards
        Vladyslav

        Comment

        • Student_ISVB
          3Dflower
          • May 2017
          • 5

          #5
          Sorry for small screenshots. There sholud be better:
          [img src="http://s019.radikal.ru/i614/1706/4b/d507768be202t.jpg" alt="DPC1" /]
          [img src="http://s014.radikal.ru/i327/1706/ea/27900482dc0ft.jpg" alt="DPC2" /]
          [img src="http://s018.radikal.ru/i515/1706/0f/645a5373652at.jpg" alt="Mesh" /]
          [img src="http://s018.radikal.ru/i509/1706/b6/8a03ecf967a7t.jpg" alt="DPC from AgiSoft" /]

          Comment

          • Roberto
            3Dflow
            • Jun 2011
            • 559

            #6
            Hi Vladyslav,

            thanks for sharing the results.

            We are quite strict when generating the point cloud and by looking at those results we might have some problems in some particular configurations (low overlap, cameras with only rotational movements?). If you can send us the dataset we will be glad to have a look at it and I'll get back to you after some detailed testing. You can send just the images through wetransfer or we can setup an account of our sharing server if you prefer.

            Comment

            • Student_ISVB
              3Dflower
              • May 2017
              • 5

              #7
              Hi Roberto,
              gladly. But, the whole dataset is 10,9 Gb. Which way is the best, to share such a big data?
              may be I should export the fotos in another format?

              Comment

              • Andrea Alessi
                3Dflow Staff
                • Oct 2013
                • 1304

                #8
                Hi,

                i can gladly open up an account on our sharing server for you, so that you can upload your dataset. I'll contact you on the email you used to sign up to the forums

                Comment

                • Roberto
                  3Dflow
                  • Jun 2011
                  • 559

                  #9
                  Also, if you want to run a test with the new beta version, we'll send you a beta key. Actually we have corrected some issues and balanced some presets for the dense point cloud generation and I'm curious to see if the improvements affect your results as well.

                  Comment

                  • Roberto
                    3Dflow
                    • Jun 2011
                    • 559

                    #10
                    Hi, thanks for sharing the data. I've run a quick test (Aerial Default -> Aerial High details for dense and mesh) with the current beta (we'll release v. 3.3 next week) and the results seem fine to me.

                    I think the problems were caused by some issues that we corrected internally and some preset re.balancing. Please let me know if the beta works well also on the other datasets.

                    Regarding the mesh/point cloud density, please note it's not related to the final accuracy. The density can be easily tuned (change the reprojection area during the photoconsistency or the discretization level) but I won't advice to do so in general, as you won't get accuracy improvements for most cases.

                    Comment

                    Working...