Announcement

Collapse
No announcement yet.

Help needed with limited photos!

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Help needed with limited photos!

    Hello everybody!

    I've really enjoyed and found Zephyr to be one of the best "free" photogrammetry options on the market. And after a steep learning curve, I've found reasonable success. However, the occasional scan can be very tricky and either not work at all, or come out with many missing textures and poorly aligned geometry. If I'm very unlucky I just get a blob. This particular model is one of those cases. Very similar models work fine, but this one always results in a mess. So I'm looking for any advice on how to get the best results.

    This rar file has all 37 pictures in full resolution. It comes from a set of 48, but the final 12 are identical, just rotated.
    https://ufile.io/p56xsis4

    Please note
    - These are not my photos, and cannot get any more.
    - The photos are a native 800x800 and quite blurry. Through many tests, I found if I used AI upscaling it improved the results.
    - My best results have been from "Urban" presets.
    - The only REAL restriction I have is limiting the final textured mesh to 7500 verts.

    Here is the result of a similar model that came out flawlessly:

    Click image for larger version  Name:	rune.jpg Views:	0 Size:	96.5 KB ID:	6344

    I'd appreciate any suggestions! I'm aware this is NOT an ideal sample range for the software, and I'm pushing it's limits. But I'm sure there's a guru among you that could work their magic here!

    Thankyou!
    Attached Files

  • #2
    Hi Quoeiza,

    i would avoid upscaling, as AI software will generally smooth it, removing keypoints in the process. I strongly suggest against any type of preprocessing.

    Eventually, just use the processed images for texturing only if they look better (workspace / change images after you have generated a mesh you are happy with)

    In this case, I was unable to orient the full dataset in one shot. Note that when you have clearly wrong cameras orientation it's better to completely change the preset, try with one that orient less images (but correctly) and eventually try orienting those that were not oriented using a different preset. That's what i did with general/deep and then urban/deep. Unfortunately only 32 images were oriented and i doubt it's possible to get better results without resorting to control points.

    I understand that it's not possible to get more images for this specific dataset, but the correct way would be to take more (and better suitable for photogrammetry) images.

    I used the beta but I am sure you can get very similar results using 4.530

    When you get "blobs" it's generally because you have wrongly oriented cameras. It's better to have less, correct, cameras than a lot of "accepted" , wrong cameras in your workspace.

    Here is the result using 32 pics, decimated to 7496 vertices.

    Click image for larger version

Name:	miniature1.png
Views:	148
Size:	111.9 KB
ID:	6355

    Click image for larger version

Name:	miniature2.png
Views:	109
Size:	334.5 KB
ID:	6356


    You could definetly get better results with masking (which i did not do since the photos you shared are heavily post processed, making the automatic masking harder. Nothing too time consuming, but not worth doing at this moment).

    zep file (requires zephyr 5 beta) : https://shared.3dflow.net/index.php/s/IPVjVn4kxMncP9L

    Comment


    • #3
      Thank you very much for your time Andrea! Your advice is most appreciated, and what you say makes a lot of sense. Since making this post, I re-went through several of your tutorials and tried different experiments and made some improvements, but I think with the addition of your tips should make the difference I need.

      Worth noting, I was aware that processing the images in any way was bad practice. I found that out from re-reading some of your earlier posts while I was learning the software. However, the original photos were sooo bad, Zephyr refused to scan them at all. And Zephyr was the only software I could find that would scan them even after upscaling them. So big kudos to Zephyr for opening new doors for us. Making the impossible possible! I've included a comparison shot of the unmodified image versus the modified one. Frankly, the fact that Zephyr works at all is miraculous. Click image for larger version

Name:	comparison.jpg
Views:	120
Size:	128.8 KB
ID:	6361

      Right, I better get working putting this new info into practice.

      Thanks again.

      Comment


      • #4
        --SOLVED - BOTTOM OF POST #6

        One more thing I would like to ask about. Occasionally in the process of converting to a textured mesh at 7500 verts, the results come out very badly like in the attached picture. If I render at max verts, it handles it fine. I can push out a max vert model and fix it in Blender, but the UV map suffers as a result. I'd much prefer to decimate in Zephyr so the results are more accurate without too much texture misalignment.

        I'm thinking I'm missing or incorrectly doing a step in the decimation process. Or is it just a glitch in the algorithm? Worth mentioning the dense point cloud and standard mesh come out fine. It's only when I attempt the final textured mesh this happens.

        Click image for larger version  Name:	mess.jpg Views:	0 Size:	38.0 KB ID:	6372
        Last edited by Quoeiza; 2020-06-04, 01:58 AM.

        Comment


        • #5
          Generally speaking it's better to set the vertex count in the generation (when you start the texturing wizard, switch to advanced and set your target vertex count there, rather than using the decimation filtering).

          Regardless, also note you can do mesh editing outside zephyr (even with custom UV) and reimport it (lite/pro/aerial) for texturing

          Comment


          • #6
            I had set the vertex count in advance as suggested, but with some models the result still comes out like above. Not always! But sometimes still. It's as if the wrong points on the mesh are being connected resulting in these giant faces with blown out textures.

            In regards to importing the the texture back in with alternate software. I've tried this in Blender, but I'm unsure how to tell it what the UV unwrap should be. Doing it by hand is possible but extremely time consuming. Unfortunately, this is mostly down to my inexperience as I've only been learning this all from scratch this week! If it's a simple solution I'd love to hear it, but as it's outside of your own software and time, I have no expectations. You've been very helpful as is.


            Edit: I just found this tutorial:
            https://www.3dflow.net/technology/do...xture-blender/
            At a glance, it looks like what I need. I'll give it a go.
            Edit2: Oh I see, the free version is unsupported. I'll have to see what my financial situation is like then! Haha


            --SOLUTION!
            Final Edit:
            So I figured out the problem I had! I had not deleted loose vertices before generating the textured mesh. It removed 14,000 stray verts, then proceeded to process at 7500 limit with no issue at all! Fantastic.

            Perhaps as a suggestion, this could be included as an automatic checkbox in the wizard?
            Last edited by Quoeiza; 2020-06-04, 01:58 AM.

            Comment


            • #7
              Glad to hear that!

              You can use the connected component filter in order to automatically select all triangles that are not connected to the main mesh, so in one click you can remove it

              Comment

              Working...
              X