Exposure correction / handling HDR

Collapse
X
 
  • Time
  • Show
Clear All
new posts
  • Timeroot
    Blossoming 3Dflower
    • Dec 2018
    • 1

    Exposure correction / handling HDR

    Hi, I'm really enjoying what I've seen of Zephyr so far. I've used panorama software for a while, but photogrammetry .really takes it to the next level. It's pretty standard in photo-stitching software that things like HDR photos and exposure-correction are handled by the optimization process: where photos overlap, the exposure (and white balance, optionally) is calibrated so that they match; then the whole pano has a coherent effective exposure. Since of course some parts might then be much darker than others and that's why the exposure was varying to begin with, you can export the pano in an HDR format like EXR and play around with the levels.

    I'm wondering to what degree, if any, Zephyr handles this? Partially because it would be nice to get more broad-range lighting ... but also because I keep accidentally un-locking exposure on my camera (it's very easy to do, unfortunately) and then I don't have the exposures even.

    Unfortunately, it appears to me that Zephyr doesn't try to correct for this. This has led to a few projects where I have excellently-recovered geometry (Zephyr has "wow"ed me at the SfM!), but then the texturing has lots of awkward lines of dark/light, corresponding to the edges of images. I've tried playing around with the Color Balance parameters, but it doesn't seem to do exposure correction directly -- rather, it's for something where the external lighting changes?

    If this is an option I can enable, or if it's already going but I'm just messing something else up, please let me know! If this is not currently a feature, I really hope it could be added. And, does anyone have tips for ways I could try to fix the reconstruction on the photos I already took? One option would be manually trying to exposure correct them, but that wouldn't be trivial since I'd have to manually find where photos overlap and line them up -- Zephyr already has overlap info internally.
  • Andrea Alessi
    3Dflow Staff
    • Oct 2013
    • 1304

    #2
    Hi Timeroot!

    Zephyr is actually pretty good and robust to lighting changes (regardless that it is due to exposure or external light changes) but it may be possible in difficult environments (e.g. ineteriors / rooms) that you can notice a sudden change.

    If you have a well taken dataset however, usually the easiest fix is simply to exclude cameras that have the "unwanted" lighting during texturing. I personally like using a tripod also because of this - while it takes longer, you are most likely to get an even lighting in all your shots

    Comment

    • Pete98
      Blossoming 3Dflower
      • Mar 2019
      • 1

      #3
      Does Zephyr Lite support RAW pictures (DNG) that would have been color corrected in bulk with a Canon camera color profile through Adobe Lightroom CC 2019 or Photoshop CC 2019?

      Comment

      • Andrea Alessi
        3Dflow Staff
        • Oct 2013
        • 1304

        #4
        Hi Pete,

        you can load most raw file formats, however i suggest you export to jpg after color correction. Raw files take longer time for processing and do not guarantee better quality (unless you need to work with 32 bit textures)

        Comment

        • cam3d
          3Dflover
          • Sep 2017
          • 661

          #5
          * Sidenote - When exporting JPG don't forget to export at 100% - Any lower than this and you start to introduce JPG artifacts which can adversely impact your reconstruction.

          Comment

          • sam3d
            3Dflourished
            • Feb 2018
            • 98

            #6
            I was recently looking at reconstructing the inside of a church window using Zephyr. I haven't needed to to do it yet but it got me thinking about HDR images because of the contrast in internal/external light.

            I wondered if it would be a good idea to use a set of HDR merged images with Zephyr or would Zephyr make better use of the original bracketed RAW images?

            The workflow would probably be: exposure bracketed RAW -> HDR merge -> export to HDR / EXR -> process in Zephyr

            What do you think?? Would the merging corrupt the pixel data too much?


            Comment

            • sam3d
              3Dflourished
              • Feb 2018
              • 98

              #7
              Hey Andrea Alessi just wondered if you'd had chance to look at the previous post?

              Comment

              • Andrea Alessi
                3Dflow Staff
                • Oct 2013
                • 1304

                #8
                Hi sam3d,

                my apologies, i didn't notice your reply!

                i personally woudln't use hdr merged images and rather convert the raw file to an appropriate exposure. I've seen a few dataset made with HDR merged images but it's very easy to confuse zephyr - also, the texturing will look a bit odd in most cases - i'm not saying it's impossible, but i don't see it very practical in my opinion - for some people it may work, i suppose!

                Remember that if you need 32 bit textures, zephyr can do that, so you can actually feed the raw images and then handle the 32 bit textures in your workflow afterwards.

                Again sorry for not replying earlier!

                Comment

                • sam3d
                  3Dflourished
                  • Feb 2018
                  • 98

                  #9
                  Thanks Andrea, no problem!

                  I'll perhaps try a small experiment sometime, I just wondered if Zephyr would make use of HDR data during reconstruction.



                  Comment

                  Working...