Welcome to the 3DF Zephyr tutorial series.
In this recipe, you will learn the basics and you will see how easy it is to turn your pictures into accurate 3D models with 3DF Zephyr.
3DF Zephyr is a powerful tool that requires a lot of computation power. Though not mandatory, a CUDA device is recommended as well as a high amount of memory.
Creating 3D Models from pictures requires a good dataset. You can follow this tutorial with your own pictures or try it with our sample dataset. If you want to take your own pictures, please follow these base guidelines that will teach you the best practices to acquire a dataset.
If you want to use our dataset, please download this zip file and extract it so that you can use it in Zephyr! This dataset is composed by 65 photos – if you’re using 3DF Zephyr Free, which is limited to 50 photos, don’t worry: simply select the first 50 photos — you’ll get a less detail model, but a nice model nonetheless!
![]() | Download Dataset – Cherub (531MB) |
![]() | Download – Cherub .ZEP file (340MB) |
After preparing your dataset, install 3DF Zephyr on your computer and that’s it! That’s all you need to start using 3DF Zephyr!
Note: due to high demand, the “Dismal Souvenir” tutorial is available for download in Tutorial #3 (Masquerade use)
To create a new project, click “Workflow” (1) and then “New Project” (2). The “Project Wizard” (3) screen will appear which will guide you through the process of importing your pictures.
This phase is critical for the scene reconstruction, so please feed 3DF Zephyr a good dataset: blurred images and dataset with no overlapping pictures are examples of bad data for 3DF Zephyr. You can learn more about the most common guidelines on this quick guide titled “how to acquire pictures for 3DF Zephyr“.
Protip: there two options in the lower left corner of the Project Wizard (3) that allows you to automatically compute the dense cloud and the surface. This is useful for a bigger dataset. Leave them unchecked as in this tutorial we’ll walk you through both those steps. Another option (checked by default) allows Zephyr to download camera calibrations where available: we suggest to leave this option on – although Zephyr is completely autocalibrated, the online camera calibration can speed up the first phase and help with some fisheye lenses.
This next window is the “Photo selection page“, in which we need to add the photo that we want 3DF Zephyr to process. Click on the “plus sign” (4) and browse to the directory where your dataset is located. Select all the images you previously extracted and then click open (or drag and drop images directly from the windows explorer)
Protip: advanced users can add a previously generated manual camera calibration (if available) by checking the option in the lower left corner of this window.
You will be now taken in the “Camera Calibration Page” Window. This topic won’t be covered in this tutorial, so just click “Next“.
You are now ready for the first computation phase. Here’s a brief explanation of what will happen: Zephyr will analyze each image and find the features of the images (points of interests that the computer can understand) and compare each image with (usually a subset of) the other pictures: this is done to set the cameras in the correct position. Before doing that though, it’s necessary to tell Zephyr which settings to use. For this tutorial, we will use the preset mode with Close Range/Default settings.
The preset mode window will appear. The preset mode allows you to pick optimal settings for most cases, depending on the application scenario (dropdown menu “category“) as well as the accuracy and computation speed required (dropdown menu “presets“). For this specific case, for example, you can pick close range as category and default as preset and then experiment with different settings.
The advanced windows setting allows you to tweak and control every aspect of our reconstruction engine. We will discuss these advanced parameters in another tutorial, although you can also find an in-depth explanation in the manual as well.
Use again the dropdown menu in the top right corner to go back to “presets mode”, select Close Range and Default and then click the “Next” Button (6).
Protip: understanding these settings is very important to obtain the best result possible with 3DF Zephyr. This is not a topic discussed in this tutorial, but it’s highly recommended that you read our parameters fine tuning guide.
After a while, you should see a “Reconstruction Successful!” dialog. This window will tell you how many (and which) images were correctly oriented. Double clicking the filename will open the appropriate picture in your default image viewer. This is especially useful when dealing with large datasets to quickly understand which cameras weren’t reconstructed successfully.
You have generated your first sparse point cloud in 3DF Zephyr.
Before moving to the next step, let’s learn the basics to navigate the scene, which is rendered at the centre of the screen (7).
Zephyr offers three navigation systems that can be selected with their respective icon (8) or from the Scene > Camera submenu.
The orbit view mode with pivot behave exactly like the orbit view mode, however, the pivot is not the centre of the reconstruction but picked every time on the model on the cursor position.
The free look mode uses the classic first person shooter WASD keys; Holding the left mouse button and moving the mouse allows you to rotate the camera and the ‘q’ and ‘e’ keys allow you to move respectively Up and Down.
You can also move quickly to a camera position by right clicking on a camera and then left clicking on “Move here” (9) or by using the camera navigator at the bottom of the screen.
Now that the cameras are positioned, we can extract the dense point cloud of our 3D model. This time, from the “Workflow” (10) Menu choose Dense Point Cloud Generation (11). The “Dense point cloud Generation Wizard” (12) will appear, click “Next” in the lower right corner of the screen.
These settings can significantly change the quality of the output as well as the computation time needed and every aspect can be tuned in the advanced window, but right now just leave again the preset Close Range/Default and then click “Next” in the lower right corner of the screen.
Click “Run” to start the dense point extraction (second phase of computation to create the 3D mesh).
Once the dense point cloud generation has completed, click finish to proceed to step 6, where we’ll be able to extract the mesh.
Protip: understanding these settings is very important to obtain the best result possible with 3DF Zephyr. This is not a topic discussed in this tutorial, but it’s highly recommended that you read our parameters fine tuning guide.
After the computation, a “Dense point cloud generation successful” dialog should appear: click “Finish” in the lower right corner of the window. You can navigate the scene of your new dense point cloud or proceed to the final phase of the computation.
To start the mesh generation process, simply click “Workflow” (13) and then on Mesh Extraction (14).
The “Mesh Generation Wizard” (15) window will appear.
Once again, you can snoop around the advanced settings if you wish, and then select again the presets “Close Range/Default” and click “Next” in the lower right corner of the window.
To start the mesh creation, just click on the “Run” button: when the “Mesh Creation Successful” dialog appears, just click “Finish” in the lower right corner of the window.
Protip: understanding these settings is very important to obtain the best result possible with 3DF Zephyr. This is not a topic discussed in this tutorial, but it’s highly recommended that you read our parameters fine tuning guide.
Once the mesh has been generated, the color information will be saved per-vertex. This might be fine for some applications, though usually a texture is required as well. In Zephyr, this is a separate step that takes in input the mesh generated in step 6a.
To start the texture generation process, simply click “Workflow” (16) and then on “Textured mesh Generation” (17). The “Textured Mesh” (18) generation window will appear.
It’s recommended that you leave 1 camera per triangle and Use Color Balance enabled. This feature, available since Zephyr 2.0 will automagically fix the lighting issues and pick the best possible color for each point. You can still use more than one camera per triangle and the multiband option if you wish (even paired with color balancing) however for most cases the best results can be achieved with 1 as “max number of camera per triangle”, multiband disabled and use color balance enabled.
Click “Next” to start the texture mesh generation.
Protip: understanding these settings is very important to obtain the best result possible with 3DF Zephyr. This is not a topic discussed in this tutorial, but it’s highly recommended that you read our parameters fine tuning guide.
To export the generated textured mesh, simply click on the “Export Menu” (19) and then on “Export Textured Mesh” (20): the “Mesh Export Window” (21) will appear.
Different options are available depending on the file format chosen. As a standalone 3D model viewer you can use your program of choice (for example, Meshlab).
When you are ready to export, simply click on the Export button in the lower right corner of the mesh export window. The texture generation topic will be covered thoroughly in another tutorial.
Opening your exported mesh in your model viewer of choice, you should now see something similar to this output. Your reconstruction should also have some parts of the white table below the cherub (which the reconstruction here does not have).
Cherub Example
by 3dflow
on Sketchfab
Continuing in this tutorial series, you will learn more than one way to polish and clean your reconstructions.
Learning the basics is easy, but understanding every aspect and the effects of every option takes a bit of time and experience: remember that we have our online documentation as well as a forum where you can ask specific questions.
The next tutorial will teach you how to set control points and control distances. Click here to proceed to the next tutorial.