Metashape 2.3 implements a new texturing algorithm that allows you to obtain a highly detailed texture in an optimal processing time. It also adds new tools for LiDAR, such as the ability to align images with aerial LiDAR point cloud using 3D matches.
Below there is the list of main features added in version 2.3:
Agisoft Metashape pre-release 2.3.0 is available and can be downloaded from our web-forum: https://www.agisoft.com/forum/index.php?topic=17361
Texture
Blending mode - Natural
Version 2.3 includes a new texturing algorithm that allows for more detailed textures and faster texture creation. When building a texture, the algorithm automatically selects the best image for each triangle on the model. Factors such as shooting resolution, distance, viewing angle, image sharpness, and the absence of "ghosts" in the images (moving objects in images like passing cars) are taken into account.
When a camera is assigned to each triangle, the photos are decomposed into frequency pyramids, and these frequencies are blended so that the low frequencies are smooth and the overall brightness does not change abruptly, while the high frequencies are as sharp as possible. This is done to ensure that the texture is as detailed and sharp as the original images.
New texturing currently can be only performed on the GPU. If you do not have a GPU, then the use of the new texturing algorithm will not be available in the current pre-release version. In this case you can use the old algorithm - Mosaic blending mode.
Important parameters for the new texturing:
- Images downscale (x1, x2, x4, x8) - the image scale factor that will be used when building the texture. You can set: x1 - corresponds to the original image, x2 - the original image will be reduced by half on each side, x4 - the image will be reduced by 4 times, x8 - will be reduced by 8 times. The number of pages is calculated automatically based on the specified Texture size and the selected Images downscale factor.
- Enable out-of-focus filter - algorithm exclude areas that were out of focus during the shooting, suitable for close-range projects. Not recommended for aerial data.
- Sharpening strength - allows to increase the sharpness of the texture. If you don't want to use this parameter, set the value to 0.
The processing time may increase if use the Enable out-of-focus filter and Sharpening strength parameters.
Texture editing tool
A texture editing algorithm is available for new texturing. The tool allows you to edit a small area on the texture by selecting the desired image. The tool is available from the context menu. Select the desired area on the model using the selection tools and select the Assign image command from the context menu:
You can select the desired image in the Assign Images dialog window:
And then you can rebuild the texture using the corresponding option (enable Rebuild texture with correction option):
Equidistant and Equisolid fisheye camera models
In Metashape 2.3 the support for Equidistant and Equisolid fisheye camera models has been added. For most images, use the Equidistant Fisheye camera model. For images with a viewing angle greater than 180 degrees, use the Equisolid. The parameters are set in the Camera Calibration window (select Tools > Camera Calibration):
LiDARs data processing
GNSS Bias support
For LiDAR point clouds, it is possible to calculate the GNSS Bias parameters. When using GNSS during shooting, there may be a systematic shift between the coordinate system of the LiDAR point cloud and the coordinate system of the reference data (ground control points). The linear values of this shift can be calculated in Metashape. This option is available in the Lidar Calibration dialog window (Tools > Lidar Calibration):
NOTE: it is important that if you do not know the offset, but you would like to adjust these parameters, you need to enable the Adjust GNSS bias option, and it is important to set zero values as a first approximation.
LiDARs alignment metric option to the Generate Report dialog
In the new version of the program, you can export visual information about LiDAR swath separation if corresponding point clouds have been used in the project. To do this, enable the Lidar separation image option in the Generate Report dialog window (File > Export > Generate report):
Match depth maps option to Align Laser Scans dialog
The ability to align laser scans with images using 3D matches has been added. The algorithm uses depth maps and points from the LiDAR point cloud. It is important that the data is georeferenced (images and laser scans).
The project report add a Laser Scans page that displays general information about the laser scans in the project:
Network processing
Export multiple assets from the same chunk in parallel
Multiple export operations from the consecutive tasks in the batch will be distributed among available working nodes, even if they are related to different tasks in the batch list or applied to different chunks.
Process different tasks in parallel if they apply to different chunks
Consecutive processing tasks in the batch that are applied to different chunks or independent tasks for the same chunk will be also processed on multiple worker nodes, without waiting for each task to complete.