Agisoft Metashape 2.3.0 implements a new texturing algorithm that allows you to obtain a highly detailed texture in an optimal processing time. It also adds new tools for LiDAR, such as capability to align images with aerial LiDAR point cloud using 3D matches. 


Below there is the list of main features added in version 2.3: 


Download Agisoft Metashape 2.3.


Agisoft Metashape 2.3.0 is available and can be downloaded from our website:


https://www.agisoft.com/downloads/installer/


Texture


Blending mode - Natural


Version 2.3 includes a new texturing algorithm that allows for more detailed textures and faster texture creation. When building a texture, the algorithm automatically selects the best image for each triangle on the model. Factors such as shooting resolution, distance, viewing angle, image sharpness, and the absence of "ghosts" in the images (moving objects in images like passing cars) are taken into account.


When an image is assigned to each triangle, the photos are decomposed into frequency pyramids, and these frequencies are blended so that the low frequencies are smoothed and the overall brightness does not change abruptly, while the high frequencies are as sharp as possible. This is done to ensure that the texture is as detailed and sharp as the original images. 


To use the new algorithm, select the Natural blending mode in the Build Texture dialog:

Important parameters for the new texturing:


  • Image downscale (x1, x2, x4, x8) - image scale factor that will be used when building the texture. You can set: x1 - corresponds to the original image, x2 - the original image will be reduced by half on each side, x4 - the image will be reduced by 4 times on each side, x8 - will be reduced by 8 times on each side. The number of pages is calculated automatically based on the specified Texture size and the selected Images downscale factor.


  • Enable out-of-focus filter algorithm excludes areas that were out of focus during the shooting, suitable for close-range projects. Not recommended for aerial data.


  • Use assigned images - option allows you to rebuild the texture based on manually edited texture sections.


  • Sharpening strengthallows to increase the sharpness of the texture. If you don't want to use this parameter, set the value to 0. 



The processing time may increase if the Enable out-of-focus filter and Sharpening strength parameters are used.



Texture editing tool


A texture editing algorithm is available for new texturing. The tool allows you to edit a small area on the texture by selecting a particular image.  More information about processing steps you can find in our article - Texture editing tool




Camera Calibration


Equidistant and Equisolid fisheye camera models


In Metashape 2.3 support for Equidistant and Equisolid  fisheye camera models has been added. For most images with fisheye effect, use the Equidistant Fisheye camera model. For images with a viewing angle greater than 180 degrees, use the Equisolid. The parameter is set in the Camera Calibration window (select Tools > Camera Calibration): 


Camera axes


Now you can specify the direction of the camera axes in the Camera Calibration dialog window for aerial and terrestrial data. This is useful for setting GNSS and INS offsets and affects for these parameters. To select the desired axis direction, select Tools > Camera Calibration:


Camera axes for Aerial data:Camera axes for Terrestrial data:



LiDAR and Point Cloud data processing


GNSS Bias support


For LiDAR point clouds it is possible to calculate the GNSS Bias parameters. When using GNSS during shooting, there may be a systematic shift between the coordinate system of the LiDAR point cloud and the coordinate system of the reference data (ground control points). The linear parameters of this shift can be calculated in Metashape. This option is available in the Lidar Calibration dialog window (Tools > Lidar Calibration): 


NOTE: it is important that if you do not know the offset, but you would like to adjust these parameters, you need to enable the Adjust GNSS bias option and set zero values as a first approximation. 



Match depth maps option in Align Laser Scans dialog


The ability to align laser scans with images using 3D matches has been added. The algorithm uses depth maps and points from the LiDAR point cloud. It is important that the image block is referenced (using GPS data), i.e. the data is correctly scaled and oriented.


To align laser scans and images, the project must have aligned images and laser scans added to the project. After that, it is important to enable the Match depth maps option in the Align Laser Scans dialog window (select Workflow > Align Laser Scans): 



LiDARs alignment metric option in Generate Report dialog


In the new version of the program you can export visual information about LiDAR swath separation if corresponding point clouds have been used in the project. To do this, enable the Lidar separation image option in the Generate Report dialog window (File > Export > Generate report):


Project report includes Laser Scans page that displays general information about the laser scans in the project:



Select points by height


It is now possible to filter dense point clouds and LiDAR point clouds by heights from a certain plane. Depending on the min-max range in meters and the plane that was specified as the reference (Absolute elevation, Height above ground, Height above DEM), points on the point cloud will be selected (Tools > Point Cloud > Select Points by Height):




We recommend using a classified point cloud for Height above ground option. If the point cloud is not classified, a rough classification will be performed to determine the ground level.

Height above DEM option will be available if you have a built DEM in your project. Build the DEM before filtering your point cloud.



Colorized point cloud by height above ground 


In Metashape 2.3 It is also possible to colorize the LiDAR and photogrammetric point cloud based on height above the ground. In this case, we recommend using a classified point cloud with a ground class, in this case, the points will be colored relative to a plane that is comparable to the ground class. Height above ground option is available from the Show Point Cloud and Show Laser Scans drop-down list:



As with point cloud filtering, we recommend using a classified point cloud, шf the point cloud is not classified, a rough classification will be performed to determine the ground level, and the points will be colored based on this approximate value. The colorization process may take some time due to the preprocessing of the point cloud. 



Network processing


Export multiple assets from the same chunk in parallel


Multiple export operations from the consecutive tasks in the batch are distributed among available working nodes, even if they are related to different tasks in the batch list or applied to different chunks.


Process different tasks in parallel if they apply to different chunks


Consecutive processing tasks in the batch that are applied to different chunks or independent tasks for the same chunk are processed on multiple worker nodes, without waiting for each task to complete.