Metashape 2.3 implements a new texturing algorithm that allows you to obtain a highly detailed texture in an optimal processing time. It also adds new tools for LiDAR, such as capability to align images with aerial LiDAR point cloud using 3D matches.
Below there is the list of main features added in version 2.3:
Agisoft Metashape pre-release 2.3.0 is available and can be downloaded from our web-forum:https://www.agisoft.com/forum/index.php?topic=17361
Texture
Blending mode - Natural
Version 2.3 includes a new texturing algorithm that allows for more detailed textures and faster texture creation. When building a texture, the algorithm automatically selects the best image for each triangle on the model. Factors such as shooting resolution, distance, viewing angle, image sharpness, and the absence of "ghosts" in the images (moving objects in images like passing cars) are taken into account.
When an image is assigned to each triangle, the photos are decomposed into frequency pyramids, and these frequencies are blended so that the low frequencies are smoothed and the overall brightness does not change abruptly, while the high frequencies are as sharp as possible. This is done to ensure that the texture is as detailed and sharp as the original images.

New texturing currently can only be performed on the GPU. If you do not have a GPU, then the use of the new texturing algorithm will not be available in the pre-release version. In this case you can use the old algorithm - Mosaic blending mode.
Important parameters for the new texturing:

- Images downscale (x1, x2, x4, x8) - image scale factor that will be used when building the texture. You can set: x1 - corresponds to the original image, x2 - the original image will be reduced by half on each side, x4 - the image will be reduced by 4 times on each side, x8 - will be reduced by 8 times on each side. The number of pages is calculated automatically based on the specified Texture size and the selected Images downscale factor.
- Enable out-of-focus filter - algorithm excludes areas that were out of focus during the shooting, suitable for close-range projects. Not recommended for aerial data.
- Sharpening strength - allows to increase the sharpness of the texture. If you don't want to use this parameter, set the value to 0.

The processing time may increase if the Enable out-of-focus filter and Sharpening strength parameters are used.
Texture editing tool
A texture editing algorithm is available for new texturing. The tool allows you to edit a small area on the texture by selecting a particular image. The tool is available from the context menu. Select the desired area on the model using the selection tools and choose the Assign image command from the context menu:

You can select a particular image in the Assign Images dialog window:

And then you can rebuild the texture using the corresponding option (enable Rebuild texture with correction option):

Equidistant and Equisolid fisheye camera models
In Metashape 2.3 support for Equidistant and Equisolid fisheye camera models has been added. For most images with fisheye effect, use the Equidistant Fisheye camera model. For images with a viewing angle greater than 180 degrees, use the Equisolid. The parameter is set in the Camera Calibration window (select Tools > Camera Calibration):

LiDAR data processing
GNSS Bias support
For LiDAR point clouds it is possible to calculate the GNSS Bias parameters. When using GNSS during shooting, there may be a systematic shift between the coordinate system of the LiDAR point cloud and the coordinate system of the reference data (ground control points). The linear parameters of this shift can be calculated in Metashape. This option is available in the Lidar Calibration dialog window (Tools > Lidar Calibration):

NOTE: it is important that if you do not know the offset, but you would like to adjust these parameters, you need to enable the Adjust GNSS bias option and set zero values as a first approximation.
LiDARs alignment metric option in Generate Report dialog
In the new version of the program you can export visual information about LiDAR swath separation if corresponding point clouds have been used in the project. To do this, enable the Lidar separation image option in the Generate Report dialog window (File > Export > Generate report):

Project report includes Laser Scans page that displays general information about the laser scans in the project:

Match depth maps option in Align Laser Scans dialog
The ability to align laser scans with images using 3D matches has been added. The algorithm uses depth maps and points from the LiDAR point cloud. It is important that the image block is referenced (using GPS data), i.e. the data is correctly scaled and oriented.

Network processing
Export multiple assets from the same chunk in parallel
Multiple export operations from the consecutive tasks in the batch are distributed among available working nodes, even if they are related to different tasks in the batch list or applied to different chunks.
Process different tasks in parallel if they apply to different chunks
Consecutive processing tasks in the batch that are applied to different chunks or independent tasks for the same chunk are processed on multiple worker nodes, without waiting for each task to complete.
