# Whole Slide Image Registration With ImageJ/Fiji and QuPathUpdated YesterdayPublicActions

This page documents a way to register whole slide images in Fiji and analyze their result in QuPath. This workflow allows to register slides with transformations which are more complex than affine transform, by using the ImgLib2 library, elastix, and BigWarp. Reasonably, if the sample is not too deformed, registration at the cellular level can be achieved on the whole slide.

## Videos tutorials

These videos have been created on the 9th of April 2021, they could become outdated faster than the written documentation below!

Playlist

## Installation

### ImageJ / Fiji

• Enable the bigdataviewer-playground update site, then restart Fiji. This update site is accessible in the list of Fiji's update site under the name bigdataviewer-playground.
• To enable automated registrations capabilities, install elastix:
• Download the latest release of elastix for your OS. This documentation has been tested for elastix 5.0.1.
• Unzip it
• For windows users, you also need to install Visual C++ redistributable, you'll most probably need vc_redist.x64.exe
• Check, in a command line that elastix can run:
• On windows launch a command console cmd.exe, go to the elastix folder, then type elastix.exe --help. Check that you got some useful help message and not errors.
• Not tested on mac and linux, but you probably need to type either elastix --help or .\elastix --help to check proper functioning as well
• Indicate elastix and transformix executable location in Fiji:
• Execute Plugins › BIOP › Set and Check Wrappers then indicate the proper location of executable files
• This should show up in the ImageJ console : [INFO] Transformix -> set :-) Elastix -> set :-)

### QuPath

Install the latest QuPath version
Install the QuPath biop extensions and its dependencies:

When QuPath is restarted, a BIOP menu should appear on top of the QuPath window:

## WSI registration - step by step procedure

To follow the WSI registration procedure, a demo dataset, consisting of a fluorescent image which can be registered to a RGB H-DAB image, can be downloaded here.

Create first a QuPath project containing all the slides that you want to register.

WARNING: Only the Bio-Formats image builder is supported on the Fiji side. Make sure to select Bio-Formats builder when importing your slides, and leave uncheck Auto generate pyramids.

If your image can't be loaded in QuPath using the Bio-Formats builder, you can convert your slides in ome.tiff format. Several options are available, for instance by using bfconvert with Kheops, or bioformatsf2raw for a fast conversion.

WARNING: All files need to be properly calibrated (microns, millimeters, etc, but not pixels!). Check on the image tab of QuPath that you have a proper physical unit specified, and not pixels!

Save your project, and you are done for now on the QuPath side.

NOTE: You can let QuPath open while performing the registration on Fiji.

### Registration in Fiji

In Fiji, open your QuPath project using Plugins › BigDataViewer › Playground › BDVDataset › Open [QuPath Project]. You can also directly type QuPath in Fiji's search bar to avoid struggling in Fiji's menu hierarchy.

• Select your qupath project file (usually named project.qpproj).
• Let the default options but make sure to select MILLIMETER in the physical units of the dataset field

NOTE: the physical units of the dataset field actually indicates how you want to import your dataset. Bio-Formats takes care of converting the unit of the acquisition into millimeters. This can be achieved only if the image is correctly calibrated initially: bio-formats knows how to converts from microns to millimeters, or from angström to millimeters, but it can't know how to transform from pixels to millimeters.
NOTE: this WSI registration workflow hides many registration parameters by making use of the proper calibration in physical units, and by assuming it targets a cellular resolution level (do not expect registration precision at 50 nm resolution, unless you manually correct it with BigWarp).

After you have opened your Project, a window called BDV Sources should pop-up. If you double click on the Sources node, you should be able to browse the hierarchy and see the "sources" contained in your QuPath project. Note that the fluorescent channels have been splitted into separated sources. In the demo file, you get a window like this:

• DAB.tif-ch0 is the RGB DAB image
• TileScan_001_Merging001-ch0 is the DAPI fluorescent channel
• TileScan_001_Merging001-ch1 is the other fluorescent channel, which staining is similar to DAB

For the demo dataset registration, you can choose the DAB image as the fixed image, and the DAPI channel as the moving image.

#### Coarse manual pre-registration

The automated registration works if the images are approximately pre-aligned in the world coordinate space. This section explains how to check and correct the initial location of sources if they are not approximately aligned.

NOTE: If you do not want to use automated registrations, you can skip this part and just go to the next Wizard section and select manual BigWarp registration in the wizard.

To check if this is the case, you will need to display the two slides you want to align in a BigDataViewer window. To do this:

• select the slides that you want to align in the tree view. For this, you may find convenient to expand SpimData 0QuPathEntryEntity, and select, using ctrl+click, the two entities you want to align:

• With these selected, right-click and select BDV - Show Sources (new BDV Window)

• Select the options as below:

• For the demo dataset, you will get something like this:

You can hover with your mouse on the right part of the panel to see a blue arrow that you can click:

With this right panel you can activate or deactivate sources. You notice that the fluorescent image, while of the correct size, is shifted in XY to the DAB image. This needs to be corrected before going further. If your images are already approximately aligned, you can skip the part below and go to the wizard section.

NOTE: Navigating within BigDataViewer requires a bit of experience. In 2D, the minimal commands to know are written below:
• hold and drag right-click: Pan;
• mouse wheel (or up / down key): zoom in and out;
• shift modifier key: zoom in or out faster,
• ctrl modifier key: more precise zoom.
• You'll soon notice that holding left-click rotates the view. To go back to the default rotation, click shift+Z

You need to modify the location of the fluorescent image in order to approximately put it at its proper location. Here's a way to manually do it:

• Select the fluorescent sources in the tree view (all fluorescent channels should be selected, either by selecting an upstream node, like in the picture below, or by selected both fluorescent channels)

• Right-click and select Manual Sources Transformation, choose mutate or append (it doesn't matter for this use case):

• Center the sources by panning the bdv window, the selected sources in the tree view will stay still (you stay in the referential of the moving sources)

• Click Apply and finish (or Cancel and then repeat the operation if something went wrong)

You now have an approximately registered slide to one another.

Congrats, you can go to the next step.

#### Wizard : automated registration

You can start the wizard by clicking Plugins › BigDataViewer › Playground › Sources › Register › Wizard Align Slides For QuPath or just type Wizard in the search bar and select the correct command when it shows up.

First, select the DAB source as the fixed source, and the DAPI channel as the moving source that will be used for the registration as indicated in the image below:

Another wizard window shows up:

Successive steps of registration happens consecutively, all are optional. If the sample is not too streched or damaged from one scan to the other, a fully automated registration (skipping step 2 - Manual spline registration (BigWarp) ) will give cellular resoution results, but it's a good idea to check the results with BigWarp anyway, even if the landmarks are not modified because the registration is already tipptopp.

NOTE: If you check show results of automated registrations, ImageJ1 images will be created and show the different images which are effectively sent to elastix for automated registration. It's a good way to check the intermediate steps and see what could have gone wrong. However, the processing will be slower because each step happen sequentially instead of in a parallel fashion.

In the rest of this section, we assume all registration steps have been checked. If not, some of the details below will not be asked.

##### Step 1 - Set Bdv window to cover both moving and fixed slide

During this step, you just need to have a bdv window open and selected and all the area of the slides you want to register should be visible, as in the image below:

Click OK to go to the second step

##### Step 2 - Define a rectangular ROI for the registration

TOFIX: It's not clearly visible, but there's actually a message overlaid in BDV: Select a rectangular region for the region you want to register. TODO : Github Issue

So just draw a rectangle spanning the region you want to register, by drawing a rectangle on bdv, holding mouse left button:

##### Step 3 - Define position of landmarks that should be automatically registered

TOFIX : It's not clearly visible, but there's actually a message overlaid in BDV: Select the position of the landmarks

Using the mouse left button, position points that will be used to locally correct the warping of the moving slide. You can concentrate the landmark on the regions you are more interested in. Here's an example of landmarks positionning:

The number of necessary landmarks is not obvious to know in advance. You need to place them however where common structures are easily identifiable in both the moving and fixed image, a bit like how you would place focus points during your acquisition.

NOTE: To go to the next step, just press Escape. This part of the UI needs improvement (TODO : github issue)
NOTE: The automated registration should take about 20 seconds. If it takes longer, check that there's no error message in the ImageJ console.
##### Step 4 - Manually correct landmarks

If you have selected Manual spline registration (BigWarp), you will now be able to manually investigate and correct landmarks that have been automatically registered, and also add new landmarks to the slides, if you want to adjust more precisely a particular region. The interface used is directly the one of BigWarp, so please check the documentation of BigWarp itself in order to edit landmarks.

Here's a convenient way to perform this step:

• First, ignore the moving image window and increase the size of the BigWarp fixed image.

• press F1 for help window
• press F to toggle ON and OFF the registered moving image (F stands for fused)
• zoom (mouse wheel) and pan (drag left? or right? TOFIX mouse button) to investigate the registration quality
• press ctrl+D to move to the next landmark

To correct a landmark position:

• press space to enter the landmark mode
• carefully select and drag a landmark to modify its position. Live visualization of the transformed image helps you positioning it correctly
• press space to exit landmark mode
• navigate and repeat for each landmark that needs correction

• in landmark mode, press ctrl + left click to pin a new landmark on both the moving and fixed image at the current mouse location. It can then be dragged.

Once you are satisfied with the registration, click OK

##### Step 5 - Succesful registration message

If all went smoothly, you should get this message:

This means that the result of the registration has been stored as a file into your QuPath project. It can then be used from within QuPath to transfer annotations and or detections from one slide to another one, as explained in the next section.

NOTE: the transformation is stored as a json file in the data entry folder of the TOFIX fixed?moving? image. It is named by convention transform_{i}_{j}.json where i and j are the index of the fixed and the the moving image in the QuPath project.

### Analysis in QuPath

From within QuPath, annotations and or detections can be transfered within registered images, one way or another. Indeed, provided transformations are regular enough, they are invertible.

To transfer annotations or detections from one image to another:

• using the procedure of your choice (cell detection plugin, stardist, manual annotation, etc.), create annotations or detections on the image of your choice (for instance the moving image).
• move to the other registered image (for instance the fixed image).
• you can then execute the following script AutomateUser scripts...New Script...:
transform_objects.groovy
import ch.epfl.biop.qupath.transform.TransformHelper

// Get the current image data
def imageData = getCurrentImageData()

// Transfer all matching annotations and detections, keeps hierarchy,
// transfer measurements (first true parameter)
// allow to use inverse transforms (second true parameter)
TransformHelper.transferMatchingAnnotationsToImage(imageData, true, true)

// Computes all intensities measurements in the new image
def server = getCurrentServer()

//If the image is RGB, this line can be added to import the correct measurements (DAB, etc.) :
//
//server = new qupath.lib.images.servers.TransformedServerBuilder(getCurrentServer())
//      .deconvolveStains(getCurrentImageData().getColorDeconvolutionStains(), 1, 2)
//      .build()
//cf https://forum.image.sc/t/transferring-segmentation-predictions-from-custom-masks-to-qupath/43408/15

The above scripts consists of two parts:

• TransformHelper.transferMatchingAnnotationsToImage looks through all images of the QuPath project if there are transformations files will allow to transfer annotations and detections from one image to the current (target) one, and then performs the transfer.
• transfered annotations/detections does not contain the standard measurements provided by QuPath (fluorescent intensity / DAB value, etc.) In order to add the measurements on the target image, the function TransformHelper.addIntensityMeasurements(getAnnotationObjects(), server, 1, true) can be called

### Troubleshooting

Something's wrong in this documentation ? Please post your issue or question in the [image.sc forum](image.sc forum).

### References (github)

The code to perform this workflow is splitted in several parts:

Qupath extensions and several repositories for the fiji side. On the fiji side, the main components are Bigdataviewer-playground for the management of multiple sources and Bigdataviewer-biop-tools, for various functionalities, for instance the QuPath bridge.

Last Author
chiarutt
Last Edited
Fri, Apr 16, 12:26

### Event Timeline

chiarutt created this document.Thu, Mar 25, 10:19
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Mon, Mar 29, 14:59
chiarutt edited the content of this document. (Show Details)Mon, Mar 29, 18:16
chiarutt edited the content of this document. (Show Details)Mon, Mar 29, 18:56
chiarutt edited the content of this document. (Show Details)Tue, Mar 30, 00:05
chiarutt edited the content of this document. (Show Details)

i was wondering how to register　ＭＯＵＳＥ BRAIN　ＡＴＬＡＳ INTO　ＴＨＥ IHC　ＢＲＡＩＮ SLIECE（４０ＵＭ） AND　ＢＡＳＥＤ ON　ＴＨＥ　ＢＲＡＩＮ　ＲＥＧＩＯＮ　ＴＯ　ＧＥＮＥＲＡＴＥ RELATED　ＲＯＩ, THEN AUTO　ＣＯＵＮＴ THE　ＰＯＳＩＴＩＶＥ SIGNAL　ＰＯＩＮＴＳ IN　ＥＡＣＨ ROI

chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 18:35
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 19:28
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 20:41
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 20:47
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 20:50
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 20:52
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 20:57
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 21:00
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 21:04
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 21:07
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Thu, Apr 8, 21:15
chiarutt edited the content of this document. (Show Details)Fri, Apr 9, 16:05
chiarutt edited the content of this document. (Show Details)Mon, Apr 12, 12:51
chiarutt edited the content of this document. (Show Details)
chiarutt edited the content of this document. (Show Details)Mon, Apr 12, 16:33
chiarutt edited the content of this document. (Show Details)Mon, Apr 12, 17:12
chiarutt edited the content of this document. (Show Details)Mon, Apr 12, 21:16
chiarutt edited the content of this document. (Show Details)Thu, Apr 15, 08:40
chiarutt edited the content of this document. (Show Details)Fri, Apr 16, 12:26