Phriction Projects Wikis Bioimaging And Optics Platform Image Processing [OBSOLETE] Whole Slide Image (WSI) Registration With ImageJ/Fiji, Elastix and QuPath History Version 40 vs 41
Version 40 vs 41
Version 40 vs 41
Content Changes
Content Changes
This page documents a way to register whole slide images in Fiji and analyze their result in QuPath. This workflow allows to register slides with transformations which are **more complex than affine transform**, by using the ImgLib2 library, elastix, and BigWarp. Reasonably, if the sample is not too deformed, registration at the cellular level can be achieved on the whole slide.
= Videos tutorials
These videos have been created on the 9th of April 2021, they could become outdated faster than the written documentation below!
[Playlist](https://www.youtube.com/watch?v=E4k2CGbvQTM&list=PL2PJpdamvntgXgF0YF_FKeYzn34RITfar)
* [Installation](https://www.youtube.com/watch?v=E4k2CGbvQTM)
* [Registration](https://www.youtube.com/watch?v=HAdktWdngWU)
* [Analysis](https://www.youtube.com/watch?v=uDglj4Lvxak)
= Installation
== ImageJ / Fiji
=== BigDataViewer-Playground update site ===
Enable the [BigDataViewer-Playground](https://imagej.github.io/plugins/bdv/playground) update site, then restart Fiji. This update site is accessible in the list of Fiji's update site under the name `BigDataViewer-Playground`.
=== Elastix - To enable automated registrations capabilities ===
** Download the [latest release of elastix for your OS](https://github.com/SuperElastix/elastix/releases/tag/5.0.1). This documentation has been tested for elastix 5.0.1.
** Unzip it somewhere convenient ( `C` drive on windows, in `Applications` for Mac )
**Windows **
For windows users, you also need to install [Visual C++ redistributable](https://support.microsoft.com/en-us/topic/the-latest-supported-visual-c-downloads-2647da03-1eea-4433-9aff-95f26a218cc0), you'll most probably need `vc_redist.x64.exe`
**Mac**
Fiji will be calling the elastix executables, which are recognized as ‘unknown developers’ by Mac OS. Thus you need to make security exceptions for both elastix and transformix (https://support.apple.com/en-hk/guide/mac-help/mh40616/mac) to avoid clicking indefinitely on the OS warning messages.
**Linux (not tested)**
Nothing particular should be required for linux system.
=== Indicate `elastix` and `transformix` executable location in Fiji ===
*** Execute {nav Plugins › BIOP › Set and Check Wrappers} then indicate the proper location of executable files {F18020664,width = 500}
*** This should show up in the ImageJ console : `[INFO] Transformix -> set :-) Elastix -> set :-)`
Once elastix is installed, you can run [the following script](https://gist.githubusercontent.com/NicoKiaru/b91f9f3f0069b765a49b5d4629a8b1c7/raw/571954a443d1e1f0597022f6c19f042aefbc0f5a/TestRegister.groovy) in Fiji to test elastix functionality. Save the file with a `.groovy` extension, open it Fiji, and run it.
== QuPath
* Install the [latest QuPath version (0.3.0)](https://qupath.github.io/)
* Install the [QuPath Warpy extension by following its readme](https://github.com/BIOP/qupath-extension-warpy).
* Install the [QuPath Image Combiner Warpy extension by following its readme](https://github.com/iwbh15/ImageCombinerWarpy_qp_v0.3).
= WSI registration - step by step procedure
To follow the WSI registration procedure, a demo dataset, consisting of a fluorescent image which can be registered to a RGB H-DAB image, can be [downloaded here](https://zenodo.org/record/4680455#.YHQk3D-xW00).
== Create your QuPath project
[Create first a QuPath project](https://qupath.readthedocs.io/en/latest/docs/tutorials/projects.html) containing all the slides that you want to register.
{F18022361, width = 400}
WARNING: Only the Bio-Formats image builder is supported on the Fiji side. Make sure to select `Bio-Formats builder` when importing your slides, and leave unchecked `Auto generate pyramids`.
If your image can't be loaded in QuPath using the `Bio-Formats builder`, you can convert your slides in `ome.tiff` format. Several options are available, for instance by using [bfconvert with Kheops](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/imagej_tools/ijp-kheops/), or [bioformatsf2raw](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/qupath/ome-tiff-conversion/) for a fast conversion.
WARNING: All files need to be properly calibrated (microns, millimeters, etc, but not pixels!). Check on the image tab of QuPath that you have a proper physical unit specified, and not pixels!
If the pixel size is wrong, you need to override it in QuPath and resave the project before resuming the workflow.
{F18022412,width = 400}
Save your project, and you are done for now on the QuPath side.
NOTE: You can let QuPath open while performing the registration on Fiji.
== Registration in Fiji
=== Open your QuPath project
In Fiji, open your QuPath project using {nav Plugins › BigDataViewer-Playground › BDVDataset › Open [QuPath Project] }. You can also directly type `QuPath` in Fiji's search bar to avoid struggling in Fiji's menu hierarchy.
* Select your qupath project file (usually named `project.qpproj`).
* Let the default options but make sure to select **MILLIMETER** in the `physical units of the dataset` field
{F18022564,width = 400}
NOTE: the `physical units of the dataset` field actually indicates how you want to import your dataset. Bio-Formats takes care of converting the unit of the acquisition into millimeters. This can be achieved only if the image is correctly calibrated initially: bio-formats knows how to converts from microns to millimeters, or from angström to millimeters, but it can't know how to transform from pixels to millimeters.
NOTE: this WSI registration workflow hides many registration parameters by making use of the proper calibration in physical units, and by assuming it targets a cellular resolution level (do not expect registration precision at 50 nm resolution, unless you manually correct it with BigWarp).
After you have opened your Project, a window called `BDV Sources` should pop-up. If you double click on the `Sources` node, you should be able to browse the hierarchy and see the "sources" contained in your QuPath project. Note that the fluorescent channels have been splitted into separated sources. In the demo file, you get a window like this:
{F22086972,width =250}
* `DAB.tif-ch0` is the RGB DAB image
* `TileScan_001_Merging001-ch0` is the DAPI fluorescent channel
* `TileScan_001_Merging001-ch1` is the EdU fluorescent channel, which staining is similar to DAB
For the demo dataset registration, you can choose the DAB image as the fixed image, and the DAPI channel as the moving image.
NOTE: Navigating within BigDataViewer requires a bit of experience. In 2D, the minimal commands to know are written below:
- **hold and drag right-click**: Pan;
- **mouse wheel (or up / down key)**: zoom in and out;
- **`shift` modifier key**: zoom in or out faster,
- **`ctrl` modifier key**: more precise zoom.
- You'll soon notice that holding **`left-click`** rotates the view. To go back to the default rotation, click **`shift+Z`**
=== Creating a Warpy Registration
You can start the registration wizard by clicking {nav Plugins › BigDataViewer - Playground › Sources › Register › QuPath - Create Warpy Registration} or just type `Warpy` in the search bar and select the correct command when it shows up.
First, select the DAB source as the fixed source, and the DAPI channel as the moving source that will be used for the registration as indicated in the image below:
{F18024767, width=700}
Another wizard window shows up:
{F22087142, width=700}
The successive steps (0,1,2,3,4) of registration happens consecutively, all are optional. It is advised to keep the first two checkboxes checked to remove most of the intitial XYZ offsets which could be stored in the files. The initial offset is due to bioformats metadata usually storing the stage position during acquisition.
NOTE: If you check `show results of automated registrations`, ImageJ1 images will be created and show the different images which are effectively sent to elastix for automated registration. It's a good way to check the intermediate steps and see what could have gone wrong. However, the processing will be slower because each step happen sequentially instead of in a parallel fashion.
In the rest of this section, we assume all registration steps have been checked. If not, some of the details below will not be asked.
==== Step 1 - Set Bdv window to cover both moving and fixed slide
{F18024883, width=250}
During this step, you just need to have a bdv window open and selected and all the area of the slides you want to register should be visible, as in the image below:
{F18024899, width=700}
Click `OK` to go to the second step
==== Step 2 - Define a rectangular ROI for the registration
TOFIX: It's not clearly visible, but there's actually a message overlaid in BDV: `Select a rectangular region for the region you want to register`. TODO : Github Issue
{F18024912, width=700}
So just draw a rectangle spanning the region you want to register, by drawing a rectangle on bdv, holding mouse left button:
{F18024943, width=700}
==== Step 3 - Define position of landmarks that should be automatically registered
TOFIX : It's not clearly visible, but there's actually a message overlaid in BDV: `Select the position of the landmarks`
{F18024960, width=700}
Using the mouse left button, position points that will be used to locally correct the warping of the moving slide. You can concentrate the landmark on the regions you are more interested in. Here's an example of landmarks positionning:
{F18024977, width=700}
The number of necessary landmarks is not obvious to know in advance. You need to place them however where common structures are easily identifiable in both the moving and fixed image, a bit like how you would place focus points during your acquisition.
NOTE: To go to the next step, just press `Escape`. This part of the UI needs improvement (TODO : github issue)
NOTE: The automated registration should take about 20 seconds. If it takes longer, check that there's no error message in the ImageJ console.
==== Step 4 - Manually correct landmarks
If you have selected `Manual spline registration (BigWarp)`, you will now be able to manually investigate and correct landmarks that have been automatically registered, and also add new landmarks to the slides, if you want to adjust more precisely a particular region. The interface used is directly the one of [BigWarp](https://imagej.github.io/plugins/bigwarp), so please check the documentation of BigWarp itself in order to edit landmarks.
Here's a convenient way to perform this step:
* First, ignore the moving image window and increase the size of the BigWarp fixed image.
{F18025076, width=700}
* press `F1` for help window
* press `F` to toggle ON and OFF the registered moving image (F stands for fused)
* zoom (mouse wheel) and pan (drag left? or right? TOFIX mouse button) to investigate the registration quality
* press `ctrl+D` to move to the next landmark
To correct a landmark position:
* press `space` to enter the landmark mode
* carefully select and drag a landmark to modify its position. Live visualization of the transformed image helps you positioning it correctly
* press `space` to exit landmark mode
* navigate and repeat for each landmark that needs correction
To add another landmark:
* in landmark mode, press `ctrl + left click` to pin a new landmark on both the moving and fixed image at the current mouse location. It can then be dragged.
Once you are satisfied with the registration, click `OK`
{F18025114, width=300}
==== Step 5 - Succesful registration message
If all went smoothly, you should get this message:
{F18025129, width=300}
This means that the result of the registration has been stored as a file into your QuPath project. It can then be used from within QuPath to transfer annotations and or detections from one slide to another one, as explained in the next section.
NOTE: the transformation is stored as a json file in the data entry folder of the TOFIX fixed?moving? image. It is named by convention `transform_{i}_{j}.json` where `i` and `j` are the index of the fixed and the the moving image in the QuPath project.
== Analysis in QuPath
From within QuPath, annotations and or detections can be transfered within registered images, one way or another. Indeed, provided transformations are regular enough, they are invertible.
To transfer annotations or detections from one image to another:
* using the procedure of your choice (cell detection plugin, stardist, manual annotation, etc.), create annotations or detections on the image of your choice (for instance the moving image).
* move to the other registered image (for instance the fixed image).
* you can then execute the following script {nav Automate > User scripts... > New Script... }:
```
lang=javascript, name=transform_objects.groovy
// Transfer PathObjects from another image that contains a serialized RealTransform
// result from the BIOP WSI Aligner (See: https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/wsi_registration_fjii_qupath/ )
// The current Image Entry that we want to transfer PathObjects to
def targetEntry = getProjectEntry()
// Locate candidate entries can can be transformed into the source entry
def sourceEntries = Warpy.getCandidateSourceEntries( targetEntry )
// Choose one source or transfer from all of them with a for loop
def sourceEntry = sourceEntries[0]
// Recover the RealTransform that was put there by WSI Aligner
def transform = Warpy.getRealTransform( sourceEntry, targetEntry )
// Recover the objects we wish to transform into the target image
// This step ensures you can have control over what gets transferred
def objectsToTransfer = Warpy.getPathObjectsFromEntry( sourceEntry )
// Finally perform the transform of each PathObject
def transferedObjects = Warpy.transformPathObjects(objectsToTransfer, transform)
// Convenience method to add intensity measurements. Does not have to do with transforms directly.
// This packs the addIntensityMeasurements in such a way that it works for RGB and Fluoresence images
Warpy.addIntensityMeasurements(transferedObjects, 1)
// Finally, add the transformed objects to the current image and update the display
addObjects(transferedObjects)
fireHierarchyUpdate()
// Necessary import, requires qupath-extenstion-warpy, see: https://github.com/BIOP/qupath-extension-warpy
import qupath.ext.biop.warpy.*
```
The above scripts consists of two parts:
* `TransformHelper.transferMatchingAnnotationsToImage` looks through all images of the QuPath project if there are transformations files will allow to transfer annotations and detections from one image to the current (target) one, and then performs the transfer.
* transfered annotations/detections does not contain the standard measurements provided by QuPath (fluorescent intensity / DAB value, etc.) In order to add the measurements on the target image, the function `TransformHelper.addIntensityMeasurements(getAnnotationObjects(), server, 1, true)` can be called
== Troubleshooting
Something's wrong in this documentation ? Please post your issue or question in the [image.sc forum](image.sc forum).
== References (papers)
* [Fiji](http://www.nature.com/nmeth/journal/v9/n7/full/nmeth.2019.html)
* [QuPath](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5715110/)
* [BigWarp](http://ieeexplore.ieee.org/document/7493463/)
* [Elastix (1)](http://dx.doi.org/10.1109/TMI.2009.2035616) [Elastix (2)](http://dx.doi.org/10.3389/fninf.2013.00050)
* [BigDataViewer](http://www.nature.com/nmeth/journal/v12/n6/full/nmeth.3392.html)
== References (github)
The code to perform this workflow is splitted in several parts:
[Qupath extensions](https://github.com/BIOP/qupath-biop-extensions) and several repositories for the fiji side. On the fiji side, the main components are [Bigdataviewer-playground](https://github.com/bigdataviewer/bigdataviewer-playground) for the management of multiple sources and [Bigdataviewer-biop-tools](https://github.com/BIOP/bigdataviewer-biop-tools), for various functionalities, for instance the QuPath bridge.
This page documents a way to register whole slide images in Fiji and analyze their result in QuPath. This workflow allows to register slides with transformations which are **more complex than affine transform**, by using the ImgLib2 library, elastix, and BigWarp. Reasonably, if the sample is not too deformed or damaged, registration at the cellular level can be achieved on the whole slide.
= Videos tutorials
These videos have been created on the 9th of April 2021, updated ones will come (TODO)!
[Playlist](https://www.youtube.com/watch?v=E4k2CGbvQTM&list=PL2PJpdamvntgXgF0YF_FKeYzn34RITfar)
* [Installation](https://www.youtube.com/watch?v=E4k2CGbvQTM)
* [Registration](https://www.youtube.com/watch?v=HAdktWdngWU)
* [Analysis](https://www.youtube.com/watch?v=uDglj4Lvxak)
= Installation
== ImageJ / Fiji
=== BigDataViewer-Playground update site ===
Enable the [BigDataViewer-Playground](https://imagej.github.io/plugins/bdv/playground) update site, then restart Fiji. This update site is accessible in the list of Fiji's update site under the name `BigDataViewer-Playground`.
=== Elastix - To enable automated registrations capabilities ===
** Download the [latest release of elastix for your OS](https://github.com/SuperElastix/elastix/releases/tag/5.0.1). This documentation has been tested for elastix 5.0.1.
** Unzip it somewhere convenient ( `C` drive on windows, in `Applications` for Mac )
**Windows **
For windows users, you also need to install [Visual C++ redistributable](https://support.microsoft.com/en-us/topic/the-latest-supported-visual-c-downloads-2647da03-1eea-4433-9aff-95f26a218cc0), you'll most probably need `vc_redist.x64.exe`
**Mac**
Fiji will be calling the elastix executables, which are recognized as ‘unknown developers’ by Mac OS. Thus you need to make security exceptions for both elastix and transformix (https://support.apple.com/en-hk/guide/mac-help/mh40616/mac) to avoid clicking indefinitely on the OS warning messages.
**Linux (not tested)**
Nothing particular should be required for linux system.
=== Indicate `elastix` and `transformix` executable location in Fiji ===
*** Execute {nav Plugins › BIOP › Set and Check Wrappers} then indicate the proper location of executable files {F18020664,width = 500}
*** This should show up in the ImageJ console : `[INFO] Transformix -> set :-) Elastix -> set :-)`
Once elastix is installed, you can run [the following script](https://gist.githubusercontent.com/NicoKiaru/b91f9f3f0069b765a49b5d4629a8b1c7/raw/571954a443d1e1f0597022f6c19f042aefbc0f5a/TestRegister.groovy) in Fiji to test elastix functionality. Save the file with a `.groovy` extension, open it Fiji, and run it.
== QuPath
* Install the [latest QuPath version (0.3.0)](https://qupath.github.io/)
* Install the [QuPath Warpy extension by following its readme](https://github.com/BIOP/qupath-extension-warpy).
* Install the [QuPath Image Combiner Warpy extension by following its readme](https://github.com/iwbh15/ImageCombinerWarpy_qp_v0.3).
= WSI registration - step by step procedure
To follow the WSI registration procedure, a demo dataset, consisting of a fluorescent image which can be registered to a RGB H-DAB image, can be [downloaded from Zenodo](https://doi.org/10.5281/zenodo.5674521).
== Create your QuPath project
[Create first a QuPath project](https://qupath.readthedocs.io/en/latest/docs/tutorials/projects.html) containing all the slides that you want to register.
{F18022361, width = 400}
WARNING: Only the Bio-Formats image builder is supported on the Fiji side. Make sure to select `Bio-Formats builder` when importing your slides, and leave unchecked `Auto generate pyramids`.
If your image can't be loaded in QuPath using the `Bio-Formats builder`, you can convert your slides in `ome.tiff` format. Several options are available, for instance by using [bfconvert with Kheops](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/imagej_tools/ijp-kheops/), or [bioformatsf2raw](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/qupath/ome-tiff-conversion/) for a fast conversion.
WARNING: All files need to be properly calibrated (microns, millimeters, etc, but not pixels!). Check on the image tab of QuPath that you have a proper physical unit specified, and not pixels!
If the pixel size is wrong, you need to override it in QuPath and resave the project before resuming the workflow.
{F18022412,width = 400}
Save your project, and you are done for now on the QuPath side.
NOTE: You can let QuPath open while performing the registration on Fiji.
== Registration in Fiji
=== Open your QuPath project
In Fiji, open your QuPath project using {nav Plugins › BigDataViewer-Playground › BDVDataset › Open [QuPath Project] }. You can also directly type `QuPath` in Fiji's search bar to avoid struggling in Fiji's menu hierarchy.
* Select your qupath project file (usually named `project.qpproj`).
* Let the default options but make sure to select **MILLIMETER** in the `physical units of the dataset` field
{F22107102,width = 400}
NOTE: the `physical units of the dataset` field actually indicates how you want to import your dataset. Bio-Formats takes care of converting the unit of the acquisition into millimeters. This can be achieved only if the image is correctly calibrated initially: bio-formats knows how to converts from microns to millimeters, or from angström to millimeters, but it can't know how to transform from pixels to millimeters.
NOTE: this WSI registration workflow hides many registration parameters by making use of the proper calibration in physical units, and by assuming it targets a cellular resolution level (do not expect registration precision at 50 nm resolution, unless you manually correct it with BigWarp).
After you have opened your Project, a window called `BDV Sources` should pop-up. If you double click on the `Sources` node, you should be able to browse the hierarchy and see the "sources" contained in your QuPath project. Note that the fluorescent channels have been splitted into separated sources. In the demo file, you get a window like this:
{F22107174,width =300}
* `DAB.ome.tiff - DAB.tif-ch0` is the RGB DAB image
* `Fluo.lif - TileScan_001_Merging001-ch0` is the DAPI fluorescent channel
* `Fluo.lif - TileScan_001_Merging001-ch1` is the EdU fluorescent channel, which staining is similar to DAB
For the demo dataset registration, the DAB image is used as the fixed image, and the DAPI channel as the moving image.
NOTE: You can put nicer names to the images in QuPath before opening the project in Fiji.
NOTE: Navigating within BigDataViewer requires a bit of experience. In 2D, the minimal commands to know are written below:
- **hold and drag right-click**: Pan;
- **mouse wheel (or up / down key)**: zoom in and out;
- **`shift` modifier key**: zoom in or out faster,
- **`ctrl` modifier key**: more precise zoom.
- You'll soon notice that holding **`left-click`** rotates the view. To go back to the default rotation, click **`shift+Z`**
=== Creating a Warpy Registration
You can start the registration wizard by clicking {nav Plugins › BigDataViewer - Playground › Sources › Register › QuPath - Create Warpy Registration} or just type `Warpy` in the search bar and select the correct command when it shows up.
First, select the DAB source as the fixed source, and the DAPI channel as the moving source that will be used for the registration as indicated in the image below:
{F22107203, width=400}
Another wizard window shows up:
{F22107220, width=600}
The successive steps (0,1,2,3,4) of registration happens consecutively, all are optional. It is advised to keep the first two boxes checked to remove most of the intitial XYZ offsets which could be stored in the files. The initial offset is usually due to bioformats metadata storing the stage position during acquisition.
NOTE: If you check `Show results of automated registrations`, ImageJ1 images will be created to show the different images which are effectively sent to elastix for automated registration. It's a good way to visualize intermediate registration steps and check what could have gone wrong. However, the processing will be slower because each registration is happening sequentially instead of being parallel.
In the rest of this documentation, we assume all registration steps have been checked. The demo dataset requires all these steps, but a small image may be sufficiently well registered with a single affine transform.
The extra parameters located below the chosen steps will be explained when need in the registration steps.
==== Step 1 - Manual rigid registration
Just after clicking `OK`, you will get a BigDataViewer window similar to this one:
{F22107336, width=650}
You have three cards on the right part of the window:
1 - `Sources` : this card can be used to the display settings of the sources (min / max and color for fluorescent images)
After some adjustment, the fluorescence image can be made brighter (max = 50) and the DAB a little bit dimmer (max = 400):
{F22107367, width =650}
NOTE: If you register two RGB images, it is important to increase the maximum display values for both images (for instance, from 256 to 512). Otherwise overlayed images will appear fully white.
2 - `WSI Registration Wizard` : this card displays the registration step we are in, some navigation hints, and some convenience actions : restoring the initial view, performing auto-scale on images, and showing / hiding the fixed or the moving image.
3 - The third card is specific to the current wizard step.
In this step, once you will to modify the location of the fluorescent image in order to approximately put it at its proper location. If you zoom in, you will see an obvious shift which can be corrected with a manual rigid registration:
{F22107408, width =650}
To correct it:
* Click the button `Start manual rigid registration` in the third card.
* Pan the view until the two images are approximately aligned (some rotation and zoom adjustment can also be performed, but a translation is enough in the example):
{F22107427, width =650}
* Finally click `Confirm transformation` to go to the next step of the wizard.
NOTE: If you make a mistake while moving the images in the registration, you can click `Cancel manual registration` to restore the initial state.
==== Step 2 - Automated affine registration - Define rectangular ROI
{F22107511, width=650}
In this step, you need to define the rectangular region that will be used to do a first coarse affine registration. By default a yellow rectangle surrounding the biggest image is set. However, in the demo case, this rectangle is too large : a large portion of the DAB image is not covered by the fluorescent image. To set a better rectangular ROI, you can directly draw a rectangle in the viewer by dragging the mouse while holding the left button:
{F22107536, width=650}
If you are not satisfied with the rectangle, you can erase the last one by drawing a new rectangle until you are satisfied. Then go to the next step by clicking `Confirm rectangle`.
NOTE: You can deactivate the rectangle selection mode and toggle the navigation mode by clicking `Enable navigation`. You can also restore the initial rectangle with `Restore initial rectangle` if needed.
For the coarse affine registration with elastix, both the fixed and moving images will be resampled over this rectangular selection. Moreover, the resampling will be performed with a pixel size equals to the value specified in the `Pixel size for coarse registration in microns` parameter. By default, this value is set to 10 microns.
WARNING: The coarse affine registration is not performed directly when clicking the `Confirm rectangle` button. Instead, patches locations for the automated spline registration are set first, before any automated registration is started. This means that, when selecting patches for the spline registration (next step), you should consider the fixed image rather than the moving image, since the moving image will be transformed before each patch is acquired.
==== Step 3 - Automated spline registration - Define landmarks
{F22107653, width=700}
Using the mouse left button, position points that will be used to locally correct the warping of the moving slide. You can concentrate the landmark on the regions you are more interested in. Here's an example of landmarks positioning:
{F22107659, width=700}
The number of necessary landmarks is not obvious to know in advance. You need to place them however where common structures are easily identifiable in both the moving and fixed image, a bit like how you would place focus points during your acquisition.
Tips:
* You can move each landmark by dragging its middle point after it has been positioned on the viewer ( the middle point increases in size when it can be dragged )
* You can clear all landmarks and restart positioning landmarks by clicking `Clear points`
* You can restore the navigation mode ( `Enable navigation` ) to zoom on the slide and position better the landmarks. Do not forget to restore the point selection mode ( `Enable point selection` ) if you want to drag patches or create new ones (even when adding the grid).
* It is possible to place patches automatically on a grid over the rectangular regions defined in the previous step by clicking `Add landmark grid`. You can set the spacing between the patches to allow for some overlap or on the contrary to let some space between them:
{F22107714, width=350}
{F22107731, width=350}
{F22107736, width=350}
It is not advised (but possible) to add more than a few hundred landmarks.
NOTE: Overlapping patches is not issue. In fact, after the patch registration, only the location of the patch central point is kept to interpolate the transformation over the whole slide.
NOTE: You can change the size of the patches by changing the value `Patch size for precision patch registration` in the initial step of the wizard. Setting it to 200 microns leads to this automated grid placement (with a spacing of 300 microns between patches:
{F22107782, width=350}
It is advised to shift the landmarks where little structure is present:
{F22107786, width=350}
By default, for each patch, the fixed and moving image is resampled with a pixel size of 1 micron. This value can be modified at the beginning of the wizard if necessary (`Pixel size for precise patch registration in micron`)
Click `confirm points` to start the registration. The ImageJ log window shows the progression of the registration:
It typically takes a minute to register the demo dataset.
{F22107797, width=350}
==== Step 4 - Manually correct landmarks location with BigWarp
If you have selected `4 - Manual spline registration (BigWarp)`, you will now be able to manually investigate and correct landmarks that have been automatically registered, and also **add new landmarks** to the slides, if you want to adjust more precisely a particular region. The interface used is directly the one of [BigWarp](https://imagej.github.io/plugins/bigwarp), so please check the documentation of BigWarp itself in order to edit landmarks.
Here's a convenient way to perform this step:
* First, ignore the moving image window and increase the size of the BigWarp fixed image.
{F18025076, width=700}
* press `F1` for help window
* press `F` to toggle ON and OFF the registered moving image (F stands for fused)
* zoom (mouse wheel) and pan (drag right mouse button) to investigate the registration quality
* press `ctrl+D` to move to the next landmark
WARNING: The DAB image has its min max value reset to 0 and 65535. Re-set these values to correct ones by using the `Sources` card.
To correct a landmark position:
* press `space` to enter the landmark mode
* carefully select and drag a landmark to modify its position. Live visualization of the transformed image helps you positioning it correctly
* press `space` to exit landmark mode
* navigate and repeat for each landmark that needs correction
To add another landmark:
* in landmark mode, press `ctrl + left click` to pin a new landmark on both the moving and fixed image at the current mouse location. It can then be dragged.
Once you are satisfied with the registration, click `OK`
{F18025114, width=300}
You can delete landmarks in the landmark table if necessary:
{F22107830, width=300}
Press `Click to finish` to end the wizard and save the transformation to QuPath.
==== Step 5 - Succesful registration message
If all went smoothly, you should get this message if the ImageJ log window:
```
Transformation file successfully written to QuPath project: Path\id_moving\transform_moving_fixed.json
```
This means that the result of the registration has been stored as a file into your QuPath project. It can then be used from within QuPath to transfer annotations and or detections from one slide to another one, as explained in the next section.
NOTE: the transformation is stored as a json file in the data entry folder of the fixed moving image. It is named by convention `transform_{i}_{j}.json` where `i` and `j` are the index of the fixed and the the moving image in the QuPath project.
== Analysis in QuPath
From within QuPath, annotations and or detections can be transfered within registered images, one way or another. Indeed, provided transformations are regular enough, they are invertible.
To transfer annotations or detections from one image to another:
* using the procedure of your choice (cell detection plugin, stardist, manual annotation, etc.), create annotations or detections on the image of your choice (for instance the moving image).
* move to the other registered image (for instance the fixed image).
* you can then execute the following script {nav Automate > User scripts... > New Script... }:
```
lang=javascript, name=transform_objects.groovy
// Transfer PathObjects from another image that contains a serialized RealTransform
// result from the BIOP WSI Aligner (See: https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/wsi_registration_fjii_qupath/ )
// The current Image Entry that we want to transfer PathObjects to
def targetEntry = getProjectEntry()
// Locate candidate entries can can be transformed into the source entry
def sourceEntries = Warpy.getCandidateSourceEntries( targetEntry )
// Choose one source or transfer from all of them with a for loop
def sourceEntry = sourceEntries[0]
// Recover the RealTransform that was put there by WSI Aligner
def transform = Warpy.getRealTransform( sourceEntry, targetEntry )
// Recover the objects we wish to transform into the target image
// This step ensures you can have control over what gets transferred
def objectsToTransfer = Warpy.getPathObjectsFromEntry( sourceEntry )
// Finally perform the transform of each PathObject
def transferedObjects = Warpy.transformPathObjects(objectsToTransfer, transform)
// Convenience method to add intensity measurements. Does not have to do with transforms directly.
// This packs the addIntensityMeasurements in such a way that it works for RGB and Fluoresence images
Warpy.addIntensityMeasurements(transferedObjects, 1)
// Finally, add the transformed objects to the current image and update the display
addObjects(transferedObjects)
fireHierarchyUpdate()
// Necessary import, requires qupath-extenstion-warpy, see: https://github.com/BIOP/qupath-extension-warpy
import qupath.ext.biop.warpy.*
```
The above scripts consists of two parts:
* `Warpy.getCandidateSourceEntries` looks through all images of the QuPath project if there are transformations files will allow to transfer annotations and detections from one image to the current (target) one, and then performs the transfer.
* transfered annotations/detections does not contain the standard measurements provided by QuPath (fluorescent intensity / DAB value, etc.) In order to add the measurements on the target image, the function `Warpy.addIntensityMeasurements` can be called
== Troubleshooting
Something's wrong in this documentation ? Please post your issue or question in the [image.sc forum](image.sc forum).
== References (papers)
* [Fiji](http://www.nature.com/nmeth/journal/v9/n7/full/nmeth.2019.html)
* [QuPath](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5715110/)
* [BigWarp](http://ieeexplore.ieee.org/document/7493463/)
* [Elastix (1)](http://dx.doi.org/10.1109/TMI.2009.2035616) [Elastix (2)](http://dx.doi.org/10.3389/fninf.2013.00050)
* [BigDataViewer](http://www.nature.com/nmeth/journal/v12/n6/full/nmeth.3392.html)
== References (github)
The code to perform this workflow is splitted in several parts:
[Qupath warpy extension](https://github.com/BIOP/qupath-extension-warpy) and several repositories for the Fiji side. On the Fiji side, the main components are [Bigdataviewer-playground](https://github.com/bigdataviewer/bigdataviewer-playground) for the management of multiple sources and [Bigdataviewer-biop-tools](https://github.com/BIOP/bigdataviewer-biop-tools), for various functionalities, for instance the QuPath bridge.
This page documents a way to register whole slide images in Fiji and analyze their result in QuPath. This workflow allows to register slides with transformations which are **more complex than affine transform**, by using the ImgLib2 library, elastix, and BigWarp. Reasonably, if the sample is not too deformed, registration at the cellular level can be achieved on the whole slide.
= Videos tutorials
These videos have been created on the 9th of April 2021, they could become outdated faster than the written documentation below!
[Playlist](https://www.youtube.com/watch?v=E4k2CGbvQTM&list=PL2PJpdamvntgXgF0YF_FKeYzn34RITfar)
* [Installation](https://www.youtube.com/watch?v=E4k2CGbvQTM)
* [Registration](https://www.youtube.com/watch?v=HAdktWdngWU)
* [Analysis](https://www.youtube.com/watch?v=uDglj4Lvxak)
= Installation
== ImageJ / Fiji
=== BigDataViewer-Playground update site ===
Enable the [BigDataViewer-Playground](https://imagej.github.io/plugins/bdv/playground) update site, then restart Fiji. This update site is accessible in the list of Fiji's update site under the name `BigDataViewer-Playground`.
=== Elastix - To enable automated registrations capabilities ===
** Download the [latest release of elastix for your OS](https://github.com/SuperElastix/elastix/releases/tag/5.0.1). This documentation has been tested for elastix 5.0.1.
** Unzip it somewhere convenient ( `C` drive on windows, in `Applications` for Mac )
**Windows **
For windows users, you also need to install [Visual C++ redistributable](https://support.microsoft.com/en-us/topic/the-latest-supported-visual-c-downloads-2647da03-1eea-4433-9aff-95f26a218cc0), you'll most probably need `vc_redist.x64.exe`
**Mac**
Fiji will be calling the elastix executables, which are recognized as ‘unknown developers’ by Mac OS. Thus you need to make security exceptions for both elastix and transformix (https://support.apple.com/en-hk/guide/mac-help/mh40616/mac) to avoid clicking indefinitely on the OS warning messages.
**Linux (not tested)**
Nothing particular should be required for linux system.
=== Indicate `elastix` and `transformix` executable location in Fiji ===
*** Execute {nav Plugins › BIOP › Set and Check Wrappers} then indicate the proper location of executable files {F18020664,width = 500}
*** This should show up in the ImageJ console : `[INFO] Transformix -> set :-) Elastix -> set :-)`
Once elastix is installed, you can run [the following script](https://gist.githubusercontent.com/NicoKiaru/b91f9f3f0069b765a49b5d4629a8b1c7/raw/571954a443d1e1f0597022f6c19f042aefbc0f5a/TestRegister.groovy) in Fiji to test elastix functionality. Save the file with a `.groovy` extension, open it Fiji, and run it.
== QuPath
* Install the [latest QuPath version (0.3.0)](https://qupath.github.io/)
* Install the [QuPath Warpy extension by following its readme](https://github.com/BIOP/qupath-extension-warpy).
* Install the [QuPath Image Combiner Warpy extension by following its readme](https://github.com/iwbh15/ImageCombinerWarpy_qp_v0.3).
= WSI registration - step by step procedure
To follow the WSI registration procedure, a demo dataset, consisting of a fluorescent image which can be registered to a RGB H-DAB image, can be [downloaded here](https://zenodo.org/record/4680455#.YHQk3D-xW00).
== Create your QuPath project
[Create first a QuPath project](https://qupath.readthedocs.io/en/latest/docs/tutorials/projects.html) containing all the slides that you want to register.
{F18022361, width = 400}
WARNING: Only the Bio-Formats image builder is supported on the Fiji side. Make sure to select `Bio-Formats builder` when importing your slides, and leave unchecked `Auto generate pyramids`.
If your image can't be loaded in QuPath using the `Bio-Formats builder`, you can convert your slides in `ome.tiff` format. Several options are available, for instance by using [bfconvert with Kheops](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/imagej_tools/ijp-kheops/), or [bioformatsf2raw](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/qupath/ome-tiff-conversion/) for a fast conversion.
WARNING: All files need to be properly calibrated (microns, millimeters, etc, but not pixels!). Check on the image tab of QuPath that you have a proper physical unit specified, and not pixels!
If the pixel size is wrong, you need to override it in QuPath and resave the project before resuming the workflow.
{F18022412,width = 400}
Save your project, and you are done for now on the QuPath side.
NOTE: You can let QuPath open while performing the registration on Fiji.
== Registration in Fiji
=== Open your QuPath project
In Fiji, open your QuPath project using {nav Plugins › BigDataViewer-Playground › BDVDataset › Open [QuPath Project] }. You can also directly type `QuPath` in Fiji's search bar to avoid struggling in Fiji's menu hierarchy.
* Select your qupath project file (usually named `project.qpproj`).
* Let the default options but make sure to select **MILLIMETER** in the `physical units of the dataset` field
{F18022564,width = 400}
NOTE: the `physical units of the dataset` field actually indicates how you want to import your dataset. Bio-Formats takes care of converting the unit of the acquisition into millimeters. This can be achieved only if the image is correctly calibrated initially: bio-formats knows how to converts from microns to millimeters, or from angström to millimeters, but it can't know how to transform from pixels to millimeters.
NOTE: this WSI registration workflow hides many registration parameters by making use of the proper calibration in physical units, and by assuming it targets a cellular resolution level (do not expect registration precision at 50 nm resolution, unless you manually correct it with BigWarp).
After you have opened your Project, a window called `BDV Sources` should pop-up. If you double click on the `Sources` node, you should be able to browse the hierarchy and see the "sources" contained in your QuPath project. Note that the fluorescent channels have been splitted into separated sources. In the demo file, you get a window like this:
{F22086972,width =250}
* `DAB.tif-ch0` is the RGB DAB image
* `TileScan_001_Merging001-ch0` is the DAPI fluorescent channel
* `TileScan_001_Merging001-ch1` is the EdU fluorescent channel, which staining is similar to DAB
For the demo dataset registration, you can choose the DAB image as the fixed image, and the DAPI channel as the moving image.
NOTE: Navigating within BigDataViewer requires a bit of experience. In 2D, the minimal commands to know are written below:
- **hold and drag right-click**: Pan;
- **mouse wheel (or up / down key)**: zoom in and out;
- **`shift` modifier key**: zoom in or out faster,
- **`ctrl` modifier key**: more precise zoom.
- You'll soon notice that holding **`left-click`** rotates the view. To go back to the default rotation, click **`shift+Z`**
=== Creating a Warpy Registration
You can start the registration wizard by clicking {nav Plugins › BigDataViewer - Playground › Sources › Register › QuPath - Create Warpy Registration} or just type `Warpy` in the search bar and select the correct command when it shows up.
First, select the DAB source as the fixed source, and the DAPI channel as the moving source that will be used for the registration as indicated in the image below:
{F18024767, width=700}
Another wizard window shows up:
{F22087142, width=700}
The successive steps (0,1,2,3,4) of registration happens consecutively, all are optional. It is advised to keep the first two checkboxes checked to remove most of the intitial XYZ offsets which could be stored in the files. The initial offset is due to bioformats metadata usually storing the stage position during acquisition.
NOTE: If you check `show results of automated registrations`, ImageJ1 images will be created and show the different images which are effectively sent to elastix for automated registration. It's a good way to check the intermediate steps and see what could have gone wrong. However, the processing will be slower because each step happen sequentially instead of in a parallel fashion.
In the rest of this section, we assume all registration steps have been checked. If not, some of the details below will not be asked.
==== Step 1 - Set Bdv window to cover both moving and fixed slide
{F18024883, width=250}
During this step, you just need to have a bdv window open and selected and all the area of the slides you want to register should be visible, as in the image below:
{F18024899, width=700}
Click `OK` to go to the second step
==== Step 2 - Define a rectangular ROI for the registration
TOFIX: It's not clearly visible, but there's actually a message overlaid in BDV: `Select a rectangular region for the region you want to register`. TODO : Github Issue
{F18024912, width=700}
So just draw a rectangle spanning the region you want to register, by drawing a rectangle on bdv, holding mouse left button:
{F18024943, width=700}
==== Step 3 - Define position of landmarks that should be automatically registered
TOFIX : It's not clearly visible, but there's actually a message overlaid in BDV: `Select the position of the landmarks`
{F18024960, width=700}
Using the mouse left button, position points that will be used to locally correct the warping of the moving slide. You can concentrate the landmark on the regions you are more interested in. Here's an example of landmarks positionning:
{F18024977, width=700}
The number of necessary landmarks is not obvious to know in advance. You need to place them however where common structures are easily identifiable in both the moving and fixed image, a bit like how you would place focus points during your acquisition.
NOTE: To go to the next step, just press `Escape`. This part of the UI needs improvement (TODO : github issue)
NOTE: The automated registration should take about 20 seconds. If it takes longer, check that there's no error message in the ImageJ console.
==== Step 4 - Manually correct landmarks
If you have selected `Manual spline registration (BigWarp)`, you will now be able to manually investigate and correct landmarks that have been automatically registered, and also add new landmarks to the slides, if you want to adjust more precisely a particular region. The interface used is directly the one of [BigWarp](https://imagej.github.io/plugins/bigwarp), so please check the documentation of BigWarp itself in order to edit landmarks.
Here's a convenient way to perform this step:
* First, ignore the moving image window and increase the size of the BigWarp fixed image.
{F18025076, width=700}
* press `F1` for help window
* press `F` to toggle ON and OFF the registered moving image (F stands for fused)
* zoom (mouse wheel) and pan (drag left? or right? TOFIX mouse button) to investigate the registration quality
* press `ctrl+D` to move to the next landmark
To correct a landmark position:
* press `space` to enter the landmark mode
* carefully select and drag a landmark to modify its position. Live visualization of the transformed image helps you positioning it correctly
* press `space` to exit landmark mode
* navigate and repeat for each landmark that needs correction
To add another landmark:
* in landmark mode, press `ctrl + left click` to pin a new landmark on both the moving and fixed image at the current mouse location. It can then be dragged.
Once you are satisfied with the registration, click `OK`
{F18025114, width=300}
==== Step 5 - Succesful registration message
If all went smoothly, you should get this message:
{F18025129, width=300}
This means that the result of the registration has been stored as a file into your QuPath project. It can then be used from within QuPath to transfer annotations and or detections from one slide to another one, as explained in the next section.
NOTE: the transformation is stored as a json file in the data entry folder of the TOFIX fixed?moving? image. It is named by convention `transform_{i}_{j}.json` where `i` and `j` are the index of the fixed and the the moving image in the QuPath project.
== Analysis in QuPath
From within QuPath, annotations and or detections can be transfered within registered images, one way or another. Indeed, provided transformations are regular enough, they are invertible.
To transfer annotations or detections from one image to another:
* using the procedure of your choice (cell detection plugin, stardist, manual annotation, etc.), create annotations or detections on the image of your choice (for instance the moving image).
* move to the other registered image (for instance the fixed image).
* you can then execute the following script {nav Automate > User scripts... > New Script... }:
```
lang=javascript, name=transform_objects.groovy
// Transfer PathObjects from another image that contains a serialized RealTransform
// result from the BIOP WSI Aligner (See: https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/wsi_registration_fjii_qupath/ )
// The current Image Entry that we want to transfer PathObjects to
def targetEntry = getProjectEntry()
// Locate candidate entries can can be transformed into the source entry
def sourceEntries = Warpy.getCandidateSourceEntries( targetEntry )
// Choose one source or transfer from all of them with a for loop
def sourceEntry = sourceEntries[0]
// Recover the RealTransform that was put there by WSI Aligner
def transform = Warpy.getRealTransform( sourceEntry, targetEntry )
// Recover the objects we wish to transform into the target image
// This step ensures you can have control over what gets transferred
def objectsToTransfer = Warpy.getPathObjectsFromEntry( sourceEntry )
// Finally perform the transform of each PathObject
def transferedObjects = Warpy.transformPathObjects(objectsToTransfer, transform)
// Convenience method to add intensity measurements. Does not have to do with transforms directly.
// This packs the addIntensityMeasurements in such a way that it works for RGB and Fluoresence images
Warpy.addIntensityMeasurements(transferedObjects, 1)
// Finally, add the transformed objects to the current image and update the display
addObjects(transferedObjects)
fireHierarchyUpdate()
// Necessary import, requires qupath-extenstion-warpy, see: https://github.com/BIOP/qupath-extension-warpy
import qupath.ext.biop.warpy.*
```
The above scripts consists of two parts:
* `TransformHelper.transferMatchingAnnotationsToImage` looks through all images of the QuPath project if there are transformations files will allow to transfer annotations and detections from one image to the current (target) one, and then performs the transfer.
* transfered annotations/detections does not contain the standard measurements provided by QuPath (fluorescent intensity / DAB value, etc.) In order to add the measurements on the target image, the function `TransformHelper.addIntensityMeasurements(getAnnotationObjects(), server, 1, true)` can be called
== Troubleshooting
Something's wrong in this documentation ? Please post your issue or question in the [image.sc forum](image.sc forum).
== References (papers)
* [Fiji](http://www.nature.com/nmeth/journal/v9/n7/full/nmeth.2019.html)
* [QuPath](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5715110/)
* [BigWarp](http://ieeexplore.ieee.org/document/7493463/)
* [Elastix (1)](http://dx.doi.org/10.1109/TMI.2009.2035616) [Elastix (2)](http://dx.doi.org/10.3389/fninf.2013.00050)
* [BigDataViewer](http://www.nature.com/nmeth/journal/v12/n6/full/nmeth.3392.html)
== References (github)
The code to perform this workflow is splitted in several parts:
[Qupath extensions](https://github.com/BIOP/qupath-biop-extensions) and several repositories for the fiji side. On the fiji side, the main components are [Bigdataviewer-playground](https://github.com/bigdataviewer/bigdataviewer-playground) for the management of multiple sources and [Bigdataviewer-biop-tools](https://github.com/BIOP/bigdataviewer-biop-tools), for various functionalities, for instance the QuPath bridgeThis page documents a way to register whole slide images in Fiji and analyze their result in QuPath. This workflow allows to register slides with transformations which are **more complex than affine transform**, by using the ImgLib2 library, elastix, and BigWarp. Reasonably, if the sample is not too deformed or damaged, registration at the cellular level can be achieved on the whole slide.
= Videos tutorials
These videos have been created on the 9th of April 2021, updated ones will come (TODO)!
[Playlist](https://www.youtube.com/watch?v=E4k2CGbvQTM&list=PL2PJpdamvntgXgF0YF_FKeYzn34RITfar)
* [Installation](https://www.youtube.com/watch?v=E4k2CGbvQTM)
* [Registration](https://www.youtube.com/watch?v=HAdktWdngWU)
* [Analysis](https://www.youtube.com/watch?v=uDglj4Lvxak)
= Installation
== ImageJ / Fiji
=== BigDataViewer-Playground update site ===
Enable the [BigDataViewer-Playground](https://imagej.github.io/plugins/bdv/playground) update site, then restart Fiji. This update site is accessible in the list of Fiji's update site under the name `BigDataViewer-Playground`.
=== Elastix - To enable automated registrations capabilities ===
** Download the [latest release of elastix for your OS](https://github.com/SuperElastix/elastix/releases/tag/5.0.1). This documentation has been tested for elastix 5.0.1.
** Unzip it somewhere convenient ( `C` drive on windows, in `Applications` for Mac )
**Windows **
For windows users, you also need to install [Visual C++ redistributable](https://support.microsoft.com/en-us/topic/the-latest-supported-visual-c-downloads-2647da03-1eea-4433-9aff-95f26a218cc0), you'll most probably need `vc_redist.x64.exe`
**Mac**
Fiji will be calling the elastix executables, which are recognized as ‘unknown developers’ by Mac OS. Thus you need to make security exceptions for both elastix and transformix (https://support.apple.com/en-hk/guide/mac-help/mh40616/mac) to avoid clicking indefinitely on the OS warning messages.
**Linux (not tested)**
Nothing particular should be required for linux system.
=== Indicate `elastix` and `transformix` executable location in Fiji ===
*** Execute {nav Plugins › BIOP › Set and Check Wrappers} then indicate the proper location of executable files {F18020664,width = 500}
*** This should show up in the ImageJ console : `[INFO] Transformix -> set :-) Elastix -> set :-)`
Once elastix is installed, you can run [the following script](https://gist.githubusercontent.com/NicoKiaru/b91f9f3f0069b765a49b5d4629a8b1c7/raw/571954a443d1e1f0597022f6c19f042aefbc0f5a/TestRegister.groovy) in Fiji to test elastix functionality. Save the file with a `.groovy` extension, open it Fiji, and run it.
== QuPath
* Install the [latest QuPath version (0.3.0)](https://qupath.github.io/)
* Install the [QuPath Warpy extension by following its readme](https://github.com/BIOP/qupath-extension-warpy).
* Install the [QuPath Image Combiner Warpy extension by following its readme](https://github.com/iwbh15/ImageCombinerWarpy_qp_v0.3).
= WSI registration - step by step procedure
To follow the WSI registration procedure, a demo dataset, consisting of a fluorescent image which can be registered to a RGB H-DAB image, can be [downloaded from Zenodo](https://doi.org/10.5281/zenodo.5674521).
== Create your QuPath project
[Create first a QuPath project](https://qupath.readthedocs.io/en/latest/docs/tutorials/projects.html) containing all the slides that you want to register.
{F18022361, width = 400}
WARNING: Only the Bio-Formats image builder is supported on the Fiji side. Make sure to select `Bio-Formats builder` when importing your slides, and leave unchecked `Auto generate pyramids`.
If your image can't be loaded in QuPath using the `Bio-Formats builder`, you can convert your slides in `ome.tiff` format. Several options are available, for instance by using [bfconvert with Kheops](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/imagej_tools/ijp-kheops/), or [bioformatsf2raw](https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/qupath/ome-tiff-conversion/) for a fast conversion.
WARNING: All files need to be properly calibrated (microns, millimeters, etc, but not pixels!). Check on the image tab of QuPath that you have a proper physical unit specified, and not pixels!
If the pixel size is wrong, you need to override it in QuPath and resave the project before resuming the workflow.
{F18022412,width = 400}
Save your project, and you are done for now on the QuPath side.
NOTE: You can let QuPath open while performing the registration on Fiji.
== Registration in Fiji
=== Open your QuPath project
In Fiji, open your QuPath project using {nav Plugins › BigDataViewer-Playground › BDVDataset › Open [QuPath Project] }. You can also directly type `QuPath` in Fiji's search bar to avoid struggling in Fiji's menu hierarchy.
* Select your qupath project file (usually named `project.qpproj`).
* Let the default options but make sure to select **MILLIMETER** in the `physical units of the dataset` field
{F22107102,width = 400}
NOTE: the `physical units of the dataset` field actually indicates how you want to import your dataset. Bio-Formats takes care of converting the unit of the acquisition into millimeters. This can be achieved only if the image is correctly calibrated initially: bio-formats knows how to converts from microns to millimeters, or from angström to millimeters, but it can't know how to transform from pixels to millimeters.
NOTE: this WSI registration workflow hides many registration parameters by making use of the proper calibration in physical units, and by assuming it targets a cellular resolution level (do not expect registration precision at 50 nm resolution, unless you manually correct it with BigWarp).
After you have opened your Project, a window called `BDV Sources` should pop-up. If you double click on the `Sources` node, you should be able to browse the hierarchy and see the "sources" contained in your QuPath project. Note that the fluorescent channels have been splitted into separated sources. In the demo file, you get a window like this:
{F22107174,width =300}
* `DAB.ome.tiff - DAB.tif-ch0` is the RGB DAB image
* `Fluo.lif - TileScan_001_Merging001-ch0` is the DAPI fluorescent channel
* `Fluo.lif - TileScan_001_Merging001-ch1` is the EdU fluorescent channel, which staining is similar to DAB
For the demo dataset registration, the DAB image is used as the fixed image, and the DAPI channel as the moving image.
NOTE: You can put nicer names to the images in QuPath before opening the project in Fiji.
NOTE: Navigating within BigDataViewer requires a bit of experience. In 2D, the minimal commands to know are written below:
- **hold and drag right-click**: Pan;
- **mouse wheel (or up / down key)**: zoom in and out;
- **`shift` modifier key**: zoom in or out faster,
- **`ctrl` modifier key**: more precise zoom.
- You'll soon notice that holding **`left-click`** rotates the view. To go back to the default rotation, click **`shift+Z`**
=== Creating a Warpy Registration
You can start the registration wizard by clicking {nav Plugins › BigDataViewer - Playground › Sources › Register › QuPath - Create Warpy Registration} or just type `Warpy` in the search bar and select the correct command when it shows up.
First, select the DAB source as the fixed source, and the DAPI channel as the moving source that will be used for the registration as indicated in the image below:
{F22107203, width=400}
Another wizard window shows up:
{F22107220, width=600}
The successive steps (0,1,2,3,4) of registration happens consecutively, all are optional. It is advised to keep the first two boxes checked to remove most of the intitial XYZ offsets which could be stored in the files. The initial offset is usually due to bioformats metadata storing the stage position during acquisition.
NOTE: If you check `Show results of automated registrations`, ImageJ1 images will be created to show the different images which are effectively sent to elastix for automated registration. It's a good way to visualize intermediate registration steps and check what could have gone wrong. However, the processing will be slower because each registration is happening sequentially instead of being parallel.
In the rest of this documentation, we assume all registration steps have been checked. The demo dataset requires all these steps, but a small image may be sufficiently well registered with a single affine transform.
The extra parameters located below the chosen steps will be explained when need in the registration steps.
==== Step 1 - Manual rigid registration
Just after clicking `OK`, you will get a BigDataViewer window similar to this one:
{F22107336, width=650}
You have three cards on the right part of the window:
1 - `Sources` : this card can be used to the display settings of the sources (min / max and color for fluorescent images)
After some adjustment, the fluorescence image can be made brighter (max = 50) and the DAB a little bit dimmer (max = 400):
{F22107367, width =650}
NOTE: If you register two RGB images, it is important to increase the maximum display values for both images (for instance, from 256 to 512). Otherwise overlayed images will appear fully white.
2 - `WSI Registration Wizard` : this card displays the registration step we are in, some navigation hints, and some convenience actions : restoring the initial view, performing auto-scale on images, and showing / hiding the fixed or the moving image.
3 - The third card is specific to the current wizard step.
In this step, once you will to modify the location of the fluorescent image in order to approximately put it at its proper location. If you zoom in, you will see an obvious shift which can be corrected with a manual rigid registration:
{F22107408, width =650}
To correct it:
* Click the button `Start manual rigid registration` in the third card.
* Pan the view until the two images are approximately aligned (some rotation and zoom adjustment can also be performed, but a translation is enough in the example):
{F22107427, width =650}
* Finally click `Confirm transformation` to go to the next step of the wizard.
NOTE: If you make a mistake while moving the images in the registration, you can click `Cancel manual registration` to restore the initial state.
==== Step 2 - Automated affine registration - Define rectangular ROI
{F22107511, width=650}
In this step, you need to define the rectangular region that will be used to do a first coarse affine registration. By default a yellow rectangle surrounding the biggest image is set. However, in the demo case, this rectangle is too large : a large portion of the DAB image is not covered by the fluorescent image. To set a better rectangular ROI, you can directly draw a rectangle in the viewer by dragging the mouse while holding the left button:
{F22107536, width=650}
If you are not satisfied with the rectangle, you can erase the last one by drawing a new rectangle until you are satisfied. Then go to the next step by clicking `Confirm rectangle`.
NOTE: You can deactivate the rectangle selection mode and toggle the navigation mode by clicking `Enable navigation`. You can also restore the initial rectangle with `Restore initial rectangle` if needed.
For the coarse affine registration with elastix, both the fixed and moving images will be resampled over this rectangular selection. Moreover, the resampling will be performed with a pixel size equals to the value specified in the `Pixel size for coarse registration in microns` parameter. By default, this value is set to 10 microns.
WARNING: The coarse affine registration is not performed directly when clicking the `Confirm rectangle` button. Instead, patches locations for the automated spline registration are set first, before any automated registration is started. This means that, when selecting patches for the spline registration (next step), you should consider the fixed image rather than the moving image, since the moving image will be transformed before each patch is acquired.
==== Step 3 - Automated spline registration - Define landmarks
{F22107653, width=700}
Using the mouse left button, position points that will be used to locally correct the warping of the moving slide. You can concentrate the landmark on the regions you are more interested in. Here's an example of landmarks positioning:
{F22107659, width=700}
The number of necessary landmarks is not obvious to know in advance. You need to place them however where common structures are easily identifiable in both the moving and fixed image, a bit like how you would place focus points during your acquisition.
Tips:
* You can move each landmark by dragging its middle point after it has been positioned on the viewer ( the middle point increases in size when it can be dragged )
* You can clear all landmarks and restart positioning landmarks by clicking `Clear points`
* You can restore the navigation mode ( `Enable navigation` ) to zoom on the slide and position better the landmarks. Do not forget to restore the point selection mode ( `Enable point selection` ) if you want to drag patches or create new ones (even when adding the grid).
* It is possible to place patches automatically on a grid over the rectangular regions defined in the previous step by clicking `Add landmark grid`. You can set the spacing between the patches to allow for some overlap or on the contrary to let some space between them:
{F22107714, width=350}
{F22107731, width=350}
{F22107736, width=350}
It is not advised (but possible) to add more than a few hundred landmarks.
NOTE: Overlapping patches is not issue. In fact, after the patch registration, only the location of the patch central point is kept to interpolate the transformation over the whole slide.
NOTE: You can change the size of the patches by changing the value `Patch size for precision patch registration` in the initial step of the wizard. Setting it to 200 microns leads to this automated grid placement (with a spacing of 300 microns between patches:
{F22107782, width=350}
It is advised to shift the landmarks where little structure is present:
{F22107786, width=350}
By default, for each patch, the fixed and moving image is resampled with a pixel size of 1 micron. This value can be modified at the beginning of the wizard if necessary (`Pixel size for precise patch registration in micron`)
Click `confirm points` to start the registration. The ImageJ log window shows the progression of the registration:
It typically takes a minute to register the demo dataset.
{F22107797, width=350}
==== Step 4 - Manually correct landmarks location with BigWarp
If you have selected `4 - Manual spline registration (BigWarp)`, you will now be able to manually investigate and correct landmarks that have been automatically registered, and also **add new landmarks** to the slides, if you want to adjust more precisely a particular region. The interface used is directly the one of [BigWarp](https://imagej.github.io/plugins/bigwarp), so please check the documentation of BigWarp itself in order to edit landmarks.
Here's a convenient way to perform this step:
* First, ignore the moving image window and increase the size of the BigWarp fixed image.
{F18025076, width=700}
* press `F1` for help window
* press `F` to toggle ON and OFF the registered moving image (F stands for fused)
* zoom (mouse wheel) and pan (drag right mouse button) to investigate the registration quality
* press `ctrl+D` to move to the next landmark
WARNING: The DAB image has its min max value reset to 0 and 65535. Re-set these values to correct ones by using the `Sources` card.
To correct a landmark position:
* press `space` to enter the landmark mode
* carefully select and drag a landmark to modify its position. Live visualization of the transformed image helps you positioning it correctly
* press `space` to exit landmark mode
* navigate and repeat for each landmark that needs correction
To add another landmark:
* in landmark mode, press `ctrl + left click` to pin a new landmark on both the moving and fixed image at the current mouse location. It can then be dragged.
Once you are satisfied with the registration, click `OK`
{F18025114, width=300}
You can delete landmarks in the landmark table if necessary:
{F22107830, width=300}
Press `Click to finish` to end the wizard and save the transformation to QuPath.
==== Step 5 - Succesful registration message
If all went smoothly, you should get this message if the ImageJ log window:
```
Transformation file successfully written to QuPath project: Path\id_moving\transform_moving_fixed.json
```
This means that the result of the registration has been stored as a file into your QuPath project. It can then be used from within QuPath to transfer annotations and or detections from one slide to another one, as explained in the next section.
NOTE: the transformation is stored as a json file in the data entry folder of the fixed moving image. It is named by convention `transform_{i}_{j}.json` where `i` and `j` are the index of the fixed and the the moving image in the QuPath project.
== Analysis in QuPath
From within QuPath, annotations and or detections can be transfered within registered images, one way or another. Indeed, provided transformations are regular enough, they are invertible.
To transfer annotations or detections from one image to another:
* using the procedure of your choice (cell detection plugin, stardist, manual annotation, etc.), create annotations or detections on the image of your choice (for instance the moving image).
* move to the other registered image (for instance the fixed image).
* you can then execute the following script {nav Automate > User scripts... > New Script... }:
```
lang=javascript, name=transform_objects.groovy
// Transfer PathObjects from another image that contains a serialized RealTransform
// result from the BIOP WSI Aligner (See: https://c4science.ch/w/bioimaging_and_optics_platform_biop/image-processing/wsi_registration_fjii_qupath/ )
// The current Image Entry that we want to transfer PathObjects to
def targetEntry = getProjectEntry()
// Locate candidate entries can can be transformed into the source entry
def sourceEntries = Warpy.getCandidateSourceEntries( targetEntry )
// Choose one source or transfer from all of them with a for loop
def sourceEntry = sourceEntries[0]
// Recover the RealTransform that was put there by WSI Aligner
def transform = Warpy.getRealTransform( sourceEntry, targetEntry )
// Recover the objects we wish to transform into the target image
// This step ensures you can have control over what gets transferred
def objectsToTransfer = Warpy.getPathObjectsFromEntry( sourceEntry )
// Finally perform the transform of each PathObject
def transferedObjects = Warpy.transformPathObjects(objectsToTransfer, transform)
// Convenience method to add intensity measurements. Does not have to do with transforms directly.
// This packs the addIntensityMeasurements in such a way that it works for RGB and Fluoresence images
Warpy.addIntensityMeasurements(transferedObjects, 1)
// Finally, add the transformed objects to the current image and update the display
addObjects(transferedObjects)
fireHierarchyUpdate()
// Necessary import, requires qupath-extenstion-warpy, see: https://github.com/BIOP/qupath-extension-warpy
import qupath.ext.biop.warpy.*
```
The above scripts consists of two parts:
* `Warpy.getCandidateSourceEntries` looks through all images of the QuPath project if there are transformations files will allow to transfer annotations and detections from one image to the current (target) one, and then performs the transfer.
* transfered annotations/detections does not contain the standard measurements provided by QuPath (fluorescent intensity / DAB value, etc.) In order to add the measurements on the target image, the function `Warpy.addIntensityMeasurements` can be called
== Troubleshooting
Something's wrong in this documentation ? Please post your issue or question in the [image.sc forum](image.sc forum).
== References (papers)
* [Fiji](http://www.nature.com/nmeth/journal/v9/n7/full/nmeth.2019.html)
* [QuPath](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5715110/)
* [BigWarp](http://ieeexplore.ieee.org/document/7493463/)
* [Elastix (1)](http://dx.doi.org/10.1109/TMI.2009.2035616) [Elastix (2)](http://dx.doi.org/10.3389/fninf.2013.00050)
* [BigDataViewer](http://www.nature.com/nmeth/journal/v12/n6/full/nmeth.3392.html)
== References (github)
The code to perform this workflow is splitted in several parts:
[Qupath warpy extension](https://github.com/BIOP/qupath-extension-warpy) and several repositories for the Fiji side. On the Fiji side, the main components are [Bigdataviewer-playground](https://github.com/bigdataviewer/bigdataviewer-playground) for the management of multiple sources and [Bigdataviewer-biop-tools](https://github.com/BIOP/bigdataviewer-biop-tools), for various functionalities, for instance the QuPath bridge.
c4science · Help