Noise2Filter: Fast, Self-Supervised Learning and Real-Time Reconstruction for 3D Computed Tomography

Published in Machine Learning: Science and Technology on 01 December 2020.

Publication PDF

Summary

At x-ray beamlines of synchrotron light sources, the achievable time-resolution for 3D tomographic imaging of the interior of an object has been reduced to a fraction of a second, enabling rapidly changing structures to be examined. The associated data acquisition rates require sizable computational resources for reconstruction. Therefore, full 3D reconstruction of the object is usually performed after the scan has completed. Quasi-3D reconstruction—where several interactive 2D slices are computed instead of a 3D volume—has been shown to be significantly more efficient, and can enable the real-time reconstruction and visualization of the interior. However, quasi-3D reconstruction relies on filtered backprojection type algorithms, which are typically sensitive to measurement noise. To overcome this issue, we propose Noise2Filter, a learned filter method that can be trained using only the measured data, and does not require any additional training data. This method combines quasi-3D reconstruction, learned filters, and self-supervised learning to derive a tomographic reconstruction method that can be trained in under a minute and evaluated in real-time. We show limited loss of accuracy compared to training with additional training data, and improved accuracy compared to standard filter-based methods.

Citation

For more details and additional results, read the full paper.
  @Article{lagerwerf-2020-noise2,
  author          = {Lagerwerf, Marinus J. and Hendriksen, Allard A. and
                  Buurlage, Jan-Willem and Batenburg, K. Joost},
  title           = {{Noise2Filter}: Fast, Self-Supervised Learning and
                  Real-Time Reconstruction for {3D} Computed Tomography},
  journal         = {Machine Learning: Science and Technology},
  volume          = 2,
  number          = 1,
  pages           = 015012,
  year            = 2020,
  doi             = {10.1088/2632-2153/abbd4d},
  url             = {https://doi.org/10.1088/2632-2153/abbd4d},
  issn            = {2632-2153},
  month           = {Dec},
  publisher       = {IOP Publishing},
}