Digitized Histological diagnosis is in increasing demand. However, color variations due to various factors are imposing obstacles to the diagnosis process. The problem of stain color variations is a well-defined problem with many proposed solutions. Most of these solutions are highly dependent on a reference template slide. We propose a deep-learning solution inspired by cycle consistency that is trained end-to-end, eliminating the need for an expert to pick a representative reference slide. Our approach showed superior results quantitatively and qualitatively against the state of the art methods. We further validated our method on a clinical use-case, namely Breast Cancer tumor classification, showing 16% increase in AUC
Our framework, as depicted in Figure, transfers the H&E Stain Appearance between different scanners, i.e from Hamamatsu (H) to Aperio (A) Scanner, without the need of paired data from both domains. The adversarial loss tries to match the distribution of the generated images to that of the target domain (Forward Cycle), and match the distribution of the generated target domain back to the source domain (Backward Cycle). The Cycle-consistency loss ensures that generated images preserve similar structure as in the source domain. This loss goes in both directions forward and backward cycles to assure stability.
we presented StainGAN as a novel method for the stain normalization task. Our experiments revealed that our method significantly outperforms the state of the art. The visual appearance of different methods can be seen in Fig.1. It clearly shows that images normalized with StainGAN are very similar to the ground truth. Further, our StainGAN method has been validated in a clinical use-case, namely Tumor Classification, as a pre-processing step showing a superior performance. Moreover, the processing time of our method is on par with Macenko as reported in Table. 2. We believe that end-to-end learning based approaches are ought to overtake the classic stain normalization methods.