Researchers From aetherAI Use Deep Learning To Develop An Annotation-Free Whole-Slide Training Strategy

This Article is written as a summay by Marktechpost Staff based on the research paper 'Identification of nodal micrometastasis in colorectal cancer using deep learning on annotation-free whole-slide images'. All Credit For This Research Goes To The Researchers of This Project. Check out the paper, ref blog.

Please Don't Forget To Join Our ML Subreddit

“Deep learning on annotation-free whole-slide images (WSI) was used to identify nodal micrometastasis in colorectal cancer” was published in the peer-reviewed Modern Pathology by aetherAI, Asia’s top medical image solution provider specialized in digital pathology and medical imaging AI. With areas under the receiver operating characteristics curve (AUC) of 0.9993 and 0.9956, respectively, the aetherAI algorithm performed well in detecting macrometastasis and micrometastasis at the slide level. For the first time, companies’ work shows that micrometastasis may be recognized using deep learning on whole-slide photos without manual annotation.

The extraordinarily high spatial resolution of whole-slide pictures makes deep learning for digital pathology difficult (WSIs). Patch-based approaches have been used in most research, which frequently needs comprehensive annotation of picture patches. On WSIs, this usually entails tedious free-hand contouring. aetherAI developed a method for training neural networks on whole WSIs using just slide-level diagnosis to relieve the load of such contouring and reap the benefits of scaling up training with many WSIs. To bypass the memory restriction of computer accelerators, they use the unified memory technique.

Extreme resolution of whole-slide images (WSI) involves dealing with ten billion pixels at the moment, which can easily cause GPU out-of-memory issues during CNN training. While traditional patch-based remedies impose the additional cost of writing comprehensive annotations, aetherAI’s True Gigapixel AI uses full WSIs and slide-level diagnostics to train without annotations.

True Gigapixel AI solves memory limits and significantly accelerates pathological AI development by removing the contouring effort.

This game-changing strategy can train AI systems using many WSIs and current slides.

aetherAI is working to improve disease detection by utilizing better picture recognition capabilities of deep neural networks. Slide quality control, case triaging, differential cell counting, and IHC quantification are just a few of their services.

Source: https://www.nature.com/articles/s41379-021-00838-2.pdf | Adenocarcinoma and squamous cell carcinoma pathological pictures are shown in a and b, respectively.

Because of the tiny size of metastatic foci, pathologists have difficulty detecting nodal micrometastasis (tumor size: 0.2–2.0 mm). Micrometastasis lymph nodes are counted as positive nodes. Hence finding micrometastasis is essential for correct colorectal cancer pathologic staging. As a result, having a pathologic staging technique to detect tiny metastatic foci in lymph nodes would be beneficial. Although deep learning methods have improved the sensitivity and efficiency of micrometastasis detection, manual annotation is time-consuming and labor-intensive. To tackle this challenge, aetherAI created a deep-learning system that identifies colorectal cancer nodal metastasis using its innovative end-to-end training technique using annotation-free WSIs.

Chang Gung Memorial Hospital in Taoyuan, Taiwan, collaborated on the research with pathologists. There were 1963, 219, and 1000 slides in the training, validation, and testing sets.

Experiments on a data set of lung cancer WSIs show that the suggested technique obtains areas under the receiver operating characteristic curve for adenocarcinoma and squamous cell carcinoma classification on 0.9594 and 0.9414, respectively. Furthermore, using class activation mapping, the approach achieves better classification results than multiple-instance learning and solid localization outcomes for minor lesions.

Conclusion:

The deep learning model developed by aetherAI employs whole-slide pictures of colorectal cancer regional lymph nodes with only a slide-level label (either positive or negative slide). A deep-learning model was trained to detect metastases using a TAIWANIA 2 supercomputer. The system worked well in seeing macrometastasis and micrometastasis at the single-lymph node level, with AUCs of 0.9993 and 0.9956, respectively. The aetherAI model correctly recognized nodal metastasis based on regions of tumor cells, as validated by visualization using class-activation mapping.