Researchers Develop New Methods And Models Using Machine Learning (ML) To Reduce Noise in X-ray Data

This article is based on the research paper  'Noise reduction in X‑ray photon correlation spectroscopy with convolutional neural networks
encoder–decoder models'.  All credit goes to the researchers of this paper  👏👏👏

Please don't forget to join our ML Subreddit

There exists a recurring problem of reducing noise or incoherent information present in existing datasets related to synchrotron X-ray experiments. Researchers from the National Synchrotron Light Source II (NSLS-II) and Computational Science Initiative (CSI) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have successfully worked out a method that aims at solving this problem, thereby providing the scientists with a fulfilling research experience.

A speckled pattern is created when an X-ray beam is scattered off a sample. The technique that analyzes the intensity of these sequential frames of speckled patterns and draws conclusions based on their structure is known as X-ray photon correlation spectroscopy (XPCS). This series of experiments uses a computed matrix called the two-time intensity-intensity correlation function (2TCF) that is time-dependent. However, with an increase in the noise levels in these images, information retrieval becomes increasingly challenging. Several efforts have been made to lessen the impact of instability and reduce the noise associated with photon detection. Nevertheless, despite these recent developments in experimental setups, achieving a high signal-to-noise ratio in many XPCS investigations remains a real issue.

Researchers aim at solving the problem on a broader scale by developing models that can be used in a wide range of XPCS studies. They have proposed an AI-based solution by effectively deploying a convolutional neural network-based encoder-decoder model (CNN-ED) for noise removal and signal restoration. The real-world experimental data used to train the ML model was created by the Coherent Hard X-ray Scattering (CHX) beamline at NSLS- II. The computational approach that makes up the primary model is based on “autoencoders.” Autoencoders is an unsupervised artificial neural network that learns how to compress and encode data effectively before reconstructing it back to a form as close to the original input as feasible. By definition, an autoencoder decreases data dimensionality by learning to ignore the noise in the data. The model also optimally uses storage memory and computational resources, making it easy to tune for local experiments. The best results were obtained using a model architecture consisting of two convolutional layers with ten channels each.

The model’s correctness in extracting meaningful data from a series of images was determined using a variety of testing approaches. One striking observation was that the model may get sufficient information about the equilibrium system’s dynamics from a lower amount of data and can also be expanded to non-equilibrium systems with dynamic parameters that change over time. The CNN-ED model outperforms other existing algorithms, and its accuracy may be improved even more by using larger training datasets.

There has been a significant improvement by using the CNN-ED approach for noise removal in XPCS when it comes to the quality of the signal. These models are faster to train and do not necessitate a large amount of data. Moreover, their accuracy is relatively consistent when it comes to hyperparameter selection. These models also require less computational resources than others to attain the same signal-to-noise ratio. However, certain limitations also persist. The tests conducted by the research group revealed that the model might not be able to remove noise from extremely noisy data reliably. For future developments, improvements in the model’s capabilities are being carried out by the research team to integrate it into the XPCS pipelines at CHX. The team is looking at more ways to leverage their model to detect instabilities in instruments during measurements and heterogeneities or other anomalous dynamics in XPCS data that are inherent to the sample.



🐝 Join the Fastest Growing AI Research Newsletter Read by Researchers from Google + NVIDIA + Meta + Stanford + MIT + Microsoft and many others...