Mathematicians from RUDN University have developed a ‘attentive’ neural network that can recognize breast cancer with an astounding accuracy rate of 99.6%. This breakthrough was made possible by incorporating a module that enhances the network’s attention to detail. Their findings have been published in the esteemed journal, Life.
The prognosis for patients with breast cancer heavily relies on the stage at which the diagnosis is made. While histological examination is considered the gold standard, it is subject to subjective factors and sample quality, resulting in inaccurate diagnoses. To combat this, the RUDN mathematician and their colleagues from China and Saudi Arabia have created a machine learning model that can more accurately detect cancer in histological images. By utilizing an additional attention module, the neural network’s accuracy rate nearly reached perfection.
Ammar Muthanna, Ph.D., Director of the Scientific Center for Modeling Wireless 5G Networks at RUDN University, emphasized the significance of computer classification, stating that it will reduce the burden on doctors and increase the accuracy of tests, ultimately improving breast cancer treatment and diagnosis. Deep learning methods have shown immense promise in medical image analysis problems in recent years.
The mathematicians tested multiple convolutional neural networks, enhancing them with two convolutional attention modules to detect objects in images. They trained and tested their model using the BreakHis dataset, which included nearly 10,000 histological images obtained from 82 patients. The model that yielded the best results was composed of the DenseNet211 convolutional network with attention modules, achieving an impressive accuracy rate of 99.6%.
The mathematicians also discovered that the recognition of cancerous formations is influenced by scale. Images appear to differ in quality at various zoom levels, influencing the visibility of cancerous objects. Therefore, a suitable approximation must be considered for real-life applications.
According to Dr. Muthanna, the attention modules significantly improved feature extraction and overall model performance. By focusing on significant areas of the image and highlighting crucial information, these modules demonstrated the importance of attention mechanisms in medical image analysis.
This breakthrough in breast cancer detection holds immense promise for the future. By leveraging the power of neural networks and attention modules, mathematicians aim to improve diagnostic accuracy, enhance treatment outcomes, and alleviate the pressure on medical professionals. With this advancement, patients can look forward to more reliable and efficient breast cancer diagnoses, leading to better care and improved survival rates.
The international research collaboration between RUDN University, Chinese, and Saudi Arabian mathematicians signifies the global effort and dedication to combating breast cancer through technological advancements. By harnessing the potential of deep learning, this breakthrough sets a new standard for medical image analysis and paves the way for further developments in cancer diagnosis and treatment.