Home > Article > Technology peripherals > From early cancer screening to disease tracking, how is AI changing medical imaging?
You can penetrate deep into the human body without even needing a doctor. Doesn’t it sound a bit incredible? Radiology medical imaging technology has made great progress, and with the support of AI, it has taken a big step forward. Using the powerful computing power of AI and machine learning to scan the human body and look for subtle differences that may be overlooked by the human eye is what the medical community is currently doing.
Today's medical imaging involves a series of complex technologies that analyze every data point to find disease from health and signal from noise. In the early decades of radiology, researchers were primarily tasked with improving the resolution of body photographs, and in subsequent decades the task was to interpret the data to ensure nothing was missed.
While the primary mission of imaging technology was to diagnose medical conditions, imaging technology is now becoming an important part of treatment, especially in the field of cancer. Doctors study the images and let them help them monitor the spread of cancer cells so they can know faster and better whether treatments are working. Imaging has begun to play a new role, and the way patients are treated has changed. Doctors have more information and can choose better treatments for their patients.
Basak Dogan, associate professor at the University of Texas Southwestern Medical Center, said: "In the next five years we will see functional imaging becoming part of treatment. Today's standard imaging cannot answer the real clinical questions that patients want treatment It can have higher accuracy, so they can make better decisions based on richer information, and functional technology can help them."
Early Diagnosis
Make full use of images and automatically read as much as possible to save radiologists valuable time. This is the first obstacle encountered by most images, whether it is X-ray, CT scan, MRI or ultrasound. At this time, computer-assisted algorithms can come into play, using powerful computing power to train the computer to distinguish between abnormality and normality. This is work currently being done.
For years software experts have been working hand-in-hand with radiologists to analyze large amounts of normal and abnormal images. Doctors feed the results into computer programs, allowing the computers to learn over time so that they can eventually differentiate between abnormalities. The more images you compare and the more you learn, the stronger the AI’s discrimination ability will be.
The FDA has approved an imaging algorithm with an accuracy of 80-90%. Still, the FDA requires that even if a machine learning algorithm makes a discovery, it will ultimately be up to humans to make the decision. AI can flag discovered doubts for doctors to review, so that doctors can provide answers to patients faster.
At Mass General Brigham, they use about 50 algorithms to assist with treatment, from detecting aneurysms and cancer to spotting symptoms of embolisms and strokes. Half of the algorithms have been approved by the FDA, and others are still being tested.
Keith Dreyer, Chief Data Science Officer and Vice Chairman of the Department of Radiology, General Hospital, said: "Our goal is to detect diseases early. Sometimes it takes human doctors several days to accurately diagnose, but computers are different, they never sleep. If computers can make accurate diagnoses, treatment will be faster."
Better tracking of patients
Integrating AI into medical care, computer-assisted screening As the first step, machine learning has become an important tool for monitoring patients and tracking subtle changes. These techniques are extremely important in cancer treatment, as doctors determine whether cancer cells are growing, shrinking, or staying the same, which is important in deciding how to treat them.
Dogan said: "The patient is undergoing chemotherapy, what is happening to the cancer cells? It is difficult for us to understand. Standard imaging technology cannot detect any changes before the chemotherapy is completed, and the entire process may last several months. It may take several days. It takes months to see shrinkage."
With AI imaging, we can detect changes in cancer cells that are independent of size and anatomy. Dogan added: "In the early stages of chemotherapy, most of the changes in cancer cells have not yet reached the level of cell death. The changes exist in modified interactions between immune cells and cancer cells."
In many cases the cancer cells do not Instead of shrinking in predictable ways from the outside in, small pockets of cancer cells within the tumor may die while others live on, leaving the entire mass pockmarked like a bug-bitten sweater. Because cell death is often associated with inflammation, sometimes the size of cancer cells is still increasing, but the number of cancer cells is not necessarily increasing. Standard imaging cannot tell us how many cancer cells are alive and how many are dead. The most commonly used imaging techniques for breast cancer are mammograms and ultrasound, which are used simply to look for anatomical features.
At the University of Texas Southwestern Medical Center, Dogan uses two imaging technologies to track functional changes in breast cancer patients.
The first one is that every time the patient undergoes a cycle of chemotherapy, she takes pictures of the patient and injects microbubbles to observe the subtle pressure changes around the cancer cells. Ultrasound waves can detect changes in bubble pressure, which gather around cancer cells; growing cancer cells have more blood vessels to support their expansion than other tissues.
In another study, Dogan tested photoacoustic imaging, which converts light into sound signals. Shining laser light on breast tissue causes cells to oscillate, which creates sound waves that can be captured and analyzed. Photoacoustic imaging technology can be used to determine the oxygen content of cancer cells. When growing, cancer cells require more oxygen than ordinary cells. By analyzing the changes in sound, you can know which parts of the cancer cells are growing and which parts are not.
By analyzing images of cancer cells, we can tell which parts are most likely to metastasize to the lymph nodes. Clinicians can’t tell you which parts of the cancer cells will spread to the lymph nodes. With photoacoustic technology, we can Detecting signs of cancer cell spread early, before such signs show up on scans, without the need for invasive biopsies to detect spread.”
Discover abnormalities that humans cannot see
Dreyer said that when we have enough data and images, algorithms can discover distortions that humans cannot. His team is developing an algorithm that can measure biomarkers in the body and plot changes in them to tell someone he or she may be at risk for a stroke, a fracture or a heart attack.
Dreyer believes that this technology is the "Holy Grail" of medical imaging. Although it is not yet mature, it can bring changes to AI medical care.
As more and more AI models become available, eventually AI imaging can help patients at home. One day, we may be able to obtain ultrasound imaging information through a smartphone app.
Dreyer said: "The real change that AI will bring to health care is that it can provide a variety of solutions to people before they become sick, so that people can stay healthy."
The above is the detailed content of From early cancer screening to disease tracking, how is AI changing medical imaging?. For more information, please follow other related articles on the PHP Chinese website!