Ai Techonlogy blog by susan love

By now, you’ve probably read or heard the news stories about how Google has created a way to use artificial intelligence (AI) to read mammograms. Much of the news coverage of the research, published January 1 in Nature, seemed to echo the idea that this research showed AI is better than an actual human radiologist reading a mammogram.

But AI is not better, it is just different. To create an AI system, scientists first show a computer software program a lot of mammography images with cancers to train it to recognize abnormalities. Then, when it is presented with a new image, it compares it to its AI memory databank to decide if it looks like what it knows is a cancer. This is, of course, exactly how radiologists are trained, although they have the advantage of having the patient’s medical history and symptoms to put into the equation. Is it better to have a radiologist read a mammogram and then have AI take a look than to just have the radiologist look at it? Probably. It’s probably also more accurate to have two radiologists read a mammogram. What would be more valuable would be having a way to figure out which cancers seen on a mammogram are actually life-threatening and which are not.

With major support from the National Institutes of Health, Sharon D. Lund Foundation, Jean Perkins Foundation and Rodenberry Foundation, my team and I are working in the US and Mexico using AI technology to develop a self-reading portable ultrasound to identify breast palpable lumps. You can read more about it here.

In the area in Mexico where we are doing our research, there is a shortage of radiologists. As a result, a woman with a lump in her breast—found by herself or her health care worker—typically has to wait nine months and travel to town to get in to see a radiologist for an evaluation with a mammogram.

Rather than duplicate the role of the radiologist, we want to make it possible for a nurse or other healthcare provider to use a portable ultrasound to take two ultrasound pictures of the breast lump and then load the pictures into an AI app on their smartphone to determine if the lump is benign and nothing more needs to be done — or if it may be a cancer and the woman really should see a radiologist for a mammogram. This would allow them to focus limited resources on the women who need them. Right now, we are in the process of teaching the AI how to know which breast lumps are worrisome. 

This approach would make the wait for a mammogram for those who need one much shorter and allow women with benign lumps to not have to wait and worry for nine months.

We are undoubtedly going to read much more this year about how AI is being used in cancer. It’s a hot area. But I think we need to think more about when and where we use AI. Why waste time developing it to do tasks that are readily accomplished by available people, when there is a need to bring better medical care to people who don’t even have easy access to health care providers. Our goal should be to improve outcomes. Having AI read mammograms is unlikely to do that but thinking about how to use AI in low-resourced areas will!

 

 

Love Research Army

We combat the disparities that exist in research by challenging the scientific community to launch studies that are as inclusive and diverse as the people that breast cancer affects.

En Español »