The world is facing a crisis related to maternal health. According to the World Health Organization, about 810 women die every day from preventable causes related to pregnancy and childbirth. Two-thirds of these deaths occur in sub-Saharan Africa. In Rwanda, one of the main causes of maternal death is cesarean section wounds.
A multidisciplinary team of clinicians and researchers from the Massachusetts Institute of Technology, Harvard University and Partners in Health (PIH) in Rwanda has proposed a solution to address this problem. They have developed a mobile health platform (mHealth) that uses artificial intelligence and computer vision in real time to predict infection in C-section wounds with nearly 90 percent accuracy.
“Early detection of infection is an important problem worldwide, but in low-resource areas such as rural Rwanda, the problem is more serious due to the lack of trained clinicians and the spread of antibiotic-resistant bacterial infections,” Richard Ribon Fletcher ’89, SM ’97, PhD ’02, a research scientist in mechanical engineering at MIT and technology leader for the team. “Our idea was to use cell phones that community health workers could use to visit new mothers in their homes and check their wounds for infection.”
This summer, the team, led by Harvard Medical School professor Bethany Haidt Gauthier, took home a $500,000 first place prize at The NIH Technology Acceleration Challenge for Maternal Health.
“The lives of women giving birth by caesarean section in the developing world are at risk due to limited access to high-quality surgery and postpartum care,” adds Frederic Katerra, a member of the PIH team. “Using mobile health technologies for reasonable early identification and accurate diagnosis of those with surgical infections in these communities would be a game-changer in improving women’s health.”
Training algorithms for infection detection
The beginning of the project was the result of several chance encounters. In 2017, Fletcher and Haidt Gauthier bumped into each other in the Washington metro during an investigator meeting at the National Institutes of Health. Haidt-Gautier, who had been working on research projects in Rwanda for five years at the time, was looking for a solution to the cesarean care gap that she and her colleagues faced in their research. Specifically, she was interested in exploring the use of cell phone cameras as a diagnostic tool.
Fletcher, who leads a group of students in Professor Sanjay Sarma’s AutoID lab and has spent decades applying phones, machine learning algorithms and other mobile technologies to global health, was a natural fit for the project.
“Once we realized that these types of image-based algorithms could support home care for women after cesarean delivery, we contacted Dr. Fletcher as a collaborator, given his extensive experience developing portable health technologies in low- and middle-income settings,” Hedt-Gauthier says.
During that same trip, Hedt-Gauthier accidentally sat next to Audace Nakeshimana ’20, who was an MIT freshman from Rwanda and later joined Fletcher’s team at MIT. Under the guidance of Fletcher, during his final year, he founded Nakishimana insighta Rwandan startup that applies artificial intelligence algorithms to clinical image analysis, was among the top grant winners in the annual MIT IDEAS competition in 2020.
The first step in the project was to collect a database of images of wounds taken by community health workers in rural Rwanda. They collected more than 1,000 images of both infected and uninfected wounds and then trained an algorithm using that data.
A central issue appeared in this first data set, collected between 2018 and 2019. Many of the images were of poor quality.
The quality of the wound images collected by health workers was highly variable and required a great deal of manual work to crop and resample the images. Since these images are used to train a machine learning model, the image quality and variability fundamentally limit the performance of the algorithm,” says Fletcher.
To solve this problem, Fletcher turned to the tools he used in previous projects: real-time computer vision and augmented reality.
Improve image quality through real-time image processing
To encourage community health workers to take high-quality images, Fletcher and the team revised a mobile application for wound screening and paired it with a simple paper frame. The frame contained a printed calibrated color pattern and another optical pattern that guided the application’s computer vision software.
Health workers are asked to place the frame on the wound and open the app, which provides real-time feedback about the position of the camera. The app uses augmented reality to display a green check mark when the phone is in the appropriate range. Once the range is reached, other parts of the computer vision software will automatically color balance, crop the image, and apply transitions to correct parallax.
“By using real-time computer vision at the time of data collection, we can create beautiful, clean, color-balanced images that can then be used to train our machine learning models, without the need to manually clean or post-process the data,” says Fletcher.
Using convolutional neural network (CNN) machine learning models, combined with a method called transfer learning, the software was able to successfully predict infection in C-section wounds with nearly 90% accuracy within 10 days of birth. Women who are expected to develop an infection through the app are then referred to a clinic where they can perform diagnostic bacterial tests and can prescribe life-saving antibiotics when needed.
The app has been well received by women and community health workers in Rwanda.
PIH’s Anne Niyigena adds: “Women’s trust in CHWs, who have been the biggest promoters of the application, means that the portable health tool has been accepted by women in rural areas.”
Using thermal imaging to address computational bias
Algorithmic bias is one of the biggest obstacles to scaling this AI-based technology to a global audience. When trained in a relatively homogeneous population, such as that of rural Rwanda, the algorithm works as expected and can successfully predict infection. But when presenting images of patients with different skin tones, the algorithm is less effective.
To address this issue, Fletcher used thermal imaging. Simple thermal camera modules, which are designed to be attached to a cell phone, cost approximately $200 and can be used to take infrared images of wounds. The algorithms can then be trained using thermal patterns of infrared wound images to predict infection. a study Published last year showed over 90% prediction accuracy when these thermal images were paired with the application’s CNN algorithm.
While it is more expensive than simply using a phone camera, the thermal image approach can be used to extend the team’s mobile health technology to a more diverse global population.
“We give health workers two options: In a homogeneous population, such as rural Rwanda, they can use a standard phone camera, using the model that has been trained using data from local residents. Otherwise, they can use the more general model that requires a thermal camera facility, Fletcher says.
While the current generation of the mobile app uses a cloud-based algorithm to run the infection prediction model, the team is now working on a standalone mobile app that doesn’t require internet access, and also looks at all aspects of a mother’s health, from pregnancy to the puerperium.
In addition to developing the library of wound images used in the algorithms, Fletcher is working closely with former student Nakishimana and his team at Insightiv on developing the app, using Android phones manufactured locally in Rwanda. The PIH will then conduct user testing and field verification in Rwanda.
As the team looks to develop the comprehensive maternal health app, privacy and data protection is a top priority.
“As we develop and improve these tools, more attention should be paid to patient data privacy. More data security details should be incorporated so that the tool addresses the gaps it aims to fill and increase user confidence, which will ultimately favor its wider adoption,” says Niyigena.
Award-winning team members include: Bethany Haidt-Gautier of Harvard Medical School; Richard Fletcher of MIT; Robert Riviello of Brigham and Women’s Hospital; Adeline Boatin of Massachusetts General Hospital; Anne Nyijina, Frederic Katira, Laban Bikurimana, and Vincent Kobaka from PIH Rwanda; and Audace Nakeshimana ’20, founder of Insightiv.ai.