A team of US researchers have developed an artificial intelligence algorithm to spot anaemia from smartphone photos of eyelids.
Anaemia, or low blood haemoglobin, affects more than 25% of the global population.
While it’s easy to diagnose if one is near a hospital or pathology lab, anaemia is much more common in places with poor access to healthcare.
Researchers based at Brown University, US, have addressed this by making a program that can spot anaemia with a much more widely distributed tool: the smartphone.
People with anaemia have different skin colouration – particularly in certain parts of the body.
“Others have used photos of the creases in the palms, fingernail beds and other parts to devise algorithms to predict anaemia,” says Dr Selim Suner, a professor of emergency medicine, surgery and engineering at Brown, and first author on a paper describing the research, published in the journal PLOS One.
“These areas of the body rely heavily on blood flow and may be affected by temperature changes and may give false results.”
Suner and colleagues instead focused on the inside of the lower eyelid: somewhere with low blood flow, and little melanin or other things that might affect colouration.
The researchers photographed the eyelids of 142 patients at a hospital, who had wide-ranging haemoglobin levels.
They used this data to train a machine learning algorithm to spot colours that were indicative of anaemia.
The researchers then tested this tool on 202 new patients, comparing the predictions from the algorithm with the patients’ haemoglobin levels.
They found the test to be 72.6% accurate overall: lower than a gold-standard lab test, but still a useful screening indicator.
While this study relied on trained medical staff to take photographs, Suner says that the team hopes to focus on broadening the use of the tool next.
“In the next iteration, we are using design features which will allow novice users to take the photos,” he says.
“We will incorporate algorithms to alert the user if the photo is of sufficient quality to render accurate results. We have shown that features of the photo such as focus and lighting effect the results, so having these features in the final version will be important.”
The team will also be testing the tool on a new cohort of patients, hoping to refine and improve it before it use becomes more widespread.
Ellen Phiddian is a science journalist at Cosmos. She has a BSc (Honours) in chemistry and science communication, and an MSc in science communication, both from the Australian National University.
Read science facts, not fiction...
There’s never been a more important time to explain the facts, cherish evidence-based knowledge and to showcase the latest scientific, technological and engineering breakthroughs. Cosmos is published by The Royal Institution of Australia, a charity dedicated to connecting people with the world of science. Financial contributions, however big or small, help us provide access to trusted science information at a time when the world needs it most. Please support us by making a donation or purchasing a subscription today.