The AI-based app interprets the results of HIV testing


Researchers at University College London and the Africa Health Research Institute have developed an AI-driven application that can interpret lateral flow tests for HIV. The technique involves taking a test image with a smartphone camera and the app can tell if the result is positive or negative by simply analyzing the image. Because these tests can be difficult to interpret, the technology should help improve their accuracy when deployed in low-resource regions.

A total of 100 million HIV tests are performed each year. Given the importance of early treatment and the large number of people being tested, the accuracy of the tests is very important. Lateral flow technology is increasingly being adopted for HIV testing, especially in the poorest areas of the world. This technology has obvious advantages in this context, including fast test results, ease of use, avoidance of expensive and heavy laboratory tests, and even the potential for self-testing.

Lateral flow tests usually provide a visual result, such as a color change. In theory, this should facilitate its interpretation. However, visually impaired or color-blind people may have difficulty interpreting the test correctly. This latest technology aims to remove some of the guesswork from the interpretation of the test, as it allows someone to simply take a picture of the test with a smartphone. The AI-based application quickly provides a result.

The technology is based on a machine learning algorithm that has been formed using 11,000 images of lateral flow tests performed in the field. In a recent test, researchers compared the accuracy of its application with tests that were read to the eye. Surprisingly, the app outperformed users of human testing, demonstrating an accuracy of 98.9% compared to 92.1% of human assessments.

Excitingly, the technology has applicability in various disease states where lateral flow tests are used, including tests for syphilis, tuberculosis, malaria, and influenza. “This study is a really strong partnership with AHRI that demonstrates the power of using deep learning to successfully classify rapid test images acquired in the“ real world ”field and reduce the number of errors that can occur when reading the results of the sight tests “. said Rachel McKendry, a researcher involved in the study, in a UCL ad. “This research shows the positive impact that mobile health tools can have on low- and middle-income countries and paves the way for a broader study in the future.”

Study a Nature medicine: In-depth learning of rapid tests based on the field of HIV

Via: UCL

Source link


Please enter your comment!
Please enter your name here