Oxford University data visualisation spin-out Zegami has written to the health ministers for ten countries asking them to provide x-rays of COVID-19 infected lungs.
A diverse set of example images are needed in order to build a more robust machine learning model to assist health professionals identify cases of COVID-19. The use of artificial intelligence like this has the potential to provide better outcomes for patients and lead to more effective treatments, if it can become fully operational.
To date, Zegami has access to 226 x-rays of COVID-19 infected lungs but needs around 10,000.
Zegami says its new model could help identify and differentiate COVID-19 cases more easily from other lung conditions such as 'bacterial pneumonia' and 'viral pneumonia', and also help predict potential outcomes for patients by comparing their COVID-19 lung x-rays with other previous patients who had similar conditions, and what eventually happened to them based on different treatment options.
In developing its new platform, Zegami has initially used images of COVID-19 x-rays from the GitHub data initiative, which was launched by Joseph Paul Cohen, a postdoctoral fellow from Mila, University of Montreal. He is looking to develop the world’s largest collection of X-ray and CT images of COVID-19 infected lungs, to enable automated diagnosis faster and more accurately.
Roger Noble, CEO and founder of Zegami said: “The fight against COVID-19 is a global one so we have written to the health ministers of a number of countries asking if they can help us with the development of our new platform. As soon as we have enough x-rays it will be fully up and running and we hope ready to play a key role in supporting medical and technical professionals in their battle with this disease.”