With the accelerating incidences of heart disease and complications today, innovation and methods for rapid diagnosis and dependable monitoring of the condition of patients’ hearts is in demand. Fortunately, the persevering research in artificial intelligence and the heart is transforming heart care as we know it.
For Heart Month this February, we asked Declan O’Regan, Head of our Computational Cardiac Imaging group to tell us more about his work and how impactful it has been, and will continue to be, for the future of diagnosis and prognosis of heart disease.
What is your research all about?
We’re mainly focused on cardiac imaging, with aims to improve the tools that radiologists have for making rapid and accurate diagnosis of inherited heart disease, in particular. We are exploring automated ways of using computer-vision technology for understanding the motion and structure of patients’ hearts using algorithms to make predictions about individual patient outcomes; about how their health will change over time. Also, to understand the genetic and other biological mechanisms that underlie heart disease- and this technology will really help accelerate that.
What are your thoughts on the public’s view and relationship with AI and machine learning in this area of research and health care? Are there concerns that patients have had with personal data collection?
We did an engagement exercise with patients with cardiomyopathy, to see what their predominant needs were and how they felt about using machines in health care. There was a lot of positive feedback from the patients, and some issues they highlighted as a priority were improving diagnosis particularly for rare diseases, accurate risk stratification- as it is important for better guiding what treatment they receive, and better engagement and participation in their health care. These are all things that machine learning can help deliver and improve. The British Heart Foundation (BHF) also did a survey where 85% were supportive of using AI in diagnostics, so there’s definitely broad support behind it. There is a necessity of having robust systems in place to reassure patients that the data is being used responsibly.
What made you decide to pursue this area of research?
My background is in clinical radiology, so I also spend time working in the NHS reporting cardiac imaging for clinical reasons. I think one of the real motivators was that there is an enormous amount of detailed information that comes from the heart, particularly using MRI, and we get beautiful, detailed 3D pictures of the heart, but it’s difficult to process and we only take relatively simple measurements. There is so much more information about the structure, geometry, shape and complex patterns of motion in the heart in the images, so this is where computer techniques for image analysis come into play and allow us to create a mathematical model of the heart. We can then use this to learn information from hundreds of thousands of patients and build up a detailed model to help us understand the heart and mechanisms underlying heart disease.
What are some of the challenges that exist in your work?
One of the major challenges for us is getting access to data at scale. There are sensitivities around collecting health care data and these sorts of algorithms require very large numbers of patients in order to train the algorithms properly. It’s essential that it is done responsibly and that all the data is fully anonymised, so a real priority for us is to ensure a secure environment where we can train the algorithms on very large health care datasets. We require data from across different hospitals and different countries to scale up analysis and make sure our algorithms work equally well in different centres. Data is also often acquired in various ways- different hospitals and on different scanners- which all add to the challenges of data collection. At the other end, the main challenge is that unless these algorithms are commercialised in some way, then they won’t benefit patients. So, another key priority for us is developing relationships with industry towards the potential use in clinical environments which will ultimately benefit patients.
What has been some highlights of your work so far?
We had some work published in Nature Machine Intelligence last year which was really satisfying because it was the first time that anyone had used a machine learning algorithm to predict patient survival just from the motion of the heart. We demonstrated that it worked really well for heart disease, and in many ways, this type of technique could be used in any circumstance where motion data from an organ or organ system is taken and a prediction can be made.
With the potential of this technique to pull data from scanners, automatically track the motion of the heart and give patient-specific predictions about their risk- all automated with no intervention at any stage- was probably the bit of work that our group was proudest of and one that we hope to take forward to more industrial collaborations.
What are the next steps for you and your team?
We have a 5-year BHF programme grant starting later this year that will be supporting a team of clinicians and post-doctoral scientists using machine learning to better understand cardiomyopathy in particular and the genetic basis of the condition. Our group is also looking at complex motion traits within the heart that are difficult to characterise, the heart’s complex muscular structure and how it is adapted to achieve optimum function and how complex structural features can affect the risk of developing heart failure in the future.
Written by Emily Jin