Inter-modality cardiac image registration/fusion, real-time 3D volume rendering
The main objective of the project “Inter-modality cardiac image registration/fusion, real-time 3D volume rendering” was the development and validation of computer aided techniques for non-invasive quantification and visualization of cardiac morphology and function. The initial hypothesis that combining multi-modal anatomical and functional information allows for a quicker assessment of a given case, was confirmed in preliminary studies using standard clinical acquisitions. The main focus of the research was fusion of cardiac magnetic resonance (CMR) images with 3D echocardiographic volumes (3D echo), either retrospectively or during live scanning. Both scenarios were tested and proven to be feasible.
From a technical perspective image registration algorithms specifically aimed at cardiac data were implemented. Both a spatial and temporal correspondence between 3D echo and CMR data was established. The proposed spatial registration algorithm can align multimodal data (3D echo, cine CMR or late enhanced CMR) at end diastole such that the same anatomic structures that are depicted by the different modalities can be accurately overlaid. Subsequently the spatial alignment is propagated along the heart cycle. Several temporal synchronization techniques (e.g. time point based temporal synchronization, temporal synchronization by dynamic time warping) have been implemented and tested. Their purpose is to compensate for the different number of temporal frames per cardiac cycle in CMR and 3D echo datasets as well as varying heart rate between different acquisitions, which leads to different systolic and diastolic time intervals. A non-linear temporal mapping between CMR and 3D echo or even between several 3D echo datasets of the same patient acquired at different time points can be computed and applied to the data. This is particularly important for fusion purposes and ensures that the same cardiac events (e.g. opening and closing of the valves) occur at the same time point along the heart cycle and as such a correct anatomic visualization of the fused data can be generated.
To achieve a good inter-modal alignment several issues had to be addressed: stack misalignment of the CMR data, alignment between cine CMR and late enhanced CMR and subsequent temporal deformation of the late enhanced CMR data to match the deformation of the cine acquisition. For visualization purposes, multi-modal volume rendering algorithms for 3D echo and CMR data have been developed and tested. This was an iterative process and several technical papers that describe our methods have been published.
Finally a streaming client that can receive process and display a live stream of data from several ultrasound machines has been developed. It allows real-time streaming of tissue slices, volumes, color flow data or Doppler data from the scanner to a dedicated workstation or a tablet device. This client was initially used for our live testing, however it was found useful for other on-going projects, such as the BIA project entitled “SmartScan - Ultralyd for "Dummies" ” or UMOJA - Ultrasound for Midwives in rural areas, a HMN financed project. The streaming code was extended with a user interface that runs on a tablet device and a prototype of this system was tested in South Africa by local midwifes.
To conclude, a software system dedicated for the fusion of cardiac CMR and 3D echo data that is fast enough to be used during a standard echocardiographic examination and it is accurate enough that a visual correspondence between 3D echo and CMR data can be established has been implemented and tested. Further improvements to the alignment quality could be achieved by using an optical tracking system for probe localization purposes or by adding an orientation sensor to the probe. Live fusion of 3D echocardiographic data with already recorded CMR images or a generic 3D heart model is particularly appealing as it can allow for user guidance during the acquisition.
It has been shown that prognosis after an acute myocardial infarction is worse with increasing amounts of scar. The main clinical outcome of this project is an improved management of patients with myocardial infarction and lower intra- and inter-observer variability for data analysis. It was shown that by establishing a direct spatial relationship between tissue characterization and functional information a more accurate myocardial viability assessment can be performed, as the cardiologists has access to more information presented simultaneously in a convenient manner.
Additionally the developed fusion tools are useful for the validation of newly developed 3D strain algorithms, as the result of the 3D strain tool can be directly overlaid onto the late enhanced data which is considered as reference. Furthermore fusion of 3D echo and CMR during live scanning has potential for teaching purposes, as the anatomic context is much easier to understand by presenting the user a combined 3D echo, CMR visualization, as compared to 3D echo alone. However these applications have not been validated during the current project. Finally, the developed software tools are not only limited for cardiac data, as mentioned before they have been used in obstetric as well as augmented reality applications.
Fusion and visualization of 3D echo and cardiac MR volumes
The research focused on the development of novel methods for re-aligning 3D echo and magnetic resonance volumes and the generation of 3D volume renderings based on fused data. Additionally augmented visualization methods to be used during data acquisition have been developed and tested in a real-life scenario.
In order to generate a clinically relevant, dynamic visualization of the anatomy and function of the left ventricle for a given case, a method for fusing 3D echo and CMR (cine and late-enhancement) volumes throughout the cardiac cycle has been developed. It relies on image registration techniques that were adapted for the specific case of cardiac volumes. Subsequently dynamic time wrapping, a method that is able to generate a non-linear wrapping between two sequences in the time dimension has been applied. Time warping will compensate for differences in temporal sampling of 3Decho and MR and for differences in heart rate and cycle duration that typically occur between the different acquisitions. For all included patients, 3D echo and CMR short- long-axis cine and late-gadolinium enhanced (LE) images acquired the same day were available. By applying the developed method the visual relationship between myocardial scar identified by LE and low end-systolic segmental strain values obtained based on 3D echo could be visually established. As such a direct spatial relationship between tissue characterization and functional information is available, which has the potential of enabling more accurate myocardial viability assessment and ischemia diagnosis. The developed tool has the potential of simplifying the follow-up of patients which suffered a recent myocardial infarction.
For visualizing the fused data, direct volume rendering is widely used. A recurring problem when rendering 3D medical data using direct volume rendering is the poor visibility of the structures of interest, due to the difficulty of choosing a correct opacity transfer function. In particular in the case of 3D echocardiography, high variations in tissue and blood pool signal make it difficult to choose an opacity transfer function that is applicable for the entire volume to be visualized. A novel approach for defining locally adaptive transfer functions has been tested and adapted specifically for the case of 3D echo data and implemented on the graphics processing unit for time efficiency reasons. The proposed technique was tested against a reference method and in addition to better depicting the cardiac wall, it also reduces the spurious structures inside the cavity.
In addition to presenting the results in a user friendly way, augmented visualization methods can assist the ultrasound examiner in placing the current view in an anatomic context and partially alleviate the burden of image interpretation. This goal was addressed with a visualization that provides clear cues on the orientation and the correspondence between the anatomy being imaged and the acquired data. A live stream of 3D ultrasound data is analyzed in real-time, distinct features are detected and a generic and highly detailed mesh model of the heart is deformed automatically, to match the patient being scanned. The heart mesh is composited with the original ultrasound data to create the data-to-anatomy correspondence. The visualization is broadcasted over the internet allowing, among other opportunities, a direct visualization on a tablet computer. The examiner interacts with the transducer and with the visualization parameters on the tablet. The augmented visualization was deemed useful for medical training and for inexperienced ultrasound users.
LV performance assessment via multi-modal data fusion
Quantification of global and regional left ventricular performance indices after cardiac injury is required in order to better understand prognosis and to evaluate response to a given treatment. The objective of the work in 2011 was to automatically generate cardiac indices based on three different cardiac magnetic resonance imaging protocols.
Cardiac remodeling refers to an alteration of the shape, size and/or function of the heart, in response to cardiac injury or changed hemodynamic loads, which will lead to a progressive decline in ventricular performance. A typical example of cardiac injury is myocardial infarction, in which case the acute ischemic event is the starting point of a series of structural and molecular changes. Several imaging modalities have the ability to image both the anatomy and the function of the heart. Among them, cardiac magnetic resonance imaging (CMR) can produce anatomically accurate recordings of morphology and function (cine imaging), perfusion (first-pass imaging) and tissue characterization such as edema (T2 STIR) or scar (late enhancement imaging).
Combining data acquired using different imaging modalities and as such establishing a direct spatial relationship between anatomical and functional information would benefit both myocardial viability assessment and ischemia diagnosis. The main motivation of the project so far was to automatically generate and visualize an integrated cardiac model, hinging on data provided by three different magnetic resonance imaging protocols. Once the model is available regional cardiac parameters such as volumes at different time points in the cardiac cycle, ejection fraction, thickness and wall thickening can be automatically extracted. Furthermore the same model was applied to longitudinal data belonging to patients with proven myocardial infarction and extracted parameters compared to the ones obtained by a previous study that relied on a side-by-side manual extraction of indices and during which the correspondence of infarct location was visually assessed.
Initial results indicate that automatic fusion of CMR morphologic and functional data is feasible in the case of longitudinal studies and can potentially speed-up and improve the accuracy with which post-infarct remodeling is analyzed. The results are comparable with manually extracted values, which indicates that the method has clinical utility.
Three-dimensional echocardiography in the evaluation of global and regional function in patients with recent myocardial infarction: a comparison with magnetic resonance imaging.
Echocardiography 2013 Jul;30(6):682-92. Epub 2013 jan 24
Strain rate imaging combined with wall motion analysis gives incremental value in direct quantification of myocardial infarct size.
Eur Heart J Cardiovasc Imaging 2012 Nov;13(11):914-21. Epub 2012 apr 12
Comparison of a new methodology for the assessment of 3D myocardial strain from volumetric ultrasound with 2D speckle tracking.
Int J Cardiovasc Imaging 2012 Jun;28(5):1049-60. Epub 2011 aug 17
Fusion of 3D echo and cardiac magnetic resonance volumes during live scanning
Proceedings of IEEE international ultrasonics symposium, Prague, Czech Republic, 2013
Visibility driven visualization of 3D cardiac ultrasound data on the GPU
Proceedings - IEEE Ultrasonics Symposium. 2012
HeartPad: real-time visual guidance for cardiac ultrasound
Proceedings of the Workshop at SIGGRAPH Asia. 2012
Adaptive volume rendering of cardiac 3D ultrasound images - utilizing blood pool statistics
Proceedings of SPIE, the International Society for Optical Engineering. 2012
Fusion of 3D echocardiographic and cardiac magnetic resonance volumes
Proceedings - IEEE Ultrasonics Symposium. 2012
Multi-modal cardiac image fusion and visualization on the GPU
IEEE International Ultrasonics Symposium; Orlando, USA
Per-Pixel Adaptive Opacity Transfer Function for Cardiac Ultrasound Visualization
2011 Joint National Ph.D. Conference in Medical Imaging and MedViz Conference
3D moving boundary conditions for heart CFD simulations - from echocardiographic recordings to discretized surfaces
MekIT'11: Sixth National Conference on Computational Mechanics, Trondheim 23-24 May 2011. Tapir Akademisk Forlag 2011 ISBN 978-82-519-2798-7. s. 33-46