11:30
Cardiac II
Chair: John van den Dobbelsteen
11:30
15 mins
|
Fake It to Make It: The Generation and Verification of Synthetic Virtual Patient Cohorts of Multi-Vessel Coronary Artery Disease
Pjotr Hilhorst, Bregje van de Wouw, Marcel van 't Veer, Frans van de Vosse, Wouter Huberts
Abstract: In silico clinical trials (ISCTs) are a favorable alternative to traditional randomized controlled trials (RCTs) for assessing the safety and effectiveness of clinical decision support tools and clinical devices. To demonstrate their potential, we created a virtual patient cohort (VPC) utilizing a virtual cohort generator (VCG), which incorporates a 1D pulse wave propagation model of the coronary arteries. The VPC was used to replicate the fractional flow reserve (FFR) distribution observed in the FAME study, a real-world RCT, at a population level. Due to the limitations of existing clinical data, we created a synthetic VPC through random model input parameter variation and applied filtering based on physiological acceptance criteria, which may have introduced correlations among the inputs. To explore the parameter landscape of the VPC and assess its variability, we employed uncertainty quantification and sensitivity analysis that can deal with the correlations resulting from the acceptance criteria. The VPC successfully captured the global population variability seen in the FAME study and the severity of stenosis was identified as the main contributor to the variability within the VPC.
|
11:45
15 mins
|
Investigating Approximate Computing to design an energy-efficient deep learning architecture for anomaly detection from ECG signals
Dario Capitani, Ghayoor Gillani, Arlene John
Abstract: Background: Cardiovascular diseases (CVD) account for 32% of global deaths, affecting approximately 640 million people. Early detection is critical to prevent fatalities, with electrocardiograms (ECG) being a key tool for detecting irregular heart activities, or arrhythmias. However, many existing arrhythmia detection models are either computationally intensive or sacrifice information through input dimension reduction. This study presents an efficient anomaly detection model using approximate computations in a Multi-Layer Perceptron (MLP).
Methods: Following Chen Zhang et al.., an MLP model initially consisting of 76,040 parameters was pruned and optimized to 4,480 parameters to improve efficiency. To account for unseen arrhythmias, an “other” class was introduced, reducing false positives for unfamiliar patterns. The model was trained using the MIT-BIH Arrhythmia Database and a large-scale 12-lead ECG database, enabling it to generalize across 7 arrhythmia types. Input data was segmented into 256 samples per heartbeat, centered on the R-peak to maintain accurate feature representation. For hardware implementation, the resultant model was deployed on an Field Programmable Gate Array (FPGA) due to its capacity for parallel processing and reconfigurability, which also makes it suitable for Application-Specific Integrated Circuit (ASIC) prototyping. Approximate computing methods, including power-efficient 4x4 multipliers and area-efficient adders, reduced resource consumption by allowing controlled computational errors in the multiplication and addition operations in the dense layers (99.99% of total operations).
Results: The final model architecture included 256 input neurons, two hidden layers of 16 neurons each, and an output layer with 7 classes, utilizing ReLU and softmax activations. The model achieved an average sensitivity of 93%, with sufficient accuracy for practical application. The model size was reduced to 8.75KB using 16-bit fixed-point representation, which is ideal for resource-constrained settings. At a 100 MHz clock frequency, the accurate neuron’s dynamic power was 0.59 mW, while two different approximated neurons used a total of 0.21 mW, resulting in a sensitivity reduction of 2.14%.
Conclusions: This compact MLP model provides a feasible solution for real-time arrhythmia detection. Its efficient FPGA implementation and use of approximate computing techniques make it promising for wearable, low-power cardiac monitoring applications.
|
12:00
15 mins
|
Exploring cath lab efficiency: a two-step approach
Emanuele Frassini, Teddy Vijfvinkel, Rick Butler, Maarten van der Elst, Benno Hendricks, John van den Dobbelsteen
Abstract: Efficient management of cardiac catheterization laboratories (cath labs) is crucial for maintaining high-quality patient care while controlling costs. This research focused on employing deep learning (DL) models to improve workflow. Our two-steps approach consists of first automating the classification of procedural phases during cardiac angiographies (CAGs) and then using these classifications to accurately predict procedure durations. We aim to gain insights in the procedures performed which could potentially reduce the turnover times between procedures, and ultimately improve efficiency.
We collected a dataset of 222 CAG procedures performed in Reinier de Graaf Hospital, Delft. Specifically, we collected video recordings, clinical data, and system logs. We combined the three data sources into one dataset used to train deep learning models, such as Recurrent Neural Networks (RNN), for the automatic classification of workflow phases. Our LSTM model achieved a weighted accuracy of 91.3%, slightly outperforming the LSTM-FCN (90.8%). This reliable classification of procedural phases formed the basis for the subsequent prediction of procedure durations.
We then applied six different DL models to predict the end times of CAG procedures with the classified phases as only input. We employed Convolutional Neural Networks (CNN), RNN and Attention based models. CNN based models, such as InceptionTime, emerged as the best-performing model, with a Mean Absolute Error of less than 5 minutes and Symmetric Mean Absolute Percentage Error below 6%. The end time prediction can be leveraged, for instance, to call the next patient at the optimal time, and thus potentially reducing the turnover time between procedures.
Furthermore, we are currently performing Montecarlo simulations to simulate the scheduling process of the hospital. The aim is to showcase what could happen if such an automated tool would be able to reduce the turnover time between procedures.
This study demonstrates that the integration of three data sources can be reliably employed to automatically classify procedure phases. DL models, particularly CNN-based architectures, can be effectively used to build an automated tool to predict procedure durations with high accuracy. A Monte Carlo simulation model is used to explore potential scenarios that could emerge from integrating this tool into the existing workflow.
|
12:15
15 mins
|
A Skewed-Gaussian Model for Pulse Decomposition Analysis of Photoplethysmography Signals
Giulio Basso, Reinder Haakma, Xi Long, Rik Vullings
Abstract: Pulse Decomposition Analysis (PDA) has been proposed to extract reliable information from photoplethysmography (PPG) morphology, facing the challenges of remote patient monitoring [1]. PDA decomposes each PPG pulse in its physiological sub-waves using so-called basis functions and then analyse their parameters.
A Gaussian model has been widely used in the literature, even though it often underperforms because it is limited to symmetric morphologies. More advanced asymmetric models, such as the Gamma model, have been proposed to achieve improved accuracy. However, the physiological interpretation of the Gamma model is less effective than the Gaussian model, mainly because it does not have an explicit parameter associated with the temporal location of the sub-waves. These locations carry significant cardiovascular information, and they are used for extracting indexes of clinical interest, such as the stiffness index.
This study aimed to design a PDA model with improved accuracy and effective physiological interpretation. We presented the novel Skewed-Gaussian model, which extends the Gaussian model towards asymmetrical shapes. We implemented multiple models varying the number of basis functions and tested them on 8000 PPG pulses from the MIMIC-III Waveform Database. The performances were compared with the reference Gamma-Gaussian model [2]. Models' accuracies were assessed using the residual sum of squares (RSS). Lastly, the model’s sensitivity and robustness to the initial values' choice were evaluated using random initial values.
The results show that our model achieved higher accuracy, based on the significantly smaller RSS (median RSS: 6∙10^(-4) vs 8∙10^(-4), p-value<0.001 for proposed and reference best models, respectively) and higher consistency, indicated by the smaller RSS interquartile range (7∙10^(-4) vs. 19∙10^(-4)). The analysis with random initial values suggested that the model was less sensitive to the choice of these values and more robust. Finally, the model might offer effective clinical interpretation because its parameters can directly be associated with the physiological characteristics of the sub-waves, such as their magnitude, location, width and level of skewness.
The proposed model may help to establish a link between alterations in cardiovascular functions and variations detectable in the PPG signal, as well as opening up new avenues for PPG-based remote patient monitoring.
[1] Rubins, Uldis. "Finger and ear photoplethysmogram waveform analysis by fitting with Gaussians." Medical & biological engineering & computing 46 (2008): 1271-1276.
[2] Fleischhauer, Vincent, et al. "Pulse decomposition analysis in photoplethysmography imaging." Physiological Measurement 41.9 (2020): 095009.
|
12:30
15 mins
|
Multi-spectral optical transmission to investigate the origin of the photoplethysmography signal
Michael Kaya, Tom Knop, Wiendelt Steenbergen, Ata Chizari
Abstract: Background: Photoplethysmography (PPG) is a non-invasive optical technique that measures
variations in the propagation of light through tissue driven by cardiac cycles. PPG is well
known for its application in pulse oximetry and heart rate estimation in wearables. However,
despite the widespread use of PPG, its origin remains uncertain. Several hypotheses have
been established in an attempt to explain the origin of the PPG signal. Currently there are
three leading hypotheses: 1) blood volume variations in the probed vascular bed during each
cardiac cycle [1], 2) variations of optical properties of blood [2] and 3) deformation of
surrounding tissue by pulsating blood pressure [3].
Objective: This study introduces an experimental method by which the second hypothesis can
be tested for a wide range of wavelengths in relation to the PPG signal.
Methods: We developed an experimental setup that accurately controlled and varied the fluid
flow rate. Using this flow system, a fluid can be infused and withdrawn in a controlled
manner. The flow system was subsequently used in conjunction with a setup that allowed
optical transmission measurements (400-800 nm). Experiments were carried out on human
whole blood flowing through a rigid glass tube (1 mm inner diameter). During these
measurements, the flow rate was varied (from 0 to 18 mL/min) and it was examined how
blood flow rate (through the associated shear rate) influenced optical transmittance. Using this
experiment, a potential dependence of optical properties on flow rate (and in particular the
associated shear rate) was examined for a range of wavelengths.
Results: The calculated normalized transmittance spectra indicate a dependency on flow rate
(n = 3). The behaviour of integrated transmittance spectra reveals that three flow intervals can
be distinguished, corresponding to low, medium and high flow rates, each with a different
dependency.
Conclusion: Based on the results, it can be concluded that hypothesis 2, concerning the
variation in optical properties of blood, was verified, as the normalized transmittance spectra
demonstrated variation with flow rate.
Significance: Our findings suggest that changes in the optical properties of flowing blood
during cardiac cycles contribute to the measured PPG signal. Understanding the origin of the
PPG signal can aid in the development of new biomedical applications and medical devices.
References:
[1] Moço AV, Stuijk S, de Haan G. New insights into the origin of remote PPG signals in visible light and infrared. Sci Rep.
2018 May 31;8(1):8501. doi: 10.1038/s41598-018-26068-2. PMID: 29855610; PMCID: PMC5981460.
[2] Schmid-Schönbein H, Volger E, Klose HJ. Microrheology and light transmission of blood. II. The photometric
quantification of red cell aggregate formation and dispersion in flow. Pflugers Arch. 1972;333(2):140-55. doi:
10.1007/BF00586913. PMID: 5065509.
[3] Volkov MV, Margaryants NB, Potemkin AV, Volynsky MA, Gurov IP, Mamontov OV, Kamshilin AA. Video
capillaroscopy clarifies mechanism of the photoplethysmographic waveform appearance. Sci Rep. 2017 Oct 16;7(1):13298.
doi: 10.1038/s41598-017-13552-4. PMID: 29038533; PMCID: PMC5643323.
|
12:45
15 mins
|
Effects of cardiac patch implantation in the infarcted heart
Britt van Kerkhof, Koen Janssens, Peter Bovendeerd
Abstract: Myocardial infarction (MI) is one of the leading causes of death worldwide. Following MI, adverse remodelling can lead to ventricular dilation, fibrosis, and a reduction in overall contractile function, which may progress to heart failure. Cardiac patches, consisting of scaffolds seeded with contractile cardiomyocytes, may promote the development of mature, contractile tissue vitro. When mounted to the heart, these patches may provide long-time support by enhancing cardiac function and reducing adverse ventricular remodelling.
In a previous study, we employed a finite element model of cardiac mechanics, to model a cardiac patch, implanted over a chronically remodelled infarct [1]. The infarct had a circular shape, lacked active contraction and passive tissue stiffness was increased. The patch had a size of 40 by 60 mm, with a thickness of 2 mm, equivalent to a volume of 5 ml. While the latter study provided important insights into the local function of native cardiac tissue and patch functionality, it was limited in its description of the left ventricular (LV) shape, which was approximated by a thick-walled ellipsoid. Additionally, the material properties of the cardiac patch were assumed to match those of native, healthy myocardium; however, the passive material properties of the patch also depend on the characteristics of the scaffold. In this study, we incorporated an anatomically realistic geometry along with more accurate passive material properties for both the infarcted tissue and the patch.
A 15% infarct led to a 34% reduction in cardiac function. The patch restored 5% of the lost stroke work, with one-third of this improvement attributed to the patch and the remaining two-thirds to improvements in the native tissue, caused by a favorable change in mechanical loading conditions induced by the patch.
The low patch-induced improvement in cardiac function is due to both the low patch volume and the limited strain excursion of the patch tissue, resulting from the underlying stiff infarct. Furthermore, the stiffer properties of the patch, influenced by the characteristics of the scaffold, further diminished patch’s functionality. The incorporation of a realistic geometry did not significantly impact the results when compared to our previous study [1].
[1] Janssens, K.L.P.M., & Bovendeerd, P.H.M. (2024). Impact of cardiac patch alignment on restoring post-infarct ventricular function. Biomechanics and Modeling in Mechanobiology, 1-14.
|
|