This paper reports an automatic method for characterizing the quality of the RR-time series in the stress test database known as DICARDIA. The proposed methodology is simple and consists in subdividing the RR time series in a set of windows for estimating the quantity of artifacts based on a threshold value that depends on the standard deviation of RR-time series for each recorded lead. In a first stage, a manual annotation was performed considering four quality classes for the RR-time series (Reference lead, Good Lead, Low Quality Lead and Useless Lead). Automatic annotation was then performed varying the number of windows and threshold value for the standard deviation of the RR-time series. The metric used for evaluating the quality of the annotation was the Matching Ratio. The best results were obtained using a higher number of windows and considering only three classes (Good Lead, Low Quality Lead and Useless). The proposed methodology allows the utilization of the online available DICARDIA Stress Test database for different types of research.
The comparison of several Level Set algorithms is performed with respect to 2D left ventricle segmentation in Multi-Slice CT images. Five algorithms are compared by calculating the Dice coefficient between the resulting segmentation contour and a reference contour traced by a cardiologist. The algorithms are also tested on images contaminated with Gaussian noise for several values of PSNR. Additionally an algorithm for providing the initialization shape is proposed. This algorithm is based on a combination of mathematical morphology tools with watershed and region growing algorithms. Results on the set of test images are promising and suggest the extension to 3{D MSCT database segmentation.
KEYWORDS: Information security, Signal processing, Telemedicine, Electrocardiography, Electronic health records, Data storage, Databases, Cardiology, Lead, Telecardiology
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
This paper reports a comparison between three fetal ECG (fECG) detectors developed during the CinC 2013 challenge for fECG detection. Algorithm A1 is based on Independent Component Analysis, A2 is based on fECG detection of RS Slope and A3 is based on Expectation-Weighted Estimation of Fiducial Points. The proposed methodology was validated using the annotated database available for the challenge. Each detector was characterized in terms of its performance by using measures of sensitivity, (Se), positive predictive value (P+) and delay time (td). Additionally, the database was contaminated with white noise for two SNR conditions. Decision fusion was tested considering the most common types of combination of detectors. Results show that the decision fusion of A1 and A2 improves fQRS detection, maintaining high Se and P+ even under low SNR conditions without a significant td increase.
The optical ow enables the accurate estimation of cardiac motion. In this research, a sparse based algorithm is used to estimate the optical ow in cardiac magnetic resonance images. The dense optical ow eld is represented using a discrete cosine basis dictionary aiming at a sparse representation. The optical ow is estimated in this transformed space by solving a L1 problem inspired on compressive sensing techniques. The algorithm is validated using four synthetic image sequences whose velocity eld is known. A comparison is performed with respect to the Horn and Schunck and the Lucas and Kanade algorithm. Then, the technique is applied to a magnetic resonance image sequence. The results show average magnitude errors as low as 0.35 % for the synthetic sequences, however results on real data are not consistent with respect to the obtained by other algorithms suggesting the need for additional constrains for coping with the dense noise.
The dynamics of curvature and torsion are important for the geometric description of arteries and for the distribution of accumulating plaque. In this research, two methods for estimating curvature and torsion are analyzed with respect to their accuracy. The first method is based on estimating the curvature and torsion of the artery centerline using the Fourier transform. Since the centerline always represents an open curve, extensions ensuring a minimal spectral energy are added on both ends to obtain a closed curve suitable for Fourier analysis. The second method has been previously used for analyzing the motion of coronary arteries and is based on the least squares fitting of a cubic polynomial to the centerline of the artery. Validation is performed using two mathematical, time-varying phantoms as well as 4-D (3-D plus time) in-vivo data of coronary arteries reconstructed by fusion of biplane angiograms and intravascular ultrasound images. Results show that both methods are accurate for estimating curvature and torsion, and that both methods have average errors below 2.15%.
KEYWORDS: Intravascular ultrasound, Angiography, Arteries, 3D modeling, In vivo imaging, Image segmentation, Data modeling, Bismuth, Adaptive optics, Data fusion
Plaque in native coronary arteries is hypothesized to accumulate more
likely along the inner curvature of a vessel segment as compared to
its outer curvature. This behavior is likely associated with differences in local shear stress, which tends to be lower on the inner bend of a curved vessel than on the outer bend. The reported in-vivo study evaluated how the circumferential plaque distribution depends on local vessel curvature in coronaries from a limited set of 12 patients. Geometrically correct models of the vessel segments were generated utilizing fusion between biplane angiography and intravascular ultrasound. The plaque thickness was derived from the 3-D borders of the lumen/plaque and media/adventitia interfaces. Within each frame, plaque thickness was classified into "below-average" and "above-average" regions. A local curvature index was defined for each point: A positive value indicates the "inner" curvature, a negative value the "outer" curvature, with the magnitude determined from differential geometry. In the majority of the examined vessels, regions of "below-average/outer-curvature" and "above-average/inner-curvature" combined outweighed the "below-average/inner-curvature" and "above-average/outer-curvature" regions. The ratio increased with the threshold to exclude lower-curvature regions, confirming the hypothesis that plaque is more likely to accumulate on the luminal surface along the inner curvature of the coronary segment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.