We introduce a novel remote volume rendering pipeline for medical visualization targeted for mHealth (mobile health) applications. The necessity of such a pipeline stems from the large size of the medical imaging data produced by current CT and MRI scanners with respect to the complexity of the volumetric rendering algorithms. For example, the resolution of typical CT Angiography (CTA) data easily reaches 512^3 voxels and can exceed 6 gigabytes in size by spanning over the time domain while capturing a beating heart. This explosion in data size makes data transfers to mobile devices challenging, and even when the transfer problem is resolved the rendering performance of the device still remains a bottleneck. To deal with this issue, we propose a thin-client architecture, where the entirety of the data resides on a remote server where the image is rendered and then streamed to the client mobile device. We utilize the display and interaction capabilities of the mobile device, while performing interactive volume rendering on a server capable of handling large datasets. Specifically, upon user interaction the volume is rendered on the server and encoded into an H.264 video stream. H.264 is ubiquitously hardware accelerated, resulting in faster compression and lower power requirements. The choice of low-latency CPU- and GPU-based encoders is particularly important in enabling the interactive nature of our system. We demonstrate a prototype of our framework using various medical datasets on commodity tablet devices.
The goal of this paper is to investigate and evaluate image quality, based on quality metrics and
visual perception in ultrasound imaging under different imaging conditions. We first generate and simulate Bmode
ultrasound images of various objects, using Field-II simulation toolbox [1]. Then we implement and
embed front-end functional modules, mid-end functions including beamforming, receive beamforming, envelop
detection and log compression, and back-end image processing methods (filtering and image enhancement
techniques). Ultrasound images are evaluated as pairs using various image quality evaluation metrics and visual
perception evaluation. The experimental results of this study show that: (1) better image quality is significantly
obtained using over-sampling and signal emphasizing (high-signal level); (2) normalized images with speckle
filters are rated better in terms of index-quality; and (3) enhanced images show better visual perception on all
simulation datasets. This paper shows the utility of our MATLAB test-bench favoring simulated image quality
and further demonstrates that the evaluation design is an important pre-processing step especially for the
hardware design of ultrasound system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.