A smartphone mobile medical application, previously presented as a tool for individuals with hand arthritis to assess and monitor the progress of their disease, has been modified and expanded to include extraction of anatomical features from the hand (joint/finger width, and angulation) and foot (length, width, big toe angle, and arch height index) from smartphone camera images.
Image processing algorithms and automated measurements were validated by performing tests on digital hand models, rigid plastic hand models, and real human hands and feet to determine accuracy and reproducibility compared to conventional measurement tools such as calipers, rulers, and goniometers. The mobile application was able to provide finger joint width measurements with accuracy better than 0.34 (±0.25) millimeters. Joint angulation measurement accuracy was better than 0.50 (±0.45) degrees. The automatically calculated foot length accuracy was 1.20 (±1.27) millimeters and the foot width accuracy was 1.93 (±1.92) millimeters. Hallux valgus angle (used in assessing bunions) accuracy was 1.30 (±1.29) degrees. Arch height index (AHI) measurements had an accuracy of 0.02 (±0.01).
Combined with in-app documentation of symptoms, treatment, and lifestyle factors, the anatomical feature measurements can be used by both healthcare professionals and manufacturers. Applications include: diagnosing hand osteoarthritis; providing custom finger splint measurements; providing compression glove measurements for burn and lymphedema patients; determining foot dimensions for custom shoe sizing, insoles, orthotics, or foot splints; and assessing arch height index and bunion treatment effectiveness.
A smartphone mobile medical application is presented, that provides analysis of the health of skin on the face using a smartphone image and cloud-based image processing techniques. The mobile application employs the use of the camera to capture a front face image of a subject, after which the captured image is spatially calibrated based on fiducial points such as position of the iris of the eye. A facial recognition algorithm is used to identify features of the human face image, to normalize the image, and to define facial regions of interest (ROI) for acne assessment. We identify acne lesions and classify them into two categories: those that are papules and those that are pustules.
Automated facial acne assessment was validated by performing tests on images of 60 digital human models and 10 real human face images. The application was able to identify 92% of acne lesions within five facial ROIs. The classification accuracy for separating papules from pustules was 98%.
Combined with in-app documentation of treatment, lifestyle factors, and automated facial acne assessment, the app can be used in both cosmetic and clinical dermatology. It allows users to quantitatively self-measure acne severity and treatment efficacy on an ongoing basis to help them manage their chronic facial acne.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.