Share Email Print
cover

Proceedings Paper

Visualizing and enhancing a deep learning framework using patients age and gender for chest x-ray image retrieval
Author(s): Yaron Anavi; Ilya Kogan; Elad Gelbart; Ofer Geva; Hayit Greenspan
Format Member Price Non-Member Price
PDF $17.00 $21.00

Paper Abstract

We explore the combination of text metadata, such as patients’ age and gender, with image-based features, for X-ray chest pathology image retrieval. We focus on a feature set extracted from a pre-trained deep convolutional network shown in earlier work to achieve state-of-the-art results. Two distance measures are explored: a descriptor-based measure, which computes the distance between image descriptors, and a classification-based measure, which performed by a comparison of the corresponding SVM classification probabilities. We show that retrieval results increase once the age and gender information combined with the features extracted from the last layers of the network, with best results using the classification-based scheme. Visualization of the X-ray data is presented by embedding the high dimensional deep learning features in a 2-D dimensional space while preserving the pairwise distances using the t-SNE algorithm. The 2-D visualization gives the unique ability to find groups of X-ray images that are similar to the query image and among themselves, which is a characteristic we do not see in a 1-D traditional ranking.

Paper Details

Date Published: 7 July 2016
PDF: 6 pages
Proc. SPIE 9785, Medical Imaging 2016: Computer-Aided Diagnosis, 978510 (7 July 2016); doi: 10.1117/12.2217587
Show Author Affiliations
Yaron Anavi, Tel Aviv Univ. (Israel)
Ilya Kogan, Tel Aviv Univ. (Israel)
Elad Gelbart, Tel Aviv Univ. (Israel)
Ofer Geva, Tel Aviv Univ. (Israel)
Hayit Greenspan, Tel Aviv Univ. (Israel)


Published in SPIE Proceedings Vol. 9785:
Medical Imaging 2016: Computer-Aided Diagnosis
Georgia D. Tourassi; Samuel G. Armato III, Editor(s)

© SPIE. Terms of Use
Back to Top