Capturing contextual dependencies in medical imagery using hierarchical multi-scale models

P. Sajda1, C. Spence2, L. Parra2
1Department of Biomedical Engineering, Columbia University, New York, NY, USA
2Vision Technologies Laboratory, Sarnoff Corporation, Princeton, NJ, USA

Tóm tắt

In this paper we summarize our results for two classes of hierarchical multi-scale models that exploit contextual information for detection of structure in mammographic imagery. The first model, the hierarchical pyramid neural network (HPNN), is a discriminative model which is capable of integrating information either coarse-to-fine or fine-to-coarse for microcalcification and mass detection. The second model, the hierarchical image probability (HIP) model, captures short-range and contextual dependencies through a combination of coarse-to-fine factoring and a set of hidden variables. The HIP model, being a generative model, has broad utility, and we present results for classification, synthesis and compression of mammographic mass images. The two models demonstrate the utility of the hierarchical multi-scale framework for computer assisted detection and diagnosis.

Từ khóa

#Biomedical imaging #Context modeling #Hip #Image analysis #Medical diagnostic imaging #Neural networks #Biomedical engineering #Image coding #Robustness #Object detection

Tài liệu tham khảo

10.1109/5.5971 spence, 2001, Detection, synthesis and compression in mammographic image analysis using a hierarchical image probability model, IEEE MMBIA 2001, 3 10.1118/1.597177 10.1109/34.93808 spence, 0, Development and application of a hierarchical image probability model, IEEE Trans Image Proc 10.1109/42.996342 10.3109/02841859309175379