This post is continuing from Biomedical Image Processing – II.
DICOM Image Standard
Medical specialists have been slow to adopt widely accepted standards for image/film storage, display, and transmission. However, one standard has been adopted with reasonable success in the radiology community: DICOM (Digital Imaging and Communication in Medicine) has been progressively developed since 1983 by ACR/NEMA (America College of Radiologists and the National Electrical Manufacturers’ Association). DICOM defines a standard for the exchange and storage of medical images from various imaging modalities, including MRI, CT, and ultrasound. It focuses on the communication interface between a host computer and the scanner, but it also defines file format standards to which a system must adhere in order to be considered DICOM-compliant. Some examples of DICOM images are depicted in Figure 7.
Good starter links:
http://www.scispy.com/Standards/DICOM.html
ftp://ftp.philips.com/pub/ms/dicom/DICOM_Information/CookBook.pdf
Figure 7. Radiological images stored in the DICOM standard image format [Vepro
Computersysteme GmbH, Cardio Viewing Station, Version 4.41].
Image Analysis
Image analysis encapsulates a set of basic image processing operations whose purpose is to query, but not alter, an image. These operations include
- intensity histogram generation,
- information classification, and
- connected components labeling.
Intensity Histogram
An intensity histogram describes the distribution of pixel intensity information within an image. This information is displayed as the number of counts associated with each intensity level (see Figure 9). Histograms can also be ascertained for color images, where distributions are displayed for the individual red/green/blue values.
Figure 9. Depiction of a histogram for an 8-bit gray scale image (256 intensity levels).
Figure 10. Histogram: 256-level gray scale image (X-ray of a human abdomen)
[http://www.medphys.ucl.ac.uk/research/borg/research/NIR_topics/imaging_exp.htm].
Image classification
Image classification includes a broad set of algorithms for identifying portions of an image that are related to one another. This is very closely related to segmentation, which physically separates these regions from other regions. For example, in a fluorescence image of cells, a researcher might be interested in an image processing algorithm that localizes cell nuclei (see Figure 11) for counting purposes.
Figure 11. Results of an algorithm that attempts to classify and segment portions of a
fluorescence image that correspond to cell nuclei.
Connected Components Labeling
Connected components labeling is a process whereby an algorithm scans an image and groups sets of pixels based on common features (such as pixel intensity). Once these pixel elements are grouped, they are all assigned the same value and labeled as a region. This process is illustrated in Figure 12. Connected component labeling is different from classification in that the algorithms make no judgments as to which components exhibit similar properties.
Figure 12. Example of component labeling based on nearest neighbor analysis
[http://www.dai.ed.ac.uk/HIPR2/label.htm].