One Year’s Results from a Server-Based System for Performing Reject Analysis and Exposure Analysis in Computed Radiography
Tóm tắt
Rejected images represent both unnecessary radiation exposure to patients and inefficiency in the imaging operation. Rejected images are inherent to projection radiography, where patient positioning and alignment are integral components of image quality. Patient motion and artifacts unique to digital image receptor technology can result in rejected images also. We present a centralized, server-based solution for the collection, archival, and distribution of rejected image and exposure indicator data that automates the data collection process. Reject analysis program (RAP) and exposure indicator data were collected and analyzed during a 1-year period. RAP data were sorted both by reason for repetition and body part examined. Data were also stratified by clinical area for further investigation. The monthly composite reject rate for our institution fluctuated between 8% and 10%. Positioning errors were the main cause of repeated images (77.3%). Stratification of data by clinical area revealed that areas where computed radiography (CR) is seldom used suffer from higher reject rates than areas where it is used frequently. S values were log-normally distributed for examinations performed under either manual or automatic exposure control. The distributions were positively skewed and leptokurtic. S value decreases due to radiologic technology student rotations, and CR plate reader calibrations were observed. Our data demonstrate that reject analysis is still necessary and useful in the era of digital imaging. It is vital though that analysis be combined with exposure indicator analysis, as digital radiography is not self-policing in terms of exposure. When combined, the two programs are a powerful tool for quality assurance.
Tài liệu tham khảo
United States Food and Drug Administration: Code of federal regulations, 21CFR900.12(e)(3)(ii), 2008
National Council on Radiation Protection and Measurements. Report 99. Quality assurance for diagnostic imaging. Bethesda: NCRP, 1988
American Association of Physicists in Medicine. Report 74. Quality control in diagnostic radiology. Madison: Medical Physics, 2002
American College of Radiology: ACR Technical Standard for Diagnostic Medical Physics Performance Monitoring of Radiographic and Fluoroscopic Equipment, Reston: ACR, 2006, pp. 1139–1142
Gray JE, Winkler NT, Stears J, Frank ED: Quality Control in Diagnostic Imaging, Gaithersburg: Aspen, 1983
Chu WK, Ferguson S, Wunder B, Smith R, Vanhoutte JJ: A two-year reject/retake profile analysis in pediatric radiology. Health Phys 42:53–59, 1982
GE Medical Systems: Revolution XQ/i digital radiographic imaging system. Pub. 98-5502:1–8, 1998
Honea R, Blado ME, Ma Y: Is reject analysis necessary after converting to computed radiography? J Digit Imaging 15(Suppl 1):41–52, 2002
Nol J, Isouard G, Mirecki J: Digital repeat analysis; setup and operation. J Digit Imaging 19:159–166, 2006
Peer S, Peer R, Walcher M, Pohl M, Jaschke W: Comparative reject analysis in conventional film–screen and digital storage phosphor radiography. Eur Radiol, 9:1693–1696, 1999
Weatherburn GC, Bryan S, West M: A comparison of image reject rates when using film, hard copy computed radiography and soft copy images on picture archiving and communication systems (PACS) workstations. Br J Radiol 72:653–660, 1999
Prieto C, Vano E, Ten JI, et al: Image retake analysis in digital radiography using DICOM header information. J Digit Imaging 2008 doi:10.1007/s10278-008-9135-y
Foos DH, Sehnert WJ, Reiner B, et al: Digital radiography reject analysis: Data collection methodology, results, and recommendations from an in-depth investigation at two hospitals. J Digit Imaging 22:89–98, 2009
Polman R, Jones AK, Willis CE, Shepard SJ: Reject analysis tool. Proc SIIM 2008:38–40, 2008
Burkhart RL: Quality Assurance Programs for Diagnostic Radiology Facilities. In: U.S. Department of Health E, and Welfare ed: Washington, D.C.: Government Printing Office; 1980:21
Sagel SS, Jost RG, Glazer HS, et al: Digital mobile radiography. J Thorac Imag 5:36–48, 1990
Freedman M, Pe E, Mun SK: The potential for unnecessary patient exposure from the use of storage phosphor imaging systems. SPIE 1897:472–479, 1993
Gur D, Fuhman CR, Feist JH: Natural migration to a higher dose in CR imaging. Proceedings of the Eight European Congress of Radiology 154, 1993
Seibert JA, Shelton DK, Moore EH: Computed radiography X-ray exposure trends. Acad Radiol 3:313–318, 1996
Stewart BK, Kanal KM, Perdue JR, Mann FA: Computed radiography dose data mining and surveillance as an ongoing quality assurance improvement process. AJR 189:7–11, 2007
Practical Extraction and Report Language (Perl). www.perl.org. Accessed 22 June 2009.
Willis CE, Leckie RG, Carter JR, et al: Objective measures of quality assurance in a computed radiography-based radiology department. SPIE 2432:12, 1995
Crow EL, Shimizu K Eds. Lognormal Distributions: Theory and Applications, 1st ed. New York: CRC, 1988
Chauvenet WA: A Manual of Spherical and Practical Astronomy, 5th edition. Philadelphia: Lippincott, 1863
Carroll QB: Fuchs’s Radiographic Exposure and Quality Control, 7th edition. Springfield: Thomas, 2003
Kuzmak PM, Dayhoff RE: Minimizing Digital Imaging and Communications in Medicine (DICOM) Modality Worklist patient/study selection errors. J Digit Imaging 14:153–157, 2001
American College of Radiology: General Radiology Improvement Database Metrics. https://nrdr.acr.org/portal/HELP/GRID/ACR_GRID_metrics.pdf. Accessed 5 February 2009
Minnigh TR, Gallet J: Maintaining quality control using a radiological digital X-ray dashboard. J Digit Imaging 22:84–88, 2009
