PACS-Based Multimedia Imaging Informatics - H. K. Huang - E-Book

PACS-Based Multimedia Imaging Informatics E-Book

H. K. Huang

0,0
169,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

Thoroughly revised to present the very latest in PACS-based multimedia in medical imaging informatics--from the electronic patient record to the full range of topics in digital medical imaging--this new edition by the founder of PACS and multimedia image informatics features even more clinically applicable material than ever before. It uses the framework of PACS-based image informatics, not physics or engineering principles, to explain PACS-based multimedia informatics and its application in clinical settings and labs. New topics include Data Grid and Cloud Computing, IHE XDS-I Workflow Profile (Integrating the Healthcare Enterprise Cross-enterprise Document Sharing for Imaging), extending XDS to share images, and diagnostic reports and related information across a group of enterprise health care sites. PACS-Based Multimedia Imaging Informatics is presented in 4 sections. Part 1 covers the beginning and history of Medical Imaging, PACS, and Imaging Informatics. The other three sections cover Medical Imaging, Industrial Guidelines, Standards, and Compliance; Informatics, Data Grid, Workstation, Radiation Therapy, Simulators, Molecular Imaging, Archive Server, and Cloud Computing; and multimedia Imaging Informatics, Computer-Aided Diagnosis (CAD), Image-Guide Decision Support, Proton Therapy, Minimally Invasive Multimedia Image-Assisted Surgery, BIG DATA. * New chapter on Molecular Imaging Informatics * Expanded coverage of PACS and eHR's (Electronic Health Record), with HIPPA compliance * New coverage of PACS-based CAD (Computer-Aided Diagnosis) * Reorganized and expanded clinical chapters discuss one distinct clinical application each * Minimally invasive image assisted surgery in translational medicine * Authored by the world's first and still leading authority on PACS and medical imaging PACS-Based Multimedia Imaging Informatics: Basic Principles and Applications, 3rd Edition is the single most comprehensive and authoritative resource that thoroughly covers the critical issues of PACS-based hardware and software design and implementation in a systematic and easily comprehensible manner. It is a must-have book for all those involved in designing, implementing, and using PACS-Based Multimedia Imaging Informatics.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 1185

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Foreword 1

PACS–Based Multimedia Imaging Informatics 3rd Edition, 2018

Foreword 2

A Thought on PACS and CAD

Foreword 3

Preface to the Third Edition

The Beginning

My Experience

The Third Edition

The Future Growth

Preface to the Second Edition

My Interest in PACS and Medical Imaging Informatics

PACS and Imaging Informatics Development since the 2004 Book

PACS and Imaging Informatics, The Second Edition

Acknowledgments

H.K. Huang Short Biography

List of Acronyms

Part 1: The Beginning: Retrospective

1 Medical Imaging, PACS and Imaging Informatics: Retrospective

PART I TECHNOLOGY DEVELOPMENT AND PIONEERS

1.1 Medical Imaging

1.2 PACS and its Development

1.3 Key Technologies: Computer and Software, Storage, and Communication Networks

1.4 Key Technologies: Medical Imaging Related

PART II COLLABORATIONS AND SUPPORTS

1.5 Collaboration with Government Agencies, Industry and Medical Imaging Associations

1.6 Medical Imaging Informatics

1.7 Summary

1.8 Acknowledgments

References

Part 2: Medical Imaging, Industrial Guidelines, Standards, and Compliance

2 Digital Medical Imaging

2.1 Digital Medical Imaging Fundamentals

2.2 Two‐Dimensional Medical Imaging

2.3 Three‐Dimensional Medical Imaging

2.4 Four‐Dimensional, Multimodality, and Fusion Imaging

2.5 Image Compression

Further Reading

3 PACS Fundamentals

3.1 PACS Components and Network

3.2 PACS Infrastructure Design Concept

3.3 Generic PACS‐Based Multimedia Architecture and Workflow

3.4 PACS‐Based Architectures

3.5 Communication and Networks

Further Reading

4 Industrial Standards: Health Level 7 (HL7), Digital Imaging and Communications in Medicine (DICOM) and Integrating the Healthcare Enterprise (IHE)

4.1 Industrial Standards

4.2 The Health Level 7 (HL7) Standard

4.3 From ACR‐NEMA to DICOM

4.4 DICOM 3.0 Standard

4.5 Examples of using DICOM

4.6 DICOM Organizational Structure and New Features

4.7 IHE (Integrating the Healthcare Enterprise)

4.8 Some Operating Systems and Programming Languages useful to HL7, DICOM and IHE

4.9 Summary of Industrial Standards: HL7, DICOM and IHE

References

Further Reading

5 DICOM‐Compliant Image Acquisition Gateway and Integration of HIS, RIS, PACS and ePR

5.1 DICOM Acquisition Gateway

5.2 DICOM‐Compliant Image Acquisition Gateway

5.3 Automatic Image Data Recovery Scheme for DICOM Conformance Device

5.4 Interface PACS Modalities with the Gateway Computer

5.5 DICOM Compliance PACS Broker

5.6 Image Preprocessing and Display

5.7 Clinical Operation and Reliability of the Gateway

5.8 Hospital Information System (HIS), Radiology Information System (RIS), and PACS

References

6 Web‐Based Data Management and Image Distribution

6.1 Distributed Image File Server: PACS‐Based Data Management

6.2 Distributed Image File Server

6.3 Web Server

6.4 Component‐based Web Server for Image Distribution and Display

6.5 Performance Evaluation

6.6 Summary of PACS Data Management and Web‐based Image Distribution

Further Reading

7 Medical Image Sharing for Collaborative Healthcare Based on IHE XDS‐I Profile

7.1 Introduction

7.2 Brief Description of IHE XDS/XDS‐I Profiles

7.3 Pilot Studies of Medical Image Sharing and Exchanging for a Variety of Healthcare Services

7.4 Results

7.5 Discussion

Acknowledgements

References

Part 3: Informatics, Data Grid, Workstation, Radiotherapy, Simulators, Molecular Imaging, Archive Server, and Cloud Computing

8 Data Grid for PACS and Medical Imaging Informatics

8.1 Distributed Computing

8.2 Grid Computing

8.3 Data Grid [5]

8.4 Fault‐Tolerant Data Grid for PACS Archive and Backup, Query/Retrieval, and Disaster Recovery

References

Further Reading

9 Data Grid for Clinical Applications

9.1 Clinical Trials and Data Grid

9.2 Dedicated Breast MRI Enterprise Data Grid

9.3 Administrating the Data Grid

9.4 Summary

References

Further Reading

10 Display Workstations

10.1 PACS‐Based Display Workstation

10.2 Various Types of Image Workstation

10.3 Image Display and Measurement Functions

10.4 Workstation Graphic User Interface (GUI) and Basic Display Functions [3–15]

10.5 DICOM PC‐Based Display Workstation Software

10.6 Post‐Processing Workflow, PACS‐Based Multidimensional Display, and Specialized Post‐Processing Workstation

10.7 DICOM‐Based Workstations in Progress

References

11 Multimedia Electronic Patient Record (EPR) System in Radiotherapy (RT)

11.1 Multimodality 2‐D and 3‐D Imaging in Radiotherapy

11.2 Multimedia ePR System in Radiation Treatment

11.3 Radiotherapy Planning and Treatment

11.4 Radiotherapy Workflow

11.5 The ePR Data Model and DICOM‐RT Objects

11.6 Infrastructure, Workflow and Components of the Multimedia ePR in RT

11.7 Database Schema

11.8 Graphical User Interface Design

11.9 Validation of the Concept of Multimedia ePR System in RT

11.10 Advantages of the Multimedia ePR system in Radiotherapy for Daily Clinical Practice

11.11 Use of the Multimedia ePR System in RT For Image‐Assisted Knowledge Discovery and Decision Making

11.12 Summary

Acknowledgement

References

12 PACS‐Based Imaging Informatics Simulators

12.1 Why Imaging Informatics Simulators?

12.2 PACS–ePR Simulator

12.3 Data Grid Simulator

12.4 CAD–PACS Simulator

12.5 Radiotherapy (RT) ePR Simulator

12.6 Image‐Assisted Surgery (IAS) ePR Simulator

12.7 Summary

Acknowledgments

References

13 Molecular Imaging Data Grid (MIDG)

13.1 Introduction

13.2 Molecular Imaging

13.3 Methodology

13.4 Results

13.5 Discussion

13.6 Summary

Acknowledgments

References

14 A DICOM‐Based Second‐Generation Molecular Imaging Data Grid (MIDG) with the IHE XDS‐i Integration Profile

14.1 Introduction

14.2 Methodology

14.3 System Implementation

14.4 Data Collection and Normalization

14.5 System Performance

14.6 Data Transmission, MIDG Implementation, Workflow and System Potential

14.7 Summary

Acknowledgments

References

15 PACS‐Based Archive Server and Cloud Computing

15.1 PACS‐Based Multimedia Biomedical Imaging Informatics

15.2 PACS‐Based Server and Archive

15.3 PACS‐Based Archive Server System Operations

15.4 DICOM‐Compliant PACS‐Based Archive Server

15.5 DICOM PACS‐Based Archive Server Hardware and Software

15.6 Backup Archive Server and Data Grid

15.7 Cloud Computing and Archive Server

Acknowledgements

References

Part 4: Multimedia Imaging Informatics, Computer‐Aided Diagnosis (CAD), Image'Guide Decision Support, Proton Therapy, Minimally Invasive Multimedia Image‐Assisted Surgery, Big Data

16 DICOM‐Based Medical Imaging Informatics and CAD

16.1 Computer‐Aided Diagnosis (CAD)

16.2 Integration of CAD with PACS‐Based Multimedia Informatics

16.3 The CAD–PACS Integration Toolkit

16.4 Data Flow of the three CAD–PACS Editions Integration Toolkit

References

Further Reading

17 DICOM‐Based CAD: Acute Intracranial Hemorrhage and Multiple Sclerosis

17.1 Computer‐Aided Detection (CAD) of Small Acute Intracranial Hemorrhage on CT of the Brain

17.2 Development of the CAD Algorithm for AIH on CT

17.3 CAD–PACS Integration

17.4 Multiple Sclerosis (MS) on MRI

References

Further Reading

18 PACS‐Based CAD: Digital Hand Atlas and Bone Age Assessment of children

18.1 Average Bone Age of a Child

18.2 Bone Age Assessment of Children

18.3 Method of Analysis

18.4 Integration of CAD with PACS‐Based Multimedia Informatics for Bone Age Assessment of Children: The CAD System

18.5 Validation of the CAD and the Comparison of CAD Result with Radiologists’ Assessment

18.6 Clinical Evaluation of the CAD System for Bone Age Assessment (BAA)

18.7 Integrating CAD for Bone Age Assessment with Other Informatics Systems

18.8 Research and Development Trends in CAD–PACS Integration

Acknowledgements

References

Further Reading

19 Intelligent ePR System for Evidence‐Based Research in Radiotherapy

19.1 Introduction

19.2 Proton Therapy Clinical Workflow and Data

19.3 Proton Therapy ePR System

19.4 System Implementation

19.5 Results

19.6 Conclusion and Discussion

Acknowledgements

References

20 Multimedia Electronic Patient Record System for Minimally Invasive Image‐Assisted Spinal Surgery

20.1 Integration of Medical Diagnosis with Image‐Assisted Surgery Treatment

20.2 Minimally Invasive Spinal Surgery Workflow

20.3 Multimedia ePR System for Image‐Assisted MISS Workflow and Data Model

20.4 ePR MISS System Architecture

20.5 Pre‐Op Authoring Module

20.6 Intra‐Op Module

20.7 Post‐Op Module

20.8 System Deployment, User Training and Support

20.9 Summary

References

21 From Minimally Invasive Spinal Surgery to Integrated Image‐Assisted Surgery in Translational Medicine

21.1 Introduction

21.2 Integrated Image‐Assisted Minimally Invasive Spinal Surgery

21.3 IIA‐MISS EMR System Evaluation

21.4 To Fulfill some Translational Medicine Aims

21.5 Summary

21.6 Contribution from Colleagues

Acknowledgement

References

22 Big Data in PACS‐Based Multimedia Medical Imaging Informatics

22.1 Big Data in PACS‐Based Multimedia Medical Imaging Informatics

22.2 Characters and Challenges of Medical Image Big Data

22.3 Possible and Potential Solutions of Big Data in DICOM PACS‐Based Medical Imaging and Informatics

22.4 Research Projects Related to Medical Imaging Big Data

22.5 Summary of Big Data

Acknowledgements

References

Index

End User License Agreement

List of Tables

Chapter 01

Table 1.1 Equipment at IPL, UCLA, 1987.

Chapter 02

Table 2.1 Sizes of some common 2‐D, 3‐D, and 4‐D and fusion medical images.

Chapter 03

Table 3.1 Major functions of the PACS server and archive.

Table 3.2 Major functions of PACS workstations.

Table 3.3 Seven‐layer open systems interconnect (OSI) protocols.

Table 3.4 Current available wireless LAN technology.

Table 3.5 Transmission rate of current WAN technology.

Table 3.6 Performance data for different sites connected with international Internet 2.

Table 3.7 Comparison of performance data between sites that utilized the Web 100 tuning protocol (last column) and the same sites using standard Linux without tuning.

Table 3.8 Performance data for IPI connectivity to both PolyU and InCor using a commercial high‐speed ISP vendor.

Chapter 04

Table 4.1 DICOM service classes.

Table 4.2 DICOM information object classes.

Table 4.3 Normalized DICOM message service element (DIMSE).

Table 4.4 Composite DICOM message service element (DIMSE).

Chapter 05

Table 5.1 Information transferred among HIS, RIS, and PACS triggered by the PACS server.

Chapter 07

Table 7.1 Time intervals for displaying the first image in a series after issuing an ITI‐43 request in the online sharing model.

Table 7.2 Time intervals for displaying the first image in a series after issuing an ITI‐43 request in the near‐line sharing model.

Table 7.3 Comparisons of three pilot studies in work flows, PACS interfacing, implementation of IHE XDS‐I actors, and transactions (S = Standard Implementation, C = Customized Implementation).

Chapter 10

Table 10.1 Advantages and disadvantage of the LCD over the CRT.

Table 10.2 Important software functions in a display WS.

Table 10.3 Description of the user interface icons and toolbars shown in Figure 10.17.

Chapter 13

Table 13.1 Preclinical Molecular Imaging File Formats Collected for Evaluation from USC Molecular Imaging Center.

Table 13.2 Performance tests measuring the time it takes to archive and retrieve a study dataset from six different preclinical molecular imaging modality types over a 100 mbps network.

Chapter 14

Table 14.1 (A) Molecular imaging datasets systematically collected for evaluation from the USC MIC. (SOP: Service‐Object Pair; SC: Secondary Capture).

Table 14.1 (B) List of DICOM tags that are labeled by the MIDG during the upload process.

Table 14.2 Comparison of upload and download performance of first‐ generation MIDG and second‐generation MIDG, using datasets from four preclinical imaging modalities.

Chapter 16

Table 16.1 (A) Topics and numbers of CAD‐related presentations at the RSNA meetings, 2000–2008.

Table 16.1 (B) Number of CAD presentations at the RSNA meetings, 2003–2008.

Table 16.2 CAD without PACS and with or without digital input.

Table 16.3 (A) CAD with DICOM PACS: PACS WS Q/R, CAD WS, and detect.

Table 16.3 (B) CAD with DICOM PACS: CAD WS Q/R and detect.

Table 16.3 (C) CAD with PACS: PACS WS with diagnosis software.

Table 16.3 (D) Integration of CAD server with DICOM PACS and/or MIII.

Table 16.4 Comparison of the three CAD–PACS integration editions (SR: Structured Reporting).

Chapter 17

Table 17.1 Details of individual image processing and analysis steps of in the CAD for AIH algorithm, as outlined in Figure 17.1.

Table 17.2 Sample rules used in the knowledge base classification in the CAD system for AIH on CT brain.

Table 17.3 CAD results based on training data and validation data.

Table 17.4 Average performance indicators included are sensitivity, specificity, positive predictive value, and negative predictive value for different clinician groups with and without CAD support.

Table 17.5 Cases in which clinicians change their diagnostic decision after CAD.

Table 17.6 Tasks and requirements for integrating CAD with PACS systems in the three examples.

Chapter 18

Table 18.1 Images and data contained in the Digital Hand Atlas.

Table 18.2 Clinical reliability of using the three ROIs for BAA in different age groups.

Table 18.3 Mean difference between bone age and chronologic age according to race and sex as assessed by radiologists.

Chapter 19

Table 19.1 Proton therapy data collected from one prostate cancer patient.

Table 19.2 Summary of main pretreatment clinical data (n = 39 patients).

Table 19.3 Summary of follow‐up clinical data (

n

 = 39 patients).

Table 19.4 Patient new information and treatment plan in comparison to search criteria.

Table 19.5 Comparison of volume percentage of the tumor target and critical structures between previous plan and modified plan of Patient New.

Chapter 20

Table 20.1 Default values for the safe ranges of the patient’s vital signs during MISS operation used as indicators to set off alarms on the real‐time intra‐op display.

Chapter 21

Table 21.1 Comparison of minimally invasive spinal surgery (MISS) operation times between the procedure with and without using the EMR system with different number of vertebrae per procedure.

Table 21.2 Comparison between the IIAS platform basic common components (columns 3 and 4) and the customized components in column 2 (SW) and column 4 (C) for a specific surgical application.

List of Illustrations

Chapter 01

Figure 1.1 Robert Steven Ledley (June 28, 1926–July 24, 2012, aged 86).

Figure 1.2 Innovative medical imaging components in the Pattern Recognition Laboratory in the mid‐1970s: (A) FIDAC (Film input to Digital Automatic Computer); (B) DRIDAC (Drum Input to Digital Automatic Computer); (C) SPIDAC (Specimen Input to Digital Automatic Computer).

Figure 1.3 Pattern Recognition Laboratory, National Biomedical Research Foundation.

Figure 1.4 The ACTA, with Professor Ledley, the first whole‐body CT scanner, with two slices per scan in 4½ min.

Figure 1.5 Top: PACS system and components. Bottom: Imaging informatics platform.

Figure 1.6 (A) Professor Moses Greenfield (March 8, 1916–July 26, 2012, 97 years); (B) Professor Hooshang Kangarloo (Dec 24, 1944–May 15, 2012, 67 years) and his faculty and residents in the pediatric radiology reading room using the PACS three 1K × 1K monitor viewing workstation.

Figure 1.7 Research and clinical components at IPL connected by the Ethernet Communication System. This figure was a historical drawing of the Image Processing Laboratory (IPL), Department of Radiological Sciences at UCLA, IPL established in 1982, and developed the first PACS for clinical operation between 1987 and 1991. Bottom left: the IPL. Right: clinical laboratory. Top right: the PACS workstations at three experimental laboratories in the hospital. This equipment was the beta version of the PACS workstations later installed in the clinical sites. IDs in the bottom of each block were the room no. located at the Department and the UCLA Hospital.

Figure 1.8 The VAX/11 750 computer. Left: the Gould DeAnza multiple display controller (middle, blue).

Figure 1.9 Large‐capacity Optical Disk Jukebox by Kodak and the RAID disks running the AMASS software (right).

Figure 1.10 The first Konica laser film scanner at UCLA, 1984.

Figure 1.11 The first Fuji computed radiography (CR) system in the United States was installed at the Ochsner Clinics, New Orleans; the second system was installed at UCLA in late 1985.

Figure 1.12 The first digital interface unit using a ping‐pong buffer and the DR11‐W interface technology to transmit CR images in real time to the outside of the CR reader. It was designed and implemented by the UCLA PACS team. The black box is shown in (A). The architecture module was a gateway to the PACS input system in (B). The novelty of the design at that time was the use of the ping‐pong buffers to allow the continuous Fuji image data transfer to both the laser film recorder as well as the direct digital capture mechanism of the VAX 11/750 computer.

Figure 1.13 The prototype Konica digital radiography system at the UCLA PACS clinical laboratory. There were three imaging plates in the housing, allowing three consecutive exposures in one examination.

Figure 1.14 (A) The multiple viewing workstations laboratory (WS Lab) at UCLA with multiple‐resolution workstations, including two six‐monitor display systems (512 × 512), one 1400‐line single‐monitor system, and one three‐monitor display system (1K × 1K). This workstation room was used for the first large‐scale study on the quality of image display with different spatial and density resolutions. (B) The first two 2K line monitor display was not yet available at that time, but was later developed by a commercial company, Megascan, and installed in the WS Lab.

Figure 1.15 The NATO ASI [22].

Figure 1.16 Adopted from Professor Irie’s Concept of Medical Record System (MIS) in Hokkadio University Hospital [28,29]

Figure 1.17 The Proceedings of the first SPIE Picture Archiving and Communication Systems (PACS) Conference at San Diego, CA, 1982.

Figure 1.18 Automatic chromosome karyotyping innovation by Professor Ledley in the 1970s, probably the earliest concept in systematic medical imaging informatics. Process steps included medical image acquisition, metaphase cell determination, patient data, interactive display, automatic measurement of chromosomes and karyotyping, the determination of normal vs. abnormal chromosomes, that led to the final diagnosis. PACS components included MACDAC (man–machine interface to digital automatic computer). SPIDAC + VIDAC (video memory) + MACDAC (interface) + IBM360/44 were equivalent of today’s Pathology PACS. The new knowledge discovery component showed an example of automatic chromosome analysis—microscopic scanning, detecting two metaphase cells (low resolution), and chromosomes (high), analyzing each chromosome, and karyotyping that could lead to the prenatal diagnosis of birth defects.

Figure 1.19 Some examples of today’s (2020s) medical imaging informatics applications of three types of patient treatment—surgery, rehabilitation, and radiation therapy. Upper row use of the medical images from various body parts for three different applications. Lower row for image assisted neurosurgery, neuro‐rehabilitation, and radiation therapy for cancer treatment.

Chapter 02

Figure 2.1 (A) Terminology used in medical images: image types, sizes, and number of pixels/image. (N × N): total number of pixels of a 2‐D image; (

x,y

): coordinates of the pixel in the 2‐D image; f (x,y): gray level value in (

x,y

), which can be from 8 to 12 bits in the gray level or 24 bits in the color image. The total number of bits per image is commonly denoted by (N × N × 12) or (N × N × 24). For 12 bits/pixel, the pixel value is stored in 2 bytes. (B) 3‐D image set: (i) a 3‐D spatial image set with

z

as the third dimension; (ii) a 3‐D temporal image set with t (time) as the third dimension. (C) 4‐D image set. A 4‐D image set consisting of sequential 3‐D spatial sets with t as the fourth dimension. (D) Fusion images: (i) PET fuses with CT: physiology (color) on anatomy (gray level); (ii) MR (color) fuses with CT (gray level): enhancement of soft tissue definition on anatomy.

Figure 2.2 Illustration of spatial and density resolutions, and signal‐to‐noise ratio using abdominal CT (Computed tomography) see image (512 × 512 × 12 bits) as an example. (A) Four images with a fixed spatial resolution (512 × 512), but variable density resolutions (12, 8, 6, and 4 bits/pixel, respectively). (B) The original and three images with a fixed density resolution (12 bits/pixel), but variable spatial resolutions (512 × 512, 256 × 256, 128 × 128, and 32 × 32 pixels, respectively). (C) The abdominal CT image (512 × 512 × 12) shown in (A). Random noises were inserted in 1000 pixels, 10 000 pixels, and 100 000 pixels, respectively. The coordinates of each randomly selected noise pixel within the body region were obtained from a random generator. The new pixel value is selected from a range between 0.7 and 1.3 and that of the original value is determined by a second random generator. Clearly, the quality of the CT image is decreasing progressively, starting from the original.

Figure 2.3 Generic radiology workflow. Note that steps 4, 8, 9, 10, 12, and 14 (→) are replaced by “PACS‐based” related systems.

Figure 2.4 (A) Data flow of an upright CR system with non‐removable imaging plates. (1) Formation of the latent image on the imaging plates; (2) Imaging plates being scanned by the laser beam; (3) Light photons converted to electronic signals; (4) Electronic signals converted to the digital signals that form a CR image (courtesy of Konica Corporation, Japan). (B) The Fuji CR XG5000 reader (Footprint: 26 × 29 × 58 in.) with a stacker accommodates four image cassettes (left), and image processing workstation and quality assurance monitor (right).

Figure 2.5 (A) A pediatric CR image, with a white background (right arrows) as seen on a video monitor. (B) A better visual quality image after the white background has been automatically removed.

Figure 2.6 (A) Workflow steps in the formation of a digital radiography (DR) image, comparing it with that of a CR image shown, (B) an add‐on DR system (Figure 2.4A) and utilizes an existing x‐rays unit with the patient bed.

Figure 2.7 A slot‐scanning digital mammography system. The slot with 300‐pixel width, covers the

x

‐direction (arrow) with 4400 pixels, The x‐ray beam sweeps in the

y

‐direction (arrow), producing over 5500 pixels. Top X: x‐ray and collimator housing; Middle C: Breast compressor.

Figure 2.8 A 4 K × 5 K × 12‐bit CC view digital mammogram shown on a 2 K × 2.5 K monitor (left). A localized digital mammogram for needle biopsy verification (right).

Figure 2.9 Schematic of a general gamma camera used in nuclear medicine imaging. PMT: Photomultiplier tube; NaI (TI): Thallium‐activated sodium iodide.

Figure 2.10 Block diagram of a B‐mode ultrasound scanner system. TGC: Time gain compensation; RF: radio frequency; HV: high voltage.

Figure 2.11 Color Doppler ultrasound of blood flow, showing convergent pulmonary vein inflow.

Figure 2.12 Block diagram of a digital microscopic system with a CCD camera connected to the digital chain. The system is mostly used in the pathological environment and digital video endoscopic imaging. A/D: analog to digital. D/A: digital to analog.

Figure 2.13 (A) Schematics of a generic digital endoscope (not to scale) with a CCD camera. A/D: analog to digital. D/A: digital to analog.

Figure 2.13 (B) Endoscopic images of thoracic vertebra 9 and 10 acquired in real time during image‐guided minimally invasive spinal surgery. See also Chapters 20 and 21 for minimally invasive image‐assisted spinal surgery.

Figure 2.14 Two 3‐D image coordinate systems: (A) 3‐D spatial image set with z as the third dimension. Images from 1 to z show the anatomical changes of the cross‐sectional chest CT images; (B) a 3‐D temporal image set with t (time) as the third dimension. Images from 1 to t show the same anatomy as image 1; the difference would be, for example, the flow of the contrast media injected to the patient from time 1 to t.

Figure 2.15 Principle of the Fourier transform (FT) projection theorem for image reconstruction from 180‐degree 1‐D projections from input energy. (A): Spatial domain: A CT image to be reconstructed from 1,2, …,180 1‐D projection energy data, f(x,y) is the image to be reconstructed. (B): Frequency domain: F(0, 0) is the center of the 2‐D FT; low‐frequency components are located at the center region, and high‐frequency components are at periphery. P(x,θ) Left: x‐ray projection at angle θ (green), x is the distance from left to right of the projection. F(u, θ): 1‐D Fourier transform of P(x, θ); red and green are corresponding projections in the spatial domain, and its 1‐D Fourier transforms (1‐DF) in the frequency domain, respectively. IFT: inverse Fourier transform.

Figure 2.16 Geometry of the multi‐slice computed tomography produces 3‐D CT images. The patient axis (parallel to the bed) is in the

z

‐direction. The x‐ray source (orange), shaped as a collimated cone beam, rotates continuously around the

z

‐axis by 360 degrees, in sync with the patient’s bed movement in the

z

‐direction. The detector system is a combination of detector arrays on a concave surface (not to scale) perpendicular to the x‐ray beams. The number of slices per 360‐degree rotation is determined by two factors: the number of detector arrays (channels) in the

z

‐direction and the method used to recombine the cone beam projection data into transverse sectional projections (Figure 2.15). The standard reconstructed images are in transverse (axial) view perpendicular to the

z

‐axis; the projection raw data can also be recombined to reconstruct sagittal, coronal, or oblique view images. If the cone beam does not rotate while the patient’s bed is moving, the reconstructed image is equivalent to a digital projection image (scout view).

Figure 2.17 The extent of

z

‐dimensional scanning of the entire human anatomy with variable multi‐slice scans. In this case, array detectors can range from 4, 16, 64, … 256, or even more are possible images in one scan. For example, in the 256 array detector system, a certain number rotations of the x‐ray system can capture the entire heart (256 mm), minimizing the possible heart beat artifacts. Another example, a whole‐body CT scanner can also scan the body of the patient from head to toe, with the scanner continuing rotating as the bed moving in the

z

‐direction (see Figure 2.16 for the

z

‐direction).

Figure 2.18 Data flow components of an X‐ray CT scanner. The scanning and data collection times, in general, are shorter than the image reconstruction time. A hardware back‐projector unit (green) is used to speed up the reconstruction time. A/D: analog to digital; WS: workstation.

Figure 2.19 (A) 3‐D multi‐slice CT showing a large right parietal hematoma with edema: (upper left) Transverse; (right) coronal. The bottom images are CT angiograms showing the 2‐D sagittal view (left) and the coronal view (right) extracted from the 3‐D CT 0.5‐mm isotropic dataset. (B) 3‐D neurodigital subtraction angiogram: (top two and left bottom) sagittal, coronal, and transverse contrast CT images; (bottom right) 3‐D angiogram obtained by tissue and bone subtraction from the 3‐D CT images. (C) Multi‐planar reconstruction CT images. The 16‐slice helical CT of the head and neck was obtained during a bolus of intravenous contrast administration. Volumetric data were then transferred to a Vitrea 3‐D workstation for post‐processing, including 3‐D volumetric rendering and multiplanar reconstructions (MPR) in the (i) sagittal, (ii) coronal, and (iii) transverse planes. This study was conducted in an adult female with transient ischemic attacks. The patient was found to have significant unilateral internal carotid artery stenosis due to atherosclerotic plaque.

Figure 2.20 (A) 3‐D CT dataset can also be used to produce 3‐D volume rendering images. Left and right show the anterior–posterior (A–P) and posterior–anterior (P–A) views of the thoracic cage, revealing fractures of ribs 7–10 in the P–A view (courtesy of GE Medical Systems). (B) 3‐D CT abdominal dataset obtained by using a 64 multi‐slice scanner showing 3‐D volume rendering of bone, blood vessels, and kidneys, and in particular, an infrarenal abdominal aortic aneurysm with mural thrombus. The scan protocol used was a 135 kVp, 175 mAs, 325‐mm scan range taking 12 seconds (courtesy of Toshiba Medical Systems).

Figure 2.21 Schematic of single photon emission CT (SPECT). Refer to Figure 3.24 for nuclear medicine scanning.

Figure 2.22 Block diagram of a positron emission tomography (PET) system showing two array banks of detectors.

Figure 2.23 (A) Positron emission tomography (PET) study of the brain. 18Ffluorodeoxyglucose (18 F‐FDG) was administered to the patient, and approximately 60 minutes later the patient was positioned on a PET scanner and images (i: transverse, ii: coronal, and iii: sagittal) were obtained from the skull apex to the skull base. Causes of cognitive impairment, such as Alzheimer’s disease, are among the indications for brain 18 F‐FDG PET (courtesy of Dr P. Moin). (B) Images of transverse, coronal, and sagittal orthogonal planes (right to left), as well as the posterior–anterior projection image (leftmost) of the whole‐body PET image with fluoride ion (18 F

).

Figure 2.24 (A) Three scanning modes of 3‐D US scanning: (left) linear translation; (middle) tilt scanning; and (right) rotational scanning. (B) 3‐D ultrasound image of a 25‐week fetal face.

Figure 2.25 Block diagram of a generic magnetic resonance imaging (MRI) system. Dotted line separates the digital domain from the MR signal generation. A/D: analog to digital; D/A: digital to analog; RF: radio frequency.

Figure 2.26 Data flow of forming an MR image. 2‐D: two‐dimensional; FID: free induction decay; FT: Fourier transform.

Figure 2.27 Clinical 3 T MR imaging system in a magnetic shielded room at the Health Science Campus, USC.

Figure 2.28 T1 weighted (A) transverse, (B) thin section coronal, and (C) sagittal images from an MRI performed to evaluate a possible structural cause for this patient’s recent onset of seizures. Thin‐section coronals (B) were obtained (only one is shown to allow better visualization of the hippocampi, a region of interest in the imaging evaluation of seizures. Directions: A: Anterior; P: posterior; R: right; L: left; F: front; B: bottom.

Figure 2.29 Two 3‐D fetal 3 T MR images using ultrafast sequences.

Figure 2.30 Two views of 3‐D dedicated breast MR angiogram (1.5 T).

Figure 2.31 MRA using a 3 T MRI scanner, approaching image resolution possible withDSA (Digital Subtraction Angiography), DSA (Section 2.3.2.2).

Figure 2.32 MRI DTI Tractographic reconstruction of neural connections via DTI.

Figure 2.33 Principles of the 3‐D fluorescence confocal microscope used to generate serial sections of an object of interest from the specimen

. x–y

scanning mirrors guide a laser beam (red lines) that excites objects attached with fluorescent dye molecules (yellow) in the sample (size not to scale). The dichromatic mirror only allows excited light (green) to pass. The optical pinhole mechanism accepts in‐focus excited emission light (thick green lines) to be recorded by the detector, and rejects out‐of‐focus emission (thin green line).

Figure 2.34 (A) 3‐D rendering of the skeleton of a rat with a set of 1000 slices scanned by a micro XCT scanner with 50 micro pixels. 500 MB of image data were generated. The skeletal structure demonstrates the 3‐D rendering of the display (courtesy of ORNL). (B) Two time series—weeks 1, 2, 3, and 5—of molecular images of a mouse with prostate tumor cells in its bone marrow. The three dimensions are projections of the

x–y

plane over time: (top row) no intervention; the tumor was growing fast after injection to week 5; (bottom row) with chemotherapy the tumor as seen at week 5 starting to respond, and it shrinks continuously after one week.

Figure 2.35 (A) Two 3‐D image coordinate systems: (i) A 3‐D spatial image set with z as the third dimension. Images from 1 to z show the anatomical changes of the cross‐sectional chest; (ii) A 3‐D temporal image set with t (time) as the third dimension. Images from 1 to t show the same anatomy as image 1; the difference would be, for example, the flow of the contrast media injected into the patient from time 1 to t. (B) A 4‐D image set consisting of sequential 3‐D spatial sets, with t as the fourth dimension. (C) (i) PET fuses with CT: physiology on anatomy; (ii) MR fuses with CT: enhancement of soft tissue definition on anatomy.

Figure 2.36 Dual x‐ray energy CT scan in a gantry producing high‐ and low‐energy images that can distinguish between tissue types. Single‐energy scan (left) without enough information to differentiate a lipid degeneration (right, inside the white circle), shown nicely in the dark red color coded inside the white circle.

Figure 2.37 Fetal images taken with 2‐D, 3‐D, and 4‐D US imaging. 2‐D, 3‐D, and 4‐D US images can be used to observe and monitor the growth of the fetus before the baby is born. (A) Left, 2‐D US image with audio heart beat at the bottom. Two 3‐D US images reveal the breathing movement of the chest wall and the stretching of the legs. (B) Sequences of 4‐D time frames with 3‐D US image frames (not in equal time intervals) showing the movements of the legs, turning of the body, and the face (frames 5, 6, 7), and the movement of the arms and legs.

Figure 2.38 (A) PET‐CT system with 64 slices at the Nuclear Medicine Division, Department of Radiology, USC. The gantry houses a 3‐D CT and a 3‐D PET. (B) Hardware image fusion with a PET‐CT combined scanner. The hardware fusion method minimizes patient position change and movement during image registration before fusion. The mockup shows a PET‐CT scan of a patient 1 hour after injection of 18‐F fluorodeoxyglucose. The patient was positioned in a PET‐CT scanner. Workflow steps: (1a) 64 Multi‐slice CT followed by (1b) a PET scan; (2) 3‐D CT images of a full reconstructed coronal section along with (3) brain transverse, coronal, and sagittal sections; (4) 3‐D CT data used to perform attenuation correction of PET data; (5) 3‐D PET image reconstruction; (6) 3‐D PET images obtained showing a coronal, (7) registered and fused CT and PET images with the corresponding coronal section. The “fused” PET‐CT images allow for increased sensitivity in the detection of neoplastic disease by combining identified abnormal physiologic activity (PET) with precise anatomic localization (CT).

Figure 2.39 PET‐CT fusion images of a coronal view of the whole body from a dual gantry PET‐CT scanner indicating normal distribution of FDG (18‐Ffluorodeoxyglucose); (left) CT image; (middle) PET image; (right) fusion image with pseudo‐color lookup table (LUT) PET image (physiology) overlaying CT image (anatomy); FDG accumulation shown in cerebral‐cerebellar cortex, myocardium, liver, kidneys, renal pelvis, bone marrow, and urinary bladder.

Figure 2.40 PET‐CT hardware combined fusion coronal sectional images with 18 F‐FDG. Of the 25 equal spaced 1.0‐cm sequential images, 00, 04, 08, 12, 16, 20, and 24 are shown. A sagittal section is also displayed (lower right).

Figure 2.41 Original body CT body image (upper left), followed clockwise by reconstructed images with compression ratios of 4:1, 8:1, 17:1, 26:1, and 37:1 (the full‐frame method was used).

Figure 2.42 Two‐level 2‐D wavelet decomposition of a MR head sagittal image. (A) original image; (B) first‐level decomposition into four images, (C) second‐level decomposition. In each level, the left upper corner is the smooth image, and the other three quadrants are the detailed images (sharper). Observe in the MRI image, all detailed images in each of the three levels contain visible sharp anatomical information. The image compression is to compress each level according to the characteristic of the anatomy, and the acceptable image compression ratio. The needed quality of the compression depends on the acceptable compression ratio discussed in Section 2.5.2.

Figure 2.43 The 3‐D volume dataset after two‐level decompositions using the 3‐D wavelet transform. Each level is decomposed into eight components. f

0

(left) is the original 3‐D dataset, f

1

s are the eight first‐level decomposition (blue), and f

2

s

,

are the second level decomposition (red). The darker blue set in the middle drawing is the smooth image set, the other seven blue are the sharper image sets. The pink in the rightmost second‐level decomposition, darker pink in the upper left is the smooth dataset, and the remaining seven sets are the sharper image sets. f ’

1

and f ’

2

are the seven sets of higher resolution/sharper data of each level.

Figure 2.44 Performance comparison using 3‐D versus 2‐D wavelet compression on a 3‐D MR head image set. Note that 3‐D wavelet transform is superior to the 2‐D transform for the same peak signal‐to‐noise ratio (PSNR).

Figure 2.45 One slice of a 3‐D CT volume data compressed with a compression ratio of 20:1 with 3‐D wavelet, 2‐D wavelet, and cosine transform JPEG compression methods: (A) original image; (B) 3‐D wavelet reconstructed image; (C) 2‐D wavelet reconstructed image; and (D) JPEG reconstructed image. The square covering the vertebra is used to explain Figure 2.46.

Figure 2.46 (A) Image from Figure 2.45B with an square added on. (B’), (C’), and (D’); subtracted images in an enlarged square region near the vertebra. See also Figure 2.45B. (B’) Original – 20:1 3‐D wavelet; (C’) Original – 20:1 2‐D wavelet; and (D’) Original – 20:1 JPEG.

Chapter 03

Figure 3.1 Yellow and green boxes: picture archiving and communications system (PACS) basic components. Blue boxes: External information systems including hospital and radiology information systems (HIS/RIS), web servers, and application servers. Data flow: blue lines for internal PACS components connection; green and red: PACS connection to related external information components. Application servers and web servers: for enriching PACS infrastructure to other clinical, research, and educational applications.

Figure 3.2 A generic PACS‐based workflow. Compare the PACS‐based workflow with the PACS‐based components and workflow shown in Figure 3.1, and the classical radiology workflow depicted in Chapter 2, Figure 2.3. QC WS: quality control workstation; RIS: radiology information systems; WSs: workstations.

Figure 3.3 Stand‐alone PACS‐based model and general data flow. The data flow starts when RIS notifies imaging modality and the PACS server that a patient has registered (1). Images are sent from the modality to the PACS server (2), PACS server archives the images (3) and sends them to WSs automatically (single‐headed orange arrows, 4) along with other prefetched images (single‐headed orange arrows, 5); images can also be queried/retrieved by the WSs (double‐headed orange arrows, 6). All WSs have local storage (7). Diagnostic reports are sent back to the PACS server or directly to RIS (purple arrow, 8).

Figure 3.4 Client‐server PACS‐based model and general data flow. The first three data flow steps are the same as those of the stand‐alone model. Data flow starts when RIS notifies imaging modality and the PACS server that a patient has registered (1). Images are sent from the modality to the PACS server (2), and PACS server archives the images (3). The client WSs has access to the completed current worklist, as well as the list of prehistorical exams of the same patient. Current images as well as historical images can be retrieved from the worklist for review and viewing (4, 5 double‐headed orange arrows). All reviewed images are discarded from the WS after review (6). Diagnostic reports are sent back to the PACS server or directly to RIS (purple arrow, 7), the same as in the stand‐alone model (Figure 3.3).

Figure 3.5 Basic teleradiology model. The management center monitors the operation to direct workflow between imaging centers and expert centers.RIS: radiology information system; WS: workstation.

Figure 3.6 The PACS and teleradiology combined model. The top rectangle is the PACS components and workflow (detail from Figure 3.1), and the bottom rectangle is the teleradiology model, modified from Figure 3.5. Red lines show communication between PACS and teleradiology. HIS: hospital information system; RIS: radiology information system; WSs: workstations.

Figure 3.7 Enterprise PACS and ePR system with images. The enterprise data center supports all sites in the enterprise. The primary data center has a secondary data center for backup to avoid a single point of failure (SPOF). The enterprise ePR system is accessible from any ePR Web clients, and allows image distribution of the patient’s electronic record within the enterprise.

Figure 3.8 Correspondence between the seven‐layer open systems interconnect (OSI, yellow) and the four‐layer Department of Defense (DOD, blue) communication protocols. TCP/IP, light purple in DOD is the most popular in medical imaging and PACS applications. FTP: file transfer protocol.

Figure 3.9 Example of data block transmission from one network node to another node with the DOD TCP/IP. The data block is divided into segments. The figure illustrates how a segment of data (yellow) is encapsulated with the application header (blue), the TCP header (purple), IP header (pink), and the packet header (green) and the packet trailer (green). All these headers and the trailer (color blocks) are the data transmission overheads.

Figure 3.10 An example of a scheme combining the Gbit Ethernet switch with the asynchronous transfer mode (ATM) optical carrier (OC) ‐12 for PACS‐based multimedia application. Blue: Gbit/s; light green: 100 Mbits/s; pink: 10 Mbits/s; purple: <10 Mbits/s; left: dark green: ATM OC‐12 622 Mbits/s; light green: OC‐3 155 Mbits/s; bottom left: The Cloud (Chapter 15).

Figure 3.11 Major international interconnection points of Internet2 at the Abilene backbone. Most early International connections to Internet 2 were from a hub near Chicago. OC‐48 (2.5 gbits/), OC‐192 (10 Gbits/s).

Figure 3.12 Topology of international Internet 2 connectivity between three test sites on three continents linking IPI/USC, Los Angeles, North America; InCor, Sao Paulo, Brazil, South America; and Polytechnic University (PolyU), Hong Kong. Flagtel (DS3) was the ISP vendor connecting IPILab to Hong Kong with points of presence (PoPs) in Tokyo and Hong Kong. The routing path from InCor to PolyU used AMPATH, and the path between IPI and InCor used CLARA. Both AMPATH and CLARA are ISP providers.

Chapter 04

Figure 4.1 DICOM and HL7 standards in the PACS workflow, in which all images and pertinent data needed to be first converted to DICOM and HL7, respectively. PACS basic components (yellow) and data flow (blue: Internal; green and red: external between PACS and other information systems); other information systems (light blue). HIS: hospital information system; RIS: radiology information system.

Figure 4.2 (A) Architecture of the DICOM data communication model and DICOM parts (Sections 4.3.2 and 4.3.3). There are two communication models: the network layers model (left) and the media storage interchange model (right). Both models share the upper‐level data structure described in DICOM parts 3–6. In part 7, message exchange is used for communication only, whereas in part 10, file format is used for media exchange. Below the upper levels, the two models are completely different. (B) The simplified DICOM data model of the real world with four levels: patient, study, series, and images. This simplified model can be extended to more complicated data models for various applications, including ePR, radiation therapy, and surgery.

Figure 4.3 “0008,0000” in the “element tag and value” column is the tag for the 0008 Group. “726” is the value for the Group length, and it means there are 726 bytes in this group. The corresponding binary coding of this tag and value are in the same line in “binary coding” column. The next few lines are the tags and values as well as the corresponding coding for “specific character set,” “SOP class UID,” “modality,” and “study description.” The image pixel data is not in the 0008 Group. Its tag is “7FE0 0010,” and following the tag are the coding for the pixel data. The element tag and value “0008 …” becomes “08 00 …” in binary coding because of the little‐endian “byte swapping”.

Figure 4.4 Data flow of a set of CT images from the scanner (left) is sent to the WS (right). Within a device the data flow is called a service; between devices it is called a protocol. DIMSE: DICOM message service elements; IP: Internet protocol; TCP: transfer control protocol.

Figure 4.5 DICOM send and receive operations. The example shows the steps involved in sending a set of CT images from the scanner to the acquisition gateway (see Figure 4.1 for the PACS components involved).

Figure 4.6 DICOM query and retrieve operation. The example shows the steps involved in a WS Q/R a set of images from the server (see Figure 4.1 for the PACS components involved).

Figure 4.7 (A) Demonstration scene during an IHE Integration Workshop in 2006. Over 100 vendors were involved in this worldwide connectathon, and during that year tested 5 technical frameworks with 37 integration profiles at major conferences worldwide, where 15 active national chapters from 4 continents participated. (B) IHE Europe 2016 Connectathon in Bochum, http://connectathon.ihe‐europe.net/.

Figure 4.8 IHE framework Infrastructure with nine domains. The IHE IT Infrastructure domain in the center is the major support of the framework (modified by the courtesy of IHE, 2010; Sections 4.7.2 to 4.7.4 are based on existing data from IHE, 2010).

Figure 4.9 Sample “IHE use case”. The four actors and their respective roles are shown.

Figure 4.10 IHE scheduled workflow profile including three systems—HIS, RIS, and PACS—and two acquisition devices—conventional films with a digitizer and a CR.

Figure 4.11 IHE post‐processing workflow profile (PWF) for a radiology imaging study. This profile is often used in the integration of images from a third‐party WS (Chapter 10) or in the integration of CAD with PACS (Chapters 17 and 18) (source: IHE).

Figure 4.12 IHE key image note (KIN) workflow profile for a radiology imaging study. This profile is often used in PACS‐based CAD (see Chapters 16 to 18).

Chapter 05

Figure 5.1 PACS basic components (yellow) and data flow (blue: internal; green and red: external between PACS and other information systems); other information systems (light blue); the color of the gateway (red). HIS: hospital information system; RIS: radiology information system.

Figure 5.2 Schematic of the DICOM‐compliant PACS image acquisition gateway with the DICOM C‐STORE service–object pairs (SOP) connecting the imaging device service class user (SCU) with the gateway service class provider (SCP).

Figure 5.3 Acquisition gateway components and their workflow. Four elementary components are storage service class provider (SCP), storage service class user (SCU), local image storage, and database management system (DBMS). The three error‐handling and image recovery software components (shaded) are query and retrieve (Q/R) SCU, integrity check, and acquisition delete.

Figure 5.4 Gateway computer database management hierarchies for patient, study, series, and image tables. (*) denotes primary key and (#), foreign key.

Figure 5.5 The general processing flow diagram of the automatic DICOM query and retrieve image recovery scheme. The scheme starts by the acquisition computer issuing a C‐Find command (purple, upper left). The recovery starts by a C‐Move command (lower left, green). Both commands are received by the imaging device’s query/retrieve computer.

Figure 5.6 Connection of the US PACS module with several US scanners (left, yellow) to the hospital integrated PACS (HI‐PACS) gateway, purple). Two gateways are used: US PACS gateway (purple) and HI‐PACS gateway (blue).

Figure 5.7 Functions of the DICOM‐based PACS broker.

Figure 5.8 Manual switch (inside right) to replace the failed acquisition gateway (yellow). Automatic replacement (left) is preferred (outside left, green).

Figure 5.9 The failsafe tandem gateway system, with the left as the primary gateway. The ePR server hardware in the right with multiple processors; one processor is dedicated as the backup gateway. Two identical gateway software programs are running simultaneously on both the gateway hardware and one processor in the ePR server hardware (see Chapter 20, Figure 20.16 for more details).

Figure 5.10 DICOM (Chapter 4) PACS basic components (yellow) and data flow (blue lines): Internal; green and red: external between PACS and other information systems); other information systems (light blue). The four components to be integrated together in Chapters 4 and 5 are HIS/RIS, database gateway, PACS server and archive, and ePR in orange.

Figure 5.11 A typical HIS system with two categories of software packages: business (top), and administration and clinical operation (bottom). Rectangles are major components in each category. The software package STOR (bottom right), is a trade name (other HIS may use different names) that provides a path for the HIS to distribute HL7 formatted data to the outside world, including PACS.

Figure 5.12 Database‐to‐database transfer using common data format (HL7) and communication protocol (TCP/IP). Data from HIS is accumulated periodically at STOR (see bottom, Figure 5.11) and broadcast to RIS.

Figure 5.13 The principle of the interface engine. (Left) HL7 textual data; (right) DICOM image data; (bottom) Web‐based electronic patient record system (Chapter 11) Electronic patient records (ePR), showing image and textual data, or messages and images. Message standards depicted are: LOINC: logical observation identifier names and codes; NDC: national drug codes; UMDNS: universal medical device nomenclature system; IUPAC: International Union of Pure and Applied Chemistry; HOI: health outcomes institute; UMLS: unified medical language system; SNOMED: systemized nomenclature of medicine; ICD (ICD‐9‐CM): the

International Classification of Diseases

, 9th edition, Clinical Modification.

Figure 5.14 Information transfer between the HIS, RIS, and PACS. Numerals represent steps of information transfer explained in Table 5.1.

Figure 5.15 RIS–PACS interface architecture implemented with a database‐to‐database transfer using a trigger mechanism.

Figure 5.16 RIS–PACS interface with a query protocol. Start at green from left upper blue; PACS query RIS (top); RIS sends request through TCP/IP to RIS (right, yellow); RIS executes query, forwards results, packed and received by the PACS (bottom, blue).

Figure 5.17 Enterprise‐wide information systems (IHE) patient reconciliation workflow profile used as an example as the HIS–RIS–PACS interface. Step 1: Yellow boxes and green arrows: trauma patient examination without patient’s ID. Step 2: Blue boxes and orange arrows: start at step 8, use the IHE workflow profile to reconcile trauma patient ID and demographic data when they become available.

Chapter 06

Figure 6.1 PACS data management and Web‐based image distribution. PACS data management, patient folder management, Web‐based server, and wireless remote control of clinical image workflows are related to the four pink color boxes in the PACS components and workflow with Web servers.

Figure 6.2 Several distributed image file servers connected to the PACS server. Each of these servers provides specific applications for a given cluster of users. For example, the physician desktop server WS; purple) are used as an illustration. The concept of the Web server is described in Figure 6.3.

Figure 6.3 Basic architecture of a Web server allowing Web browsers to query and retrieve image/data from PACS through the Web‐based server. The DICOM/HTTP interpreter is the key component.

Figure 6.4 Typical query/retrieval session from the Web browser through the Web server, requesting images and related data from the PACS server. The session requires eight steps involving both Web (yellow) and PACS (blue) technologies. The resources required in the Web server (Figure 6.3) for such a session are detailed in the Web broker (middle, orange).

Figure 6.5 Architecture of the component‐based Web server for image/data distribution and display (Zhang

et al

., 2003). IIS: Microsoft Internet information server; ASP: active server pages. The concept is moving the image processing and manipulation objects to the browser during viewing. Blue: DICOM technology; yellow: Web technology.

Figure 6.6 Data flow of the query/retrieve operation in the component‐based Web server, which preserves the 12 bits/pixel DICOM image in the browser. The DICOM images can be displayed with their full resolution by the display and processing component in the browser. Numerals are the workflow steps. Blue: PACS technology; yellow: Web technology; pink: processing component objects (Zhang

et al

.). (1)–(1)’–(1)”: query; (2)–(2)’–(2)”–(3)”–(3)”–(3): retrieve.

Figure 6.7 Component architecture of a diagnostic WS for display and to process DICOM images in a Web‐based server. iViewer, iProcessor, and iController are image processing and display interfaces software (Zhang

et al

.). API: application program interface; ODBC: open database connectivity.

Figure 6.8 Workflow of the display and processing (DP) component in a Web‐based server for display and processing of DICOM images (Zhang

et al

.).

Figure 6.9 (A) Comparison of the averaged speeds of image loading and display from one to six CR images between a PACS diagnostic WS and a web server distributing images to one client. (B) Averaged speeds of distributing different modality images (CT, MRI, CR) from the Web server to one to four clients. All clients requested the web server at the same time. MB/s: megabytes per second (Zhang

et al.

).

Chapter 07

Figure 7.1 The diagram of actors (boxes) and transactions (lines) used in the XDS‐I.b profile.

Figure 7.2 The architecture, major components, and workflows of an XDS‐I‐based iEHR for image sharing with federated integration.

Figure 7.3 (A) Protocols and data flows for online sharing model in an Edge appliance. (B) Protocols and data flows for the near‐line sharing model in Edge appliance.

Figure 7.4 Performance comparison of on‐line and near‐line sharing models. The time intervals (s) represented by vertical axes (time unit: second) for appearance of the first image of different data series after issuing an ITI‐43 request in the IHE X‐DS‐I‐based image sharing platform.

Figure 7.5 The architecture, major components, and data flows of the RSNA Image Sharing Network solution.

Figure 7.6 The architecture of the grid‐based XDS‐I image sharing system for regional imaging collaborative diagnosis, integrated with an existing EHR system. There are two service groups (#1(top) & #2 (bottom)) in this diagram, and radiologists in hospital B and F (blue) perform the final reporting. The Image manager is the central storage archive, which provides image archiving functions to all hospitals in one district.

Figure 7.7 The data models for the XDS provide and register document set‐b. The models are related to submitting the preliminary and final reporting datasets.

Figure 7.8 The two new communication services in the interface integration between the grid‐based XDS‐I image sharing system and the existing EHR system. (A) The image readiness notification allows information to flow from the XDS registry to an existing EHR system. (B) Different mechanisms are shown for a web‐based EHR portal client to access published images from the grid‐based XDS‐I image sharing system, which is integrated with an existing EHR system.

Figure 7.9 Monthly statistical chart of numbers of imaging studies generated in 18 community hospitals in Xuhui District and the number of studies sent to a remote central hospital for both preliminary and final reporting through the iEHR system from January to December 2014.

Figure 7.10 (A) shows major sites in Shanghai involved with the clinical pilot studies since 2011. Three methods of integrating image sharing described in this chapter: 1) Image sharing for cross‐enterprise healthcare with federated integration; 2) XDS‐I‐based patient‐controlled image sharing; and 3) Collaborative imaging diagnosis with electronic health record integration in regional healthcare. The second method, XDS‐I, was initiated by and demonstrated at RSNA. There were only few pilot studies implemented in clinical environments described in the first and third methods. Among them are those hospitals and HMOs in Shanghai, China, with 20 million people.

Figure 7.10 (B) XuHui (lower west) and ZhaBei (upper west), two districts in Shanghai were selected as the third pilot project: Collaborative imaging diagnosis with electronic health record integration in regional healthcare, XuHui District. One of the most prestigious living areas, busy, commercial, and residential centre, also known as Xujiahui. Convergence of five main roads (including Huai Hai Road) in Xujiahui. The famous Jiao tong and Donghua universities are located here, as well as various consulate offices, and the district is density populated, with heavy traffic.

Chapter 08

Figure 8.1 Distributed computing is used to perform a PACS CAD task.

Figure 8.2 Five layers of the grid computing technology.

Figure 8.3 Five‐layer grid architecture defined by the Globus Toolkit 4.0: Fabric, connectivity, resource, collective, and application. The left‐hand side depicts its correspondence to the open system interconnection (OSI) seven‐layer Internet protocol (physical, data link, network, transport, session, presentation, and application layer). The right‐hand side describes its functions.

Figure 8.4 The five‐layer Data Grid architecture integrating DICOM services and the Globus Toolkit for PACS and MIII applications [4]. Resources (fabric) layer: bottom: the five leftmost clear boxes are existing resources from PACS; Internet 2 (I2); Rede Nacional de Ensino e Pesquisa (RNP2); storage area network (SAN) is for PACS archive (Chapter 15); the clear box Replica Database is a Globus tool; the rightmost metadata database is for fault‐tolerant Data Grid and computing grid (shadow) application. Core middleware (connectivity layer and resource layer): The four leftmost boxes are Globus tools used for data management in the PACS Data Grid; the rest are other Globus tools. Replica (shadow) and resource management (green shadow) are also used for the computing grid. User level middleware (collective layer): Metadata catalog service and the Globus info services tool are included for fault tolerance. Both resources are also used for computing grid applications (shadow boxes). Data grid application layer: This consists of the DICOM storage, query, and retrieve services. Light shaded boxes with bold red external rectangles are DICOM resources, and the metadata database for fault tolerance. Services in these boxes were developed at the Imaging Processing and Informatics Laboratory (IPILab), USC.

Figure 8.5 Existing Data Grid and its three current applications developed at IPILab, USC.

Figure 8.6 Data Grid platform at IPILab used to customize for PACS and MIII applications. The major components are the data grid (DG) grid access point (GAP) and the DICOM GAP for connection to PACS sites, other MIII servers, and the Data Grid simulator. The Data Grid simulator can be used for prototyping other Data Grid applications.

Figure 8.7 Enterprise PACS with three PACS sites (light yellow boxes). Each site has a stand‐alone PACS with its own server, WSs, storage area network (SAN) archive, and storage backup, and each site operates independently. An enterprise PACS is when these three PAC systems (or more) are connected together to share images. In an enterprise PACS, a WS at each site can Q/R images from its own SAN for image display. A WS of any three PAC systems can also Q/R images from other sites using an imaging routing mechanism (IRM, yellow box) shown on the left. The weakness of this method of connecting the three PAC systems is that two single points of failure can occur. When a PACS server or the SAN fails, the interconnectivity of the three PAC systems breaks down. On the other hand, the fault‐tolerant data grid architecture shown in green can restore each site’s backup and their connections to the IRM. It maintains interconnectivity of these three systems in real time without human intervention. There are two types of PACS GAP in this architecture, the DICOM GAP (bottom) and the PACS GAP (middle left). The former is for the PACS WS that uses the DICOM standard for image Q/R, the latter is for DICOM file transfer used by some PACS.

Figure 8.8 General architecture of the fault‐tolerance metadata system for the Data Grid. There are three levels of fault tolerance: (top) Multiple grid access points (GAP); (middle) data access interface (DAI) servers; and (bottom) multiple metadata storage nodes.

Figure 8.9 Workflow of the Data Grid during image data archive. Solid black lines (left) show the normal archive and backup operations, the first copy of the image file is sent from the acquisition to its SAN1 P1, and two backup copies to the Data Grid SAN2 P2, and SAN3 P2 for backup storage through its designated GAP1. Dotted black line shows when GAP 1 fails (red crossed lines), and GAP 2 takes over GAP 1 functions automatically.

Figure 8.10 Workflows of the Data Grid during query/retrieve (Q/R) of an image file. (A) The PACS WS Q/R its own PACS image file or from the Data Grid; (B) the PACS WS Q/R other PACS image file from the Data Grid.

Figure 8.11 Three tasks (upper right, heavy green lines) of the Data Grid during disaster recovery when either a PACS server or the SAN fails. Site 3 is used as an example. Task 1: Allow site 3 PACS WS to Q/R its own images from the Data Grid for continuing clinical operation. Task 2: After server and SAN have been restored, the Data Grid rebuilds P1 of SAN 3 its own images. Task 3: After its server and SAN have been restored, the Data Grid rebuilds P2 of SAN 3, which has the backup images of other PACS connected to the Data Grid. All three tasks are performed without human intervention. Workflows and operations of Figures 8.9, and 8.10 allow the Data Grid to automatically complete the three tasks.

Chapter 09

Figure 9.1 Organizational chart of an imaging‐based clinical trial. Radiology core (blue) is responsible for all image‐related operations.

Figure 9.2 Typical workflow of the imaging‐based clinical trial. The images are sent from field sites (1, 2, …,

n

) to a radiology core (blue), where the images are checked by a quality control (QC) workstation (WS) (point a). Images are stored in the image server (point b). The server archives images in an image repository and stores the metadata of the image in the database (point c). The images in the repository are backed up (point d).

Figure 9.3 Three‐cores Data Grid architecture (green) for image backup in clinical trials. Cores A, B, and C have the same setup. The image backup storage (e.g. SAN) in every radiology core is separated into two partitions, P1 and P2. P1 is used for local backup, while P2 is contributed to the Data Grid for backup of images from other cores. Note that the Data Grid has not intruded on the core’s backup operation in the image repository. Figure 9.5 also describes the data migration process from the current local hard disk backup to P1 of the SAN.

Figure 9.4 Data Grid test bed (green) with three International sites at the Image Processing and Informatics Laboratory (IPI SAN), University of Southern California; the Hong Kong Polytechnic University (PolyU); and the Heart Institute (InCor) at University of São Paulo, Brazil. The test bed used two real clinical trials image sets (MRI T1 and T2 images) for the validation of the Data Grid. IPI (left, blue) is used as the site for testing data migration.GAP: grid access point; P1, P2: partition 1, 2; RAID: redundant array of independent disks.

Figure 9.5 Data migration from existing backup storage to the Data Grid (right, green) during Data Grid deployment. Image migration from the local backup of the radiology core (left, blue) to P1 of the IPI SAN and the Data Grid is through a three‐step procedure. In this scenario, after completion of the Data Grid deployment, the SAN becomes the local backup archive for the clinical trials’ images of the Laboratory. P1 is the core’s own backup, and P2 is used to back up for other cores. GAP: grid access point; T: trial.

Figure 9.6 (A) Left: Relative size of the AURORA dedicated stand‐alone breast MRI scanner. Right: Major components of the scanner. (B) Multimodality Web‐based viewing workstation: Sagittal MRI, Axial MRI, 3‐D US, digital mammography, and CAD Structure reporting. (C) One view of a 3‐D MRI study using the AURORA dedicated breast MRI scanner. The patient was in the prone position with her two breasts extruded through the two round openings from the table (courtesy of AURORA, Imaging Technology, Inc.). (D) Imaging data grid (BIDG) platform ready to be connected to three MRI sites. The ePR Web server is an additional component in the data grid (DG) for the enterprise system patient electric records.

Figure 9.7 The DICOM‐based enterprise dedicated MRI breast imaging Data Grid (BIDG) is an infrastructure that supports large‐scale breast imaging archive and distribution management. The AURORA‐dedicated MRI enterprise is used as an example. Compare the similarity between this figure and the general Data Grid architecture shown in Figure 8.11 in Chapter 8. Two additional components in this figure are the ePR Web server and the DICOM conversion unit (DCU) explained in the text (courtesy of AURORA Imaging Technology, Inc.). DB: database; GAP: grid access point; IPILab: Image Processing and Informatics Laboratory, University of Southern California; SAN: storage area network; WS: workstation.

Figure 9.8 The Breast Imaging Data Grid is based on the IHE XDS‐I workflow profile. Left: green, input sources. Right: Consumers.