115,99 €
Biomedical Photonic Technologies
A state-of-the-art examination of biomedical photonic research, technologies, and applications
In Biomedical Photonic Technologies, a team of distinguished researchers delivers a methodical inquiry and evaluation of the latest developments in the field of biomedical photonics, with a focus on novel technologies, including optical microscopy, optical coherence tomography, fluorescence imaging-guided surgery, photodynamic therapy dosimetry, and optical theranostic technologies.
Each discussion of individual technologies includes examples of their contemporary application in areas like cancer therapy and drug delivery. Readers will discover the major research advancements in biomedical photonics from the last 20 years, ascertaining the basic principles of formation, development, and derivation of biomedical photonics phenomena at a variety of scales. Readers will also find:
Perfect for biophysicists and applied physicists, Biomedical Photonic Technologies will also benefit bioengineers and biotechnologists in academia and in industry.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 540
Veröffentlichungsjahr: 2023
Cover
Title Page
Copyright
Preface
1 Advanced Wide‐Field Fluorescent Microscopy for Biomedicine
1.1 Introduction
1.2 Optical Sectioning by Structured Illumination
1.3 Super‐Resolution Imaging with Structured Illumination
1.4 3D imaging with Light Sheet Illumination
1.5 Summary
References
2 Fluorescence Resonance Energy Transfer (FRET)
2.1 Fluorescence
2.2 Characteristics of Resonance Energy Transfer
2.3 Theory of Energy Transfer for a Donor–Acceptor Pair
2.4 Types of FRET Application
2.5 Common Fluorophores for FRET
2.6 Effect of FRET on the Optical Properties of Donor and Acceptor
2.7 Qualitative FRET Analysis
2.8 Quantitative FRET Measurement
2.9 Conventional Instrument for FRET Measurement
2.10 Applications of FRET in Biomedicine
References
3 Optical Coherence Tomography Structural and Functional Imaging
3.1 Introduction
3.2 Principles of OCT
3.3 Performances of OCT
3.4 Development of OCT Imaging
3.5 OCT Angiography
3.6 OCTA Quantification
3.7 Applications of OCT
3.8 Conclusion
References
4 Coherent Raman Scattering Microscopy and Biomedical Applications
4.1 Introduction
4.2 Coherent Anti‐stokes Raman Scattering (CARS) Microscopy
4.3 Stimulated Raman Scattering (SRS) Microscopy
4.4 Biomedical Applications of CRS Microscopy
4.5 Prospects and Challenges
References
5 Fluorescence Imaging‐Guided Surgery
5.1 Introduction
5.2 Basics of Fluorescence Image‐Guided Surgery
5.3 Fluorescence Probes for Imaging‐Guided Surgery
5.4 Typical Fluorescence Imaging‐Guided Surgeries
5.5 Limitations, Challenges, and Possible Solutions
References
6 Enhanced Photodynamic Therapy
6.1 Introduction
6.2 Photosensitizers for Enhanced PDT
6.3 Light Sources for Enhanced PDT
6.4 Oxygen Supply for Enhanced PDT
6.5 Synergistic Therapy for Enhanced PDT
6.6 PDT Dosimetry
6.7 Clinical Applications
6.8 Future Perspective
Acknowledgments
References
7 Optogenetics
7.1 Introduction
7.2 Introduction of Optogenetics
7.3 The History and Development of Optogenetics
7.4 Photosensitive Protein
7.5 Precise Optogenetics
7.6 Application and Development of Optogenetics
7.7 Prospects and Prospects
References
8 Optical Theranostics Based on Gold Nanoparticles
8.1 Thermoplasmonic Effects of AuNP
8.2 Gold Nanoparticles‐Mediated Optical Diagnosis
8.3 Gold Nanoparticle‐Based Anticancer Applications
8.4 Precise Manipulation of Molecules by Laser Gold Nanoparticles Heating
8.5 Gold Nanoparticles in Clinical Trials
References
Index
End User License Agreement
Chapter 2
Table 2.1 Representative Förster distance for various donor‐acceptor pairs....
Table 2.2 Förster distance for trypotaphan‐acceptor pairs.
Table 2.3 Calculated Förster distances (in Nanometers) for FP pairings.
Chapter 3
Table 3.1 Statistical characteristics of OCT amplitudes and AD OCTA signals ...
Table 3.2 Main research group and their proposed OCTA algorithms [60].
Chapter 5
Table 5.1 Excitation and emission wavelengths of these FDA‐approved probes a...
Table 5.2 Fluorescence imaging‐guided minimal invasive surgeries with each o...
Table 5.3 Major plastic surgeries that using ICG‐based fluorescence imaging ...
Chapter 6
Table 6.1 Treatment parameters of typical clinical PSs.
Table 6.2 Advantages of different monotherapies.
Table 6.3 PDT‐related triple/multiple‐modal therapy.
Table 6.4 Available techniques for measuring dosimetric parameters.
Table 6.5 Clinical applications of PDT with typical PSs.
Chapter 1
Figure 1.1 (a) Comparison of wide‐field microscopy. (b) Point‐scanning micro...
Figure 1.2 (a) Sketch of the depth of field. (b) The defocused signals cause...
Figure 1.3 Structured illumination with grating for optical section.
Figure 1.4 Structured illumination with digital mirror device.
Figure 1.5 Structured illumination with LED array. (a) Schematic of structur...
Figure 1.6 Structured illumination with random speckle patterns. (a) Schemat...
Figure 1.7 Algorithm for HiLo reconstruction. (a) The flow chart of HiLo alg...
Figure 1.8 Optical section algorithms with Hilber–Huang transform.
Figure 1.9 (a) Vertical sections through the 3D PSFs and resulting definitio...
Figure 1.10 Principle of the super‐resolution SIM. (a) Schematic of structur...
Figure 1.11 The layout of the 2D‐SIM setup based on the proposed polarizatio...
Figure 1.12 (a) Principle of light sheet. (b) The thickness and field of vie...
Figure 1.13 Light sheet with cylinder lens. (a) Top view and side view of il...
Figure 1.14 Light sheet with scanning beams. (a) Gaussian beam. (b) Bessel b...
Figure 1.15 Light sheet with multi‐direction illumination and detection. (a,...
Figure 1.16 Light sheet microscope for extended field of view. (a) Schematic...
Figure 1.17 Light sheet with a single objective.
Figure 1.18 Single‐lens light sheet illumination with tilt angle corrected....
Chapter 2
Figure 2.1 One form of a Jablonski diagram.
Figure 2.2 Absorption and emission spectra of CFP.
Figure 2.3 Absorption and emission spectra of YFP.
Figure 2.4 Energy levels of FRET.
Figure 2.5 Dependence of the energy transfer efficiency (E) on distance.
R
0
...
Figure 2.6 Intra‐molecular FRET.
Figure 2.7 Inter‐molecular FRET.
Figure 2.8 Ratio imaging of intracellular calcium ion.
Figure 2.9 Excitation (a) and Emission (b) spectra of FPs.
Figure 2.10 Illustration of FRET constructs before and after complete and pa...
Figure 2.11 FRET constructs of 1D‐nA after partial acceptor photobleaching,
Figure 2.12 Type 1: Interaction.
Figure 2.13 Type 2: Kinase activity assay.
Figure 2.14 Type 3: Conformational change of molecule.
Figure 2.15 The balance of anti‐apoptotic and pro‐apoptotic BCL‐2 proteins d...
Figure 2.16 Kinetics of Bax oligomerisation on the single‐cell level as dete...
Figure 2.17 Improved SCAT probes for caspases activation. (a) Scheme of SCAT...
Figure 2.18 Single‐cell imaging analysis of SCAT3‐expressing living HeLa cel...
Figure 2.19 Construct of Cameleon.
Figure 2.20 Ratio imaging of intracellular calcium ion.
Figure 2.21 FRET sensor for estrogen ligand binding.
Figure 2.22 FRET sensor for protein phosphorylation.
Chapter 3
Figure 3.1 Penetration depth and resolution of OCT and other imaging technol...
Figure 3.2 Michelson interferometer and optical low coherence interference. ...
Figure 3.3 Reconstruction of 3D OCT image for mouse full eye
in vivo
. (a) Wi...
Figure 3.4 Schematic of Fourier domain low‐coherence interference. (a) spect...
Figure 3.5 Lateral resolution and depth of focus of OCT system adopting obje...
Figure 3.6 Sensitivity falling off versus depth. The actual imaging depth of...
Figure 3.7 3D imaging of the whole anterior segment of the human eye. (a) 3D...
Figure 3.8 OCT images of chick embryonic heart. (a) Longitudinal section; (b...
Figure 3.9 (a) Schematic of an OCT structural cross section. Due to finite r...
Figure 3.10 A schematic diagram of the calculation of complex decorrelation....
Figure 3.11 A mouse cortex microangiography via complex cross‐correlation‐ba...
Figure 3.12 Flow phantom data validate the feasibility of ID‐OCTA. (a) Struc...
Figure 3.13 Schematic of the vascular shape and performance of 3D Hessian fi...
Figure 3.14 Proposed ID‐binary image similarity (ID‐BISIM) thresholding meth...
Figure 3.15 The numerical simulation results. (a) Plot of decorrelation vers...
Figure 3.16 Stimulus‐evoked hemodynamic responses in rat cortex
in vivo
. (a)...
Figure 3.17 (a) Schematic of a typical sample arm in an OCT system. (b) B‐sc...
Figure 3.18 Longitudinal monitoring of chronic post‐PT vascular response in ...
Figure 3.19 The schematic diagram of INS‐fOCT and INS‐evoked spatial and tem...
Figure 3.20 OCTA angiograms of retina from mouse with ID‐OCTA algorithm. (a–...
Figure 3.21 ID‐OCT angiograms of human retina with prototype OCTA system. (a...
Figure 3.22 Representative OCT structural and angiographic images of mouse d...
Chapter 4
Figure 4.1 (a) Energy diagram and optical transitions of IR absorption, Rayl...
Figure 4.2 Transition diagrams of (a) CARS and (b) SRS. Here, the particular...
Figure 4.3 Nonresonant background of CARS. (a) Diagrams of the four‐wave‐mix...
Figure 4.4 Principle of stimulated Raman scattering. (a) Energy diagram of S...
Figure 4.5 The setup (a) and principle (b) of hyperspectral stimulated Raman...
Figure 4.6 Techniques to speed up the rate of stimulated Raman scattering mi...
Figure 4.7 Label‐free histology for rapid diagnosis of stimulated Raman scat...
Figure 4.8 (a) The SRS spectrum of plaque and normal tissue, showing the blu...
Figure 4.9 Labeling Raman imaging. (a) SRS imaging with deuterated palmitic ...
Chapter 5
Figure 5.1 Publications related to fluorescence imaging‐guided surgery searc...
Figure 5.2 Basic setup of the fluorescence imaging system for imaging‐guided...
Figure 5.3 ALA‐based fluorescence imaging system for glioma resection at Dar...
Figure 5.4 Intraoperative paired fluorescence and white light images during ...
Figure 5.5 Intraoperative fluorescence images (Top) of different types of th...
Figure 5.6 Pre‐surgery MRI image (a), intraoperative RGB (b), and fluorescen...
Figure 5.7 Intraoperative ICG fluorescence images of a breast cancer patient...
Figure 5.8 Images of the surgical field before dissection of Calot's triangl...
Figure 5.9 Comparison between traditional (a) and ICG fluorescens (b) laparo...
Figure 5.10 ICG images of an ICG‐positive laryngeal carcinoma (malignant) on...
Figure 5.11 Intraoperative fluorescence images of the recipient of a living ...
Figure 5.12 Commercialized fluorescence imaging systems for plastic surgery....
Figure 5.13 Intraoperative bone blood perfusion imaging setup.
Figure 5.14 Images acquired intraoperatively during surgical treatment of an...
Figure 5.15 Comparison of autofluorescence in different tissues in the surgi...
Figure 5.16 Images of intraoperative auto‐fluorescence images of the PGs. (a...
Figure 5.17 Photo and image of the goggle and the surgical cavity seen by th...
Chapter 6
Figure 6.1 Publications related to PDT searched in Web of Science over a dur...
Figure 6.2 Hot topics in PDT research.
Figure 6.3 Simplified Jablonski diagram for PDT of Type I, II, and III.
Figure 6.4 Pathways of PDT response.
Figure 6.5 New LEDs for potential PDT applications.
Figure 6.6 Oxygen supply strategies for enhanced PDT.
Figure 6.7 Light fractionation and metronomic PDT.
Figure 6.8 Synergistic therapy for enhanced PDT.
Figure 6.9 Schematic diagram of the time‐ and spectral‐resolved
1
O
2
luminesc...
Figure 6.10 Time (a) and spectrally (b) resolved NIR luminescence from 15 μM...
Figure 6.11 Clonogenic surviving fraction versus cumulative
1
O
2
luminescence...
Figure 6.12 Schematic diagram of
1
O
2
luminescence imaging system.
Figure 6.13 Dynamic monitoring NIR luminescence in blood vessels in DSWC mod...
Figure 6.14 Schematic diagram for individual PDT with precise dosimetry.
Chapter 7
Figure 7.1 Recording and stimulation: past and present. (a) The first action...
Figure 7.2 A variety of applications use optogenetic probes to both read out...
Figure 7.3 XFP expression in Brainbow transgenic mice. (a, b) Thy1‐Brainbow‐...
Figure 7.4 Single optrode with dual stimulation and recording functions [17]...
Figure 7.5 Basic Properties of Known Single‐Component Optogenetic Tools with...
Figure 7.6 Combining NpHR with ChR2 for noninvasive optical control. (a) Hip...
Figure 7.7 Light propagation in brain tissue for
in vivo
optogenetics [23]. ...
Figure 7.8 Overview of the optical neural interface [2]. (a) Schematic of op...
Figure 7.9 Fiberless optical stimulation using μLEDs [37]. (a) GaN μLEDs gro...
Figure 7.10 Characterization of fluorescence imaging with intensity modulati...
Figure 7.11 Schematic diagram of laser scanning optical system [49]. (a) Sch...
Figure 7.12 Parallel light‐targeting methods [49]. (a) Optical devices used ...
Figure 7.13 Experimental setup and principle of MTF‐LS [57]. (a) CGH based o...
Figure 7.14 Experimental device for parallelization of light stimulus based ...
Figure 7.15 Photochemical reactions of various photosensitive elements under...
Figure 7.16 Photostimulation in freely moving mice performing a detection ta...
Figure 7.17 Transfer of the gene encoding ChR2 renders heart muscle cells se...
Chapter 8
Figure 8.1 Schematic of transient events involved in thermoplasmonic effects...
Figure 8.2 Plasmonic absorption cross section
σ
abs
as a function of wav...
Figure 8.3 Temporal evolution of electron temperature
T
e
(dashed line) and A...
Figure 8.4 Illustration of heat conduction through Au–water interface.
Figure 8.5 Spatiotemporal plot of generation and dissipation of heat in AuNP...
Figure 8.6 Temporal evolution of temperature rise for AuNP (red) and water a...
Figure 8.7 Characteristic diffusion time as a function of nanoparticle radiu...
Figure 8.8 Fluence thresholds as a function of NP radius for 355 nm (blue) a...
Figure 8.9 Two‐photon photoluminescence (TPL) and TPL lifetime imaging of Au...
Figure 8.10 Photothermal imaging of small AuNPs in a cell. (a) DIC microscop...
Figure 8.11 Photo‐acoustic imaging of AuNPs
in vivo
. (a) A typical experimen...
Figure 8.12 (a) Experimental (symbols) and calculated (solid lines) temperat...
Figure 8.13 Schematic of protein inactivation by laser gold nanoparticles he...
Figure 8.14 Highly localized chromosome manipulation using site‐directed pos...
Figure 8.15 Photothermal substrates used for VNB‐mediated photoporation. (a)...
Cover Page
Title Page
Copyright
Preface
Table of Contents
Begin Reading
Index
Wiley End User License Agreement
iii
iv
xiii
xiv
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
Edited by Zhenxi Zhang, Shudong Jiang, and Buhong Li
Editors
Prof. Zhenxi Zhang
Xi'an Jiaotong University
Institute of Biomedical Photonics and Sensing
28 Xianning Xi Road
710049 Xi'an
China
Prof. Shudong Jiang
Dartmouth College
Thayer School of Engineering
14 Engineering Drive
Hanover, NH 03755
USA
Prof. Buhong Li
Hainan University
School of Science
58 Renmin Road
570228 Haikou
China
Cover Image: © GrAl/Shutterstock
All books published by WILEY‐VCH are carefully produced. Nevertheless, authors, editors, and publisher do not warrant the information contained in these books, including this book, to be free of errors. Readers are advised to keep in mind that statements, data, illustrations, procedural details or other items may inadvertently be inaccurate.
Library of Congress Card No.: applied for
British Library Cataloguing‐in‐Publication Data
A catalogue record for this book is available from the British Library.
Bibliographic information published by the Deutsche Nationalbibliothek
The Deutsche Nationalbibliothek lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at <http://dnb.d-nb.de>.
© 2023 WILEY‐VCH GmbH, Boschstraße 12, 69469 Weinheim, Germany
All rights reserved (including those of translation into other languages). No part of this book may be reproduced in any form – by photoprinting, microfilm, or any other means – nor transmitted or translated into a machine language without written permission from the publishers. Registered names, trademarks, etc. used in this book, even when not specifically marked as such, are not to be considered unprotected by law.
Print ISBN: 978‐3‐527‐34656‐1
ePDF ISBN: 978‐3‐527‐82353‐6
ePub ISBN: 978‐3‐527‐82354‐3
oBook ISBN: 978‐3‐527‐82355‐0
The development of laser technology has brought tremendous changes to optical science and technology, and the emergence of transformative research achievements and application technologies has become inevitable. Biomedical photonics is one of the important physical mechanisms of laser application in biomedical engineering, which has been widely used in cell micronano surgery, ophthalmic refractive surgery, tumor diagnosis and treatment, and other fields. The in‐depth study of the physical mechanisms of biomedical photonics is of great significance to promote the development of related application technologies. This book presents the latest development in biomedical photonics and provides an effective reference for further research with a novel and unique perspective, clear and reasonable research methods, and systematic and rich research contents. The book consists of eight chapters, which introduce the theoretical basis of biomedical photonics in various fields from imaging to therapy. Meanwhile, based on the latest technological achievements and applications, and combined with current research and development status, this book discusses various imaging and therapeutic technologies based on the classical physical mechanism of laser interaction with tissue in detail, and systematically introduces the latest development and applications of biomedical photonics.
The book starts from the basic theory of biomedical photonics to its application in the field of biomedical photonics, and summarizes the current research hotspots, research ideas, applied technologies and methods, as well as the future development trend of biomedical photonics. The book also includes the authors' main research achievements in biomedical photonics over the past 20 years, describing the basic principles of the formation, development, and derivation of biomedical photonics phenomena at multiple scales, while also introducing the related technologies and applications of laser optics, physics, biology, thermal, and nanomaterials. It embodies the characteristics of interdisciplinary research and shows a new interdisciplinary field with great vitality. It can be used as a text book for information science, physical science, and life science, as well as a reference book for researchers in related science and technology.
This book focuses on biomedical applications and the main contents are as following:
In
Chapter 1
, the principle and biomedical applications of advanced wide‐field fluorescence microscope based on optical slice are summarized.
In
Chapter 2
, the physical and molecular process, quantitative measurement, and application of fluorescence resonance energy transfer technology in living cells are systematically analyzed.
In
Chapter 3
, the basic theory and structural and functional imaging of optical coherence tomography are summarized.
In
Chapter 4
, the basic principles of coherent Raman scattering microscopy and its recent imaging applications in histopathology and biological studies are introduced.
In
Chapter 5
, the history and basis of fluorescence imaging mediated surgery are systematically described, and the research and application of typical fluorescence imaging surgery are summarized.
In
Chapter 6
, the latest development and clinical application of photodynamic therapy are reviewed, and the main challenges and prospects of photodynamic therapy are discussed.
In
Chapter 7
, the research of opsin and precision optogenetics in optogenetics are introduced, and the limitations and challenges of this technology are put forward based on the application of optogenetics in neuroscience.
In
Chapter 8
, the classical physical mechanism of the thermal plasma effect of gold nanoparticles and the related applications of optical imaging, detection, and phototherapy enhanced by gold nanoparticles are described.
Authors of each chapters are
Chapter 1
: Prof. Hui Li and Dr. Chong Chen
Chapter 2
: Prof. Tongsheng Chen
Chapter 3
: Prof. Peng Li and Prof. Zhihua Ding
Chapter 4
: Prof. Minbiao Ji
Chapter 5
: Prof. Shudong Jiang
Chapter 6
: Profs. Buhong Li and Li Lin
Chapter 7
: Prof. Ke Si
Chapter 8
: Prof. Cuiping Yao, Dr. Xiaoxuan Liang, Prof. Sijia Wang and Prof. Jing Xin
The draft of this book was completed by Ph.D. candidate Ping Wang.
We sincerely hope that the publication of this book will help readers to understand the development of biomedical photonics, especially in China.
16 June 2022
Zhenxi Zhang, Shudong Jiang and Buhong Li
Chong Chen and Hui Li
Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Science, Suzhou, 215163, China
Life was assembled from molecules, subcellular organelles, cells, tissues, organs, to the whole organism in totally different ways. The assembly at each level has its own structure, dynamics, and functions, making for such a complex and beautiful living world. To study such complex structures, as nobel prize winner Feynmann said, “It is very easy to understand many of these fundamental biology questions: you just look at the things.” However, different imaging tools need to be developed for different purposes to look at the various biological objects, with a scale from nanometers to centimeters.
Optical microscopy plays the most important role in inspecting the microscale biological world among all the imaging tools. Although optical microscopy has been invented for more than 300 years, we have witnessed significant improvement in the optical microscope technique in the last 30 years. These improvements mostly fall into two aspects: sample labeling technique and different imaging modalities.
Image contrast is the first factor to be concerned with for optical imaging. By now, fluorescent imaging has provided the highest contrast due to the filtering out of excitation light. Organic dyes, quantum dots, and fluorescent protein are the three most widely used fluorescent labeling agents. Fluorescent proteins, which won the novel prize for chemistry in 2008, provide a genetic way for labeling so that fluorescent imaging with live cells, organelles, and even live animals becomes possible.
High‐end microscopes fall into two categories: widefield microscopes and point scanning microscopes (Figure 1.1). A wide‐field microscope takes imaging by a camera and usually has high speed and high photon efficiency. Typical examples are the Tirf‐microscope, structured illumination microscope, and single‐molecule localization super‐resolution microscope. The point‐scanning microscope takes imaging by fast scanning the excitation laser beam or sample, usually at a lower speed but with higher axial sectioning capability. Typical examples include laser scanning confocal microscopes, two‐photon microscopes, and STED super‐resolution.
Figure 1.1 (a) Comparison of wide‐field microscopy. (b) Point‐scanning microscopy.
Source: Chong Chen.
This chapter mainly introduces the advance of the wide‐field fluorescent microscope in the last ten years. We first introduce the methods to improve the optical sectioning and the resolution by structured illumination, then introduce the methods by light sheet illumination. The optical principle, setup, image processing method will be introduced in each section. The chapter ends with a prospect for future development.
The optical section in microscopy defines its capability to resolve structure axially. In an epi‐fluorescent microscope, the entire sample space is illuminated, and all of the excited fluorescence signals collected by the objective can go into the array detector. Consequently, when the sample goes out of focus, its image becomes blurred, but the signals do not disappear. This problem presents a significant hindrance in wide‐field microscopy.
In optical microscopy, the depth of focus is how far the sample plane can move while the specimen remains in perfect focus. The numerical aperture of the objective lens is the main factor that determines the depth of focus (D):
where λ is the wavelength of the fluorescent light, n is the refractive index of the medium [usually air (1.000) or immersion oil (1.515)], NA is the numerical aperture of the objective lens. The variable e is the smallest distance that can be resolved by a detector that is placed in the image plane of the microscope objective, whose lateral magnification is M. For a high‐end fluorescent microscope with NA 1.4 and a 100× magnification objective, the depth of focus is on the order of 500 to 700 nm, dependent on the fluorescent wavelength. This depth of field defines the best optical section capability for an epi‐fluorescence microscope.
Figure 1.2 (a) Sketch of the depth of field. (b) The defocused signals cause blur of the focused image.
Source: Chong Chen.
However, when imaging samples with a thickness larger than the microscope's depth‐of‐focus, the sample's out of focus plane is also excited and forms a defocus image at the camera plane (Figure 1.2). The superimposition of these defocus images lowers the contrast of the camera's captured image and practically lowers the axial resolution. The imperfections in the microscope's optics and the scattering of fluorescence signals by the sample itself make the situation even worse. So, the priority demand to improve the wide‐field microscope's performance lies in eliminating the out‐of‐focus signal, yielding better optical sectioning capability.
In a laser scanning confocal microscope, the out‐of‐focus light is rejected using a pinhole. In a wide‐field microscope, no pinhole can be used since it uses a array detector. One way to reject the out‐of‐focus signals is by using structured illumination. By utilizing a grating or a digital mirror device (DMD), stripe patterns were projected on the image plane so that a structured illumination was created to excite the fluorescence molecules within the focus plane. The actual focus range of the stripes can be made very sharp if the proper period of the grating is used. Out of focus, the strip patterns become uniform, which will generate a nonmodulated background. Therefore, the image formed by the microscope will consist of striped in‐focus features superposed with uniformly illuminated out‐of‐focus features. A postprocessing algorithm could reject this background afterward, thus obtaining a better optical section capability.
So, to obtain optical sectioning with structured illumination, two factors are needed: optical instrumentation to create structured illumination and the optical section reconstruction algorithm. These two aspects will be discussed in the following section.
For structured illumination, the excitation light field needs to be patterned, and the pattern needs to be shifted or rotated to capture all sample information. Several methods have been developed for this purpose.
The fluorescence grating imager inserts a grid into the field diaphragm plane of a fluorescence microscope (
Figure 1.3
). The shift of the grid projection can be achieved by translating the grate or tilting a plane‐parallel glass plate located directly behind the grid. This method requires very little modification to the microscope, so it is easy to implement and low cost. Zeiss, Inc. uses the technique in their APOTOME equipment for 3D imaging
[1]
. However, the image speed is limited by the mechanical movement of the grate or the glass plate. Generally, 10 frames per second could be obtained, which is not fast enough for some subcellular dynamics imaging. Another drawback is that the period of a grate is fixed, so one has to change the grid if one wants to use a different period.
Figure 1.3 Structured illumination with grating for optical section.
Source: Chong Chen.
Figure 1.4 Structured illumination with digital mirror device.
Source: Chong Chen.
To avoid mechanical translation, a DMD was introduced to project fringe patterns on the sample plane
[2]
. The DMD is a
micro‐electro‐mechanical system
(
MEMS
) consisting of a few hundred thousand tiny switchable mirrors with two stable mirror states (such as, −12° and +12°). When a micromirror is set at +12° toward the illumination, it is referred to as an “on” state. Similarly, at the position of −12° it is referred to as the “off” state. The mirrors are highly reflective and have a higher refreshing speed and a broader spectral response (
Figure 1.4
). However, since DMD is a reflective device and has to be positioned at 12° in the light path, the optical layout using DMD is more complex than using grate [
3
–
6
].
Structured illumination could also be realized using an LED array as a light source, as implemented by V. Poher et al.
[7]
. The microstructured light source is an InGaN LED consisting of 120 side‐by‐side and individually addressable microstripe elements. Each stripe of the device is 17 microns wide and 3600 microns long, with a center‐to‐center spacing between stripes of 34 microns, giving an overall diode structure size of 3.6 × 4.08 mm. A dedicated electrical driver was constructed to allow arbitrary combinations of the stripes to be driven simultaneously to produce programmable line patterns (
Figure 1.5
). Using LED array for structured illumination does not require modification of the microscope setup, only a change of the light source. The whole system contains no moving parts, and the LED array could display up to 50 000 independent line patterns per second. However, the brightness of the array LED is still limited at this point.
Figure 1.5 Structured illumination with LED array. (a) Schematic of structured illumination microscopy; (b) Schematic of LED array.
Source: Chong Chen.
Instead of a well‐defined strip pattern, structured illumination could also be random speckle patterns from a laser, as invented by J. Mertz group
[8]
. Speckle patterns are random and granular‐intensity patterns that exhibit inherently high contrast. Fluorescence images obtained with speckle illumination are therefore also granular; however, the contrast of the observed granularity provides a measure of how in focus the sample is: High observed contrast indicates that the sample is dominantly in focus, whereas low observed contrast indicates it is dominantly out of focus. The diffuser randomizes the phase front of the laser beam, resulting in a speckle pattern that is projected into the sample via a microscope objective. To obtain the latter, random the speckle patterns are applied within a single exposure of the camera, effectively simulating uniform illumination. Randomization of the speckle pattern is easily achieved by translating or rotating the diffuser (
Figure 1.6
). When the diffuser is static, the speckle illumination exhibits high contrast (top right panel). When the diffuser is rapidly oscillated by a galvanometric motor, the resulting speckle becomes blurred over the course of the camera exposure, effectively simulating uniform illumination (bottom right panel). The observed speckle contrast thus serves as a weighting function, indicating the in‐focus to out‐of‐focus ratio in a fluorescence image. While effective, this technique proved to be slow since several images were required to obtain an accurate estimate of speckle contrast A later implementation involved evaluating speckle contrast in space rather than time, using a single image
[9]
.
Figure 1.6 Structured illumination with random speckle patterns. (a) Schematic of speckle illumination microscopy; (b) random speckle illumination image and uniform wide‐field image.
Source: Chong Chen.
In optical section imaging with structured illumination, multiple images were captured for each layer with the same strip period but different phases. An afterwards reconstruction algorithm is then used to calculate a sectioned image from the multiple raw data points. By now, several different algorithms have been developed with different performance.
The optical system consists simply of an illumination mask S(t0, w0), which is imaged onto an object of amplitude transmittance or reflectance τ(t1, w1) the final image is recorded by a CCD camera in the image plane (t, w) [10, 11]. The mask is illuminated incoherently, which permits us to write the image intensity as
where h1,2 represents the amplitude point‐spread function (PSF) of the optical system. The optical coordinates (t, w) that are related to real coordinates (x, y) through (t, w) = (2π/λ)(x, y)n ⋅ sin α where n ⋅ sin α is the numerical aperture (NA) and λ denotes the wavelength.
The illumination mask takes the form of a one‐dimensional grid and can be written for simplicity as
where m denotes a modulation depth and φ0 is an arbitrary spatial phase. The normalized spatial frequency is related to the actual spatial frequency ν through , where β denotes the magnification between the grid plane and the specimen plane. And then, Eq. (1.3) was substituted into Eq. (1.2),
where I0 represents a conventional wide‐field image. Ic and Is represent the images that are due to masks of forms and , respectively. These definitions suggest that if we are able to form . We would remove the grid pattern from the image of the specimen. This result could be readily achieved by taking three images, I1, I2, and I3, which correspond to the relative spatial phases 0, φ0 = 2π/3, and φ0 = 4π/3, respectively. So,
which is analogous to square‐law detection in communications systems. Alternatively, Ip can be formed as
which is the equivalent of homodyne detection. We note that the conventional image, I0, can be recovered from I1 + I2 + I3.
HiLo microscopy is based on the acquisition of two images with different type of illumination in order to obtain one optically sectioned image (Figure 1.7) [12, 13]. A uniform‐illumination image is used to obtain the high‐frequency (Hi) components of the image, and a nonuniform‐illumination image is used to obtain the low‐frequency (Lo) components of the image. The corresponding intensity distributions of the uniform‐ and structured‐illumination images are denoted as and , respectively. The intensity distributions of the high‐ and low‐frequency images are referred to as and with the spatial, two‐dimensional coordinates . The resulting full‐frequency optically sectioned image is then obtained by
with η being a scaling factor that depends on the experimental configuration of the setup.
In order to obtain the high‐frequency in‐focus components, a typical characteristic of the optical transfer function (OTF) of a standard wide‐field microscope is exploited: high‐frequency components are only well‐resolved as long as they are in‐focus, while low‐frequency components remain visible even if they are out‐of‐focus. Hence, high‐frequency components are directly extracted by using
whereby HP denotes a Gaussian high‐pass filter with the cutoff frequency kc applied in the frequency domain.
The low‐frequency component of the image is obtained by calculating
with the complimentary low‐pass filter LP. The speckle contrast acts as a weighting function for extracting the in‐focus contributions and rejecting the out‐of‐focus contributions of the uniform‐illumination image . The overall spatial contrast is influenced by the speckles in the illumination as well as sample‐induced speckles. In order to correct for the influence of the sample‐induced speckles, the difference image
is used for speckle contrast calculation. The speckle contrast is defined as
where and are the mean of and the standard deviation of Iδ, respectively. The speckle contrast is calculated over a partition of local evaluation areas A. It is assumed that each area is large enough to encompass several imaged speckle grains. The axial resolution is further increased by applying the band‐pass filter
to the difference image Iδ before evaluating . As a result, the optical sectioning depth of the Lo‐component can be adjusted by tuning σw. In order to also adjust the optical sectioning depth of the Hi‐component, the cutoff frequency of the Gaussian high‐pass filter is also tuned by setting kc = 0.18σw. Since the high‐ and low‐frequency components of the image are now determined, the resulting optically sectioned HiLo image is eventually obtained using Eq. (1.7). As a result, because the measurements rates equals half of the camera framerate, the HiLo technique provides a powerful method for fast two‐dimensional image acquisition.
Figure 1.7 Algorithm for HiLo reconstruction. (a) The flow chart of HiLo algorithm; (b) A reconstruction of pumpkin stem section. Is‐1, Is‐2 structured illumination images; WF wide field image; HP high‐pass filter image; LP low‐pass filter image.
Source: Chong Chen.
Xing zhou et al. propose a one‐dimensional (1‐D) sequence Hilbert transform (SHT) algorithm to decode the in‐focus information (Figure 1.8) [14].
A key step of structured illumination microscopy (SIM) is to project a sinusoidal fringe onto the specimen of interest. Then, the captured structured images can be decomposed into the in‐focus and the out‐of‐focus components:
where In is the out‐of‐focus background, Im is the in‐focus information, ν and ϕ are the spatial frequency and initial phase of the projected sinusoidal fringe, respectively. Because the intensity of the out‐of‐focus background In remains constant, we can subtract two phase‐shifting raw images to eliminate the background. Thus, we obtain an input image Is with sinusoidal amplitude modulation described in Eq. (1.14):
where ϕs is the mean value of the two arbitrary initial phases of the two raw images. The next step is to demodulate the sinusoidal amplitude and obtain the in‐focus information Im from Eq. (1.14). It is seen that Im can be solved out by Eq. (1.14) directly. However, as a result of this, Im becomes extremely sensitive to the precision of ϕs. Thus, Is must be decoded by other ways to eliminate the residue sinusoidal pattern. Here, we utilize the Hilbert transform (HT) and construct a complex analytical signal IA that is presented in the form of:
where i is the imaginary unit and the imaginary part ISH of IA is the HT of the input pattern IS. In optical interferometry, the interferograms may contain complex structures, including different frequency components. Therefore, the decoding process must be based on the 2‐D HT. However, in SIM, the projection fringe contains only a single spatial frequency in one orientation (either x direction or y direction). So, the 2‐D image IS can be simplified as the combination of a sequence of 1‐D sinusoidal amplitude modulation signals. In this case, the 1‐D signal processing algorithm can be used for the 2‐D image demodulation. In 1‐D signal analysis, the HT is a powerful tool to achieve demodulation signal. Based on the characteristics of the HT, the HT of a cosine‐modulated function becomes a sine‐modulated function:
where HTx denotes the HT operation in x direction. Applying the 1‐D Hilbert transform to the Is, we obtain the analytical signal:
Finally, the optically sectioned image Im can be obtained by
Lateral resolution gets more attention in optical microscopy since it directly determines the finest structures that can be resolved. The lateral resolution of the optical system and the pixel size of the camera are the two factors that need to be considered first for magnification of the optical system. According to the Nyquist sampling rule, the pixel resolution (pixel size divided by the magnification) should be at least half of the feature size to be observed.
Figure 1.8 Optical section algorithms with Hilber–Huang transform.
Source: Chong Chen.
However, in optical microscopy, the lateral resolution is determined not only by the pixel resolution but also by the PSF. Due to the wave diffraction of light, a collimated light focused by the optical system won't be an infinite point but a diffraction‐limited optical spot. Similarly, a point‐like object will be imaged by an optical system as a blurry spot with a finite size. The intensity distribution of the spot is defined as the PSF. The lateral PSF in 2‐D can be described as an airy pattern or approximated as a Gaussian function in general. The PSF depends on the light wavelength and numerical aperture of the imaging objective (Figure 1.9).
Two point‐like objects next to each other will be imaged as the superposition of two corresponding PSFs. When the two point‐like objects are too close to each other, they cannot be resolved as separate points. The lateral resolution of an optical system can be defined as the minimum resolvable distance between two point‐like objects in the image, as Rayleigh criteria:
Another way to define the resolution is by utilizing the OTF. The OTF is the Fourier transform of the PSF. The resolution of a light microscope is determined by the cutoff frequency of the OTF:
Figure 1.9 (a) Vertical sections through the 3D PSFs and resulting definition. The contrast has been greatly enhanced to show the weak side lobes (which are the rings of the airy disk, viewed edge on). (b) Schematic of horizontal resolution definition. (c) Schematic diagram of axial resolution definition.
Source: Chong Chen.
Only those spatial frequencies coming from the object that is inside the support of the OTF, i.e. smaller than the cut‐off frequency, are detectable. In the last twenty years, several super‐resolution techniques have been developed to break the diffraction limit and have been widely used in many biological applications. In the following, the structured illumination microscopy technique is introduced as a typical wide‐field super solution microscope that is mostly suitable for live cell dynamic imaging.
In 2000, Gustafsson, M. G., proposed a wide‐field imaging technique with resolution beyond the diffraction limit by structured illumination [15]. The principle is to project a spatially patterned illumination onto the sample such that Moiré fringes are created. The recorded Moiré fringes of the fluorescent image contain the frequency of both the illumination structure and the spatial frequencies of the sample. High spatial frequency information of the sample, which was outside the passband of the microscope's OTF is now downmodulated into the passband of the OTF. By acquiring several images, a super‐resolved SIM image can be reconstructed through computational reconstruction (Figure 1.10). In order to achieve isotropic resolution improvement, nine images of three orientation angles with three phases of each orientation angle are usually acquired. In the last twenty years, the SR‐SIM technique has been greatly studied and improved, which has now become the most popular super‐resolution technique for live cell dynamic imaging. With linear excitation, linear SIM could achieve a twofold resolution improvement [16]. Nonlinear SIM had no theoretical resolution limit and experimentally attained better than 40 nm resolution with saturated excitation [17] or sequential activation & excitation [18].
Figure 1.10 Principle of the super‐resolution SIM. (a) Schematic of structured illumination pattern; (b) 2D OTF of SIM with excitation pattern in only one orientation. (c) 2D OTF of SIM with excitation pattern in three orientation.
Source: Chong Chen.
Despite the fact that structured illumination was used in both optical sectioning and super‐resolution, the principle, optical setup, and image reconstruction algorithms are quite different. The structure illumination in OS‐SIM is generally formed by the projection of a fringe pattern onto the microscope field of view (FOV) using a noncoherent light source, while the structure illumination for SR‐SIM is typically formed by laser interference because of the thinner pattern period required.
Many early SIM systems that use a mechanically moving physical grating to generate the spatial patterned illumination cannot guarantee the precise shift of the illumination pattern during image acquisition [16, 19]. The spatial light modulator (SLM)‐based SIM system arose from the need for fast illumination pattern switching and an accurate pattern shift for video‐rate observation in living cells [20, 21].
Figure 1.11 The layout of the 2D‐SIM setup based on the proposed polarization optimization method. (a) Schematic of interference structured illumination microscopy; (b) Polarization of three orientation; (c) Schematic of polarization control method.
Source: Chong Chen.
A ferroelectric‐liquid‐crystal‐on‐silicon (LCoS) display used as a SLM has two stable crystal axes driven by an applied voltage. Due to the pixelated structure of the display, additional unwanted diffraction orders are created and lead to a jagged edge in the illumination pattern in the sample plane. In SIM, a Fourier filter is used to block those unwanted diffraction orders. In the two‐beam SIM system, the zeroth illumination order is blocked (Figure 1.11). Only the ±1st diffraction orders pass through the objective and form an interference pattern in the sample plane. This leads to the same interference pattern in the sample plane for both the grating pattern and its inverse image displayed on the SLM.
Fast switching up to several kHz is offered by ferroelectric SLMs. However, low diffraction efficiency is a major drawback. In addition, the SLM has to display the inverse image of the previous grating pattern to prevent damage to the LCoS. Nevertheless, as the required illumination intensity for acquiring raw SIM images is relatively low, it is preferable to use a ferroelectric SLM to modulate the illumination light.
The simplified sketch of the first prototype of the fastSIM setup is shown in Figure 1.11. An acousto‐optical tunable filter (AOTF) right behind the CW laser ensures precise and fast switching of the illumination light. An illumination grating pattern is generated by an SLM placed in the intermediate image plane and is projected on the sample. In order to achieve high contrast of the illumination grating in the sample plane, a liquid crystal variable retarder (LCVR) and a customized azimuthally patterned polarizer (pizza polarizer) are placed after the SLM to achieve azimuthal polarization. A passive mask as a Fourier filter is used to block all unwanted diffraction orders except the ±1st diffraction orders.
Polarization modulation is the most important part of the SIM optical path. In addition to the above methods, more optimized schemes have been developed in recent years. A polarization beam splitter (PBS) in this scheme was used to shorten the length of the optic path. In order to modulate the polarization of ±1 order diffraction beams in three different orientations, a combinatorial waveplate composed of six fan‐shaped achromatic Pizza HWPs was employed. Three orientations corresponding to illumination light are defined as D1, D2, and D3, respectively. The fast axes of the two opposite HWPs for D1 are set to be parallel to the electrical vibration direction of the incident beam. The fast axes of the remaining four HWPs of D2 and D3 are rotated by +30° and −30°, respectively. The polarization of the ±1‐order diffraction beams is rotated by 0° and ±60° in three orientations, corresponding to the above fast‐axis rotation. After pizza HWP modulation, an LCVR is utilized to compensate for the additional phase generated by the dichroic mirror (DM). The fast axis of the LCVR is set to be parallel to the electrical vibration direction of the incident beam, as shown in Figure 1.11.
Linear super‐resolution structured illumination microscopy (SR‐SIM) is a wide‐field imaging method that doubles the spatial resolution of fluorescent images [22]. The final SR image is reconstructed by postprocessing image algorithms. Here, we first briefly introduce the principle and implementation of the conventional Wiener SIM reconstruction algorithm (hereinafter referred to as “Wiener‐SIM”). We start by considering a two‐dimensional (2D) SIM raw data of the form
where subscript θ and n are the orientation and phase of the illumination pattern (θ = 1,2, 3, n = 1, 2, 3); r is the image spatial coordinates; Sin(r) is the sample on the objective focal plane; mθ, kθ, and ϕθ are the modulation, spatial frequency and initial phase of the illumination pattern in the raw data, respectively; symbol ⊗ denotes the convolution operation; PSF(r) is the PSF of the microscope, Sout(r) is out‐of‐focus background fluorescent signal, and N(r) is noise.
The spectrum of SIM raw data can be obtained in the frequency domain by implementing Fourier transform to Eq. (1.21)
where is the OTF of the microscope, which is the Fourier transform of PSF(r). is the spectrum components of the sample at the focal plane, are the spectrum components that contain the unresolvable high‐frequency signals. and are the spectrum of the out‐of‐focus background and noise signal, respectively.
For 2D‐SIM, the raw data of three different phases are collected in the same orientation, and all the spectrum components in Eq. (1.22) can be separated by solving Eq. (1.23)
To accurately extract the separated components in Eq. (1.23), and then shift them to the correct position, the correct illumination pattern parameters of mθ, kθ, and ϕθ should be accurately determined from the raw data. Typically, these reconstruction parameters can be estimated by cross‐correlation methods. In this chapter, we developed a robust reconstruction parameter by combining normalized cross‐correlation and spectrum notch for determining the correct reconstruction parameters.
Then, these separated components are shifted back to their correct position with sub‐pixel precision
where , , and are the shifted spectrum components.
Thus, the reconstructed spectrum of SR‐SIM can be obtained by the traditional generalized Wiener filtering deconvolution
where symbol * is the conjugate operation. w is the Wiener constant, which is an empirical value is the apodisation function used to suppress artifacts. In the Wiener‐SIM we implemented, a theoretical OTF based on imaging conditions was employed as the apodization function
Finally, the reconstructed SR image is
where symbol FFT−1 represents inverse Fourier transform.
Conventionally, the raw SIM data with high modulation contrast and high signal‐to‐noise ratio (SNR), the PSF that matches the imaging conditions during raw data acquisition, and the reconstruction algorithm with excellent artifact suppression performance are three key factors for obtaining SR‐SIM images with minimum artifacts. In general, Wiener‐SIM usually estimates the illumination pattern parameters from the raw SIM data and then reconstructs the SR images based on the generalized Wiener filtering deconvolution (Eq. (1.25)). However, Wiener‐SIM is not the optimal solution for reconstructing high‐quality SR‐SIM images because it still faces the following challenges.
Typical artifacts often appear in SR‐SIM images, such as “hatching,” “honeycomb,” “snowflake,” “sidelobe,” and “hammerstroke” artifacts, etc., resulting in the fidelity and quantification of SIM images always facing challenges [23]. To pursue high‐quality SR images with minimal artifacts, many efforts have been made, including in‐depth imaging system establishment protocol, accurate reconstruction parameter estimation, practical system calibration and sample preparation guidelines, the guide for user‐defined parameters adjustment, and some open‐source reconstruction tools. These works have prompted researchers to gain insight into the causes, features, and suppression methods of typical artifacts. So far, the sources of artifacts in SR‐SIM images have been well studied and summarized in the recently published papers. Although many artifacts can be distinguished subjectively, they cannot be robustly eliminated, and certain artifacts still frequently appear in the SIM images of publications. The presence of artifacts limits the accessibility of SIM to a few experts, hindering its wider use as a general imaging tool. More importantly, new structures found using SIM must be interpreted with special care to avoid incorrectly identifying artifacts as real features [24–26].
To reduce reconstruction artifacts, Wiener‐SIM usually emphasizes collecting raw data with high modulation contrast and a high SNR [27, 28]. However, in actual experiments, since the modulation contrast and SNR of the raw SIM data are mainly determined by the quality of illumination patterns, the contrast properties of the labeled biological samples, and the optical properties of the imaging system, the raw SIM data with high modulation contrast and high SNR cannot always be collected. For example, when implementing ultrafast SR‐SIM imaging, it is easy to collect raw data with low SNR. Meanwhile, as the imaging depth increases, the modulation contrast of raw data will also gradually decrease due to the rapid enhancement of the out‐of‐focus background fluorescence. Moreover, when the contrast of the labeled sample is poor, even if the illumination pattern at the objective focal plane has high modulation contrast, the raw data shows low modulation contrast. Furthermore, raw data with suboptimal modulation contrast is also unavoidable in either home‐built SIM systems or commercial SIM systems when imaging device states are imperfect and users are unskilled in operation.
For low SNR raw data, Wiener‐SIM may not be able to estimate the correct pattern wave vectors, and noise‐related artifacts in the reconstructed image are also more serious. For raw data with suboptimal modulation contrast, Wiener‐SIM also faces the challenge that the correct reconstruction parameters may not be determined. For example, the estimated modulation depth is less than the actual value. According to Eq. (1.26), the smaller modulation factor can cause over‐amplification of the high‐frequency components, resulting in “snowflake” artifacts and artifacts related to high‐frequency noise. In addition, for raw data with strong background fluorescence, out‐of‐focus fluorescence can cause “honeycomb” artifacts or “hammerstroke” artifacts.
In practice, in order to avoid the risk of artifacts, a large number of suboptimal raw data sets are not effectively utilized or even abandoned. This not only caused a waste of time and money in the SIM imaging experiment, but more importantly, limited the suitable application scenarios of SIM.
Light sheet fluorescence microscopy (LSFM) uses a thin plane of light to optically section transparent tissues or whole organisms that have been labeled with a fluorophore. Compared with confocal and two‐photon microscopy, LSFM is able to image thicker tissues (>1 cm) with reduced photobleaching and phototoxicity because the specimen is exposed only to a thin light sheet. In addition, LSFM is a nondestructive method that produces well‐registered optical sections that are suitable for three‐dimensional reconstruction and can be processed by other histological methods (e.g. mechanical sectioning) after imaging.
The first published account of a very simple version of an LSFM (called ultramicroscopy) was described by Siedentopf and Zsigmondy (1903), in which sunlight was projected through a slit aperture to observe gold particles. In 1993, Voie and colleagues developed a light sheet microscope system called orthogonal‐plane fluorescence optical sectioning (OPFPS) [29]. OPFOS was developed by investigators in Francis Spelman's laboratory at the University of Washington who were attempting to quantitatively assess hair cell structure and other cochlear features to improve the cochlear implant. OPFOS featured all of the elements that are present in current LSFM devices—namely, laser, beam expander, cylindrical lens to generate the light sheet, specimen chamber, orthogonal illumination of the specimen, specimen movement for z‐stack creation, and specimen clearing and staining for producing fluorescent optical sections. In 1994, an oblique illuminating confocal microscope was being developed in Ernst Stelzer's laboratory to improve the axial resolution of confocal microscopy. It was called a confocal theta microscope [30, 31]. Theta confocal microscopy appeared to lay the foundation for their subsequent version of an LSFM device called selective or single‐plane illumination microscopy (SPIM). As shown in Figure 1.12a, a typical light sheet microscope is an L‐shaped structure composed of two objective lenses (one illumination objective and one detection objective).
Figure 1.12 (a) Principle of light sheet. (b) The thickness and field of view of light sheet.
Source: Chong Chen.
Obviously, in a LSFM, the excitation and collection branches are uncoupled. Therefore, it is helpful to study both the excitation and collection arms of the LSFM separately. FOV lateral and axial resolution are the key parameters to be considered when constructing an LSFM system. In a LSFM, a focused light beam is used to produce the excitation LS (Figure 1.12). When using a Gaussian beam, its beam waist (w0) can be related to the sectioning ability and, therefore, in a first approximation, to the axial resolution, Raxial, of the final image as
where f is focal length of the illumination objective, D is the entrance pupil diameter of illumination objective, and n is the refractive index. For a formal definition of resolution (see Section 1.2.1). Similarly, the Rayleigh range, Zr, can be related to the FOV of the image and will be given by
The Ernst H. K. Stelzer laboratory published the first true light sheet fluorescence microscopy article [32]. They used SPIM to visualize all muscles in vivo in the transgenic Medaka line Arnie, which expresses green fluorescent protein in muscle tissue. SPIM can also be applied to visualize the embryogenesis of the relatively opaque Drosophila melanogaster in vivo. The SPIM was capable of resolving the internal structures of the entire organism with high resolution (better than 6 µm) as deep as 500 μm inside the fish, a penetration depth that cannot be reached using confocal LSM. The L‐type light sheet microscope's main components were shown.
A series of lasers provide lines for fluorescence excitation. An optical system that includes a cylindrical lens focuses the laser light onto a thin light sheet. The sample is mounted on a transparent, low‐concentration (0.5%) agarose gel (Figure 1.13). This agarose is prepared from an aqueous solution adequate for the sample, in our case, phosphate buffered saline (PBS), providing a suitable environment for a live sample. The cylinder of agarose containing the sample is immersed in PBS, which virtually eliminates refractive imaging artifacts at the agarose surface. The cylinder containing the sample is supported from above by a micropositioning device. By using the four available degrees of freedom (three translational and one rotational), the sample can be positioned such that the excitation light illuminates the plane of interest. An objective lens, detection filter, and tube lens are used to image the distribution of fluorophores in the illumination plane onto a CCD camera, with the detection axis arranged perpendicular to the axis of illumination. The light sheet thickness is adapted to the detection lens, i. e. the light sheet is made as thin as possible while keeping it uniform across the complete FOV of the objective lens. Its thickness is typically between 3 and 10 μm; e. g. for a 10×, 0.30 NA objective lens, the light sheet beam waist can be reduced to 6 μm, and the resulting width will vary less than 42% across the FOV of 660 μm.
Figure 1.13 Light sheet with cylinder lens. (a) Top view and side view of illumination path. (b) 3D diagram of a typical sample pool.
Source: Chong Chen.
Any fluorescence imaging system suffers from scattering and absorption in the tissue; in large and highly scattering samples, the image quality decreases as the optical path length in the sample increases. This problem can be reduced by multiview reconstruction, in which multiple 3D data sets of the same object are collected from different directions and combined in a postprocessing step. The sample was rotated mechanically and for each orientation (0°, 90°, 180°, and 270°) a stack was recorded. The stacks were then reoriented in the computer to align them with the stack recorded at 0°. The fusion of these four data stacks yields a superior representation featuring similar clarity and resolution throughout the entire specimen.
As known, the Gauss beam intensity is nonuniform. A digital scanned laser light sheet fluorescence microscopy (DSLM) was developed to achieve the imaging speed and quality required for recording large specimen [32]. (Figure 1.14) The idea behind DSLM is to generate a “plane of light” with a laser scanner that rapidly moves a micrometer‐thin beam of laser light vertically and horizontally through the specimen. The DSLM has several advantages over standard light sheet microscopy. First, DSLM illuminates each line in the specimen with the same intensity, a crucial prerequisite for quantitative imaging of large specimens. Second, in contrast to standard light sheet‐based microscopy, DSLM does not rely on apertures to form the laser profile, which reduces optical aberrations and thereby provides an exceptional image quality. Third, the entire illumination power of the light source is focused onto a single line, resulting in an illumination efficiency of 95% as compared with 3% in standard light sheet microscopy. Fourth, DSLM allows generating intensity‐modulated illumination patterns (structured illumination), which can be used to enhance the image contrast in highly light‐scattering specimens, such as large embryos.
One of the fundamental novel ideas of the DSLM concept is the use of laser scanners to create a 2D sample illumination profile perpendicular to the detection axis. In the standard mode of DSLM operation, one of the scan mirrors of the scan head moves at a constant speed
