142,99 €
Most cameras are inherently designed to mimic what is seen by the human eye: they have three channels of RGB and can achieve up to around 30 frames per second (FPS). However, some cameras are designed to capture other modalities: some may have the ability to capture spectra from near UV to near IR rather than RGB, polarimetry, different times of light travel, etc. Such modalities are as yet unknown, but they can also collect robust data of the scene they are capturing. This book will focus on the emerging computer vision techniques known as computational imaging. These include capturing, processing and analyzing such modalities for various applications of scene understanding.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 510
Veröffentlichungsjahr: 2024
Cover
Table of Contents
Title Page
Copyright Page
Introduction
PART 1: Transient Imaging and Processing
1 Transient Imaging
1.1. Introduction
1.2. Mathematical formulation
1.3. Capturing light in flight
1.4. Applications
1.5. Non-line-of-sight imaging
1.6. Conclusion
1.7. References
2 Transient Convolutional Imaging
2.1. Introduction
2.2. Time-of-flight imaging
2.3. Transient convolutional imaging
2.4. Transient imaging in scattering media
2.5. Present and future directions
2.6. References
3 Time-of-Flight and Transient Rendering
3.1. Introduction
3.2. Mathematical modeling
3.3. How to render time-of-flight cameras?
3.4. Open-source implementations
3.5. Applications of transient rendering
3.6. Future directions
3.7. References
PART 2: Spectral Imaging and Processing
4 Hyperspectral Imaging
4.1. Introduction
4.2. 2D (raster scanning) architectures
4.3. 1D scanning architectures
4.4. Snapshot architectures
4.5. Comparison of snapshot techniques
4.6. Conclusion
4.7. References
5 Spectral Modeling and Separation of Reflective-Fluorescent Scenes
5.1. Introduction
5.2. Related Work
5.3. Separation of reflection and fluorescence
5.4. Estimating the absorption spectra
5.5. Experiment results and analysis
5.6. Limitations and conclusion
5.7. References
6 Shape from Water
6.1. Introduction
6.2. Related works
6.3. Light absorption in water
6.4. Bispectral light absorption for depth recovery
6.5. Practical shape from water
6.6. Co-axial bispectral imaging system and experiment results
6.7. Trispectral light absorption for depth recovery
6.8. Discussions
6.9. Conclusion
6.10. References
7 Far Infrared Light Transport Decomposition and Its Application for Thermal Photometric Stereo
7.1. Introduction
7.2. Related work
7.3. Far infrared light transport
7.4. Decomposition and application
7.5. Experiments
7.6. Conclusion
7.7. References
8 Synthetic Wavelength Imaging: Utilizing Spectral Correlations for High-Precision Time-of-Flight Sensing
8.1. Introduction
8.2. Synthetic wavelength imaging
8.3. Synthetic wavelength interferometry
8.4. Synthetic wavelength holography
8.5. Fundamental performance limits of synthetic wavelength imaging
8.6. Conclusion and future directions
8.7. Acknowledgment
8.8. References
PART 3: Polarimetric Imaging and Processing
9 Polarization-Based Shape Estimation
9.1. Fundamental theory of polarization
9.2. Reflection component separation
9.3. Phase angle of polarization
9.4. Surface normal estimation from the phase angle
9.5. Degree of polarization
9.6. Surface normal estimation from the degree of polarization
9.7. Stokes vector
9.8. Surface normal estimation from the Stokes vector
9.9. References
10 Shape from Polarization and Shading
10.1. Introduction
10.2. Related works
10.3. Problem setting and assumptions
10.4. Shading stereoscopic constraint
10.5. Polarization stereoscopic constraint
10.6. Normal estimation with two constraints
10.7. Experiments
10.8. Conclusion and future works
10.9. References
11 Polarization Imaging in the Wild Beyond the Unpolarized World Assumption
11.1. Introduction
11.2. Mueller calculus
11.3. Polarizing filters
11.4. Polarization imaging
11.5. Image formation model
11.6. Polarization imaging reflectometry in the wild
11.7. Digital single-lens reflex (DSLR) setup
11.8. Reflectance recovery
11.9. Results and analysis
11.10. References
12 Multispectral Polarization Filter Array
12.1. Introduction
12.2. Multispectral polarization filter array with a photonic crystal
12.3. Generalization of imaging and demosaicking with multispectral polarization filter arrays
12.4. Demonstration
12.5. Conclusion
12.6. References
List of Authors
Index
End User License Agreement
Chapter 4
Table 4.1. The required number of detector pixels MxMy and the utilization eff...
Chapter 5
Table 5.1. The mean percent difference between k
1
and k
2
for 183 absorption sp...
Chapter 10
Table 10.1. SONY dataset: mean absolute difference (in degrees) of an object’s...
Table 10.2. SONY dataset: light direction estimation error (degrees). The best...
Table 10.3. Lumenera dataset: mean absolute difference (in degrees) of an obje...
Table 10.4. Lumenera dataset: light direction estimation error (degree). The b...
Chapter 11
Table 11.1. Statistical variation in surface normals of “red book” under diffe...
Chapter 1
Figure 1.1. Example capture of a light pulse propagating through a bottle fill...
Figure 1.2. Synthetic example of time-resolved light transport (data from Jara...
Figure 1.3. Time-resolved incoming radiance (bottom) at the ground plane for d...
Figure 1.4. Left: ray diagram illustrating the two-planes light field parametr...
Figure 1.5. Left: measured temporal impulse response of a 20-μm CMOS SPAD with...
Figure 1.6. Illustration of our reconstruction setup: a laser pulse is emitted...
Figure 1.7. Left: the propagation time from a point
x
at a hidden surface form...
Figure 1.8. Reconstruction result of hidden scene (a) consisting of a mannequi...
Figure 1.9. The virtual imaging process. (a) A virtual light source at the rel...
Figure 1.10. Reconstruction result of a complex hidden scene (a) with signific...
Chapter 2
Figure 2.1. Working principle of a PMD pixel. Photo-generated charges are dire...
Figure 2.2. Top: Operating principle of a conventional correlation PMD sensor....
Figure 2.3. In the scene on the top, contributions from direct and indirect li...
Figure 2.4. Left: our capture setup for transient images (from left: computer,...
Figure 2.5. Time slices from three transient images captured with our setup an...
Figure 2.6. Transient imaging in combination with spatial modulation. (a) Scen...
Figure 2.7. Example of imaging in scattering media using our approach. Left: o...
Chapter 3
Figure 3.1. An example light path for surface only mesh topology.
Figure 3.2. Various time-of-flight rendering tasks: for the Cornell box scene,...
Figure 3.3. Path sampling with pathlength constraint: in bidirectional path tr...
Figure 3.4. Transient imaging in dynamic scenes: to render transients of dynam...
Figure 3.5. Proximity detection camera: we simulate measurements from such a c...
Figure 3.6. CW-ToF depth-selective camera: we simulate a CW-ToF camera using m...
Figure 3.7. Optimal imaging system design for scanning-based non-line-of-sight...
Chapter 4
Figure 4.1. The portions of the datacube collected during a single detector in...
Figure 4.2. (a) The transmission spectra for six of Landsat 8’s spectral filte...
Figure 4.3. Swath is cross-track; track is the platform motion direction
Figure 4.4. A fiber spectrometer accepts light focused from the fore-optics on...
Figure 4.5. (a) An f/10 Czerny–Turner spectrometer, including a (non-conventio...
Figure 4.6. (a) An f/2.8 transmission grating spectrometer. (b) An f/2.5 folde...
Figure 4.7. A direct-view f/2.3 grism spectrometer.
Figure 4.8. An f/1.8 coded aperture spectrometer. The objective lens does not ...
Figure 4.9. An f/6 echelle spectrometer. An example raw image on the detector ...
Figure 4.10. (a) An f/3.8 Offner spectrometer.
Figure 4.11. An f/2 Michelson-type Fourier transform spectrometer. In practice...
Figure 4.12. Fabry–Perot filter. The transmission function shown here uses θ...
Figure 4.13. (a) A three-stage Lyot filter and (b) a six-stage Şolc filter. In...
Figure 4.14. An acousto-optic tunable filter (AOTF) used in a line-imaging con...
Figure 4.15. Various views of a Bowen–Walraven image slicer, illustrating how ...
Figure 4.16. The system layout (a) for an image slicer, and closeup (b) of the...
Figure 4.17. A faceted mirror constructed with (a) a set of individually polis...
Figure 4.18. The image mapping spectrometer (IMS) system layout. Three differe...
Figure 4.19. The system layout for an integral field spectrometer with coheren...
Figure 4.20. The system layout for an integral field spectrometer with lenslet...
Figure 4.21. Layouts for a filter array camera (FAC) system, using (a) a mosai...
Figure 4.22. The system layout for a computed tomography imaging spectrometry ...
Figure 4.23. (Top) The system layout for a coded aperture snapshot spectral im...
Figure 4.24. Diagrams showing how the detector utilization efficiency μ is cal...
Chapter 5
Figure 5.1. (a) The scene captured under white light. (b) The recovered reflec...
Figure 5.2. An example of absorption and emission spectra from the McNamara an...
Figure 5.3. An example of a captured scene (a). When a reflective-fluorescent ...
Figure 5.4. Sinusoidal illuminant patterns. The blue and pink solid lines deno...
Figure 5.5. The percentage of absorption spectra in the McNamara and Boswell f...
Figure 5.6. Absorption and emission spectra of two fluorescent materials.
Figure 5.7. All test errors sorted in ascending order. Sixty-seven percent of ...
Figure 5.8. Examples of estimated absorption spectra and their root mean squar...
Figure 5.9. Evaluation of our separation method on a pink sheet. (a) Two high-...
Figure 5.10. Recovered reflectance r(λ), fluorescence emission e(λ)
Figure 5.11. Recovered reflectance spectra for the ordinary reflective materia...
Figure 5.12. Comparison results on the fluorescent yellow sheet.
Figure 5.13. The separation results on four channels of the hyperspectral imag...
Figure 5.14. The relighting results for a scene with fluorescent and non-fluor...
Figure 5.15. Separation and relighting results for a fluorescent and a non-flu...
Figure 5.16. Separation and relighting results for a scene with fluorescent an...
Figure 5.17. The two high-frequency filters (a) and their spectra (b).
Figure 5.18. Recovered reflectance r(λ) and fluorescence emission e(λ...
Figure 5.19. The separation results with the high-frequency filters. The spect...
Figure 5.20. The separation results with ambient light. The spectra of illumin...
Figure 5.21. The separation results under strong ambient light, which is typic...
Chapter 6
Figure 6.1. (a) and (b) The scene at 905 nm and 950 nm after normalizing the i...
Figure 6.2. (a) The water absorption curve in the range from 400 nm to 1400 nm...
Figure 6.3. Images of 950 nm of water pouring into a cup
Figure 6.4. (a) The relative depth error with respect to the reflectance spect...
Figure 6.5. Reflectance spectra database in vis–NIR range from 400 to 1,400 nm...
Figure 6.6. Configuration of perspective light and camera.
Figure 6.7. (a) Our co-axial bispectral imaging system and (b) the spectral re...
Figure 6.8. Depth estimation error by the bispectral method for three planar p...
Figure 6.9. Depth estimation error by the bispectral method for white target i...
Figure 6.10. Shape recovery of objects by the bispectral method with complex g...
Figure 6.11. Shape recovery of translucent objects by the bispectral method. O...
Figure 6.12. Shape recovery of a moving hand captured at video rate by the bis...
Figure 6.13. Linear approximation of the reflectance spectrum in the wavelengt...
Figure 6.14. Relative reflectance spectrum difference for wood (a), cloth (b),...
Chapter 7
Figure 7.1. A ball captured by a conventional color camera and a thermal camer...
Figure 7.2. Far infrared light transport. While far infrared light can partial...
Figure 7.3. Far infrared light and heat transport components. Similar to the v...
Figure 7.4. The architecture of a typical thermal sensor, a microbolometer. A ...
Figure 7.5. Transient properties of far infrared light transport. Because the ...
Figure 7.6. Double exponential fitting result to the FTCS curves of different ...
Figure 7.7. Other viable approaches. (a) By turning on and off the light sourc...
Figure 7.8. Experimental setup. The object is illuminated by far infrared ligh...
Figure 7.9. Decomposition result for a black-painted wooden ball. (a) The scen...
Figure 7.10. Exponential fitting results. Double exponential curves fit the ob...
Figure 7.11. Results of the thermal photometric stereo. (a–c) Decomposed diffu...
Figure 7.12. The effectiveness of decomposition. Photometric stereo result wit...
Figure 7.13. Results on various materials. Spheres made of wood, glass, plasti...
Chapter 8
Figure 8.1. Formation of the synthetic wave field, explained at the example of...
Figure 8.2. Synthetic wavelength imaging exploits spectral diversity, i.e. enc...
Figure 8.3. High-precision ToF imaging with synthetic wavelength interferometr...
Figure 8.4. Measurement of a plaster bust with a focal plane array (FPA)-based...
Figure 8.5. Measurement of a plaster bust with a flutter-shutter camera-based ...
Figure 8.6. Schematic setups for NLoS imaging around corners (a) and NLoS imag...
Figure 8.7. Synthetic wavelength holography (SWH) image formation: The scene/o...
Figure 8.8. Imaging around corners with synthetic wavelength holography (exper...
Figure 8.9. a) Photo of the two objects (character “N” and “U”) used for the s...
Figure 8.10. Imaging through scatterers with synthetic wavelength holography (...
Figure 8.11. Key attributes of synthetic wavelength holography (SWH) and poten...
Figure 8.12. Measurement of the “synthetic diffraction disk” via reconstructio...
Chapter 9
Figure 9.1. Polarization
Figure 9.2. (a) Linear polarizer illustrated as a circle with a line grid insi...
Figure 9.3. Sinusoid of the brightness while rotating the polarizer
Figure 9.4. Assorted pixels of polarization imaging camera
Figure 9.5. Separation of the specular reflection component and the diffuse re...
Figure 9.6. Reflection, transmission and refraction
Figure 9.7. Intensity reflectivity and intensity transmissivity
Figure 9.8. Relationship between the surface normal and the POI: (a) single vi...
Figure 9.9. Shape from polarization and space carving: (a) target object, (b) ...
Figure 9.10. Algorithm flow
Figure 9.11. Target object (ellipsoid)
Figure 9.12. Estimated shape (ellipsoid)
Figure 9.13. Intersection shape (ellipsoid)
Figure 9.14. Target object (stripe)
Figure 9.15. Estimated shape (stripe)
Figure 9.16. Degree of polarization of the reflected light
Figure 9.17. Emission
Figure 9.18. Degree of polarization of thermal radiation
Figure 9.19. Stokes vector
Figure 9.20. Acquisition system for measuring transparent surfaces based on po...
Figure 9.21. Dog-shaped glass object: (1a) (2a) real image, (1b) (2b) initial ...
Chapter 10
Figure 10.1. Normal vector
n
(p) is defined in the world coordinate system by a...
Figure 10.2. Diffuse reflection at a surface point associated with pixel p and...
Figure 10.3. Three polarization images taken at a zero polarizer angle for thr...
Figure 10.4. Algorithm 1: Average error for all image pixels and different wei...
Figure 10.5. Algorithm 1: Error map of image pixels for different weights give...
Figure 10.6. Porcelain doll in our experiment. Images shown have a zero polari...
Figure 10.7. Experimental setup with a) Lumenera and b) SONY cameras.
Figure 10.8. Ten porcelain objects and scatter plot of all eight light directi...
Figure 10.9. SONY dataset: examples of surface normal reconstruction using the...
Figure 10.10. Distribution of the surface normal difference by the zenith angl...
Figure 10.11. Six objects and scatter plot of all eight light directions of th...
Figure 10.12. Lumenera dataset: examples (Doll and Bear2) of surface normal re...
Figure 10.13. Shapes reconstructed with Algorithm 2 without a prior on light s...
Figure 10.14. Shapes reconstructed with Algorithm 2 without a prior on light s...
Chapter 11
Figure 11.1. Plots of the Fresnel equations and degree of polarization at an a...
Figure 11.2. Visualizing polarization: When looking down the propagation direc...
Figure 11.3. Geometry of a general linear polarizer rotated at an angle ϕo fro...
Figure 11.4. Geometry of a general reflecting optical element rotated by an an...
Figure 11.5. Polarization imaging geometry: The angle of polarization ψi of th...
Figure 11.6. Principal polarization imaging setup: Commodity photography equip...
Figure 11.7. Polarization imaging reflectometry in the wild: A set of high dyn...
Figure 11.8. TRS fitting: For each near-Brewster view, a per-pixel fit of equa...
Figure 11.9. Polarization imaging reflectometry: Example reflectance maps reco...
Figure 11.10. Change in reflected radiance s
r,0
due to incident polarized illu...
Figure 11.11. Reflectance maps ((a)–(d)) estimated from two views of the sampl...
Figure 11.12. Comparisons of sample photographs (a) to matching renderings und...
Figure 11.13. Brewster angle measurement validation: Simulated TRS for a glass...
Figure 11.14. Diffuse-specular separation at normal incidence: At 6 pm, the sk...
Chapter 12
Figure 12.1. Multispectral and polarization filter array examples.
Figure 12.2. Conceptual diagram of a multilayer photonic crystal.
Figure 12.3. Fabrication process for photonic crystal filter array.
Figure 12.4. Design overview for prototype photonic crystal multispectral filt...
Figure 12.5. Spectral transmittance characteristics of deposited filter array.
Figure 12.6. Visual appearance of filter array.
Figure 12.7. Overview of imaging and demosaicking by multispectral polarizatio...
Figure 12.8. Captured area in the demonstration.
Figure 12.9. Captured and demosaicked images.
Cover Page
Table of Contents
Title Page
Copyright Page
Introduction
Begin Reading
List of Authors
Index
WILEY END USER LICENSE AGREEMENT
iii
iv
xiii
xiv
xv
1
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
69
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
SCIENCES
Image, Field Director – Laure Blanc-Féraud
Sensors and Image Processing, Subject Head – Cédric Demonceaux
Coordinated by
Takuya Funatomi
Takahiro Okabe
First published 2024 in Great Britain and the United States by ISTE Ltd and John Wiley & Sons, Inc.
Apart from any fair dealing for the purposes of research or private study, or criticism or review, as permitted under the Copyright, Designs and Patents Act 1988, this publication may only be reproduced, stored or transmitted, in any form or by any means, with the prior permission in writing of the publishers, or in the case of reprographic reproduction in accordance with the terms and licenses issued by the CLA. Enquiries concerning reproduction outside these terms should be sent to the publishers at the undermentioned address:
ISTE Ltd27-37 St George’s RoadLondon SW19 4EUUK
www.iste.co.uk
John Wiley & Sons, Inc.111 River StreetHoboken, NJ 07030USA
www.wiley.com
© ISTE Ltd 2024The rights of Takuya Funatomi and Takahiro Okabe to be identified as the authors of this work have been asserted by them in accordance with the Copyright, Designs and Patents Act 1988.
Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s), contributor(s) or editor(s) and do not necessarily reflect the views of ISTE Group.
Library of Congress Control Number: 2023942088
British Library Cataloguing-in-Publication DataA CIP record for this book is available from the British LibraryISBN 978-1-78945-150-4
ERC code:PE6 Computer Science and Informatics PE6_8 Computer graphics, computer vision, multi media, computer games PE6_11 Machine learning, statistical data processing and applications using signal processing (e.g. speech, image, video) PE6_12 Scientific computing, simulation and modelling tools
Takuya FUNATOMI1 and Takahiro OKABE2
1Division of Information Science, Nara Institute of Science and Technology, Japan
2Department of Artificial Intelligence, Kyushu Institute of Technology, Fukuoka, Japan
In our physical world, light propagates from various positions towards various directions. Light is emitted from light sources such as the sun and lamps and reflected by surfaces such as walls and glasses. The amount of light flowing in every direction through every point is described as a light field. As readers know, light is an electro-magnetic wave that oscillates perpendicular to its traveling direction. Therefore, light is characterized both by the spatial period of oscillation, i.e. the wavelength, and by the direction of oscillation, i.e. the polarization state. Consequently, the light field is a high-dimensional function with respect to time t, position (x, y, z), direction (θ.ϕ), wavelength λ and polarization state s of light.
Most cameras are inherently designed to mimic the human eye by having three channels of red, green and blue (RGB) color and achieving about 30 frames per second. Therefore, conventional cameras only capture a part of the modalities of a light field with limited spatial, temporal and spectral resolution. Some cameras are designed to capture other modalities, for example spectra from near UV to near IR rather than RGB, polarimetry and time of light travels. Such modalities are difficult to perceive, but provide much information about scenes.
This book focuses on emerging computer vision techniques known as computational imaging. These techniques include capturing, processing and analyzing light modalities for various applications used in scene understanding.
This book is divided into three parts corresponding to time of flight, spectral and polarimetric domains. Part 1 focuses on transient imaging, which enables us to capture events at a temporal resolution in the order of the speed of light. This field is rapidly growing with applications more than range-imaging, e.g. non-line-of-sight reconstruction. The key to rapid growth stands on imaging technologies with consumer cameras and analysis-by-synthesis techniques.
Part 1 has three chapters. Chapter 1 begins with an overview of transient imaging techniques. This chapter also separately addresses one of the most attractive topics in this field, non-line-of-sight (NLOS) imaging, which enables us to see scenes that are not directly visible. Chapter 2 introduces a transient imaging technique with a correlation image sensor used in consumer-level time-of-flight cameras. Chapter 3 reviews the time-of-flight/transient rendering techniques fundamental to the analysis-by-synthesis approach.
Part 2 focuses on spectral imaging and processing, which makes use of spectral information from near UV to IR rather than conventional RGB for scene understanding. Spectral imaging enables us not only to increase the spectral resolution of captured images but also to study wavelength-dependent phenomena such as refraction, scattering and absorption.
This part covers Chapters 4–8: it begins with the principles and architectures of hyperspectral camera in Chapter 4, and then introduces the emerging techniques based on spectral imaging. Chapter 5 addresses the absorption and emission due to fluorescent materials and shows that spectral imaging with a programmable illumination in the spectral domain is useful for separating reflective and fluorescent components in images. Chapter 6 exploits the fact that water absorbs light with near IR wavelengths and shows that the shape of an underwater scene can be recovered from near IR imaging. Chapter 7 studies the temporal transport of far IR light and heat and shows that thermal imaging enables shape recovery of challenging objects made of transparent and semi-transparent materials. Chapter 8 introduces a unique approach to time-of-flight and NLOS imaging based on spectral interferometry.
Part 3 focuses on polarimetric imaging and processing using the oscillating direction of light for scene understanding. The polarization state of light depends on the normal and refractive index of an object’s surface on which light is reflected, and therefore polarimetric imaging is useful for estimating the geometry and material of the object. Polarization imaging is one of the actively studied areas as the imaging sensors with polarization filter arrays become widespread.
This part covers the fundamental theory of polarization and the emerging techniques based on polarimetric imaging. It consists of Chapters 9–12. Chapter 9 begins with the fundamental theory of polarization followed by various techniques for polarization-based shape recovery: the shape from the phase angle, the degree of linear polarization and the Stokes vector. Chapter 10 shows that integrating complementary clues, i.e. the polarimetric clue and shading clue, is useful for shape recovery. Chapter 11 achieves shape recovery in the wild, in particular, by considering polarized illumination such as the sky. Finally, Chapter 12 is a bridge between multispectral imaging and polarimetric imaging. It achieves a multispectral polarization filter array using a photonic crystal and proposes a demosaicing algorithm for it.
Each part begins with an introductory chapter and reviews various achievements by several leading researchers in the field. We hope that such collections, ranging from the overview of fundamentals to cutting-edge technologies, inspire new directions of research and development in these modalities for various applications of scene understanding.