Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Novel imaging systems for intraocular retinal prostheses and wearable visual aids
(USC Thesis Other)
Novel imaging systems for intraocular retinal prostheses and wearable visual aids
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Copyright 2016 Furkan Emre Sahin
NOVEL IMAGING SYSTEMS FOR INTRAOCULAR RETINAL PROSTHESES
AND WEARABLE VISUAL AIDS
by
Furkan Emre Sahin
A Dissertation Presented to the
FACULTY OF THE GRADUATE SCHOOL
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF PHILOSOPHY
(ELECTRICAL ENGINEERING)
May 2016
ii
ACKNOWLEDGEMENTS
This work is the result of years of hard work that I could not have completed
without the great support, guidance, and encouragement of many excellent people.
First, I would like to thank my adviser, Professor Armand R. Tanguay, Jr. for
his trust in me, and also giving me the honor and privilege of being a member of his
Optical Materials and Devices Laboratory (OMDL). His suggestions, guidance,
continuous support, and friendship made my Ph.D. journey fruitful and pleasant, and
helped me become the scholar that I have always dreamed of becoming.
I would like to extend a very warm and hearty thank you to Dr. Patrick J.
Nasiatka, a mentor at OMDL but more importantly a wonderful friend who always
amazed me with the breadth of his knowledge, about anything. I also thank other
colleagues and friends at OMDL, especially Dr. Ben P. McIntosh. I owe significantly to
both Patrick and Ben, for discussions and contributions to my research. They made
the late nights, the long weekends, and our famous EE 529 presentation sessions
enjoyable.
I would also like to thank our collaborators in the wearable visual aid project,
Dr. James D. Weiland, Dr. Laurent Itti, Dr. Gerard G. Medioni, and their wonderful
students. I also thank Dr. Mark S. Humayun, the leader of the interdisciplinary
NSF Biomimetic MicroElectronic Systems Engineering Research Center at USC, and
the retinal prosthesis project. Thank you also to my committee members, Dr. James
D. Weiland, Dr. Alexander (Sandy) Sawchuk, Dr. Laurent Itti, and Dr. Wei Wu.
iii
I am also honored to be among the first cohort of students in the Health,
Technology, and Engineering (HTE@USC) program at USC. Thank you Dr. George
Tolomiczenko and Dr. Terry Sanger for starting such a revolutionary program and
introducing and training me in all aspects of medical technology development. I also
extend my thanks to my HTE project team members, Karthik Murali, Vladimir
Ljubimov, and Dr. Pamela Mar. I will always remember my Sunday meetings with
Karthik, spending long hours either at a white board or in the lab.
Lastly, I would like to express my deepest thanks to my family and friends for
all of their support throughout the years. Thank you to my mother, my father, and my
brother for their endless support. It was very difficult for me to leave them to come
to USC and not see them sometimes for up to a year. However many miles away they
may be, I always feel the support of their prayers. Finally, I want to thank my fiancée
and soon to be wife for her support during the writing period of this thesis.
iv
TABLE OF CONTENTS
Acknowledgements ii
List of Tables vii
List of Figures xi
Abstract xxxii
Chapter 1. Introduction 1
1.1 Wearable Visual Aids 1
1.2 Intraocular Retinal Prostheses 3
1.3 Eye-Tracked Extraocular Camera 4
1.4 Ultraminiature Intraocular Camera for Retinal Prostheses 8
1.5 Organization of the Thesis 9
Chapter 2. Computational Cameras and Wide Angle Lenses 15
2.1 Computational Cameras 15
2.2 Wide Angle Lenses 19
2.3 Barrel Distortion 22
2.4 Dewarping of Wide Angle Images 24
2.5 An Example Dewarping Case 28
2.6 Dewarped Field of View 31
2.7 Integrated Circuits for Dewarping 35
2.8 Methods for Evaluating Lens Designs 36
2.8.1 Modulation Transfer Function 36
2.8.2 Spatial Frequency Response 39
2.8.3 Correlation Coefficient 43
2.8.4 Image Simulation 44
2.9 Wells Research Optical Test Bench 45
2.10 Summary 49
v
Chapter 3. Optimal Design of Wide Angle Computational Cameras 55
3.1 Relative Illumination Considerations 57
3.2 Methods 58
3.3 Results 63
3.4 Summary 74
Chapter 4.
Miniature Wide Field of View Lens Design and Optimization
of Lateral Chromatic Aberration 77
4.1 Single Meniscus Lens Design 77
4.2 Two Lens Design 79
4.3 Lateral Chromatic Aberration Correction 82
4.4
Wide Angle Lens Design with Pixel-Limited
Lateral Chromatic Aberration 85
4.5
Wide Angle Lens Design with Unconstrained
Lateral Chromatic Aberration 89
4.6 Analysis of Results 94
4.7 Custom Wide Angle Lens with COTS Elements 97
4.8 Lens Testing and Verification 102
4.9 Summary 113
Chapter 5. Wide Angle Lens Design with an Aspherical Element 116
5.1 Lens Design Approach 117
5.2 Lens Testing and Verification 131
5.3 Comparison of the Two Lens Designs with COTS Lens Elements 134
5.4 Imaging Performance 138
5.5 Summary 140
Chapter 6.
Miniature Wide Field of View Camera Based on
Wafer Level Optics 144
6.1 Afocal Attachment 147
vi
6.2 Custom Afocal Attachment Design 150
6.3 Custom Afocal Attachment Design with COTS Elements 152
6.4 Summary 156
Chapter 7. Eye-Tracked Extraocular Camera 160
7.1 Wide Field of View Scene Camera 161
7.2 Eye-Tracking Camera 163
7.3 Belt Worn Computer 166
7.4 Software Development 167
7.5 Testing and Benchmarking 171
7.6 Summary 174
Chapter 8. Ultraminiature Intraocular Cameras 178
8.1 Design Strategy for Ultraminiaturization 179
8.2 Alternative Image Sensor Arrays 183
8.3 Ultraminiature IOC with a Glass Lens 186
8.4 Alps Electric Miniature Lenses 198
8.5 Ultraminiature Intraocular Camera with Spherical Lenses 203
8.6 Two Lens Intraocular Cameras with COTS Spherical Lenses 208
8.6.1 Initial Design 208
8.6.2 Elongated IOC Design 214
8.7 Summary 219
Chapter 9. Summary and Future Research Directions 224
9.1 Summary 224
9.2 Future Research Directions 228
Bibliography 232
vii
LIST OF TABLES
Table 3-1. Specifications of the custom four lens wide angle optical
system. All dimensions are in mm. The lens has 50% barrel
distortion at a 60° field angle. The thickness parameter is the
distance between the surface indicated and the following
surface. 73
Table 4-1. Specifications of the designed custom single meniscus lens, as
shown in Figure 4-1. All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and
the following surface. 79
Table 4-2. Specifications of the designed custom wide field of view lens
with two meniscus elements, as shown in Figure 4-2. All
dimensions are in mm. The thickness parameter is the
distance between the surface indicated and the following
surface. 81
Table 4-3. Wavelength weights for the three color channels extracted
from the plot shown in Figure 4-4. Wavelength weights of
<10% were set equal to zero, as they typically have a minimal
effect on the optical system design and analysis. 84
Table 4-4. Specifications of the four lens, wide angle optical system
design that was optimized to have pixel-limited LCA across the
visible spectrum. All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and
the following surface. 88
Table 4-5. Specifications of the four lens, wide angle optical system
design that was optimized with the RGB weights extracted
from the spectral graph shown in Figure 4-4, and listed in Table
4-3, with unconstrained LCA. All dimensions are in mm. The
thickness parameter is the distance between the surface
indicated and the following surface. 91
Table 4-6. Comparison of monochromatic RMS spot sizes for the
pixel-limited LCA lens design and the unconstrained LCA lens
design. The unconstrained LCA lens design has smaller
monochromatic RMS spot sizes. 91
viii
Table 4-7. Correlation coefficients for different color channels in the
simulated and dewarped images. The corner patch coefficients
are derived from the slanted black square in the top left corner
of the images, as shown in Figure 4-12. 96
Table 4-8. Specifications of the custom four lens COTS optical system with
reduced optical LCA. All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and
the following surface. 99
Table 4-9. Specifications of the custom four lens COTS optical system with
unconstrained LCA (EO-Lens). All dimensions are in mm. The
thickness parameter is the distance between the surface
indicated and the following surface. 102
Table 4-10. Correlation coefficients for the different color channels in the
overall captured and dewarped image of the ISO 12233 test
target, and for a corner patch of the image (as shown in Figure
4-23). 111
Table 5-1. Specifications of the custom two lens wide field of view optical
system (as shown in Figure 5-5, Left). All dimensions are in
mm. The thickness parameter is the distance between the
surface indicated and the following surface. 124
Table 5-2. Specifications of the LP-Custom lens, a custom wide field of
view lens with a central COTS aspherical lens element. All
dimensions are in mm. The thickness parameter is the
distance between the surface indicated and the following
surface. 125
Table 5-3. Specifications of the full custom wide field of view lens design
shown in Figure 5-7. The first and the last elements have
spherical surfaces, and the central lens element has aspherical
surfaces. All dimensions are in mm. The thickness parameter
is the distance between the surface indicated and the following
surface. 128
Table 5-4. Specifications of the LP-Lens, a custom wide field of view lens
with COTS lens elements. All dimensions are in mm. The
thickness parameter is the distance between the surface
indicated and the following surface. 131
ix
Table 5-5. Comparison of the specifications of the two designed and
implemented lenses, the spherical surface EO-Lens (Chapter 4)
and the aspherical surface LP-Lens (Chapter 5). 136
Table 6-1. Optical specifications for the OmniVision OVM7692 wafer level
camera [2]. 145
Table 6-2. Specifications of the idealized, multi-element model for the
OVM7692 wafer level camera optical system. All dimensions
are in mm. The thickness parameter is the distance between
the surface indicated and the following surface. 146
Table 6-3. Specifications of the custom afocal attachment designed for the
OVM7692 wafer level camera. All dimensions are in mm. The
thickness parameter is the distance between the surface
indicated and the following surface. 151
Table 6-4. Specifications of the custom afocal attachment with COTS
optical elements. All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and
the following surface. 153
Table 7-1. Specifications for the Ainol Mini PC and the iMac desktop
computer used for benchmarking ET-EOC software. 172
Table 7-2. Benchmarking results for the eye-tracked extraocular camera
(ET-EOC) software running on a belt worn computer and on a
desktop computer. 173
Table 7-3. Accuracy testing results for the eye-tracked extraocular
camera (ET-EOC). Pixel locations for the centers of the targets,
locations predicted by the software after calibration, and the
differences in image coordinates between these two values are
listed, as well as the distance between the two pixel locations
(Error). All values are in units of pixels. 174
Table 8-1. Comparison of specifications for the OmniVision OV6930, the
Awaiba NanEye, the Fujikura G2, and the OmniVision OV6946
image sensor arrays. 185
Table 8-2. Specifications of the designed custom ultraminiature IOC lens
with conic surfaces. All dimensions are in mm. 188
x
Table 8-3. Specifications of the designed custom ultraminiature IOC
optical system with plano convex lens elements, as shown in
Figure 8-19. All dimensions are in mm. 204
Table 8-4. Specifications of the ultraminiature IOC optical system with
COTS spherical lens elements, as shown in Figure 8-24. All
dimensions are in mm. 209
Table 8-5. Specifications of the elongated ultraminiature IOC optical
system with COTS spherical lens elements, as shown in Figure
8-30 below. All dimensions are in mm. 215
Table 8-6. Comparison of specifications for four custom ultraminiature
IOC lenses presented in this chapter, with each optical imaging
system modeled in the Liou-Brennan optical model of the eye
[11]. RMS spot sizes for three field angles are listed. The
reference wavelengths for the RMS spot sizes are 630 nm,
540 nm, and 470 nm. The RMS spot size contribution at
540 nm is weighed twice as much as the 630 nm and 470 nm
RMS spot size contributions in the calculation. 221
xi
LIST OF FIGURES
Figure 1-1. Retinal prosthesis with externally mounted (extraocular)
camera; from [8]. The diagram of the eye has been rotated
counterclockwise by 90°, and the ultra-flexible ribbon cable
placed in the opposite hemisphere, for clarity of illustration. 3
Figure 1-2. Retinal prosthesis with internally mounted (intraocular)
camera; from [8]. 5
Figure 1-3. Retinal prosthesis with an eye-tracked extraocular camera;
modified after [8]. 5
Figure 2-1. (Left) Square grid imaged through an optical system with
barrel distortion. (Right) Square grid imaged through an
optical system with pincushion distortion. 24
Figure 2-2. Pixel values for the dewarped image (shown as blue dots on
the right) are sampled from locations on the captured image
(red dots on the left). The location for sampling is determined
by the mapping function for dewarping. In the case of
non-integer pixel locations, the pixel value is interpolated from
the pixel values of neighboring pixels (green dots). 25
Figure 2-3. Image of an ISO 12233 test target captured with the SunVision
lens integrated with an OmniVision OV10633 image sensor
array. 28
Figure 2-4. Image of a chessboard calibration target captured with the
SunVision lens integrated with an OmniVision OV10633 image
sensor array. 29
Figure 2-5. (Left) The same captured image shown in Figure 2-3. The
regions omitted by the dewarping algorithm are shown in
black. (Right) The same image following dewarping. The
image is now rectilinear. 29
Figure 2-6. Resolution of the dewarped image as a function of the field
angle. Areas in the corners are sampled from smaller regions
in the captured image as compared to areas in the center. 30
xii
Figure 2-7. (Left) Image shown in Figure 2-3 after dewarping, with the
aspect ratio and the vertical field of view kept constant. (Right)
Image shown in Figure 2-3 after dewarping, with the aspect
ratio and the horizontal field of view kept constant. Some
regions in the dewarped image are empty, since information
for these regions was not available in the captured image. 32
Figure 2-8. Lens nodal points and rear nodal distance (r nd). 33
Figure 2-9. (Left) Leopard Imaging LI-USB30-M034WDR camera with
Aptina AP0100 real-time dewarping chip. (Center) Example
image captured with this camera. (Right) Same image as in
the center, after on-chip dewarping (Images provided by
Leopard Imaging [40]). 35
Figure 2-10. (Left) Sinusoidal line pattern with 100% modulation.
(Right) Sinusoidal line pattern with 40% modulation. 37
Figure 2-11. A typical MTF plot for an optical system. Different colored lines
correspond to different field angles in the field of view. Solid
lines correspond to tangential (meridional) fans of rays and
dashed lines correspond to radial (sagittal) fans of rays. 38
Figure 2-12. (Left) Tangential rays for a lens. The intersection of the
tangential (meridional) plane and the front surface of the lens
is shown with a purple line. (Right) Radial rays for a lens. The
intersection of the radial (sagittal) plane and the front surface
of the lens is shown with a purple line. 39
Figure 2-13. ISO 12233 Resolution Target. SFR measurements from regions
shown in the red boxes will be averaged and called “Center
SFRs”, SFR measurements from regions shown in the blue
boxes will be averaged and called “Edge SFRs”, and SFR
measurements from regions shown in the green boxes will be
averaged and called “Corner SFRs” later in this thesis. 42
Figure 2-14. Wells Research OS-400-25 Optical Test Bench [46]. 46
Figure 2-15. Normalized intensity spectrum of the LED light sources in the
OS-400-25 Optical Test Bench. Different color LEDs are plotted
with their respective colors. 47
xiii
Figure 2-16. The OS-400-25 measurement reticle with a square aperture
(shown in gray in the background), is placed in the back focal
plane of a sample lens being tested. The reticle is imaged
through the lens under test and the measurement camera lens
to the image plane of the measurement camera. The regions
used for generating MTF plots are highlighted in red. 48
Figure 3-1. (Left) An input grid of lines (black) and its simulated image
(superimposed in red) for a lens design with a ±60° diagonal
field of view and 30% barrel distortion at a 60° (half) field
angle. Blue arrows show the local magnitudes and directions
for reverse mapping to implement dewarping. (Right) An
input grid of lines (black) and its simulated image
(superimposed in red) for a lens design with a ±60° diagonal
field of view and 60% barrel distortion at a 60° (half) field
angle. In the higher distortion lens, corner regions are more
compressed as compared to the lower distortion lens design. 56
Figure 3-2. Original Roosinov lens form, representing a wide field of view,
compact lens design; from U.S. Patent US2516724 [6]. 60
Figure 3-3. (Left) Schematic diagram of a wide angle lens optimized to
have 5% barrel distortion at a 60° (diagonal) field angle.
(Center) Schematic diagram of a wide angle lens optimized to
have 30% barrel distortion at a 60° field angle. (Right)
Schematic diagram of a wide angle lens optimized to have 45%
barrel distortion at a 60° field angle. The last two elements
were merged to form a doublet as indicated by successive
optimization cycles. The reference wavelength for the
diagrams is 540 nm. 61
Figure 3-4. (Top Left) Input high resolution test image array, comprising
a 5 × 5 array of ISO 12233 test targets. (Top Right) Image
simulated through the lens with 40% barrel distortion at full
field (±60° diagonal FOV). The reference wavelengths for the
image simulation are 630 nm, 540 nm, and 470 nm. (Bottom
Left) The simulated image dewarped with distortion
parameters extracted from the lens model with the “Distortion
Grid” tool in CODE V. (Bottom Right) The simulated image
dewarped and flat-field corrected. 63
xiv
Figure 3-5. The correlation coefficients for a set of lens designs with
different degrees of barrel distortion at full field (±60° dFOV).
Note the offset vertical axis, spanning correlation coefficient
values from 0.3 to 1.0. 64
Figure 3-6. Variation of illumination as a function of the field angle for the
dewarped images, from eight different wide angle lens designs,
each with different degrees of barrel distortion at full field
(±60° dFOV). 66
Figure 3-7. (Top Left) Image simulated through the lens with 5% barrel
distortion at full field, with 10 dB PSNR noise added. The
reference wavelengths for the image simulation are 630 nm,
540 nm, and 470 nm. (Center Left) The same image as on the
top left following dewarping. (Bottom Left) The same image
as on the top left following dewarping and flat-field correction.
(Top Right) Image simulated through the lens with 50% barrel
distortion at full field and 10 dB PSNR noise added. The
reference wavelengths for the image simulation are 630 nm,
540 nm, and 470 nm. (Center Right) The same image as on the
top right following dewarping. (Bottom Right) The same
image as on the top right following dewarping and flat-field
correction. 67
Figure 3-8. Correlation coefficient variation as a function of field angle for
lens designs with different degrees of barrel distortion at full
field (±60° diagonal FOV), without flat-field correction.
Correlation coefficients corresponding to the same field angles
are averaged, and the resulting values are plotted in the graph.
Note the offset vertical axis, spanning correlation coefficient
values from 0.5 to 1.0. 68
Figure 3-9. Correlation coefficient variation as a function of field angle for
lens designs with different degrees of barrel distortion at full
field (±60° diagonal FOV), after flat-field correction.
Correlation coefficients corresponding to the same field angles
are averaged, and the resulting values are plotted in the graph.
Note the offset vertical axis, spanning correlation coefficient
values from 0.75 to 1.0. 69
xv
Figure 3-10. (Top Left) Image simulated through the lens design with 10%
barrel distortion at full field following dewarping and flat-field
correction. The reference wavelengths for the image
simulation are 630 nm, 540 nm, and 470 nm. (Bottom Left)
Center patch of the image shown on the top left. (Top Right)
Image simulated through the lens design with 50% barrel
distortion at full field following dewarping and flat-field
correction. The reference wavelengths for the image
simulation are 630 nm, 540 nm, and 470 nm. (Bottom Right)
Center patch of the image shown on the top right. Higher
resolution in the central patch of the 50% barrel distortion
design is readily apparent. 71
Figure 3-11. Spatial frequency corresponding to a modulation of 0.5 for lens
designs with different degrees of barrel distortion at full field
(±60° dFOV), derived from simulated images after dewarping
and flat-field correction. 72
Figure 3-12. Schematic diagram of a wide angle lens that was optimized to
have 50% barrel distortion at full field (±60° diagonal FOV).
The last two elements were merged to form a doublet as
indicated by successive optimization cycles. This lens design
was determined to be optimal for the given constraints. 72
Figure 4-1. (Top Left) Schematic diagram of a single meniscus, wide angle
lens design, based on the Wollaston lens. The reference
wavelength for the diagram is 540 nm. (Top Right) Image
simulation result through the lens shown on the left, after
dewarping. The image simulation was performed with the
individual color channel wavelength weights listed in
Table 4-3. The image is not flat-field corrected to show the
variation of illumination at the image plane. (Bottom) The
same image as on the top right after flat-field correction. 78
Figure 4-2. (Top Left) Schematic diagram of a two lens, wide angle lens
design. The reference wavelength for the diagram is 540 nm.
(Top Right) Image simulation result through the lens shown
on the left, after dewarping. The image simulation was
performed with the individual color channel wavelength
weights listed in Table 4-3. The image is not flat-field
corrected to show the variation of illumination at the image
plane. (Bottom) The same image as on the top right after
flat-field correction. 80
xvi
Figure 4-3. Lateral Chromatic Aberration (LCA) in an optical system will
cause the image of a white point (Left) to separate into
different colors (Center) at the image plane. Each color
channel of the captured digital image (red, green, and blue) can
be realigned digitally in software to minimize LCA (Right). 82
Figure 4-4. Spectral response of the three different color filters of a
commercially available color image sensor array, the Aptina
MT9M034; from [8]. The cutoff frequency of an IR-cut filter
that is typically employed in cameras is shown with the dashed
line. 83
Figure 4-5. (Left) Schematic diagram of a four lens, wide angle optical
system design that was optimized to have pixel-limited LCA
across the visible spectrum. The reference wavelength for the
diagram is 540 nm. (Right) Spot diagram for the lens shown
on the left, for five field angles and with three wavelengths
(red: 630 nm; green: 540 nm; and blue: 470 nm). 86
Figure 4-6. (Top) Image simulation result through the optical system with
pixel-limited LCA shown in Figure 4-5, Left. The image
simulation was performed with the individual color channel
wavelength weights listed in Table 4-3. (Bottom) The same
image as on the top following software dewarping with a single
LUT (no digital chromatic aberration correction). The image is
not flat-field corrected to show the variation of illumination at
the image plane. 87
Figure 4-7. The same image as in Figure 4-6, Bottom after flat-field
correction. 88
Figure 4-8. (Left) Schematic diagram of a four lens, wide angle lens design
optimized with the individual color channel wavelength
weights listed in Table 4-3, with unconstrained LCA. The
reference wavelength for the diagram is 540 nm. (Right) Spot
diagram for the lens shown on the left, for five field angles and
with three wavelengths (red: 630 nm, green: 540 nm, and
blue: 470 nm). The units of the RMS spot diameters are in µm. 90
xvii
Figure 4-9. (Top) Image simulation result through the lens with
unconstrained LCA shown in Figure 4-8. The image simulation
was performed with the individual color channel wavelength
weights listed in Table 4-3. (Bottom) The same image as on
the top after dewarping with a single LUT . The image is not
flat-field corrected to show the variation of illumination at the
image plane. 92
Figure 4-10. (Top) The same image shown in Figure 4-9, Top after
dewarping with 3 separate LUTs, one for each color channel.
The image is not flat-field corrected to show the variation of
illumination at the image plane. (Bottom) The same image as
on the top after flat-field correction. 93
Figure 4-11. (Top) Spatial frequency response (SFR) plot of the designed
lenses in the center region. (Bottom) SFR plot of the designed
lenses at the edges. (Note that the EO-Lens (orange) and the
COTS Design – Minimal LCA (black) curves almost overlap in
this case). Image simulations of the ISO 12233 test target
(used in the SFR calculations) were performed with the
individual color channel wavelength weights listed in Table
4-3. 95
Figure 4-12. Visual comparison of lens performance in the center regions
(Top Row) and corner regions (Bottom Row) of images
simulated with the designed lenses, after dewarping. For the
Custom Four Lens Design with Unconstrained LCA, the
dewarping was performed with three separate look-up tables,
each optimized for an individual color channel (red, green, and
blue). For the COTS Design with Unconstrained LCA
(EO-Lens), the dewarping was also performed with three
separate look-up tables, each optimized for an individual color
channel (red, green, and blue). Image simulations of the
ISO 12233 test target were performed with the individual
color channel wavelength weights listed in Table 4-3. 97
Figure 4-13. Schematic diagram of the custom lens design using COTS
elements with reduced optical LCA. The reference wavelength
for the diagram is 540 nm. 100
xviii
Figure 4-14. (Top) Schematic diagram of the custom lens design with
unconstrained LCA using COTS elements (EO-Lens). The
reference wavelength for the diagram is 540 nm. (Bottom)
Size comparison of the designed and fabricated lens (Bottom
Right) with a commercially available lens of similar field of
view (Bottom Left). The fabricated lens diameter could be
reduced from 12 mm (set in this implementation by the size of
the standard M12 × 0.5 thread employed for ease of optical
mounting for lens testing) to approximately 7 mm. 101
Figure 4-15. Comparison of the measured distortion profile of the COTS
EO-Lens designed with unconstrained LCA to the distortion
profile obtained from the CODE V model. 103
Figure 4-16. First modulation transfer function (MTF) measurement of the
EO-Lens. Tangential MTF (Top) and radial MTF (Bottom)
curves for the designed and implemented EO-Lens. The MTF
measurements were performed with the green LED of the
OS-400-25 system, with the spectrum as shown in Figure 2-15.
CODE V results were also obtained with wavelength weights
derived from the spectrum of the green LED as shown in
Figure 2-15. The measurement results show close agreement
to the analysis results obtained from the CODE V model of the
lens at all field angles. 105
Figure 4-17. Second MTF measurement of the EO-Lens. Tangential MTF
(Top) and radial MTF (Bottom) curves for the designed and
implemented EO-Lens. The MTF measurements were
performed with the green LED of the OS-400-25 system, with
the spectrum as shown in Figure 2-15. CODE V results were
also obtained with wavelength weights derived from the
spectrum of the green LED as shown in Figure 2-15. The
measurement results show close agreement to the analysis
results obtained from the CODE V model of the lens at all field
angles. 106
Figure 4-18. Comparison of the modulation transfer function (MTF)
averaged over five field angles for two separate measurements
with the result obtained from the CODE V model of the
EO-Lens. These results are based on the data presented in
Figures 4-16 and 4-17, and show excellent agreement among
the average results for the two measurements and the average
result obtained from the CODE V model. 107
xix
Figure 4-19. Comparison of spatial frequency response (SFR) plots for
captured and simulated images with the EO-Lens integrated
with an Aptina MT9M034 image sensor array after dewarping,
with the object placed at infinity. Image simulations of the
ISO 12233 test target (used in the SFR calculations) were
performed with the individual color channel wavelength
weights listed in Table 4-3. 108
Figure 4-20. The total resolution of the dewarped image was determined by
estimating the resolution in different regions of the image,
including corners, top and bottom edges, left and right edges,
and center regions of an ISO 12233 test target. 109
Figure 4-21. Image of an ISO 12233 test target, with a Macbeth color chart
placed on top of it, captured with the as-fabricated EO-Lens
integrated with an Aptina MT9M034 wide dynamic range
image sensor array. 110
Figure 4-22. (Top) The same image as shown in Figure 4-21 after
dewarping with a single LUT , without flat-field correction.
(Bottom) The same image as shown in Figure 4-21 after
dewarping with 3 separate LUTs, one for each color channel,
without flat-field correction. 112
Figure 4-23. (Left) Expanded view of the top left region from Figure 4-22,
Top. (Right) Expanded view of the top left region from Figure
4-22, Bottom, showing the significant improvement in lateral
chromatic aberration through software correction (separate
color channel dewarping). 113
Figure 4-24. The same image as shown in Figure 4-21 after dewarping with
3 separate LUTs, one for each color channel, and flat-field
correction to remove relative illumination variation. 113
Figure 5-1. The sag (z) of an aspherical surface is defined as the deviation
from a planar surface measured from the vertex. 116
xx
Figure 5-2. Schematic diagram of the LightPath Technologies 355150 lens
(modeled over the visible wavelength range). Large field angle
rays are focused closer to the lens than the paraxial rays.
Therefore, in order to achieve the smallest spot sizes for all
field angles without field flattening, a curved image surface
would need to be incorporated. The curved image surface
drawn with the dashed orange line shows this curved surface
for the drawn set of rays. The reference wavelength for the
diagram is 540 nm. 119
Figure 5-3. (Left) Spot diagram for the LightPath Technologies 355150
lens operating at f/ 2, for four field angles and at 780 nm
wavelength. (Center) Spot diagram for the LightPath
Technologies 355150 lens operating at f/ 2, for four field angles
and at 540 nm wavelength. (Right) Spot diagram for the
LightPath Technologies 355150 lens operating at f/ 2, for four
field angles and with three wavelengths (red: 630 nm; green:
540 nm; and blue: 470 nm). 121
Figure 5-4. (Top Left) Modulation transfer function (MTF) plot for the
LightPath Technologies 355150 lens operating at f/ 2, for four
field angles and at 780 nm wavelength. (Top Right) MTF plot
for the LightPath Technologies 355150 lens operating at f/ 2,
for four field angles and at 540 nm wavelength. (Bottom) MTF
plot for the LightPath Technologies 355150 lens operating at
f/ 2, for four field angles and with three wavelengths (red:
630 nm; green: 540 nm; and blue: 470 nm). 122
Figure 5-5. (Left) Schematic diagram of the LightPath 355150 lens with a
custom spherical (negative) element added to increase the
usable field of view. (Right) Schematic diagram of the
LightPath Technologies 355150 lens with two custom
spherical (negative) elements added (LP-Custom lens) to
increase the usable field of view and reduce aberrations over
the wide field of view. The reference wavelength for the
diagrams is 540 nm. 123
Figure 5-6. Tangential (T) and radial (R) field curves for the single
aspherical lens, the two lens design (as shown in Figure 5-5
Left), the LP-Custom lens (as shown in Figure 5-5 Right), and
the LP-Lens (as shown in Figure 5-9). 126
xxi
Figure 5-7. Schematic diagram of a full custom wide field of view lens
design. The first and the last lens elements have spherical
surfaces, and the central lens element has aspherical surfaces.
The reference wavelength for the diagram is 540 nm. 127
Figure 5-8. Comparison of the SFR plots obtained from simulated images
with the LP-Lens, the LP-Custom lens, and the full custom lens
after dewarping each image with three corresponding look-up
tables (one for each color channel). The simulated images have
a 55° half field of view (110° full field of view), the edge region
corresponds to a 45° field angle, and the corner region
corresponds to a 50° field angle. Image simulations of the ISO
12233 test target (used in the SFR calculations) were
performed with the individual color channel wavelength
weights listed in Table 4-3. 129
Figure 5-9. (Left) Schematic diagram of the LP-Lens. (Right) Schematic
diagram of the LP-Lens, after reducing the edge thickness of
the first COTS lens element. The reference wavelength for the
diagrams is 540 nm. 130
Figure 5-10. Comparison of measured distortion profile of the LP-Lens to
the distortion profile obtained from the CODE V model. 132
Figure 5-11. Tangential MTF (Top) and radial MTF (Bottom) curves for the
designed and implemented LP-Lens. Measurement results as
well as the simulation results obtained from the CODE V model
of the lens are presented. The MTF measurements were
performed with the green LED of the OS-400-25 system, with
the spectrum as shown in Figure 2-15. CODE V results are also
obtained with wavelength weights derived from the spectrum
of the green LED as shown in Figure 2-15. 133
Figure 5-12. Average modulation transfer function (MTF) curves for the
LP-Lens, showing the average of the radial and tangential MTF
curves at a given field angle. Measurement results show close
agreement with the results obtained from the CODE V model
of the lens. 134
xxii
Figure 5-13. Comparison of the SFR plots obtained from simulated images
with the LP-Lens and the EO-Lens after dewarping with three
look-up tables (one for each color channel). Image simulations
of the ISO 12233 test target (used in the SFR calculations) were
performed with the individual color channel wavelength
weights listed in Table 4-3. 135
Figure 5-14. Lateral color plot for the LP-Lens and the EO-Lens. 136
Figure 5-15. (Top) Image captured with the LP-Lens integrated with the
LI-USB30-M034WDR camera module. The camera module is
programmed to fully correct for the barrel distortion. The
image has a 100° (measured) diagonal field of view. (Bottom)
Image captured with the LP-Lens integrated with the
LI-USB30-M034WDR camera module. In this case, the camera
module is programmed to correct for most of the barrel
distortion, with minimal uncorrected barrel distortion at the
corners. The image has 110° (measured) diagonal field of
view. 139
Figure 5-16. Comparison of SFR plots for both captured and simulated
images with the LP-Lens after dewarping, with the object
placed at infinity. Image simulation of the ISO 12233 test
target (used in the SFR calculations) was performed with the
individual color channel wavelength weights listed in Table
4-3. 141
Figure 5-17. Image of an ISO 12233 target captured with the LP-Lens
integrated with the LI-USB30-M034WDR camera module. The
image is flat-field corrected to remove relative illumination
variations. 142
Figure 6-1. OmniVision OVM7692 wafer level camera shown next to a U.S.
penny for size comparison. 144
Figure 6-2. (Left) Schematic diagram of the idealized, multi-element
model for the OVM7692 wafer level camera optical system.
The reference wavelength for the diagram is 540 nm.
(Right) Spot diagram for the lens shown on the left, for five
field angles and with three wavelengths (red: 630 nm;
green: 540 nm; and blue: 470 nm). 147
Figure 6-3. Schematic diagram of a Galilean telescope design. 148
xxiii
Figure 6-4. Spatial frequency response (SFR) analysis of the OmniVision
OVM7692 wafer level camera in the center and edge regions. 150
Figure 6-5. (Left) Schematic diagram of the custom afocal attachment
designed for the OVM7692 wafer level camera. Note that the
exit rays are parallel for each field angle, as desired. The
reference wavelength for the diagram is 540 nm. (Right)
Modulation transfer function (MTF) plot for the combination
of the custom afocal attachment and the pixel-size-limited
optical model of the OVM7692 wafer level camera. The
reference wavelengths for the MTF plot are 630 nm, 540 nm,
and 470 nm. The MTF contribution at 540 nm is weighed
twice as much as the 630 nm and 470 nm MTF contributions
in the calculation. 152
Figure 6-6. (Left) Schematic diagram of the custom designed afocal
attachment with COTS elements. The reference wavelength for
the diagram is 540 nm. (Right) Modulation transfer function
(MTF) plot for the combination of the custom afocal
attachment with COTS elements and the pixel-size-limited
optical model of the OVM7692 wafer level camera. The
reference wavelengths for the MTF plot are 630 nm, 540 nm,
and 470 nm. The MTF contribution at 540 nm is weighed
twice as much as the 630 nm and 470 nm MTF contributions
in the calculation. 153
Figure 6-7. (Left) As-fabricated custom afocal attachment with COTS
elements integrated in a housing for field of view expansion of
a wafer level camera. (Right) The afocal attachment integrated
with the OVM7692 wafer level camera development kit. The
afocal attachment is placed on top of the wafer level camera
and is held in place behind the round silver mount shown in
the photo. 154
Figure 6-8. Spatial frequency response (SFR) analysis of the wafer level
camera with and without the afocal attachment. 155
Figure 6-9. Image of ISO 12233 test target captured with the OVM7692
wafer level camera. The image is contrast enhanced to show
detail. 156
xxiv
Figure 6-10. (Top) Image of the same target as shown in Figure 6-9
captured with the designed afocal attachment integrated with
the wafer level camera. (The test target is brought closer to the
camera to capture the whole test target, as the field of view has
been significantly increased). (Bottom) The image on the top
following software dewarping. The images are contrast
enhanced to show detail. 158
Figure 7-1. Retinal prosthesis with an eye-tracked extraocular camera;
modified after [1]. 160
Figure 7-2. Scene camera used in the designed and implemented
eye-tracked extraocular camera prototype. The scene camera
comprises the LI-USB30-M034WDR camera module from
Leopard Imaging [6] and the custom designed and
implemented LP-Lens, as described in Chapter 5. 161
Figure 7-3. (Left) An indoor hallway scene captured with the ET-EOC
scene camera. (Right) An outdoor scene captured with the
ET-EOC scene camera. 163
Figure 7-4. HBV-1306 USB camera module shown next to a U.S. penny. 164
Figure 7-5. The Ainol Mini PC, a small computer with a built-in battery that
was used as the real time image processor in the eye-tracked
extraocular camera. 166
Figure 7-6. Eye-tracked extraocular camera (ET-EOC) images captured
during the calibration phase. (Left) Green channel of the
captured eye-tracking camera image. The location of the pupil
is correctly identified by the software as shown with an
overlaid red square around the pupil. (Right) Captured scene
camera image. The location of the target is correctly identified
by the software, as shown with an overlaid red square. 168
xxv
Figure 7-7. Schematic representation of the eye-tracker and the scene
camera images. The coordinates p x and p y are the x and y pixel
locations of the pupil center in the eye-tracker image (shown
with the green dot), while pc x and pc y are the x and y pixel
locations of the eye-tracker calibration center in the
eye-tracker image (shown with the red dot). The coordinates
s x and s y are the x and y pixel locations of the gaze target center
in the scene camera image (shown with the purple dot), while
sc x and sc y are the x and y pixel locations of the scene camera
calibration center in the scene camera image (shown with the
blue dot). Four quadrants in both images are also shown. 169
Figure 7-8. Integrated eye-tracked extraocular camera (ET-EOC), showing
both the scene camera and the eye-tracking camera mounted
on a pair of clear safety glasses. 171
Figure 7-9. The implemented prototype eye-tracked extraocular camera
(ET-EOC) as worn for field testing. The camera feeds are
routed through two separate USB cables to the belt-mounted
computer. 175
Figure 8-1. Schematic diagram of the glass lens (with two conic surfaces)
designed for the ultraminiature intraocular camera (IOC),
shown in the Liou-Brennan [11] optical model of the eye. The
cover glass above the image sensor array is also shown. The
reference wavelength for the diagram is 540 nm. 187
Figure 8-2. Modulation transfer function (MTF) plot for the designed
ultraminiature intraocular camera lens with two conic
surfaces, in the Liou-Brennan optical model of the eye and with
the object at infinity. The lens has a modulation of 0.5 or
greater up to the Nyquist frequency (83 lp/mm) at all field
angles. The reference wavelengths for the MTF plot are
630 nm, 540 nm, and 470 nm. The MTF contribution at
540 nm is weighed twice as much as the 630 nm and 470 nm
MTF contributions in the calculation. 187
xxvi
Figure 8-3. Portion of the USAF 1951 resolution target used for image
simulation of ultraminiature IOC designs. The target has a 6:10
aspect ratio, matching the aspect ratio of the implanted
electrodes of the Argus II intraocular retinal prosthesis. Next
generation retinal implants are expected to have up to 24 × 40
electrodes, which can support at most 12 line pairs/vertical
height ( i.e., 12 pairs of black and white lines over the vertical
field of view). 190
Figure 8-4. Image shown in Figure 8-3 simulated through the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-
Brennan optical model of the eye with the object placed at
infinity. The diagonal FOV is 20°. The reference wavelengths
for the image simulation are 630 nm, 540 nm, and 470 nm,
with equal weights. 191
Figure 8-5. Image simulation result shown in Figure 8-4 following
resampling at an image sensor resolution of 48 × 80 pixels. 192
Figure 8-6. Image simulation result shown in Figure 8-4 following
resampling at an electrode array resolution of 24 × 40
grayscale pixels. 192
Figure 8-7. Modulation transfer function (MTF) plot for the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-
Brennan optical model of the eye, with the object placed at
20 cm (approximately twice the hyperfocal distance). The
reference wavelengths for the MTF plot are 630 nm, 540 nm,
and 470 nm. The MTF contribution at 540 nm is weighed
twice as much as the 630 nm and 470 nm MTF contributions
in the calculation. 193
Figure 8-8. Image shown in Figure 8-3 simulated through the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-
Brennan optical model of the eye with the object placed at
20 cm (approximately twice the hyperfocal distance). The
diagonal field of view is 20°. The reference wavelengths for the
image simulation are 630 nm, 540 nm, and 470 nm, with equal
weights. 193
Figure 8-9. Image simulation result shown in Figure 8-8 following
resampling at an image sensor resolution of 48 × 80 pixels. 194
xxvii
Figure 8-10. Image simulation result shown in Figure 8-8 following
resampling at an electrode array resolution of 24 × 40
grayscale pixels. 194
Figure 8-11. Modulation transfer function (MTF) plot for the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-
Brennan optical model of the eye, with the object placed at
6 cm (approximately half the hyperfocal distance). The
reference wavelengths for the MTF plot are 630 nm, 540 nm,
and 470 nm. The MTF contribution at 540 nm is weighed
twice as much as the 630 nm and 470 nm MTF contributions
in the calculation. 195
Figure 8-12. Image shown in Figure 8-3 simulated through the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-
Brennan optical model of the eye, with the object placed at
6 cm (approximately half the hyperfocal distance). The
diagonal field of view is 20°. The reference wavelengths for the
image simulation are 630 nm, 540 nm, and 470 nm, with equal
weights. 196
Figure 8-13. Image simulation result shown in Figure 8-12 following
resampling at an image sensor resolution of 48 × 80 pixels. 197
Figure 8-14. Image simulation result shown in Figure 8-12 following
resampling at an electrode array resolution of 24 × 40
grayscale pixels. 197
Figure 8-15. (Left) Spot diagram for the designed ultraminiature IOC lens
(with two conic surfaces) in the Liou-Brennan optical model of
the eye, with the object placed at infinity. (Center) Spot
diagram for the designed ultraminiature IOC lens (with two
conic surfaces) in the Liou-Brennan optical model of the eye,
with the object placed at 20 cm (approximately twice the
hyperfocal distance). (Right) Spot diagram for the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-
Brennan optical model of the eye, with the object placed at
6 cm (approximately half the hyperfocal distance). The
reference wavelengths for the spot diagrams are 630 nm,
540 nm, and 470 nm. The RMS spot diagram contribution at
540 nm is weighed twice as much as the 630 nm and 470 nm
RMS spot diagram contributions in the calculation. 198
xxviii
Figure 8-16. The ALPS Electric FLAS0SG01A aspherical lens shown next to
a U.S. penny. 199
Figure 8-17. (Top) Schematic diagram of the Alps Electric FLAS0SG01A lens
shown in the Liou-Brennan optical model of the eye [11]. The
reference wavelength for the diagram is 540 nm. (Bottom)
MTF plot for the Alps Electric FLAS0SG01A lens in the Liou-
Brennan optical model of the eye, with the object placed at
infinity. The reference wavelengths for the MTF plot are
630 nm, 540 nm, and 470 nm. The MTF contribution at
540 nm is weighed twice as much as the 630 nm and 470 nm
MTF contributions in the calculation. 201
Figure 8-18. (Top) Image simulation through the Alps Electric
FLAS0SG01A aspherical lens operating in air at f/0.84, and
with a 200 × 400 pixel image sensor resolution. The reference
wavelengths for the image simulation are 630 nm, 540 nm, and
470 nm. (Bottom) Captured image with the Alps Electric
FLAS0SG01A lens and the OV6930 image sensor array also
operating in air at f/0.84, and with a 200 × 400 pixel image
sensor resolution. The central box outlined in red dashes
represents the central 20° diagonal field of view. 202
Figure 8-19. Schematic diagram of the designed ultraminiature two lens
IOC optical system, shown in the Liou-Brennan optical model
of the eye [11]. The cover glass above the image sensor array
is also shown. The reference wavelength for the diagram is
540 nm. 204
Figure 8-20. Modulation transfer function (MTF) plot for the designed
ultraminiature two lens IOC optical system (as shown in Figure
8-19) in the Liou-Brennan optical model of the eye and with
the object at infinity. The lens has a modulation of 0.46 or
greater up to the Nyquist frequency (83 lp/mm) at all field
angles. The reference wavelengths for the MTF plot are
630 nm, 540 nm, and 470 nm. The MTF contribution at
540 nm is weighed twice as much as the 630 nm and 470 nm
MTF contributions in the calculation. 206
xxix
Figure 8-21. Image shown in Figure 8-3 simulated through the designed
ultraminiature two lens IOC optical system (as shown in Figure
8-19) in the Liou-Brennan optical model of the eye and with
the object placed at infinity. The diagonal field of view is 20°.
The reference wavelengths for the image simulation are
630 nm, 540 nm, and 470 nm, with equal weights. 206
Figure 8-22. Image simulation result shown in Figure 8-21 following
resampling at an image sensor resolution of 48 × 80 pixels. 207
Figure 8-23. Image simulation result shown in Figure 8-21 following
resampling at an electrode array resolution of 24 × 40
grayscale pixels. 207
Figure 8-24. Schematic diagram of the designed two lens ultraminiature
IOC optical system with COTS lens elements shown in the
Liou-Brennan optical model of the eye [11]. The cover glass
above the image sensor array is also shown. The reference
wavelength for the diagram is 540 nm. 208
Figure 8-25. Modulation transfer function (MTF) plot for the designed two
lens ultraminiature IOC optical system with COTS lens
elements (as shown in Figure 8-24) in the Liou-Brennan
optical model of the eye and with the object at infinity. The lens
has a modulation of 0.45 or greater up to the Nyquist frequency
(100 lp/mm) at all field angles. The reference wavelengths for
the MTF plot are 630 nm, 540 nm, and 470 nm. The MTF
contribution at 540 nm is weighed twice as much as the
630 nm and 470 nm MTF contributions in the calculation. 210
Figure 8-26. Image shown in Figure 8-3 simulated through the designed
two lens ultraminiature IOC optical system with COTS lens
elements (as shown in Figure 8-24) in the Liou-Brennan
optical model of the eye with the object placed at infinity. The
diagonal field of view is 20°. The reference wavelengths for the
image simulation are 630 nm, 540 nm, and 470 nm, with equal
weights. 211
Figure 8-27. Image simulation result shown in Figure 8-26 following
resampling at an image sensor resolution of 48 × 80 pixels. 211
xxx
Figure 8-28. Image simulation result shown in Figure 8-26 following
resampling at an electrode array resolution of 24 × 40
grayscale pixels. 212
Figure 8-29. Tangential MTF (Top) and radial MTF (Bottom) curves for the
designed and implemented compound IOC lens with two COTS
lens elements, as shown in Figure 8-24. The MTF
measurements were performed with four LEDs covering the
visible spectrum, as shown in Figure 2-15. CODE V results
were also obtained with wavelength weights derived from the
plot shown in Figure 2-15. The measurement results show
close agreement to the analysis results obtained from the
CODE V model of the lens at all field angles. 213
Figure 8-30. (Top) Schematic diagram of the elongated IOC lens design
(with two COTS spherical lens elements) shown in the
Liou-Brennan optical model of the eye [11]. The reference
wavelength for the diagram is 540 nm. (Bottom) Modulation
transfer function (MTF) plot of the same lens in the Liou-
Brennan optical model of the eye with the object placed at
infinity, and evaluated at field angles up to ±10° (a 20° diagonal
field of view). The lens has a modulation of 0.28 or greater up
to the Nyquist frequency (56 lp/mm) at all field angles. The
reference wavelengths for the MTF plot are 630 nm, 540 nm,
and 470 nm. The MTF contribution at 540 nm is weighed
twice as much as the 630 nm and 470 nm MTF contributions
in the calculation. 216
Figure 8-31. Tangential MTF (Top) and radial MTF (Bottom) curves for the
elongated IOC compound lens designed and implemented with
two COTS lens elements, as shown in Figure 8-30. The MTF
measurements were performed with four LEDs covering the
visible spectrum, as shown in Figure 2-15. CODE V results
were also obtained with wavelength weights derived from the
plot shown in Figure 2-15. The measurement results show
substantial agreement with the analysis results obtained from
the CODE V model of the lens. 217
xxxi
Figure 8-32. (Top) Image simulation through the elongated compound IOC
lens with COTS spherical elements (as shown Figure 8-30)
operating in air, and with a 200 × 400 pixel image sensor
resolution. The reference wavelengths for the image
simulation are 630 nm, 540 nm, and 470 nm.
(Bottom) Captured image with the lens (in air) and the
OV6930 image sensor array operating at a 200 × 400 pixel
image sensor resolution. The central box outlined in red
dashes represents the central 20° diagonal field of view. 218
Figure 9-1. Schematic diagram of a one-to-one imaging system for
implantable cameras, based on the designed ultraminiature
IOC, as shown in Figure 8-24. The reference wavelength for the
diagram is 540 nm. 230
xxxii
ABSTRACT
The development of advanced wearable visual aid systems and implantable
retinal prosthesis devices can enable quality of life improvements for millions of blind
and low-vision patients.
In this thesis, both designs and design principles of cameras for these devices
are presented, such as the design and implementation of wide angle, wide dynamic
range scene cameras for eye-tracked extraocular cameras and wearable visual aid
applications, as well as ultraminiature intraocular cameras (IOC). Unique optical
designs for miniature wide angle lenses are presented, as well as the implementation
and testing of these lenses using closest matching commercial off-the-shelf lens
elements.
A design framework for the optimal design of wide angle computational
cameras is also presented. In this hybrid optical/digital approach, the optical system
produces an image with an optimal degree of barrel distortion and lateral chromatic
aberration, and these are then corrected in software post-processing.
In order to restore foveation in retinal prosthesis patients, a proof-of-principle
eyeglass-mounted eye-tracked extraocular camera design with a custom designed
and implemented wide angle camera, an integrated eye-tracking system, and
software for image capture, gaze extraction and region selection is presented.
Ultraminiature optical systems that allow for a significant reduction in size
compared to earlier intraocular camera designs are also presented, and excellent
imaging performance is demonstrated.
1
Chapter 1
INTRODUCTION
The World Health Organization (WHO) estimated the number of visually
impaired people in the world to be 285 million in a 2010 study, with 39 million blind
and 246 million with low vision [1]. It is sad that these people have a hard time
accomplishing routine daily tasks like navigation indoors and outdoors, and often
require help from others.
In this thesis, highly miniaturized optical imaging systems are described that
allow for significant advances in three different applications that can provide vision
based assistance to the blind and to those with low vision: (1) wearable visual aids,
(2) intraocular retinal prostheses with an eye-tracked extraocular camera, and
(3) intraocular retinal prostheses with an intraocular camera.
1.1 Wearable Visual Aids
Recent advances in optical imaging, machine vision, and artificial intelligence
can be leveraged to create successful wearable visual aids for patients with some
degree of visual impairment. These devices can have the capability to capture and
recognize information about the outside world and provide auditory or tactile
feedback, enabling more successful object recognition, object localization, reach and
grasp, and safer and more flexible navigation and mobility for patients. Any visual
aid device will most likely have some kind of an optical imaging system that will either
replace or complement the dysfunctional visual system in these patients [2, 3, 4].
2
In order to have wide acceptance by patients with visual impairments, a visual
aid system should be both robust and able to work in a variety of real-life conditions.
The imaging system in such a device should be able to capture high quality video
under the very wide dynamic range of brightness levels that is characteristic of both
indoor and outdoor environments. The imaging system should also provide a wide
field of view (e.g., 90° to 120°) that encompasses both central and peripheral vision,
such that objects of interest in the environment can be identified by computer vision
algorithms without the need for mechanically scanning the scene by means of head
motion. The visual aid system might also have two cameras for extraction of depth
information in a scene using stereo matching algorithms. This depth information is
very useful for aiding the patient in navigation and mobility, as well as reach and
grasp tasks. The incorporation of wide field of view cameras for such a depth sensing
system has the potential to allow for improved performance, as more scene features
(e.g., corners and edges) can be captured for stereo matching, and the addition of
depth information over a wider field of view in turn allows for more robust navigation
planning. In addition, the video cameras themselves should be lightweight, low
power, and of small form factor for unobtrusive integration into wearable devices.
Current commercially available wide angle lenses are much too large and bulky to
satisfy these constraints.
3
1.2 Intraocular Retinal Prostheses
One potential solution for the treatment of blindness caused by retinal
degenerative diseases such as retinitis pigmentosa (RP) and age-related macular
degeneration (AMD) is the surgical implantation of intraocular retinal prostheses
(IRPs). In this approach, a microelectrode array stimulates the neurons in the retina
to create the percept of an image. One possible approach to stimulating the neurons
is with an epiretinal implant in which the microelectrode array is placed in direct
contact with the inner surface of the retina, an approach developed by Dr. Mark S.
Humayun, et al. [5, 6]. The current generation of this retinal prosthesis, called the
Argus II (Second Sight Medical Products, Sylmar, California), has 60 electrodes in a
6 × 10 array covering a 20° diagonal field of view [7].
Figure 1-1. Retinal prosthesis with externally mounted (extraocular) camera;
from [8]. The diagram of the eye has been rotated counterclockwise by 90°, and
the ultra-flexible ribbon cable placed in the opposite hemisphere, for clarity of
illustration.
4
Current retinal prostheses employ an extraocular video camera that is
typically mounted on eyeglasses as shown in Figure 1-1. This system does not allow
for foveation ability and the natural coupling between eye and head motions [9, 10].
In order to scan the scene, patients must move their heads to orient the camera in the
direction they want to look, as well as compensate for the disparity between the
direction their head is pointed and the direction their eyes are pointed. Foveation
can be restored by feeding the portion of the camera image that is in the direction of
the eye's gaze to the microelectrode array.
This can be achieved in two ways, either by implanting a miniature camera
directly in the eye (an intraocular camera) as shown in Figure 1-2, or by using a wide
field of view extraocular camera (EOC) together with an eye tracking system as shown
in Figure 1-3. In the intraocular camera (IOC) approach, the camera will be physically
pointed in the direction of gaze. In the eye-tracked extraocular camera (ET-EOC)
approach, a subregion corresponding to the direction of gaze (determined by the eye
tracking system) can be extracted from the wide field of view image captured by the
extraocular camera, and this image can then be sent to the microelectrode array [8].
The advantages of restoring foveation have been demonstrated by means of visual
prosthesis simulation on sighted individuals [11-15].
1.3 Eye-Tracked Extraocular Camera
The human eye has numerous extraordinary properties, both as an optical
sensing system and as an optical imaging system. One such property is that the
5
central part of the field of view (fovea) has very high resolution; the resolution
decreases in peripheral vision. A natural way to see a high resolution image of an
object is therefore to align the high resolution foveal region in the direction of the
object in the field of view, a concept known as foveation.
Figure 1-2. Retinal prosthesis with internally mounted (intraocular) camera;
from [8].
Figure 1-3. Retinal prosthesis with an eye-tracked extraocular camera;
modified after [8].
6
Another key property of the human eye is the ability to clearly see very bright
and dark regions simultaneously in the same scene due to its very wide dynamic
range. Although the dynamic range of the human eye is dependent on many factors,
generally 120 dB is considered to be a reasonable estimate [16].
These two properties taken together allow for a very robust visual system. As
the goal of any prosthetic device is to restore the original capabilities of a
dysfunctional organ or sense, the imaging system of a prosthetic vision device should
mimic the properties of the human eye insofar as it is possible. As such, the imaging
system should be able to handle high brightness variations within a scene, operate
over a wide range of illumination conditions, and also allow for natural foveation.
In order to capture an image without having some bright regions overexposed
and some dark regions underexposed, the eye-tracked extraocular camera should
have a wide dynamic range image sensor array (as should the scene camera in the
wearable visual aid, and the intraocular camera for retinal prostheses). The dynamic
range of an image sensor array depends on many factors such as noise, the well
capacity of the photosites, and the bit resolution of the analog-to-digital converter
used. Typical image sensor arrays have dynamic ranges between 48 dB and 72 dB.
Special image sensor arrays, usually designed for security and automotive
applications, can have wide dynamic ranges of more than 100 dB.
The human eye has a full range of motion of about 90°. However, most people
prefer to use combined head and eye motion for looking at objects in oblique
locations. As a consequence, the eye has a more comfortable range of motion of about
7
40° [17]. For a given orientation of the head, the field of view of a sighted individual
is very close to 180°, including the fields of view of the eyes themselves as well as
their orientation.
The implanted electrodes in intraocular retinal prostheses typically cover a
±10° diagonal field of view in the central region of the retina. The camera in an
eye-tracked EOC system should have as large a field of view as possible up to 180° in
order to cover the full field of view of a sighted individual, and be capable of providing
visual input to the retinal prosthesis at its nominal resolution over the central 60° of
the field of view. As this field of view is not likely feasible, optical imaging systems
with 90° to 120° fields of view are being explored.
In addition, the wide field of view video camera should be lightweight, low
power, and of small form factor for unobtrusive integration into eyeglasses.
Eyeglass mounted eye-tracking systems typically consist of a camera working
in the near infrared (700 nm to 1000 nm wavelength range), directed and focused on
the eye. In the near infrared, the pupil of the eye typically images with a high contrast
compared to the surrounding regions, and therefore can easily be identified by
computer vision algorithms using image intensity thresholding. The field of view of
this eye-tracking camera is important in determining the location relative to the head.
A narrow field of view camera must be placed at a sufficient distance to capture the
entire eye, while a wide field of view camera can be placed much closer to the eye,
and therefore allow for a more unobtrusive system.
8
A fully functional version of an eye-tracked extraocular camera forms an
integral part of the Visual Prosthesis Simulator developed at the Optical Materials and
Devices Laboratory (OMDL). This system uses commercial off-the-shelf (COTS)
components, and is much too large to be routinely wearable [18]. The lens designs
presented in this thesis allow for the miniaturization and unobtrusive integration of
an eye-tracked extraocular camera [19].
1.4 Ultraminiature Intraocular Camera for Retinal Prostheses
An alternative method to achieve natural foveation with retinal prosthesis
patients is to implant a miniature camera in the eye of the patient. In this case, the
output of the camera will drive the microelectrode array on the retina after video
processing on a belt worn visual processing unit (VPU). As the intraocular camera is
designed to be implanted in the crystalline lens sac, the intraocular camera (IOC) will
move with the natural foveation of the eye, and the stimulation on the retina will
automatically match the gaze direction.
The visual field criterion of the legal blindness definition in the United States
is that a subject’s field of view must be less than 20° [20]. A principal goal of current
and next generation retinal prostheses in restoring (legal) sight is to meet this
requirement by covering at least a ±10° diagonal field of view on the retina. As a
consequence, the diagonal field of view of the IOC should at least be ±10°, which also
matches the field of view of the current generation Argus II retinal prosthesis.
9
Current generations of the intraocular camera are capable of fitting within the
crystalline lens sac following removal of the crystalline lens. Next generation
intraocular cameras would prove more readily surgically implantable if they could be
further reduced in size to form ultraminiature intraocular cameras that are capable
of being mounted within an intraocular lens typical of those implanted during
cataract surgery.
1.5 Organization of the Thesis
This thesis presents several novel approaches to the design of miniature
optical systems for the various devices described above that are designed to restore
functionality to visually impaired patients.
Wide angle cameras for potential use in an eye-tracked extraocular camera as
well as in wearable visual aid systems are presented, as well as ultraminiature
cameras for intraocular implantation.
Miniaturization of the wide angle optical systems is achieved through a novel
hybrid optical and digital approach in which the optical system produces a wide field
of view image that exhibits some degree of barrel distortion and also lateral
chromatic aberration. This image is captured by a digital image sensor array, and
software post-processing is employed to dewarp the image to restore a rectilinear
image, and to also reduce lateral chromatic aberration. Surprisingly, the optimal
degree of barrel distortion that yields the highest quality post-processed images is
quite large [21, 22].
10
Ultraminiaturization of the intraocular camera optical system is achieved by
scaling down key design constraints and employing to a certain extent several of the
design principles used to miniaturize the wide field of view cameras. The overall
reduction in system volume allows for the use of high density optical glasses as
opposed to low density polymers, while still meeting the overall mass requirements
for IOCs.
Chapter 2 presents a brief introduction to computational cameras and
examples of recent research in this area. Different approaches to achieving wide field
of view imaging systems is also presented. Background on achieving rectilinear, wide
angle imaging systems through a computational approach, and the characterization
and correction of barrel distortion in wide angle images is also presented. Methods
for testing and characterization of imaging systems presented in this thesis are also
introduced.
Chapter 3 presents a design approach for the design of wide angle
computational cameras by combining an analysis of the optimal distortion in the lens
system with optimal dewarping.
Chapter 4 presents a wide angle lens with a 100° dewarped diagonal field of
view that was designed and optimized from scratch, and then implemented with the
closest matching off-the-shelf lens elements. The benefit of the correction of lateral
chromatic aberration in software post-processing is also explored. This lens is
primarily intended for use in the wearable visual aid application.
11
Chapter 5 presents an alternative wide angle lens design that is also based on
off-the-shelf lens elements and that provides an even wider field of view (110°
dewarped diagonal field of view), and is designed around a well-corrected, narrow
field of view lens. This lens is primarily intended for use in the eye-tracked
extraocular camera, as it has an enhanced field of view at a tolerable cost in
resolution.
Chapter 6 presents the design of an afocal attachment for a miniature wafer
level camera for the purpose of expanding the field of view. This approach allows for
the possibility of modifying mass-produced miniature wafer level cameras for wide
field of view applications.
Chapter 7 presents the implementation of a prototype eye-tracked extraocular
camera for restoration of foveation in retinal prosthesis patients, using components
presented in the previous chapters.
Chapter 8 presents design approaches for a next generation intraocular
camera prototype that uses ultraminiature glass lenses, resulting in a significant
reduction in camera size and volume while maintaining imaging performance.
Characterization results obtained from implemented lenses are also presented.
Chapter 9 summarizes the work presented in this dissertation and provides
future research directions.
12
Chapter 1 References
[1] World Health Organization, “Global Data on Visual Impairments 2010”.
Available online:
http://www.who.int/blindness/GLOBALDATAFINALforweb.pdf
[2] K. A. Thakoor, S. Marat, P. J. Nasiatka, B. P. McIntosh, F. E. Sahin,
A. R. Tanguay, Jr., J. D. Weiland, and L. Itti, “Attention Biased Speeded Up
Robust Features (AB-SURF): A Neurally-Inspired Object Recognition
Algorithm for a Wearable Aid for the Visually-Impaired”, 2013 IEEE
International Conference on Multimedia and Expo Workshops, pp. 1–6,
July 2013.
[3] A. Adebiyi, N. Mante, C. Zhang, F. E. Sahin, G. G. Medioni, A. R. Tanguay, Jr., and
J. D. Weiland, “Evaluation of Feedback Mechanisms for Wearable Visual Aids”,
2013 IEEE International Conference on Multimedia and Expo Workshops,
pp. 1–6, July 2013.
[4] F. E. Sahin, B. P. McIntosh, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “Design of a Compact Wide-Field-of-View Camera for
Retinal Prostheses”, 2013 Annual Meeting of the Association for Research in
Vision and Ophthalmology (ARVO), Seattle, Washington, May 5, 2013;
Investigative Ophthalmology and Visual Science, Vol. 54, No. 15, p. 1068,
June 2013.
[5] M. S. Humayun, E. de Juan, J. D. Weiland, G. Dagnelie, S. Katona, R. Greenberg,
and S. Suzuki, “Pattern Electrical Stimulation of the Human Retina”, Vision
Research, Vol. 39, No. 15, pp. 2569–2576, July 1999.
[6] M. S. Humayun, “Intraocular Retinal Prosthesis”, Transactions of the American
Ophthalmological Society, Vol. 99, pp. 271–300, January 2001.
[7] J. D. Dorn, A. K. Ahuja, A. Caspi, L. da Cruz, G. Dagnelie, J. Sahel, R. J. Greenberg,
M. J. McMahon, and Argus II Study Group, “The Detection of Motion by Blind
Subjects with the Epiretinal 60-Electrode (Argus II) Retinal Prosthesis”, JAMA
Ophthalmology, Vol. 131, No. 2, pp. 183-189, February 2013.
[8] N. R. B. Stiles, B. P. McIntosh, P. J. Nasiatka, M. C. Hauer, J. D. Weiland, M. S.
Humayun, and A. R. Tanguay, Jr., “An Intraocular Camera for Retinal
Prostheses: Restoring Sight to the Blind”, Chapter 20 in Optical Processes in
Microparticles and Nanostuctures, Advanced Series in Applied Physics,
Volume 6, A. Serpenguzel and A. Poon (Eds.), Singapore: World Scientific,
2010, pp. 385-429.
13
[9] M. S. Humayun, J. D. Dorn, L. da Cruz, G. Dagnelie, J. A. Sahel, P. E. Stanga,
A. V. Cideciyan, J. L. Duncan, D. Eliott, E. Filley, A. C. Ho, A. Santos, A. B. Safran,
A. Arditi, L. V. Del Priore, and R. J. Greenberg, “Interim Results from the
International Trial of Second Sight’s Visual Prosthesis”, Ophthalmology,
Vol. 119, No. 4, pp. 779–788, May 2012.
[10] M. P. Barry and G. Dagnelie, “Use of the Argus II Retinal Prosthesis to Improve
Visual Guidance of Fine Hand Movements”, Investigative Ophthalmology &
Visual Science, Vol. 53, No. 9, pp. 5095–5101, January 2012.
[11] B. P. McIntosh, P. J. Nasiatka, N. R. B. Stiles, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “The Importance of Foveation in Retinal Prostheses:
Experiments with a Visual Prosthesis Simulator”, Neural Interfaces
Conference 2010, Long Beach, California, June 2010.
[12] B. P. McIntosh, N. R. B. Stiles, M. S. Humayun, and A. R. Tanguay, Jr., “Visual
Prosthesis Simulation: Effects of Foveation on Visual Search”, 2013 Annual
Meeting of the Association for Research in Vision and Ophthalmology (ARVO),
Seattle, Washington, May 5, 2013; Investigative Ophthalmology and Visual
Science, Vol. 54, No. 15, p. 1057, June 2013.
[13] B. P. McIntosh, N. R. B. Stiles, M. S. Humayun, and A. R. Tanguay, Jr., “Effects of
Foveation on Visual Search Task with Visual Prosthesis Simulation”, Annual
Meeting of the Vision Sciences Society, Naples, Florida, May 12, 2013; Journal
of Vision, Vol. 13, No. 9, p. 685, July 2013.
[14] N. R. B. Stiles, B. P. McIntosh, A. R. Tanguay, Jr., and M. S. Humayun, “Retinal
Prostheses: Functional Use of Monocular Depth Perception in the Low
Resolution Limit”, 2013 Annual Meeting of the Association for Research in
Vision and Ophthalmology (ARVO), Seattle, Washington, May 5, 2013;
Investigative Ophthalmology and Visual Science, Vol. 54, No. 15, p. 1042,
June 2013.
[15] A. R. Tanguay, Jr., N. R. B. Stiles, B. P. McIntosh, and M. S. Humayun,
“Functional Use of Monocular Depth Perception in the Low Resolution Limit”,
Annual Meeting of the Vision Sciences Society, Naples, Florida, May 14, 2013;
Journal of Vision, Vol. 13, No. 9, p. 1182, July 2013.
[16] A. Darmont, High Dynamic Range Imaging: Sensors and Architectures, 1
st
Ed.,
Washington: SPIE Press, 2012, p. 37.
[17] J. S. Stahl, “Eye-Head Coordination and the Variation of Eye-Movement
Accuracy with Orbital Eccentricity”, Experimental Brain Research, Vol. 136,
No. 2, pp. 200-210, January 2001.
14
[18] B. P. McIntosh, “Intraocular and Extraocular Cameras for Retinal Prostheses:
Effects of Foveation by Means of Visual Prosthesis Simulation”, Ph.D. Thesis,
University of Southern California, 2015.
[19] F. E. Sahin, B. P. McIntosh, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “Eye-Tracked Extraocular Camera for Retinal Prostheses”,
2015 OSA Frontiers in Optics, Art. No. FTu2C.3, San Jose, CA, October 2015.
[20] United States Social Security Administration, “Title XVI - Supplemental
Security Income for the Aged, Blind, and Disabled”, Section 1614: “Meaning of
Terms: Aged, Blind, or Disabled Individual”, in Compilation of the Social
Security Laws Volume 1, Act 1614, as amended through January 2007.
[21] F. E. Sahin, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and A. R. Tanguay, Jr.,
“Optimal Design of Miniature Wide-Angle Computational Cameras for Retinal
Prostheses and Wearable Visual Aids”, 2014 OSA Frontiers in Optics, Art. No.
FTu5F.1, Tucson, AZ, October 2014.
[22] F. E. Sahin, P. J. Nasiatka, and A. R. Tanguay, Jr., “Lateral Chromatic Aberration
Optimization in Wide-Field-of-View Computational Cameras”, 2015 OSA
Frontiers in Optics, Art. No. FTh1F.4, San Jose, CA, October 2015.
15
Chapter 2
COMPUTATIONAL CAMERAS AND WIDE ANGLE LENSES
The design and analysis of wide angle computational cameras constitutes a
principal focus of this thesis. In this chapter, a brief summary of several
computational camera approaches and recent wide angle lens designs is presented.
Background information on the various methods and tools used for the design of the
optical systems introduced in the upcoming chapters is also presented, as well as the
methods used for analyzing and comparing various designs.
2.1 Computational Cameras
Traditional film cameras transferred scene information to a two dimensional
film. The film was later printed to produce images for people to enjoy. The goal of
this imaging process was to produce an accurate representation of the scene on
photographic paper. As the image on the film was transferred almost directly to
paper, the image on the film had to be an accurate representation of the scene. In
order to achieve this, the optical system in the film camera was designed to replicate
the scene information as well as possible onto the image plane. This meant that the
optical system had to be well designed to provide a sharp image.
With advances in digital image sensor arrays and image processing
algorithms, as well as in the development of fast hardware platforms like graphical
processing unit (GPU), digital signal processing (DSP), and field programmable gate
array (FPGA) chips that are capable of running these algorithms in real time, a new
16
approach to designing imaging systems has emerged, spawning the new field of
computational cameras. In the case of computational cameras, the goal of the imaging
system is once again to produce an accurate representation of the captured scene.
However, in order to produce this final, sharp image, the image captured by the digital
image sensor array does not necessarily need to be an accurate representation of the
scene directly. Novel optical designs can instead produce an image that is captured
by the digital image sensor array and then digitally processed to form a final image
that is an accurate representation of the scene.
This approach of developing hybrid optical and digital systems with
co-optimization of the optical system and the computational algorithms is capable of
introducing imaging systems with new functionality. Computational cameras could
also allow for correction of optical system aberrations in the digital domain, and
therefore allow for both simpler and nontraditional optical designs. Several recent
examples of both computational camera approaches will be presented. Further
examples can be found in several excellent publications [1, 2, 3].
Designing a computational camera with an extended depth of field (an EDOF
camera) has been an active area of research. Once a traditional camera is focused on
an object at a specific distance, only a range of object distances will be imaged with
acceptable sharpness. This range is known as the depth of field, and depends on
parameters such as the image sensor pixel size, the optical system focal length, and
the f-number. An EDOF camera allows for a larger range of object distances to be in
acceptable focus in the final image compared to a traditional camera with similar
17
parameters. The pioneering work for computational cameras was an EDOF camera
developed by Dowski and Cathey [4]. They designed an optical system that
specifically incorporates a cubic phase mask that is located at the pupil plane. This
produces a wavefront coded image with an object-distance-independent misfocus
blur. The image is captured by a digital image sensor array, and this image is later
digitally filtered to recover a high resolution image with extended depth of field [4].
Another example of an added functionality made possible by the
computational camera approach is the elimination of motion blur artifacts in images.
Motion blur is apparent in photographs that contain objects in the scene that are not
stationary during camera exposure. Levin et al. designed a camera in which the image
sensor is moved horizontally during exposure, first at an initial velocity in one
direction of motion, then decreasing to zero followed by accelerating motion in the
opposite direction. This results in the blurring of all objects in the scene independent
of their velocities, and as a consequence in a constant point spread function (PSF) that
is independent of the relative motions of objects in the scene. The captured image is
later deconvolved with the known PSF in order to achieve a final image without
motion blur [5].
Cossairt and Nayar designed an extended depth of field imaging system with
intentional axial chromatic aberration, which they called a “Spectral Focal Sweep”
(SFS) camera [6]. In this approach, different wavelengths of light are in focus at the
image plane for different object distances. However, the broadband PSF is the same
for a range of object distances. The SFS camera has a grayscale image sensor array,
18
and the captured images are later deconvolved with the broadband PSF in order to
achieve a final image with extended depth of field. What is interesting about this
approach is that it both simplifies the optical system design (their design comprises
two commercial off-the-shelf (COTS) lens elements) and also provides the added
functionality of extending the depth of field of the camera [6].
Another example of a simplified optical system allowed by the computational
camera approach was introduced by Robinson et al. with their “Spherical Coded
Imagers” [7]. In this approach, a simple optical system (a singlet lens) is designed to
minimize all optical aberrations except for spherical aberration. Since spherical
aberration causes blurring in a field invariant and rotationally symmetric fashion, a
subsequent deconvolution step with a space invariant digital filter can be
implemented to achieve a sharp final image. This approach also results in an
extended depth of field [7].
The examples described above all rely on having a single PSF for the entire
image that is invariant to either object distance or motion. Having a single PSF for the
entire image is desirable because this makes the subsequent deconvolution step easy
and straightforward. In recent years, progress has been made in the development of
image processing algorithms for deconvolution of images with spatially varying PSFs
[8, 9]. Since most optical aberrations depend on the field position, imaging with
uncorrected lenses typically results in spatially varying PSFs at the image plane. With
this approach, imaging with a single lens and deconvolution of the captured image
19
with the appropriate regional PSFs has been demonstrated to produce high contrast
final images [10-12].
2.2 Wide Angle Lenses
Wide angle lenses typically cause straight lines in an object to be imaged as
curved lines in the image plane due to an optical aberration known as barrel
distortion (as explained in detail in Section 2.3). Capturing wide field of view
rectilinear images, such that straight lines in the object appear straight in the image,
has been an important concern since the early days of photography. In the era of the
slow daguerreotype photographic process, which required exposures of a few
seconds up to 30 minutes for well illuminated outdoor scenes, photographers usually
shot photos of static objects like buildings. Curving of the straight lines of walls and
roofs due to distortion would have been easily noticeable [13]. As a consequence,
optical designers came up with lens designs that were symmetrical with respect to a
central aperture stop, and therefore were capable of eliminating this distortion.
These designs required either curved photographic plates or slow lenses with
apertures stopped down to f/30 or more. Such approaches would be hard to replicate
for the miniature lenses designed for compact video and cell phone cameras. In
modern day machine vision applications, capturing an accurate (distortion-free)
image of an object is still important for object identification purposes.
Novel designs in both wide angle and fisheye lenses have been introduced in
recent years. An interesting approach for achieving a wide angle, rectilinear imaging
20
system through eliminating barrel distortion optically was proposed by Jeffrey
Gohman with the Theia lens [14, 15]. In this approach, a specially designed wide
angle lens system creates an intermediate image that exhibits some degree of barrel
(negative) distortion. This intermediate image is then imaged by another set of lenses
to the final image plane. The latter lens system has pincushion (positive) distortion
that exactly reverses the effect of the barrel distortion produced by the front optical
system, achieving a final image that is rectilinear. Such a system requires many lenses
(15 lenses were specified in the referenced patent) and is very bulky (an example
lens, SY125A, is 36 mm in diameter and 59 mm long [16]). As such, it is also not
suitable for miniaturization.
High-resolution, wide field of view imaging with a curved image plane has also
been proposed [17, 18]. In this case, the optical system is a monocentric lens, in
which all of the lens elements have the same center of curvature. The monocentric
lens approach can eliminate distortion, coma, and astigmatism in the optical
design [19]. However, this approach requires a curved image plane. Approaches for
relaying the image from the curved image plane to multiple planar image sensor
arrays have been proposed, using both relay optics [20] and fiber bundles [21]. Even
though these approaches produce relatively small lenses compared to the high
resolution they can provide, they are still too large for unobtrusive integration in
wearable cameras.
Using only refractive elements (i.e., lenses) constitutes a dioptric optical
system. However, a set of reflective elements (a curved mirror) can be combined with
21
lenses to constitute a catadioptric optical system. Such a catadioptric system has
recently been proposed for wide angle imaging. By choosing the proper mirror shape,
distortion-free images can be captured [22, 23]. However, the mirrors in these
cameras in combination with the overall system length are usually too large for
wearable camera applications.
Another approach to the development of lenses with very wide fields of view
involves the design and implementation of fisheye lenses, which can exhibit fields of
view up to 180° (a full hemisphere). In order to image the entire hemisphere onto an
image sensor of finite size, these fisheye lenses are typically designed to have an
f-theta mapping of the object space into the image space. For optimal rectilinear
mapping in imaging systems, the image height is equal to f tan , in which f is the
effective focal length and is the field angle in object space. With f-theta mapping,
the image height on the image plane is equal to the focal length of the lens multiplied
by the field angle (in radians) in object space [24], which reduces the image height
significantly for large field angles. Traditional fisheye lenses with hemispherical
fields of view, however, tend to exhibit severe barrel distortion as well as a loss of
resolution in the corner and edge regions of the image.
A recent development in lenses with such extreme fields of view (such as the
fisheye lenses described above, for example) has been introduced and implemented
in Panomorph lenses, which were first proposed and developed by ImmerVision [25].
Panomorph lenses have aspherical lens elements that allow for the control of the
space-variant distortion profile of the lens. As a convention, the f-theta mapping
22
employed in fisheye lenses is typically considered “linear” (in field angle), and the
Panomorph lenses have a distortion profile that deviates from this linear case,
resulting in a lens with a “nonlinear” distortion profile. This optical design approach
allows the resolution to be enhanced in certain regions of interest (such as at large
field angles) for specific applications [26, 27].
As a novel approach, we propose to design and implement wide angle
computational cameras. In this approach, we choose to sacrifice the ability to
minimize in the optical domain several optical aberrations (distortion and lateral
chromatic aberration) and nonlinearities (relative illumination variation) that are
typical of wide angle lenses, as this would necessarily result in bulky compound lens
designs with many lens materials and surfaces. Instead, we propose to optimize the
lens design in order to allow these aberrations to be optimally corrected in the digital
domain.
2.3 Barrel Distortion
The paraxial approximation in optics assumes that all incident and refracted
ray angles are small for any angle 𝜃 , and therefore sin 𝜃 ≈ 𝜃 . This approximation
greatly simplifies calculations and allows for quick analyses of simple optical systems.
With more complex optical systems covering wide fields of view, however, some rays
will likely have large incidence and refraction angles that will violate the accuracy of
the paraxial approximation [28].
23
A more accurate approximation can be achieved by approximating sin 𝜃 by
means of a Taylor series expansion, keeping as many high order terms as
computationally feasible for optical analyses.
sin 𝜃 ≈ 𝜃 −
𝜃 3
3!
+
𝜃 5
5!
−
𝜃 7
7!
…
(2-1)
Ludwig von Seidel (1821 – 1896) decided to include the first two terms of the
approximation, and derived several key deviations from the first-order, paraxial
theory. These deviations are known as the Seidel aberrations or third-order
aberrations, and consist of spherical aberration, coma, astigmatism, field curvature,
and distortion [28].
Seidel's analysis specifies that the magnitude of the distortion aberration is
proportional to the cube of the object height within the field of view. Therefore, the
corners of an image suffer the largest amount of distortion, as they correspond to the
largest field angles. Distortion at a given field angle is typically quoted as the
difference between the real image height (hr) and the ideal (distortion-free) image
height (hi), normalized to the ideal image height (hi) [29, 30].
𝐷𝑖𝑠𝑡𝑜𝑟𝑡𝑖𝑜𝑛 (%) =
ℎ
𝑟 − ℎ
𝑖 ℎ
𝑖 × 100 (2-2)
Cameras typically image large objects onto a small image plane, and therefore
the magnitude of the magnification (M) is less than unity (|𝑀 |< 1). In the case of barrel
(negative) distortion, the magnitude of the magnification decreases towards the
corners, and consequently the real image heights in those regions of the image will be
24
smaller than the corresponding ideal image heights (hr < hi). Therefore, the sign of
the distortion will be negative.
In the case of pincushion (positive) distortion, the magnitude of the
magnification increases towards the corners, and consequently the real image heights
in those regions of the image will be larger than the corresponding ideal image
heights (hr > hi). Therefore, the sign of the distortion will be positive [29, 30].
Figure 2-1. (Left) Square grid imaged through an optical system with barrel
distortion. (Right) Square grid imaged through an optical system with
pincushion distortion.
Wide angle lenses typically have some degree of barrel distortion, as shown in
Figure 2-1 (Left) for the imaging of a square grid image with barrel distortion.
2.4 Dewarping of Wide Angle Images
One possible method of achieving a rectilinear, wide angle imaging system is
to employ a hybrid optical and digital approach. In this case, the optical system is
designed to produce a wide field of view image with some degree of barrel distortion.
The image is first captured and digitized, and then the barrel distortion is removed in
software post-processing by remapping the pixels of the captured image. The
25
software post-processing thus effectively dewarps the captured image to achieve a
rectilinear image.
A mathematical function that relates each set of pixel coordinates on the
dewarped image to a corresponding set of image pixel coordinates on the captured
image is needed for the purpose of dewarping a barrel distorted image with reverse
mapping. In the case of reverse mapping, the dewarped image grid is iterated pixel
by pixel, and each pixel value is interpolated from a location on the captured image
determined by the mapping function, as shown in Figure 2-2. The iteration over the
pixels of the dewarped image ensures that all output pixels are computed, as the
reverse mapping constitutes an overall contraction mapping. Given the mapping
function, a look-up table (LUT) that maps each set of dewarped image pixel
coordinates to the corresponding set of captured image pixel coordinates can also be
generated. In the case of non-integer pixel coordinates, the resulting pixel value can
be interpolated from neighboring pixels on the captured image [31].
Figure 2-2. Pixel values for the dewarped image (shown as blue dots on the
right) are sampled from locations on the captured image (red dots on the left).
The location for sampling is determined by the mapping function for
dewarping. In the case of non-integer pixel locations, the pixel value is
interpolated from the pixel values of neighboring pixels (green dots).
26
The mapping function and the corresponding LUT can be generated by either
of two methods. If the exact optical specifications of the imaging lens system are
known, the lens system can be modeled in optical design software. This computer
model can be used to generate data points (relating rd to rc as explained below) and
the distortion polynomial coefficients (as shown in Equation 2-6 below) can be
estimated by fitting a polynomial to these data points.
If the optical specifications of the lens are not known, data extracted from
captured images can be used to estimate the distortion polynomial coefficients, using
camera calibration software. One such software package is the Caltech Camera
Calibration Toolbox described below.
Distortion is purely radial in nature. Therefore, the radial distance from the
center of the image for the captured image (rc) will be related in some manner to the
radial distance from the center of the image for the dewarped image (rd). A
polynomial fit for this relationship is given in Equation 2-6. This polynomial contains
only odd terms as given by the ray aberration equation for distortion [32]. Once rc is
calculated from rd, it can be translated into pixel coordinates (xc, yc) by means of
Equations 2-5, 2-7, and 2-8. The angle 𝜃 that is calculated from the dewarped pixel
coordinates in Equation 2-5 can also be used for calculating the pixel coordinates on
the captured image (xc, yc), since distortion is only radial.
𝑟 𝑑 = √(𝑥 𝑑 − 𝑥 𝑐𝑒𝑛𝑡𝑒𝑟 )
2
+ (𝑦 𝑑 − 𝑦 𝑐𝑒𝑛𝑡𝑒𝑟 )
2
(2-3)
27
𝑟 𝑐 = √(𝑥 𝑐 − 𝑥 𝑐𝑒𝑛𝑡𝑒𝑟 )
2
+ (𝑦 𝑐 − 𝑦 𝑐𝑒𝑛𝑡𝑒𝑟 )
2
(2-4)
𝜃 = 𝑎𝑟𝑐𝑡𝑎𝑛 (
𝑦 𝑑 − 𝑦 𝑐𝑒𝑛𝑡𝑒𝑟 𝑥 𝑑 − 𝑥 𝑐𝑒𝑛𝑡𝑒𝑟 )
(2-5)
𝑟 𝑐 = 𝑘 1
𝑟 𝑑 + 𝑘 2
(𝑟 𝑑 )
3
+ 𝑘 3
(𝑟 𝑑 )
5
+ 𝑘 4
(𝑟 𝑑 )
7
+ 𝑘 5
(𝑟 𝑑 )
9
(2-6)
(𝑥 𝑐 − 𝑥 𝑐𝑒𝑛𝑡𝑒𝑟 ) = 𝑟 𝑐
cos 𝜃 (2-7)
(𝑦 𝑐 − 𝑦 𝑐𝑒𝑛𝑡𝑒𝑟 ) = 𝑟 𝑐 sin 𝜃 (2-8)
In order to calibrate and dewarp the effects of barrel distortion for a wide
angle camera, various software approaches have been developed. One of the most
commonly used software packages for this purpose is the Caltech Camera Calibration
Toolbox mentioned above, developed by Jean-Yves Bouguet. The basis for this
toolbox is described in several key publications [33, 34]. A version of the Caltech
Camera Calibration Toolbox for Matlab
®
is available online, and a C implementation
of the toolbox is included in OpenCV, a popular open source image processing library
focused mainly on real-time computer vision [35].
The Caltech Camera Calibration Toolbox takes images of a planar chessboard
pattern at different object distances and orientations as its input, and then extracts
the corner points of the chessboard pattern from the image. It then both estimates
and refines a set of camera parameters, such as the focal length and the distortion, by
solving a linear least squares equation based on these corner point coordinates [34].
28
The set of coefficients {ki} in Equation 2-6 are then available for dewarping, either by
using the built-in dewarping function of the toolbox or by using custom software.
2.5 An Example Dewarping Case
As an example, a commercially available miniature wide angle lens (SunVision
CCTV 2.8 mm) was calibrated using the Caltech Camera Calibration Toolbox. This
lens was mated with an OmniVision OV10633 (1280 × 800 resolution) CMOS color
image sensor array for testing [36]. The SunVision lens has a 2.8 mm focal length and
provides a 57° vertical and 95° horizontal field of view. However, this field of view
has severe barrel distortion, as shown in Figure 2-3.
Figure 2-3. Image of an ISO 12233 test target captured with the SunVision lens
integrated with an OmniVision OV10633 image sensor array.
A total of 40 calibration images, each image with the chessboard target at a
different orientation and location, were captured using this camera (Figure 2-4).
Once the camera parameters are determined, the toolbox also has built-in tools for
29
both the generation of the corresponding dewarping LUT, and for subsequent image
dewarping. Following dewarping using the toolbox, the dewarped field of view of the
original image reduced to 50° vertical and 74° horizontal, as shown in Figure 2-5.
Figure 2-4. Image of a chessboard calibration target captured with the
SunVision lens integrated with an OmniVision OV10633 image sensor array.
This calibration and dewarping example teaches two key lessons:
(1) Pixels of the dewarped image are interpolated from the captured image,
which causes a decrease in resolution towards the corners of the undistorted image
(Figure 2-5).
Figure 2-5. (Left) The same captured image shown in Figure 2-3. The regions
omitted by the dewarping algorithm are shown in black. (Right) The same
image following dewarping. The image is now rectilinear.
30
(2) Some regions of the captured image may not be used for generating the
dewarped image. For example, Figure 2-5 (Left) shows the regions omitted in
generating the dewarped image of Figure 2-5 (Right) in black. The regions omitted
are indirectly determined by applying a constraint to the dewarped image, such as
that the number of horizontal and vertical pixels in both images are the same, and
that the central image resolution of the captured image is maintained.
Figure 2-6. Resolution of the dewarped image as a function of the field angle.
Areas in the corners are sampled from smaller regions in the captured image as
compared to areas in the center.
In order to quantify the decrease in resolution that results from the dewarping
operation using the Caltech Camera Calibration Toolbox, the reverse mapping
sampling area (on the captured image) corresponding to a 5 pixel × 5 pixel patch of
the undistorted image was calculated as a function of the field angle (as plotted
Figure 2-6). A 25 pixel
2
patch at the center of the dewarped image is derived from a
corresponding 25 pixel
2
region on the captured image. A 25 pixel
2
patch at the corner
(42° half field angle) of the dewarped image, however, is derived only from a
6 pixel
2
region on the captured image. Assuming that the camera resolution was pixel
31
size limited, corresponding to a total effective resolution of 1280 × 800 pixels
(1.23 megapixels), the dewarping process causes the effective resolution to drop to
0.53 megapixels.
2.6 Dewarped Field of View
The example dewarping case with the Caltech Camera Calibration Toolbox
discussed above showed that using the toolbox results in a reduction of both the
horizontal and vertical fields of view. This is a consequence of the fact that (as a
default) it outputs an image with the same total number of pixels as the original
image, namely, 1280 × 800, at the same time keeping the resolution in the central
(dewarped) field of view constant. Since portions of the captured image at large radii
are compressed, the dewarping process re-expands them, and they are then cropped
by the arbitrary restriction of outputting the same number of pixels as the original.
Alternatively, either the vertical or the horizontal field of view can be kept constant
following dewarping, as shown in Figure 2-7. If the horizontal field of view is kept
constant, this process results in local patches of the image that are rectilinear, but also
in a final image that does not have a rectilinear boundary. Therefore, if the aspect
ratio of the image is to be kept constant, and a rectilinear image boundary maintained,
then the maximum dewarped field of view will be limited by the vertical field of view
of the captured image in this case.
32
Figure 2-7. (Left) Image shown in Figure 2-3 after dewarping, with the aspect
ratio and the vertical field of view kept constant. (Right) Image shown in
Figure 2-3 after dewarping, with the aspect ratio and the horizontal field of view
kept constant. Some regions in the dewarped image are empty, since
information for these regions was not available in the captured image.
Two nodal points can be determined for any lens system. Any ray that is aimed
at the first nodal point (NP1) will travel through the lens system and then exit,
appearing as if it is coming from the second nodal point and with the same angle with
respect to the optical axis (Figure 2-8) as the incident ray. The distance from the
second nodal point (NP2) to the image plane is called the rear nodal distance (rnd),
which for simple lens systems (one or two elements) is nearly equal to the effective
focal length of the lens [28].
Assuming that the image plane size (h) is determined by the size of a given
image sensor array, the effective focal length (efl) required for any desired (full) field
of view (θ) can be easily estimated:
tan (
𝜃 2
) =
(ℎ 2) ⁄
𝑟𝑛𝑑 and 𝑒𝑓𝑙 ≈ 𝑟𝑛𝑑
(2-9)
This in turn allows the lens designer to determine the initial lens design parameters.
33
In a rectilinear imaging system in which the image sensor is in fact the field
stop (the element that limits the field of view of the system), the vertical, horizontal,
and diagonal fields of view of the system can be expressed as functions of the rear
nodal distance (rnd) of the lens and the size of the image sensor as in Equations 2-10
through 2-12 below.
Figure 2-8. Lens nodal points and rear nodal distance ( rnd).
tan (
𝑑𝐹𝑂𝑉 2
) =
(𝐷𝑖𝑎𝑔𝑜𝑛𝑎𝑙 𝐼𝑚𝑎𝑔𝑒 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 )/2
𝑟𝑛𝑑
(2-10)
tan (
ℎ𝐹𝑂𝑉
2
) =
(𝐻𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 𝐼𝑚𝑎𝑔𝑒 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 )/2
𝑟𝑛𝑑
(2-11)
tan (
𝑣𝐹𝑂𝑉 2
) =
(𝑉𝑒𝑟𝑡𝑖𝑐𝑎𝑙 𝐼𝑚𝑎𝑔𝑒 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 )/2
𝑟𝑛𝑑
(2-12)
in which dFOV is the (angular) diagonal field of view, hFOV is the (angular) horizontal
field of view, and vFOV is the (angular) vertical field of view.
34
Therefore,
tan (
𝑑𝐹𝑂𝑉 2
)
(𝐷𝑖𝑎𝑔𝑜𝑛𝑎𝑙 𝐼𝑚𝑎𝑔𝑒 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 )/2
=
tan (
ℎ𝐹𝑂𝑉
2
)
(𝐻𝑜𝑟𝑖𝑧𝑜𝑛𝑡𝑎𝑙 𝐼𝑚𝑎𝑔𝑒 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 )/2
(2-13)
=
tan (
𝑣𝐹𝑂𝑉 2
)
(𝑉𝑒𝑟𝑡𝑖𝑐𝑎𝑙 𝐼𝑚𝑎𝑔𝑒 𝑆𝑒𝑛𝑠𝑜𝑟 𝑆𝑖𝑧𝑒 )/2
With wide angle imaging systems, the field of view also depends on the degree
of distortion present at large field angles, and therefore the tangent relationship of
Equation 2-13 does not hold for the captured images.
Following dewarping, the image will be rectilinear. The dewarped vertical
field of view (𝑣𝐹𝑂𝑉 𝑑 ) can be kept the same as the captured vertical FOV (𝑣𝐹𝑂𝑉 𝑐 ),
assuming that the aspect ratio of the image sensor array is such that the vertical
dimension is the smallest. If the aspect ratio of the image is to be kept constant, the
vertical field of view will set the horizontal and diagonal fields of the view of the
dewarped, rectilinear image according to Equation 2-13, as shown in
Figure 2-7 (Left).
If the dewarped horizontal field of view (ℎ𝐹𝑂𝑉
𝑑 ) is kept the same as the
captured horizontal field of view (ℎ𝐹𝑂𝑉
𝑐 ), and if the aspect ratio of the image is again
kept constant, then there will be empty regions in the dewarped image due to a
corresponding lack of information in the captured image, as shown in
Figure 2-7 (Right).
35
2.7 Integrated Circuits for Dewarping
Several companies have announced the commercial availability of integrated
image processing chips that are capable of providing real-time barrel distortion
dewarping, including the OmniVision OV480, the Aptina AP0100, and the Geo
Semiconductor GW3100 [37, 38, 39]. These chips allow for an integrated approach
to providing dewarped images from wide field of view imaging systems, and can
eliminate the need for a general purpose processor to implement dewarping.
Figure 2-9. (Left) Leopard Imaging LI-USB30-M034WDR camera with Aptina
AP0100 real-time dewarping chip. (Center) Example image captured with this
camera. (Right) Same image as in the center, after on-chip dewarping (Images
provided by Leopard Imaging [40]).
A small camera module that includes one such chip (an Aptina AP0100) was
purchased from Leopard Imaging [40]. This particular camera module, an
LI-USB30-M034WDR, has a high resolution (1280 × 960), wide dynamic range
(115 dB) image sensor array from Aptina, the MT9M034. The AP0100 image
processing chip is programmable for different lens-dependent barrel distortion
profiles. The camera has a USB 3.0 interface, and the entire package measures only
26 mm × 26 mm × 14.7 mm (excluding the lens mount), as shown in Figure 2-9 (Left).
36
As explained in Chapters 5 and 7, this camera was used for testing one of the designed
wide field of view lenses and also as the scene camera in the eye-tracked extraocular
camera prototype.
2.8 Methods for Evaluating Lens Designs
In the design and optimization of both miniature wide angle computational
cameras and ultraminiature intraocular cameras, several metrics were used to
evaluate and compare the different designs. In what follows, a brief introduction to
these metrics and measurement methods is presented.
2.8.1 Modulation Transfer Function
The modulation transfer function (MTF) is a metric commonly used for
evaluating the overall performance of an optical system. The modulation (M) for a
repeating line pattern with intensity (I) varying sinusoidally (as shown in Figure
2-10) is defined as:
𝑀 =
(𝐼 𝑚𝑎𝑥
− 𝐼 𝑚𝑖𝑛
)
(𝐼 𝑚𝑎𝑥
+ 𝐼 𝑚𝑖 𝑛 )
(2-14)
In an ideal optical imaging system, 100% modulation of a certain spatial
frequency at the input should result in 100% modulation of the same spatial
frequency at the output. Due to the aberrations inherent in any optical system and to
diffraction resulting from finite size apertures, the modulation of a sinusoidal input
pattern is typically degraded at the output for a real system, and is a function of the
37
spatial frequency [30]. This reduced modulation typically results in reduced contrast
images.
The MTF is usually plotted as the modulation at the image plane as a function
of spatial frequency (ξ, typically expressed in line pairs/mm) for perfect (100%)
modulation at the object. As the MTF might vary for different field angles, it should
be plotted separately for each field angle of interest.
Figure 2-10. (Left) Sinusoidal line pattern with 100% modulation.
(Right) Sinusoidal line pattern with 40% modulation.
In a digital imaging system, the maximum amount of information that can be
captured is limited by the pixel size of the image sensor array. The limiting spatial
frequency that can be captured by the optical imaging system, known as the Nyquist
frequency, is defined to be:
𝑁𝑦𝑞𝑢𝑖𝑠𝑡 𝐹𝑟𝑒𝑞𝑢𝑒𝑛𝑐𝑦 (𝑙𝑖𝑛𝑒 𝑝𝑎𝑖𝑟𝑠 /𝑚𝑚 ) =
1
2 × (𝑃𝑖𝑥𝑒𝑙 𝑃 𝑖𝑡𝑐 ℎ)
(2-15)
The Nyquist frequency provides an upper limit for the highest spatial frequency that
can be captured with a nonzero value of the MTF.
In an ideal optical system, an infinitely small point at the input images to an
infinitely small point at the image plane. In a real optical system, such an infinitesimal
input point will image to a finite blur spot at the image plane, due to either aberrations
38
in the system, diffraction, or focus error. Even for a very well aberration corrected
system, there will be an output blur spot due to diffraction. The intensity blur spot at
the image plane that results from an infinitesimal (point) object in the object plane is
called the “Point Spread Function” (PSF) of the system. The PSF is the impulse
response of a focused optical system, and typically varies in size and shape for
different locations on the image plane. The 2D spatial Fourier transform of the PSF is
the “Optical Transfer Function” (OTF), and the modulus of the OTF is the MTF [41].
𝑂𝑇𝐹 (𝜉 𝑥 , 𝜉 𝑦 ) = ℱ
2𝐷 {𝑃𝑆𝐹 (𝑥 , 𝑦 )} (2-16)
𝑀𝑇𝐹 = |𝑂𝑇𝐹 | (2-17)
Optical design software typically has built-in tools for generation of the MTF
for a given optical system. Sets of MTF plots will provide a key analytical tool for
evaluating different lens designs throughout this document, and an example MTF plot
is shown in Figure 2-11.
Figure 2-11. A typical MTF plot for an optical system. Different colored lines
correspond to different field angles in the field of view. Solid lines correspond
to tangential (meridional) fans of rays and dashed lines correspond to radial
(sagittal) fans of rays.
39
The MTF of an optical system is typically plotted for two different sets of rays.
Meridional (tangential) rays lie in the meridional plane, which is the plane that
contains both the chief ray (shown in red in Figure 2-12) and the optical axis. Sagittal
(radial) rays lie in the sagittal plane, which is the plane that contains the chief ray and
is perpendicular to the meridional plane [28]. For the set of rays parallel to the optical
axis, the meridional and the sagittal planes are the same.
Figure 2-12. (Left) Tangential rays for a lens. The intersection of the tangential
(meridional) plane and the front surface of the lens is shown with a purple line.
(Right) Radial rays for a lens. The intersection of the radial (sagittal) plane and
the front surface of the lens is shown with a purple line.
2.8.2 Spatial Frequency Response
Optical design software can provide accurate analysis and optimization of lens
quality by calculating the RMS spot sizes and MTF for a given field position and a given
object distance. The RMS spot size can be computed by first tracing a set of rays at a
given field angle through the optical system to the image plane. Later, the square root
40
of the mean square distance between the image plane position of each ray and the
centroid focal position for the given field angle can be calculated to find the RMS spot
radius [42].
However, for digitally dewarped wide angle imaging systems, these optical
imaging system parameters may not represent the most appropriate measure of the
final dewarped image quality, and may even be misleading for comparing different
lens designs. For example, two different lens designs covering the same field of view
with different degrees of barrel distortion can be considered. Consider a case for
which the lens design with higher distortion might have better, even diffraction
limited, MTF performance, whereas the lens design with lower distortion might have
worse MTF performance. After dewarping the output images from these lenses, the
dewarped image obtained from the lens with the lower distortion might exhibit
higher contrast (effectively a better MTF) as compared to the lens with higher
distortion, as for the latter lens the regions in the corners and the edges of the image
are initially highly compressed by the distortion, and hence sampled at lower
resolution in the dewarping process. The exact opposite situation can also occur,
depending on the design constraints used to implement the high distortion and low
distortion optical imaging system designs. This post-dewarping image quality effect
will be analyzed in detail in Chapter 3.
The fundamental origin of this issue accrues to the fact that the MTF analysis
in optical design software describes only the performance of the optical imaging
system itself. In the case of a digitally dewarped wide angle imaging system, the final
41
image quality is also affected by the sampling on the image sensor array, and also by
the subsequent dewarping process. In order to compare the performance of different
wide angle lens designs for a given image sensor that cover the same dewarped field
of view, a metric for analyzing the quality of the final dewarped image is needed,
instead of relying only on optical analysis metrics such as the RMS spot size or the
MTF.
The Spatial Frequency Response (SFR) of an imaging system can be computed
to evaluate the overall system modulation for different spatial frequencies. The
International Standards Organization (ISO) has published a standard procedure and
example calculation algorithm for resolution measurement based on the SFR
(ISO 12233 [43]). This standard procedure is based on edge-gradient MTF analysis,
in which the Fourier transform of the one-dimensional line spread function (instead
of the two-dimensional PSF) is used to calculate the SFR [44]. The SFR can be an
accurate estimation of modulation for a camera, and it provides a total system
analysis that includes the effects of the lens and the image sensor as well as any digital
post-imaging corrections.
In order to analyze the wide angle imaging systems described in the following
chapters of this thesis, a high resolution ISO 12233 target image (Figure 2-13) will
either be simulated through different lens designs and sampled at a selected image
sensor resolution, or a physical target will be imaged with an implemented lens and
a matching image sensor array. The simulated or captured image will then be
dewarped with look-up tables (LUTs) generated from distortion parameters
42
extracted from the lens model in optical design software. Certain regions of the
dewarped image will then be used for the SFR measurement, as shown, for example,
in Figure 2-13 below. Sample SFR calculation code developed by Peter Burns is used
in SFR calculations [45], with equal weights for the individual color channels of the
input image.
In this thesis, presented MTF plots are generated either directly from the lens
model in CODE V optical design software [42], or through physical characterization
of the implemented lenses with the Wells Research Optical Test Bench [46] as
described in Section 2.9 below. SFR plots are generated from either simulated or
captured images of the ISO 12233 test target (after dewarping in the case of wide
angle lens designs).
Figure 2-13. ISO 12233 Resolution Target. SFR measurements from regions
shown in the red boxes will be averaged and called “Center SFRs”, SFR
measurements from regions shown in the blue boxes will be averaged and
called “Edge SFRs”, and SFR measurements from regions shown in the green
boxes will be averaged and called “Corner SFRs” later in this thesis.
43
2.8.3 Correlation Coefficient
The MTF can be calculated either as a monochromatic metric at a given design
wavelength, or as a polychromatic metric by weighting MTFs at multiple wavelengths
across the visible spectrum. The SFR can also be calculated as a polychromatic metric,
with the different color channels of an image (typically red, green, and blue) having
different weights.
Lateral chromatic aberration (LCA) is an optical aberration that results from
the inherent dispersive characteristics of optical materials used in imaging systems,
and if not corrected, results in a mismatch or shift among the color channels in the
captured image that causes degradation in the resulting image quality. Metrics such
as the MTF or the SFR do not reveal the amount of lateral chromatic aberration in an
optical system or a captured image.
A digital color image typically has three color values for each pixel; red, green,
and blue. When a color image of a black and white or a grayscale test target is
captured, the intensity values for these red, green, and blue channels of the captured
image should all be exactly the same at each pixel. In the case of a 24-bit image, in
which the intensity of each color channel is represented by 8-bit values, black regions
should have the value of “0” and white regions should have the value of “255” in each
color channel.
The correlation coefficient (r) is a metric that determines the overall similarity
of two images. It is defined as:
44
𝑟 =
∑ ∑ (𝐴 𝑚𝑛
− 𝐴 ̅
)(𝐵 𝑚𝑛
− 𝐵 ̅
)
𝑛 𝑚 √(∑ ∑ (𝐴 𝑚𝑛
− 𝐴 ̅
)
2
𝑛 𝑚 )(∑ ∑ (𝐵 𝑚𝑛
− 𝐵 ̅
)
2
𝑛 𝑚 )
(2-18)
in which A and B are two grayscale images each with a resolution of 𝑚 × 𝑛 pixels, and
𝐴 ̅
and 𝐵 ̅
are the mean intensity values of the pixels that comprise each of the
respective images.
The values of 𝑟 range from −1 to 1; 𝑟 = 1 means that the images are exactly
the same, and values of 𝑟 close to 1 indicate a high correlation or similarity between
pairs of images, or between pairs of color channels within a given image [47].
In the case of ideal imaging, the captured image of a grayscale test target
should have exactly the same pixel intensities in the red, green, and blue channels.
The correlation coefficients between individual pairs of color channels should
therefore be unity. Any lateral chromatic aberration (LCA) will cause the images for
the different color channels to be spatially mismatched. The correlation coefficients
between pairs of color channels (e.g., red-green, or blue-green) can be calculated to
evaluate the degree of LCA in an image of a grayscale target. Values of 𝑟 close to 1
indicate a high correlation between pairs of color channels, resulting from low LCA.
2.8.4 Image Simulation
Custom lens designs developed during this research were modeled and
optimized using the commercial optical modeling and design software CODE V
®
.
CODE V provides various tools for constrained optimization of optical systems, such
as Automatic Design and Global Synthesis. It also includes various tools for ray
45
tracing and analysis of designed optical systems, such as RMS spot size, MTF analysis,
and distortion analysis [42].
The Image Simulation (IMS) tool in CODE V provides a very accurate
simulation of the appearance of a 2D graphical image as imaged through the designed
optical system. For this simulation, the point spread function (PSF) of the system is
calculated at selected sampling points on the image plane, and the PSF is convolved
with the corresponding point of the test image. The PSF is interpolated for the pixels
between the sampling points. This simulation includes the blurring effects due to
aberrations (including distortion and chromatic aberrations), diffraction, and
illumination variations at the image plane [42].
Image simulation also allows the blurring effects due to finite-sized
rectangular pixels within the image sensor array to be simulated by convolving the
rectangular pixel with the PSF at a given image point [42]. If not stated otherwise,
this blurring effect due to image sensor pixels was always included in the simulations
described herein, using the pixel sizes of the image sensor array selected for the given
design.
2.9 Wells Research Optical Test Bench
In order to characterize and verify successful implementation of the lens
designs presented in this thesis, a high performance optical characterization station
was acquired. The OS-400-25 is a semi-automated optical lens testing system from
Wells Research (Lincoln, MA) [46], (now part of TriOptics USA [48]). This instrument
46
(as shown in Figure 2-14, below) is the first of its kind to be installed worldwide, and
represents a significant upgrade to the previous methods available for evaluating
custom designed and fabricated ultraminiature optical systems at the Optical
Materials and Devices Laboratory here at USC. The Wells Research OS-400-25 system
is specifically designed for the testing of small, fast (low f/#) optical designs, and it
can accurately measure parameters such as the optical system’s on- and off-axis
Modulation Transfer Functions (MTF), lens effective focal lengths, and distortion.
Figure 2-14. Wells Research OS-400-25 Optical Test Bench [46].
The test bench has four LED light sources (blue, green, amber, and red), that
cover the visible spectrum for optical testing. Testing can be performed either with a
single LED turned on and therefore a narrow bandwidth, or with multiple LEDs
turned on to cover a wider spectrum. A typical balanced white light spectrum with
all of the LEDs turned on is shown in Figure 2-15. This balanced spectrum was
47
experimentally determined by measuring the combined output intensity spectrum of
the LEDs with an Ocean Optics Jaz Spectrometer [49], while simultaneously adjusting
the output power of the individual LEDs to achieve roughly equal peak intensity
levels. This balanced spectrum was achieved by having full output from the green
and the amber LEDs, and 75% output from the red and the blue LEDs.
Figure 2-15. Normalized intensity spectrum of the LED light sources in the
OS-400-25 Optical Test Bench. Different color LEDs are plotted with their
respective colors.
Among the many different testing configurations supported by the test bench,
the reverse projection configuration was used for the testing of the lenses described
herein. This configuration provides a quick and accurate way to achieve MTF
measurements. In this case, a backlit reticle (illuminated by the LEDs) is placed in the
back focal plane of the lens under test. The light rays coming out of the front surface
of the lens are collimated (for the typical case of an infinite conjugate lens). These
0
0.2
0.4
0.6
0.8
1
440 490 540 590 640
Normalized Intensity
Wavelength (nm)
48
light rays are then picked up by a well corrected measurement camera lens that is
focused at infinity, and the output rays from the lens are focused onto a digital image
sensor array [50, 51].
A reticle with a square aperture (shown in Figure 2-16) is typically used for
testing. The one-dimensional radial and tangential line spread functions (LSF) are
calculated from the sides of the reticle. Measurement of the MTF is achieved by taking
the modulus of the one-dimensional spatial Fourier transform of the line spread
functions. Since the reticle is placed in the back focal plane of the lens under test, the
joint magnification of the lens under test and the measurement camera lens is
factored in for accurate MTF calculation.
Figure 2-16. The OS-400-25 measurement reticle with a square aperture
(shown in gray in the background), is placed in the back focal plane of a sample
lens being tested. The reticle is imaged through the lens under test and the
measurement camera lens to the image plane of the measurement camera. The
regions used for generating MTF plots are highlighted in red.
With this semi-automated system, a lens is typically measured on-axis first.
This allows for setting the reticle position to the desired focus location of the lens.
49
Then by manually swiveling the reticle and the lens under test assembly to a desired
angle, off-axis measurements can be performed. In order to offset any effects of tilt
in mounting the lens, taking measurements in both positive and negative off-axis field
angles and then averaging of them is recommended. The measured MTF and
distortion curves can be compared to analysis results obtained from the model of the
lens under test in optical design software. It is sometimes challenging to exactly
match the location of the reticle to the location of the image plane in the optical design
software, resulting in discrepancies between the results.
As explained in detail in Chapter 4, the designed and implemented wide angle
lenses might have lateral chromatic aberration at large field angles. Therefore, in
order to verify successful implementation of the lenses, measurements with only the
green LED were performed. The comparison plots were also generated with
wavelength weights extracted from the normalized intensity spectrum of the green
LED shown in Figure 2-15. With the narrow field of view intraocular camera lens
designs, the full broadband spectrum was used in both the physical measurement and
the analysis in optical design software.
2.10 Summary
Several examples of computational cameras were presented in this chapter.
Recent advances in computational cameras allow for novel imaging system designs
with properties not possible with conventional cameras. In this hybrid optical/digital
50
design approach, the optical system produces an intermediate image that is captured
by a digital image sensor array, and the image is then computationally processed.
Several examples of recent wide field of view lens designs were also presented.
Wide angle lenses typically have a significant amount of barrel distortion. This effect
can be reversed in software post-processing (dewarping) to achieve a rectilinear
image. A method of implementing dewarping and an example test result with a
dewarping software package were also presented.
Several methods of evaluating lens designs and also overall imaging systems
were introduced, including the MTF, the SFR, and the correlation coefficients between
pairs of color channels. A test bench for measuring the MTF of implemented lenses
was also presented.
After introducing the concept of computational cameras and wide angle
lenses, a wide angle computational camera design approach to optimize the amount
of barrel distortion in the lens design is presented in the next chapter.
51
Chapter 2 References
[1] C. Zhou and S. K. Nayar, “Computational Cameras: Convergence of Optics and
Processing”, IEEE Transactions on Image Processing, Vol. 20, No. 12,
pp. 3322-3340, December 2011.
[2] O. Cossairt, “Tradeoffs and Limits in Computational Imaging”, Ph.D. Thesis,
Columbia University, 2011.
[3] R. Lukac (Ed.), Computational Photography: Methods and Applications, Boca
Raton: CRC Press, 2015.
[4] E. R. Dowski and W. T. Cathey, “Extended Depth of Field Through Wave-Front
Coding”, Applied Optics, Vol. 34, No. 11, pp. 1859–1866, April 1995.
[5] A. Levin, P. Sand, T. S. Cho, F. Durand, and W. T. Freeman, “Motion-Invariant
Photography”, ACM Transactions on Graphics (TOG), Vol. 27, No. 3, Art. No. 71,
August 2008.
[6] O. Cossairt and S. Nayar, “Spectral Focal Sweep: Extended Depth of Field from
Chromatic Aberrations”, 2010 IEEE International Conference on
Computational Photography, pp. 1–8, March 2010.
[7] M. D. Robinson, G. Feng, and D. G. Stork, “Spherical Coded Imagers:
Improving Lens Speed, Depth-of-Field, and Manufacturing Yield through
Enhanced Spherical Aberration and Compensating Image Processing”,
Proceedings of SPIE, Vol. 7429, Art. No. 74290M, August 2009.
[8] O. Whyte, J. Sivic, A. Zisserman, and J. Ponce, “Non-Uniform Deblurring for
Shaken Images”, International Journal of Computer Vision, Vol. 98, No. 2,
pp. 168-186, June 2012.
[9] A. Gupta, N. Joshi, C. L. Zitnick, M. Cohen, and B. Curless, “Single Image
Deblurring Using Motion Density Functions”, Computer Vision–ECCV 2010,
pp. 171-184, 2010.
[10] E. Kee, S. Paris, S. Chen, and J. Wang, “Modeling and Removing
Spatially-Varying Optical Blur”, 2011 IEEE International Conference on
Computational Photography, pp. 1-8, April 2011.
[11] C. J. Schuler, M. Hirsch, S. Harmeling, and B. Schölkopf, “Non-Stationary
Correction of Optical Aberrations”, 2011 IEEE International Conference on
Computer Vision, pp. 659-666, November 2011.
52
[12] F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb,
“High-Quality Computational Imaging through Simple Lenses”, ACM
Transactions on Graphics (TOG) Vol. 32, No. 5, Art. No. 149, September 2013.
[13] R. Kingslake, A History of the Photographic Lens, 1st Ed., Boston: Academic
Press, 1989, Chap. 2.
[14] Theia Technologies, Wilsonville, OR. http://www.theiatech.com
[15] J. A. Gohman, “Wide Angle Lens System Having a Distorted Intermediate
Image”, U.S. Patent 7,009,765, 2006.
[16] Theia Technologies SY125A Specification Sheet. Available Online:
http://www.theiatech.com/files/specs/SY125_spec_sheet_web.pdf
[17] E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and Scaling of
Monocentric Multiscale Imagers”, Applied Optics, Vol. 51, No. 20,
pp. 4691-4702, July 2012.
[18] O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel Computational Imaging”,
2011 IEEE International Conference on Computational Photography, pp. 1-8,
April 2011.
[19] I. Stamenov, I. P. Agurok, and J. E. Ford, “Optimization of Two-Glass
Monocentric Lenses for Compact Panoramic Imagers: General Aberration
Analysis and Specific Designs”, Applied Optics, Vol. 51, No. 31, pp. 7648-7661,
October 2012.
[20] D. L. Marks, S. S. Hui, J. Kim, and D. J. Brady, “Engineering a Gigapixel
Monocentric Multiscale Camera”, Optical Engineering, Vol. 51, No. 8,
Art. No. 083202, August 2012.
[21] I. Stamenov, A. Arianpour, S. J. Olivas, I. P. Agurok, A. R. Johnson, R. A. Stack,
R. L. Morrison, and J. E. Ford, “Panoramic Monocentric Imaging Using
Fiber-Coupled Focal Planes”, Optics Express, Vol. 22, No. 26, pp. 31708-31721,
December 2014.
[22] J. S. Chahl and M. V. Srinivasan, “Reflective Surfaces for Panoramic Imaging”,
Applied Optics, Vol. 36, No. 31, pp. 8275-8285, November 1997.
[23] R. A. Hicks and R. K. Perline, “Equiresolution Catadioptric Sensors”, Applied
Optics, Vol. 44, No. 29, pp. 6108-6114, October 2005.
[24] W. J. Smith, Modern Lens Design, 2nd Ed., New York: McGraw-Hill, 2005,
p. 402.
53
[25] Panamorph Lenses, ImmerVision, Montreal, QC.
http://www.immervisionenables.com/panomorph-technology
[26] J. Gauvin, M. Doucet, M. Wang, S. Thibault, and B. Blanc, “Development of New
Family of Wide-Angle Anamorphic Lens with Controlled Distortion
Profile”, Proceedings of SPIE, Vol. 5874, August 2005.
[27] S. Thibault, J. Gauvin, M. Doucet, and M. Wang, “Enhanced Optical Design by
Distortion Control”, Proceedings of SPIE, Vol. 5962, October 2005.
[28] E. Hecht, Optics, 4th Ed., Boston: Addison Wesley, 2001.
[29] W. J. Smith, Modern Optical Engineering, 3rd Ed., New York: McGraw-Hill,
2000, pp. 71-72.
[30] R. E. Fischer, B. Tadic-Galeb, and P. R. Yoder, Optical System Design, 2
nd
Ed.,
New York: McGraw-Hill, 2008.
[31] G. Wolberg, Digital Image Warping, 1
st
Ed., Los Alamitos: IEEE Computer
Society Press, 1990, pp. 42-45.
[32] M. Bass, C. DeCusatis, J. Enoch, V. Lakshminarayanan, G. Li, C. MacDonald,
V. Mahajan, and E. Van Stryland, Handbook of Optics, Volume I: Geometrical
and Physical Optics, Polarized Light, Components and Instruments, 3rd Ed.,
New York: McGraw-Hill, 2010, pp. 1.90-1.91.
[33] Z. Zhang, “Flexible Camera Calibration by Viewing a Plane from Unknown
Orientations”, Proceedings of IEEE International Conference on Computer
Vision, pp. 666-673, 1999.
[34] J. Heikkilä and O. Silvén, “A Four-Step Camera Calibration Procedure with
Implicit Image Correction”, Proceedings of IEEE Computer Society Conference
on Computer Vision and Pattern Recognition, pp. 1106 -1112, 1997.
[35] J-Y. Bouguet, “Camera Calibration Toolbox for Matlab”, Available online:
http://www.vision.caltech.edu/bouguetj/calib_doc
[36] OmniVision OV10633 720p Wide-Dynamic Range Image Sensor Product
Brief. Available online: http://www.ovt.com
[37] OmniVision Technologies, “OmniVision Adds Powerful Electronic Distortion
Correction Solution to Its Automotive Product Line”, Press Release,
October 2012. Available online: www.ovt.com/news/presskit.php?ID=102
54
[38] Aptina AP0100 Product Flyer. Available online:
http://www.aptina.com, as of January 2015.
[39] Geo Semiconductor Inc. GW3100 Product Brief. Available online:
http://www.geosemi.com
[40] Leopard Imaging Inc., Milpitas, CA. http://www.leopardimaging.com
[41] J. W. Goodman, “Frequency Analysis of Optical Imaging Systems”, in
Introduction to Fourier Optics, 3rd Ed., Englewood, Colorado: Roberts &
Company Publishers, 2004, Chap. 6.
[42] CODE V® 10.7 Reference Manual, Synopsys Optical Solutions Group,
Pasadena, CA, 2014.
[43] Photography - Electronic Still Picture Cameras - Resolution Measurements,
ISO Standard 12233:2000.
[44] P. Burns, “Slanted-Edge MTF for Digital Camera and Scanner Analysis”,
Proceedings of IS&T Image Processing, Image Quality, Image Capture, Systems
Conference, pp. 135 -138, 2000.
[45] P. Burns, “sfrmat3: SFR analysis for digital cameras and scanners”, Available
online: http://losburns.com/imaging/software/SFRedge/index.htm
[46] Wells Research, Lincoln, MA. http://www.wellsresearch.com
[47] A. Goshtasby, Image Registration: Principles, Tools and Methods, 1
st
Ed.,
London: Springer Science & Business Media, pp. 9-12, 2012.
[48] TriOptics USA, West Covina, CA. http://www.trioptics-usa.com
[49] Ocean Optics, Dunedin, FL. http://www.oceanoptics.com
[50] S. P. Sadoulet, “Optics Testing: MTF Quickly Characterizes the Performance
of Imaging Systems”, Laser Focus World, March 2006.
[51] M. Dahl, J. Heinisch, S. Krey, S. M. Bäumer, J. Lurquin, and L. Chen, “Ultra-Fast
MTF Test for High-Volume Production of CMOS Imaging Cameras”, SPIE
Annual Meeting, Optical Science and Technology, pp. 293-300, August 2003.
55
Chapter 3
OPTIMAL DESIGN OF WIDE ANGLE COMPUTATIONAL CAMERAS
If a wide angle computational camera is to be designed such that the optical
system produces an image with some degree of barrel distortion and this distortion
is removed digitally through dewarping, how much distortion would be optimal for
the optical system? As we will describe in detail below, our investigation shows that
given an initial wide angle lens design that can be used as a starting point for
optimization, relaxing the constraints on allowable barrel distortion during lens
optimization generally decreases the spot sizes and also decreases the illumination
variation at the image plane, producing a sharper and more uniformly illuminated
image. However, this increase in barrel distortion will lead to reduced resolution at
the corners of the final image, as the corner pixels of the final image will be sampled
from a smaller area of the captured image (as illustrated in Figure 3-1). An optimal
balance of distortion and the other primary and chromatic aberrations can be found
to allow for the design of a hybrid optical/digital system that will generate an
improved final image. Analysis of image simulations from wide angle lenses with
varying degrees of barrel distortion, following software dewarping of the simulated
images, allows the optical designer to quantitatively determine the optimal balance
of distortion and other aberrations in order to yield an optimized hybrid
optical/digital imaging system for each given application.
In this chapter, we describe the general methodology for wide angle lens
aberration optimization in a hybrid optical/digital imaging system. As a specific
56
example, we chose to focus on a multi-element lens design, with each element defined
by spherical surfaces, as such lens elements are both readily available and relatively
inexpensive to fabricate as compared with aspherical lenses. Designs specific to the
two key application areas of Wearable Visual Aids and Eye-Tracked Extraocular
Cameras are provided in later chapters.
Figure 3-1. (Left) An input grid of lines (black) and its simulated image
(superimposed in red) for a lens design with a ±60° diagonal field of view and
30% barrel distortion at a 60° (half) field angle. Blue arrows show the local
magnitudes and directions for reverse mapping to implement dewarping.
(Right) An input grid of lines (black) and its simulated image (superimposed in
red) for a lens design with a ±60° diagonal field of view and 60% barrel
distortion at a 60° (half) field angle. In the higher distortion lens, corner
regions are more compressed as compared to the lower distortion lens design.
A lens with four optical elements was used as a basis to create highly
optimized designs in CODE V [1], each design having predefined distortion values at
the most extreme field angles. Test images were simulated and then software
dewarped. The resulting final images were analyzed to determine the optimal barrel
distortion for the optical system design in order to meet general system
specifications.
57
Dewarping was implemented with a custom written Matlab program. Initially,
the distortion profile of a given lens was generated from its computer model in
CODE V, with the built-in Distortion Grid tool [1]. This allows for achieving a
polynomial fit relating the radial distance (from the center of the image) for the
simulated image (rc) to the radial distance for the dewarped image (rd), as explained
in detail in Section 2.4. This polynomial fit was later used to implement dewarping,
in which the pixels of the dewarped image were sampled from the distorted image by
means of reverse mapping with bilinear interpolation.
3.1 Relative Illumination Considerations
In typical optical imaging systems, the luminance at the image plane decreases
as the field angle increases. In simple optical systems, this decrease in relative
illumination is proportional to the fourth power of the cosine of the field angle, known
as the “cosine fourth” law of illumination [2]. However, with more complex
multi-element lens designs, additional factors affect off-axis relative illumination. For
example, negative (barrel) distortion generally causes less darkening at the corners
of the image than that given by the cosine fourth law [3].
As with distortion, relative illumination variations can also be corrected
digitally, this time by applying a spatially varying gain. Once the optical system is
built and integrated with the image sensor array, a calibration procedure can be
carried out to determine the required gain coefficient for each pixel to offset the
effects of relative illumination, and also of variations in individual pixel properties.
58
Typically, the gain coefficient will be unity at the center of the image, and will increase
towards the corners. These gain coefficients for individual pixels can then be stored
in a look-up table, and each pixel value can be multiplied by the corresponding gain
coefficient to reverse the effects of relative illumination. This correction step is
known as “flat-field correction” [4, 5].
A custom flat-field correction program was developed in Matlab. The program
has an initial calibration step, in which it uses a simulated or captured (for the
implemented lenses explained in the later chapters) image of a uniform white target,
and calculates the gain coefficients for the individual pixels. Once the calibration step
is performed for a given lens design, during the correction step the calibrated gain
coefficients are then multiplied with the pixel values of the image undergoing
flat-field correction.
As this relative illumination correction consists of multiplying captured pixel
values by predetermined gain coefficients, it will tend to amplify any noise present in
the captured image for gain coefficients greater than unity. Therefore, the optimal
degree of barrel distortion and subsequent digital dewarping can also be analyzed
from the relative illumination point of view in order to minimize the amplification of
image noise.
3.2 Methods
The case study that we have chosen for this chapter is to design an ultra wide
field of view (120° diagonal FOV) lens for a 1/3” optical format (6 mm diagonal)
59
digital image sensor array with a 4:3 aspect ratio (4.8 mm horizontal × 3.6 mm
vertical). The image sensor array is assumed to have 4 µm square pixels, typical of
image sensor arrays that are optimized for low-light video applications (such as
surveillance and automotive imaging) or wide dynamic range image sensor arrays.
The corresponding total resolution of the image sensor array is 1200 × 900
pixels (1.08 Megapixels).
The system is designed to be very compact; therefore, the total optical length
was constrained to be less than 7 mm, and the entrance pupil diameter was set to
0.8 mm. As explained in detail in Section 3.3 below, the lens design with an optimal
degree of barrel distortion (50% barrel distortion) has an effective focal length of
2.47 mm. With an entrance pupil diameter of 0.8 mm, the lens operates at f/3.1,
which provides a balance between reduction of optical aberrations and high light
throughput. In order to accommodate for the cover glass of the image sensor array,
the back focal length was constrained to be greater than 0.5 mm. The system operates
in the visible spectrum and three sample wavelengths red (630 nm), green (540 nm),
and blue (470 nm) were used in optical system optimization.
As described in Chapter 2, the vertical field of view (vFOV, 92.2° in this case)
can be kept constant during dewarping. Therefore, the chief ray for half-vFOV (46.1°)
was constrained to hit the image plane at an image height of exactly 1.8 mm (half the
vertical size of the image sensor array).
60
Figure 3-2. Original Roosinov lens form, representing a wide field of view,
compact lens design; from U.S. Patent US2516724 [6].
Within these general constraints, multiple lenses were designed using CODE V,
each design having an exact constraint on the distortion value for the 60° half
(diagonal) field angle. Considering the field of view and cost requirements, the
starting lens design was inspired from a Roosinov lens. The Roosinov six-element
lens has a nearly symmetric structure with a positive central group and negative
meniscus lenses at the ends, as shown in Figure 3-2. It can provide wide field of view,
and has a compact volume [6, 7]. In order to reduce system complexity and
associated costs, the central doublets were replaced with singlets. This lens was
modified to match the stated constraints of the given system.
CODE V has an integrated global optimization tool called Global Synthesis [1].
Starting with a lens form that meets system constraints, Global Synthesis can be used
to automatically generate new lens forms that still meet the system constraints but
could potentially have reduced aberrations. It can also be used to confirm that a
selected lens form is optimal in the global lens solution space. For designs meeting
the above stated system constraints, the solution space was explored with this Global
61
Synthesis tool in CODE V, and the starting lens form was indeed confirmed to be the
optimal solution. The optimal lens system for the given case was determined to be a
roughly symmetrical structure with a set of two positive lenses positioned around a
central aperture stop, and with negative meniscus lenses on the outside as shown in
Figure 3-3, similar to the simplified Roosinov lens starting point. With this general
structure, the outer negative elements reduce the large field angles to narrower field
angles for the central positive elements. This general design was then highly
optimized for each case described below. The lens materials were selected using
CODE V's built-in Glass Expert tool [1]. Glass Expert is an optimization tool that
allows for the selection of optimal lens materials for a given constrained lens design.
Figure 3-3. (Left) Schematic diagram of a wide angle lens optimized to have 5%
barrel distortion at a 60° (diagonal) field angle. (Center) Schematic diagram of
a wide angle lens optimized to have 30% barrel distortion at a 60° field angle.
(Right) Schematic diagram of a wide angle lens optimized to have 45% barrel
distortion at a 60° field angle. The last two elements were merged to form a
doublet as indicated by successive optimization cycles. The reference
wavelength for the diagrams is 540 nm.
Image simulation of a high resolution test image (4800 × 3600 pixels),
comprising an array of 25 ISO 12233 resolution targets (Figure 3-4, Top Left), was
then performed in CODE V with the built-in Image Simulation (IMS) tool for each
62
individual lens design. Each color channel of the input image was simulated with the
corresponding wavelength in the lens, namely 630 nm for the red channel, 540 nm
for the green channel, and 470 nm for the blue channel. For this simulation, the point
spread function (PSF) of the system was calculated at 432 sample points on the image
plane (in a 24-by-18 sampling grid), and the local PSF was convolved with the
corresponding points of the test image. The image simulation output was
downsampled in Matlab at the selected image sensor resolution (1200 × 900 pixels).
The pixel values for the downsampled image were achieved by averaging the pixel
values in the corresponding 4 × 4 pixel regions of the high-resolution simulation
output image.
The simulation output for each lens design was then corrected for barrel
distortion using parameters extracted from the lens model. In this case, the pixels of
the dewarped image were sampled from the distorted image by means of reverse
mapping with bilinear interpolation.
This analysis forms the basis for the design of a compact, wide angle scene
camera for both the eye-tracked extraocular camera and wearable visual aid
applications. As opposed to the implemented lens designs presented in Chapters 4
and 5, the optimal lens design presented in this chapter has custom lens elements.
Small diameter negative meniscus lens elements similar to the outer elements of the
designed lens could not be found in lens catalogs. Therefore, the designed lens could
not be converted to a similar design with COTS elements, an approach that would
have allowed for quick and low-cost demonstration.
63
Figure 3-4. (Top Left) Input high resolution test image array, comprising a
5 × 5 array of ISO 12233 test targets. (Top Right) Image simulated through the
lens with 40% barrel distortion at full field (±60° diagonal FOV). The reference
wavelengths for the image simulation are 630 nm, 540 nm, and 470 nm.
(Bottom Left) The simulated image dewarped with distortion parameters
extracted from the lens model with the “Distortion Grid” tool in CODE V .
(Bottom Right) The simulated image dewarped and flat-field corrected.
3.3 Results
In order to quantitatively evaluate the performance of each lens design, the
correlation coefficient of the grayscale version of each simulated image with respect
to the grayscale input image (also downsampled in Matlab with averaging to
1200 × 900 pixels to match the sampling resolution at the image plane) was
calculated (with results as shown in Figure 3-5). The grayscale images are obtained
64
through averaging of pixel values (with equal weights) from the individual color
channels (red, green, and blue) of the color images. As explained above, the system
operates in the visible spectrum and three sample wavelengths red (630 nm), green
(540 nm), and blue (470 nm) were used in lens optimization, as well as in image
simulations.
Figure 3-5. The correlation coefficients for a set of lens designs with different
degrees of barrel distortion at full field (±60° dFOV). Note the offset vertical
axis, spanning correlation coefficient values from 0.3 to 1.0.
These results are discussed in detail below. The effect of relative illumination
(as shown in Figure 3-6) can be reversed by flat-field correction of the dewarped
images. However, as described above, this flat-field correction will to a certain extent
amplify the noise in the final image as pixel values are multiplied by predetermined
gain coefficients that are greater than 1 in value. This effect was analyzed by adding
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
0 10 20 30 40 50 60
Correlation Coefficient
Percent Barrel Distortion at 60 ° FOV
Dewarped
Dewarped and Flat-Field Corrected
10 dB PSNR Noise, Dewarped
10 dB PSNR Noise, Dewarped and Flat-Field Corrected
65
(separately for each individual color channel) 10 dB peak-signal-to-noise ratio
(PSNR) artificial Gaussian white noise to the simulated images before the dewarping
and flat-field correction steps (with results as shown in Figure 3-7). This allows for
the analysis of the designed imaging systems at high image sensor noise conditions
such as low ambient light settings. The PSNR is defined in Equations 3-1 and 3-2
below [8].
𝑃𝑆𝑁𝑅 = 10 𝑙𝑜𝑔 10
𝑅 2
𝑀𝑆𝐸
(3-1)
𝑀𝑆𝐸 =
∑ ∑ (𝐴 𝑚𝑛
− 𝐵 𝑚𝑛
)
2
𝑛 𝑚 𝑚 × 𝑛
(3-2)
in which R is the range of the pixel values for each image. For 8-bit grayscale images,
R is equal to 255. The mean square error (MSE) is derived between image B, a
noise-free grayscale image with a resolution of 𝑚 × 𝑛 pixels, and image A, a grayscale
image also with a resolution of 𝑚 × 𝑛 pixels that has been degraded by noise. Given R,
the value of the MSE can be determined from Equation 3-1 for a given PSNR value,
which then provides a constraint on the level of random noise added to the noise-free
image to yield image A from image B through Equation 3-2.
For the given optical system design under consideration, 40% to 55%
distortion at full field produces the best images following dewarping alone on
noise-free images, as shown in Figure 3-5. However, after flat-field correction, the
image quality variation for designs with different degrees of barrel distortion is
minimal, with a slight decrease evident only at the lowest and the highest distortion
values.
66
Figure 3-6. Variation of illumination as a function of the field angle for the
dewarped images, from eight different wide angle lens designs, each with
different degrees of barrel distortion at full field (±60° dFOV).
Adding 10 dB PSNR noise to the simulated images shows that, even with
flat-field correction, 40% to 55% distortion at full field will still be optimal for the
given design, considering operation under different levels of ambient illumination
and therefore different levels of noise. For this series of wide angle lens designs, the
corners of the low distortion designs will be highly vignetted, and therefore noise will
be amplified in these corner regions with flat-field correction. For the high distortion
designs, the offsetting decrease in illumination variation is quite significant, resulting
in less noise amplification with flat-field correction.
0
20
40
60
80
100
0 10 20 30 40 50 60
Relative Illumination (%)
Field Angle ( °)
10% Barrel Distortion
20% Barrel Distortion
30% Barrel Distortion
35% Barrel Distortion
40% Barrel Distortion
45% Barrel Distortion
50% Barrel Distortion
60% Barrel Distortion
67
Figure 3-7. (Top Left) Image simulated through the lens with 5% barrel
distortion at full field, with 10 dB PSNR noise added. The reference wavelengths
for the image simulation are 630 nm, 540 nm, and 470 nm. (Center Left) The
same image as on the top left following dewarping. (Bottom Left) The same
image as on the top left following dewarping and flat-field correction.
(Top Right) Image simulated through the lens with 50% barrel distortion at full
field and 10 dB PSNR noise added. The reference wavelengths for the image
simulation are 630 nm, 540 nm, and 470 nm. (Center Right) The same image
as on the top right following dewarping. (Bottom Right) The same image as on
the top right following dewarping and flat-field correction.
68
Figure 3-8. Correlation coefficient variation as a function of field angle for lens
designs with different degrees of barrel distortion at full field (±60° diagonal
FOV), without flat-field correction. Correlation coefficients corresponding to
the same field angles are averaged, and the resulting values are plotted in the
graph. Note the offset vertical axis, spanning correlation coefficient values from
0.5 to 1.0.
In order to evaluate the variation of quality in the various regions of the final
images for the lens designs with different degrees of barrel distortion, the correlation
coefficient of each repeating small patch of the final image (obtained through image
simulation, downsampling to image sensor array resolution, dewarping, and
grayscale conversion) with respect to the same region in the input image (after
downsampling to image sensor array resolution and grayscale conversion) was
calculated, with results as shown in Figure 3-8. In this figure, the change in
correlation coefficient as a function of the position on the final image is shown.
However, instead of the location on the image plane, the corresponding object space
field angle is plotted on the x-axis. Due to symmetry in the simulation input image
and the rotational symmetry in the designed optical systems, correlation coefficients
0.5
0.6
0.7
0.8
0.9
1
0 5 10 15 20 25 30 35 40 45 50 55
Correlation Coefficient
Field Angle
10% Barrel Distortion
20% Barrel Distortion
30% Barrel Distortion
40% Barrel Distortion
45% Barrel Distortion
50% Barrel Distortion
55% Barrel Distortion
60% Barrel Distortion
69
obtained from individual patches corresponding to the same object space field angles
(in magnitude), are averaged. Different color curves correspond to lens designs with
different degrees of barrel distortion. The final image obtained from the lens design
with 50% barrel distortion (as shown with the dashed blue line in the figure) has the
highest correlation coefficient up to about ±50° diagonal field angle. This analysis
confirms that 40% to 50% distortion produces overall better images at all regions of
the image. However, image quality starts decreasing towards the corners of the
image with 55% barrel distortion.
Figure 3-9. Correlation coefficient variation as a function of field angle for lens
designs with different degrees of barrel distortion at full field (±60° diagonal
FOV), after flat-field correction. Correlation coefficients corresponding to the
same field angles are averaged, and the resulting values are plotted in the graph.
Note the offset vertical axis, spanning correlation coefficient values from 0.75
to 1.0.
If individual final images (obtained through image simulation, downsampling
to image sensor array resolution, dewarping, and grayscale conversion) are flat-field
corrected, and the correlation coefficient for each small patch is again calculated (as
0.75
0.8
0.85
0.9
0.95
1
0 5 10 15 20 25 30 35 40 45 50 55
Correlation Coefficient
Field Angle ( °)
10% Barrel Distortion
20% Barrel Distortion
30% Barrel Distortion
40% Barrel Distortion
45% Barrel Distortion
50% Barrel Distortion
55% Barrel Distortion
70
shown in Figure 3-9), the decrease in correlation coefficient for large field angles is
not as significant compared to the images that are not flat-field corrected (as shown
in Figure 3-8) for designs with low distortion. Even though the correlation
coefficients for different designs in the center regions of the images (0° field angle)
have more variation compared to the corner regions (except for the design with 55%
barrel distortion), the variation is still not that significant (0.91 correlation coefficient
at the center for the design with 10% distortion at full field, as shown by the solid
blue curve, as compared to 0.98 correlation coefficient at the center for the design
with 50% distortion at full field, as shown by the dashed blue curve). However, visual
inspection of simulated, dewarped and flat-field corrected images (as shown in Figure
3-10), shows that there is significant improvement in resolution in the center regions
of images obtained from lens designs with higher degrees of distortion. In order to
analyze this effect, images containing edge patterns were simulated rather than
relying solely on the correlation coefficient as an evaluation metric. Spatial frequency
response (SFR) analysis on different regions of these images was performed after
dewarping and flat-field correction of the images.
71
Figure 3-10. (Top Left) Image simulated through the lens design with 10%
barrel distortion at full field following dewarping and flat-field correction. The
reference wavelengths for the image simulation are 630 nm, 540 nm, and
470 nm. (Bottom Left) Center patch of the image shown on the top left.
(Top Right) Image simulated through the lens design with 50% barrel
distortion at full field following dewarping and flat-field correction. The
reference wavelengths for the image simulation are 630 nm, 540 nm, and
470 nm. (Bottom Right) Center patch of the image shown on the top right.
Higher resolution in the central patch of the 50% barrel distortion design is
readily apparent.
Figure 3-11 shows a plot of spatial frequency at a modulation of 0.5 for lens
designs with different degrees of barrel distortion, as calculated for different regions
of the dewarped and flat-field corrected images. This analysis shows that by allowing
higher degrees of distortion, the resolution in the center of the images can be greatly
improved, with slightly enhanced resolution in the corner regions as well.
Figure 3-11 confirms that 40% to 50% barrel distortion is optimal for the given
design with the given constraints. The design with 50% barrel distortion has
72
considerably higher resolution in the center and also in the top and bottom side
regions compared to the lower distortion designs, and the resolution in the left and
right side and the corner regions is slightly enhanced relative to those of the lower
distortion designs. Therefore, this design can be considered optimal for the given
constraints.
Figure 3-11. Spatial frequency corresponding to a modulation of 0.5 for lens
designs with different degrees of barrel distortion at full field (±60° dFOV),
derived from simulated images after dewarping and flat-field correction.
Figure 3-12. Schematic diagram of a wide angle lens that was optimized to have
50% barrel distortion at full field (±60° diagonal FOV). The last two elements
were merged to form a doublet as indicated by successive optimization cycles.
This lens design was determined to be optimal for the given constraints.
0
20
40
60
80
100
0 10 20 30 40 50 60
Spatial Frequency
at 0.5 modulation (lp/mm)
Percent Barrel Disortion at 60 °FOV
Center
Top and Bottom Sides
Left and Right Sides
Corner
73
The lens design with 50% barrel distortion at a 60° field angle is shown in
Figure 3-12, has an effective focal length of 2.47 mm, and operates at f/3.1. The first
lens element is fabricated from N-FK51A, a fluorite crown glass with an index of
refraction of 1.487 at λ = 587.6 nm. The second lens element is fabricated from
N-LASF46B, a lanthanum dense flint glass with an index of refraction of 1.904 at
λ = 587.6 nm. The doublet lens element is fabricated from N-LASF40, a lanthanum
dense flint glass with an index of refraction of 1.834 at λ = 587.6 nm, and N-SF66, a
dense flint glass with an index of refraction of 1.923 at λ = 587.6 [9]. The
specifications for the lens are listed in Table 3-1.
Table 3-1. Specifications of the custom four lens wide angle optical system. All
dimensions are in mm. The lens has 50% barrel distortion at a 60° field angle.
The thickness parameter is the distance between the surface indicated and the
following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
First Lens −8.812 0.598
N-FK51A-
SCHOTT
2.0
1.309 0.301 2.0
Second Lens 2.178 1.344
N-LASF46B-
SCHOTT
1.1
4.24 0.277 1.1
Aperture Stop Infinity 0.100 0.477
Doublet Lens 10.044 0.627
N-LASF40-
SCHOTT
1.2
−0.763 0.583
N-SF66-
SCHOTT
1.2
−1.865 3.172 1.2
Image Plane Infinity 2.13
74
3.4 Summary
Most wide angle lenses require a dewarping step to achieve a rectilinear
image. A novel method for analyzing the optimal degree of barrel distortion for a
given lens design was developed. A general starting point test case was optimized
multiple times to generate lens designs with different degrees of barrel distortion at
full field. Through analysis of simulated and dewarped images with each design, the
design that yields the best dewarped final image (with or without flat-field correction,
and also with or without artificial image sensor noise addition) was determined.
Different analysis methods, such as the correlation coefficient with respect to the
input image and analysis of the variation of the image quality at different regions of
the image, were employed for dewarped images with or without flat-field correction
(and also with or without artificial image sensor noise addition). The results show
that designs with an initial 40% to 50% barrel distortion yield the best images after
dewarping. For these particular designs, the output images have high resolution in
the central field of view, and the resolution decreases for increasing field of view.
This method of determining optimal distortion for wide angle computational
cameras combines image simulation in optical design software with image dewarping
and flat-field correction in software post-processing, and includes careful analysis of
the final images. As has been demonstrated by the wide angle lens design described
in this chapter, this method is very powerful, and gives optical designers useful
degrees of freedom in an extended design space. In this chapter, miniaturization and
cost constraints (for example, the incorporation of only spherical surfaces) provided
75
the limiting factors on the optical system, and the overall imaging performance was
improved by designing the optical system with an optimal amount of distortion and
performing corrections in the digital domain. This approach can be generalized to
other wide angle systems with different system constraints.
76
Chapter 3 References
[1] CODE V® 10.7 Reference Manual, Synopsys Optical Solutions Group,
Pasadena, CA, 2014.
[2] M. Reiss, “The cos
4
Law of Illumination”, Journal of the Optical Society of
America, Vol. 35, No. 4, pp. 283-288, 1945.
[3] V. N. Mahajan, Optical Imaging and Aberrations: Part 1. Ray Geometrical
Optics, Bellingham, WA: SPIE, 1998, Chap. 2.
[4] R. E. Fischer, B. Tadic-Galeb, and P. R. Yoder, Optical System Design, 2
nd
Ed.,
New York: McGraw-Hill, 2008, p. 696.
[5] J. A. Seibert, J. M. Boone, and K. K. Lindfors, “Flat-Field Correction Technique
for Digital Detectors”, Proceedings of SPIE: Medical Imaging, pp. 348-354,
July 1998.
[6] M. M. Roosinov, “Wide Angle Orthoscopic Anastigmatic Photographic
Objective”, U.S. Patent US2516724, 1950.
[7] R. Kingslake, A History of the Photographic Lens, 1st Ed., Boston: Academic
Press, 1989, Chap. 10.
[8] Q. Huynh-Thu and M. Ghanbari, “Scope of Validity of PSNR in Image/Video
Quality Assessment”, Electronics Letters, Vol. 44, No. 13, pp. 800-801, 2008.
[9] Schott Inc. Optical Glass Data Sheets.
Available online: http://www.schott.com
77
Chapter 4
MINIATURE WIDE FIELD OF VIEW LENS DESIGN AND
OPTIMIZATION OF LATERAL CHROMATIC ABERRATION
Prior to the invention of photographic film, optical design focused mainly on
narrow field of view systems like telescopes and microscopes. The first photographic
lens was designed by the English scientist W. H. Wollaston in 1812, interestingly
about 25 years before the invention of photographic film [1]. This lens was a single
meniscus lens with the convex side facing the image plane and an aperture stop
placed in front. It was capable of having a relatively wide field of view for the time,
45° at f/11 or f/16 [2].
4.1 Single Meniscus Lens Design
The simplest camera lens is a singlet lens with spherical surfaces, and as was
first shown by Wollaston, such a simple lens can provide a wide field of view.
Therefore, the starting point for the design of a wide field of view, miniature lens was
taken to be precisely this Wollaston lens. Wollaston’s original single meniscus design
was modified to cover a 100° dewarped diagonal field of view (dFOV) over a 4.8 mm
× 3.6 mm image sensor array with 3.75 µm × 3.75 µm pixels (1280 × 960 resolution),
and then optimized in CODE V. The final design has an effective focal length of 3.1 mm
and is f/3.1. The lens is fabricated from S-LAH59, a high index glass from Ohara Corp.
with an index of refraction of 1.816 at λ = 587.6 nm [3]. The total distance from the
aperture stop to the image plane is 5.1 mm. The lens specifications are listed in Table
78
4-1. As expected, a single lens with two spherical surfaces and a single glass material
does not provide enough degrees of freedom to minimize aberrations over a wide
field of view at such a low f-number (Figure 4-1). However, this single lens design
yields an extremely compact optical system with a wide field of view at moderate
resolution and with moderate image brightness uniformity.
Figure 4-1. (Top Left) Schematic diagram of a single meniscus, wide angle lens
design, based on the Wollaston lens. The reference wavelength for the diagram
is 540 nm. (Top Right) Image simulation result through the lens shown on the
left, after dewarping. The image simulation was performed with the individual
color channel wavelength weights listed in Table 4-3. The image is not flat-field
corrected to show the variation of illumination at the image plane. (Bottom)
The same image as on the top right after flat-field correction.
79
Table 4-1. Specifications of the designed custom single meniscus lens, as shown
in Figure 4-1. All dimensions are in mm. The thickness parameter is the
distance between the surface indicated and the following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.1
0.5
Meniscus Lens −8.886 1.925
S-LAH59 -
OHARA
1.8
−2.172 3.083
1.8
Image Plane Infinity 2.5
4.2 Two Lens Design
In order to improve the resolution of the single meniscus lens design, a second
spherical lens was added. The goal is to have a highly compact system, and therefore
the best location to place the second lens is after the first lens in order to keep the
overall system length short. The optimal shape for this second lens was determined
(through simulation and optimization) to again be a positive meniscus. Both lenses
are fabricated from Schott N-PSK53A, a dense phosphate crown glass with an index
of refraction of 1.618 at λ = 587.6 nm [4]. The optimal lens materials were selected
using CODE V's built-in Glass Expert tool (which resulted in the same lens material
for both lenses). The design of this two meniscus lens system with a front aperture
stop and with the convex surfaces of both lenses facing each other at the center
(Figure 4-2) was optimized to balance and minimize aberrations.
The final design has an effective focal length of 3.21 mm and is f/3.21. The
total distance from the aperture stop to the image plane is only 7 mm. The lens
80
specifications are listed in Table 4-2. This lens design has much better overall
aberration correction as compared to the single lens design (apparent in the central
region of Figure 4-2, Bottom), although an increase in lateral chromatic aberration is
evident near the periphery of the image plane (Figure 4-2, Top Right).
Figure 4-2. (Top Left) Schematic diagram of a two lens, wide angle lens design.
The reference wavelength for the diagram is 540 nm. (Top Right) Image
simulation result through the lens shown on the left, after dewarping. The
image simulation was performed with the individual color channel wavelength
weights listed in Table 4-3. The image is not flat-field corrected to show the
variation of illumination at the image plane. (Bottom) The same image as on
the top right after flat-field correction.
81
Table 4-2. Specifications of the designed custom wide field of view lens with
two meniscus elements, as shown in Figure 4-2. All dimensions are in mm. The
thickness parameter is the distance between the surface indicated and the
following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.133
0.5
First Lens −3.357 1.905
N-PSK53A-
SCHOTT
2.0
−2.161 0.533
2.0
Second Lens 3.864 2.998
N-PSK53A-
SCHOTT
3.0
8.815 1.430 3.0
Image Plane Infinity 2.3
Prototype manufacturing and assembly of custom optical elements is
expensive. In order to evaluate a prototype system with similar performance that is
less expensive to produce, a lens system similar to the design shown in Figure 4-2 but
with closely matching commercially available off-the-shelf (COTS) lens elements can
be designed. COTS miniature (≤ 6 mm diameter) lens elements are usually plano
spherical lenses (plano convex or plano concave). Therefore, the two meniscus lens
design shown in Figure 4-2 was converted to a four lens design by just converting the
meniscus elements into plano convex and plano concave elements with the planar
sides in contact. Although this does not increase the number of spherical surfaces,
which would yield an extended design space, it provides the opportunity to use
different glasses for correction of chromatic aberrations. Before describing this four
lens wide angle lens design as well as its associated COTS design in detail, the
procedure employed for correcting lateral chromatic aberration is presented.
82
4.3 Lateral Chromatic Aberration Correction
Wide angle lenses typically suffer from lateral chromatic aberration (LCA) at
large field angles [5], as can be seen for example in the peripheral areas of the
simulated image shown in Figure 4-2. In the case of LCA, the focal point is shifted
laterally in the off-axis field for different wavelengths, causing a smearing of the
colors in high contrast areas as shown schematically in Figure 4-3. The traditional
optical method for correcting chromatic aberration is to incorporate both positive
and negative lens elements with different dispersion characteristics to minimize
chromatic aberration effects at the image plane. LCA is typically the hardest
aberration to correct in wide angle lenses [6].
Figure 4-3. Lateral Chromatic Aberration (LCA) in an optical system will cause
the image of a white point (Left) to separate into different colors (Center) at the
image plane. Each color channel of the captured digital image (red, green, and
blue) can be realigned digitally in software to minimize LCA (Right).
An alternative method is to correct the LCA in software post-processing [7].
This represents an added correction step for the overall optical imaging system;
however, it has the potential to both simplify the optical design and to provide the
designer with more freedom to correct for non-chromatic aberrations as well. In the
case of a digitally corrected wide angle imaging system, the captured image is
83
dewarped via a look-up table, as described previously. For the case of LCA, rather
than having a single look-up table, the red, green, and blue color channels can be
characterized and dewarped separately with different look-up tables to correct for
lateral chromatic aberration in software post-processing.
Figure 4-4. Spectral response of the three different color filters of a
commercially available color image sensor array, the Aptina MT9M034;
from [8]. The cutoff frequency of an IR-cut filter that is typically employed in
cameras is shown with the dashed line.
Different wavelengths of light will typically experience different degrees of
aberrations, and since the individual color filters used in the image sensor array cover
a wide range of wavelengths to maximize light throughput, it is difficult to obtain
perfect reconstruction of the input after realignment in software. The design of a lens
system with digital correction of lateral chromatic aberration should therefore be
optimized considering a realistic range of wavelengths for the individual color
channels, corresponding to the spectra of the color filters on the image sensor array.
84
Optimization and subsequent analysis of the performance of this lens should also
account for the spectra of each of the color channels.
Table 4-3. Wavelength weights for the three color channels extracted from the
plot shown in Figure 4-4. Wavelength weights of <10% were set equal to zero,
as they typically have a minimal effect on the optical system design and
analysis.
Weight
Wavelength (nm) Blue Green Red
400 27 0
0
420 39 0 0
440 47 0 0
460 51 0 0
480 46 0 0
500 31 16 0
520 15 44 0
540 0 62 0
560 0 57 0
580 0 45 26
600 0 29 57
620 0 12 53
640 0 0 48
660 0 0 42
An analysis of the spectral response characteristics of Bayer color filters (a
Bayer color filter typically consists of one red filter, two green filters and one blue
filter, organized in a rectangular grid) from several companies [9] shows that
although the quantum efficiencies might be different, the cut-off wavelength
characteristics for specific color filters are usually similar. A typical spectral curve is
shown in Figure 4-4. Wavelength weights extracted from this curve (listed in Table
4-3) were used in image simulation analyses of the lenses presented in this chapter.
85
Wavelength weights corresponding to quantum efficiencies of less than 10% were
not included, as they had minimal effect on the analyses. The optical system is
assumed to include an IR-cut filter with a cut-off wavelength at 660 nm, typical for
cameras designed for the visible spectrum.
4.4 Wide Angle Lens Design with Pixel-Limited Lateral Chromatic Aberration
In what follows, we first consider the optimization of the four lens design
described above with the goal of minimizing lateral chromatic aberration optically
without correction in software post-processing. Then we consider the
re-optimization of the lens, allowing lateral chromatic aberration in the optical
system design and correcting for it with software post-processing. As we will see, the
latter approach yields a computational camera design with overall improved
performance.
After converting the meniscus elements to plano spherical elements, the lens
design was optimized separately with two different design approaches as described
above. The first approach imposes the design constraint to have an LCA of less than
the pixel size (3.75 µm) across the field of view over the visible spectrum, a
pixel-limited LCA design. Lens glass materials were optimally selected to meet this
constraint using CODE V's built-in Glass Expert tool. The first and fourth lenses are
fabricated from N-SF66, a dense flint glass with an index of refraction of 1.923 at
λ = 587.6 nm. The second and third lenses are fabricated from N-LAK33B, a
lanthanum crown glass with an index of refraction of 1.755 at λ = 587.6 nm [4].
86
The final optical system design that meets the pixel-limited lateral chromatic
aberration design constraint is shown in Figure 4-5 together with the corresponding
spot diagram, and the image simulation through this lens is shown in Figure 4-6. The
lens specifications are listed in Table 4-4. As with all of the image simulations shown
in this chapter, this image simulation was performed with individual color channel
wavelength weights extracted from the spectral graph shown in Figure 4-4 and listed
in Table 4-3. As expected, there is no observable lateral color in this simulated image.
This image was later dewarped using a single look-up table (in Matlab, as explained
in detail in Chapter 3), and the result is shown in Figure 4-6. The flat-field corrected
image is shown in Figure 4-7 below.
Figure 4-5. (Left) Schematic diagram of a four lens, wide angle optical system
design that was optimized to have pixel-limited LCA across the visible spectrum.
The reference wavelength for the diagram is 540 nm. (Right) Spot diagram for
the lens shown on the left, for five field angles and with three wavelengths (red:
630 nm; green: 540 nm; and blue: 470 nm).
87
Figure 4-6. (Top) Image simulation result through the optical system with
pixel-limited LCA shown in Figure 4-5, Left. The image simulation was
performed with the individual color channel wavelength weights listed in Table
4-3. (Bottom) The same image as on the top following software dewarping with
a single LUT (no digital chromatic aberration correction). The image is not
flat-field corrected to show the variation of illumination at the image plane.
88
Table 4-4. Specifications of the four lens, wide angle optical system design that
was optimized to have pixel-limited LCA across the visible spectrum. All
dimensions are in mm. The thickness parameter is the distance between the
surface indicated and the following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.169
0.5
First Lens −4.156 1.626
N-SF66-
SCHOTT
2.0
Infinity 0.0025
2.0
Second Lens Infinity 1.044
N-LAK33B-
SCHOTT
2.0
−2.411 0.386 2.0
Third Lens 3.854 1.474
N-LAK33B-
SCHOTT
3.0
Infinity 0.0025 3.0
Fourth Lens Infinity 0.5
N-SF66-
SCHOTT
3.0
5.874 2.269 3.0
Image Plane Infinity 2.33
Figure 4-7. The same image as in Figure 4-6, Bottom after flat-field correction.
89
4.5 Wide Angle Lens Design with Unconstrained Lateral Chromatic Aberration
As a second approach to correcting for lateral chromatic aberration, the same
four lens initial design was also optimized without the pixel-limited LCA constraint,
with LCA correction applied digitally following image capture. As in the previous
case, the red, green, and blue color channels each had optimization weights extracted
from the spectral graph shown in Figure 4-4, and listed in Table 4-3. This allows for
the spot sizes for each color channel to be minimized, but does not impose the
constraint that spots corresponding to different color channels for a given field angle
should be located at the same point on the focal plane, resulting therefore in an
unconstrained LCA design.
The final design for this system is shown in Figure 4-8, and the image
simulation result through this lens is shown in Figure 4-9. The polychromatic RMS
spot sizes for large field angles are increased in this case relative to those obtained in
the previous case (Figure 4-5) due to the LCA. The lens specifications are listed in
Table 4-5. The monochromatic RMS spot sizes at three wavelengths for both the
pixel-limited and unconstrained LCA designs are listed in Table 4-6, and a comparison
shows a significant reduction in the monochromatic spot sizes with the
unconstrained LCA design. A separate look-up table was generated for each
individual color channel using CODE V’s built-in Distortion Grid tool and a custom
Matlab program, as explained before. The simulated image (shown in Figure 4-9,
Top) was dewarped using these three separate look-up tables (LUTs) for the red,
green, and blue color channels, and the result is shown in Figure 4-10. For
90
comparison, the same image dewarped with a single LUT (the LUT for the green
channel) is also shown in Figure 4-9 (Bottom).
Figure 4-8. (Left) Schematic diagram of a four lens, wide angle lens design
optimized with the individual color channel wavelength weights listed in
Table 4-3, with unconstrained LCA. The reference wavelength for the diagram
is 540 nm. (Right) Spot diagram for the lens shown on the left, for five field
angles and with three wavelengths (red: 630 nm; green: 540 nm; and blue:
470 nm). The units of the RMS spot diameters are in µm.
91
Table 4-5. Specifications of the four lens, wide angle optical system design
that was optimized with the RGB weights extracted from the spectral graph
shown in Figure 4-4, and listed in Table 4-3, with unconstrained LCA. All
dimensions are in mm. The thickness parameter is the distance between the
surface indicated and the following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.189
0.5
First Lens −4.0 1.789
N-SF57-
SCHOTT
2.0
Infinity 0.0025
2.0
Second Lens Infinity 1.035
N-LASF44-
SCHOTT
2.0
−2.760 0.001 2.0
Third Lens 4.470 2.624
N-LAK33B-
SCHOTT
3.0
Infinity 0.0025 3.0
Fourth Lens Infinity 0.5
N-SF66-
SCHOTT
3.0
7.058 1.857 3.0
Image Plane Infinity 2.36
Table 4-6. Comparison of monochromatic RMS spot sizes for the
pixel-limited LCA lens design and the unconstrained LCA lens design. The
unconstrained LCA lens design has smaller monochromatic RMS spot sizes.
Field
Angle (°)
Blue (470 nm) Green (540 nm) Red (630 nm)
Pixel-Lim.
LCA (µm)
Unconst.
LCA (µm)
Pixel-Lim.
LCA (µm)
Unconst.
LCA (µm)
Pixel-Lim.
LCA (µm)
Unconst.
LCA (µm)
0 14.6 10.2 13.6 9.8 13.3 9.5
15 14.3 10.6 13.8 10.1 13.9 9.8
30 15.3 12.5 15.8 12.1 16.4 11.4
45 13.4 11.7 15.1 11.4 16.5 10.8
50 12.7 10.3 14.7 10.1 16.3 9.7
92
Figure 4-9. (Top) Image simulation result through the lens with unconstrained
LCA shown in Figure 4-8. The image simulation was performed with the
individual color channel wavelength weights listed in Table 4-3. (Bottom) The
same image as on the top after dewarping with a single LUT . The image is not
flat-field corrected to show the variation of illumination at the image plane.
93
Figure 4-10. (Top) The same image shown in Figure 4-9, Top after dewarping
with 3 separate LUTs, one for each color channel. The image is not flat-field
corrected to show the variation of illumination at the image plane.
(Bottom) The same image as on the top after flat-field correction.
As can be seen by comparing Figures 4-7 and 4-10, Bottom, a significant
improvement in overall image quality has been achieved by first relaxing the optical
system design constraint on lateral chromatic aberration, and then dewarping with
three separate look-up tables, one for each principal color channel. A quantitative
analysis of the imaging performance is provided in the following section.
94
4.6 Analysis of Results
An overall metric for measuring the performance of an optical system is the
modulation of each spatial frequency at a given field angle, as described in detail in
Chapter 2.
Figure 4-11 shows the spatial frequency response (SFR) analysis of image
simulation results for the several different wide angle lens designs described above
after dewarping, both for the central region and also for the edge region (the central
and edge regions were defined in Section 2.8.2). In this case, the SFR calculated from
the simulated images is a better metric as compared to the MTF calculated in CODE V
from the lens model, because we are more interested in the quality of the dewarped
images. Compared to the single lens case, there is a great improvement in overall
performance with the addition of the second lens, particularly in the center region.
The two plots in Figure 4-11 show that the improvement is not as dramatic for the
edge region. This is due in part to the dewarping process, which blurs all designs
almost equally, as all designs have similar distortion characteristics.
Figure 4-11 also shows that the four lens design with unconstrained LCA and
correction of LCA in software has much better quality in the center region compared
to the four lens design with pixel-limited LCA. The performance of both designs in
the edge regions is almost identical. The SFR analysis for the design implemented
with closest matching commercially available off-the-shelf (COTS) lens elements
(EO-Lens), as described in detail below, is also presented in Figure 4-11 for
comparison. This comparison shows that the EO-Lens (as described below) has
95
slightly worse but comparable performance as compared to the unconstrained LCA
four lens custom design, both with software LCA correction.
Figure 4-11. (Top) Spatial frequency response (SFR) plot of the designed lenses
in the center region. (Bottom) SFR plot of the designed lenses at the edges.
(Note that the EO-Lens (orange) and the COTS Design – Minimal LCA (black)
curves almost overlap in this case). Image simulations of the ISO 12233 test
target (used in the SFR calculations) were performed with the individual color
channel wavelength weights listed in Table 4-3.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Spatial Frequency Response
Spatial Frequency (lp/mm)
Single Meniscus Lens
Two Lens
Four Lens - Pixel-Limited LCA
Four Lens - Unconstrained LCA
COTS Implementation (EO-Lens)
COTS Design - Minimal LCA
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Spatial Frequency Response
Spatial Frequency (lp/mm)
Single Meniscus Lens
Two Lens
Four Lens - Pixel-Limited LCA
Four Lens - Unconstrained LCA
COTS Implementation (EO-Lens)
COTS Design - Minimal LCA
96
SFR analysis does not provide a quantitative indication of the amount of LCA
in an optical system. As a metric for LCA, the correlation coefficients between
different pairs of color channels (red, green, and blue) can be calculated. In the case
of a black and white test target, such as the ISO 12233 target, information in the
different color channels should ideally be identical or at the very least highly
correlated. Low correlation coefficient values indicate a mismatch across the color
channels due to LCA.
Table 4-7. Correlation coefficients for different color channels in the simulated
and dewarped images. The corner patch coefficients are derived from the
slanted black square in the top left corner of the images, as shown in
Figure 4-12.
Overall Corner Patch
Red-Green Red-Blue Red-Green Red-Blue
Single Lens 0.982 0.842 0.962 0.686
Two Lens 0.969 0.696 0.931 0.504
Four Lens – Pixel-Limited LCA 0.996 0.991 0.998 0.996
Four Lens – Unconstrained LCA 0.999 0.996 0.999 0.996
As can be seen in Figure 4-11, Figure 4-12, and Table 4-7, the additional
constraint on minimizing lateral chromatic aberration in such a highly constrained
optical system design space can be relaxed to allow for higher degrees of freedom for
optimizing other aberrations. As the LCA can be corrected in software, the relaxed
lateral chromatic aberration constraint can provide an output image that is better
overall after dewarping with 3 separate LUTs for the red, green, and blue color
channels.
97
Custom Single
Lens Design
Custom Two
Lens Design
Custom Four
Lens Design,
with
Pixel-Limited
LCA
Custom Four
Lens Design
with
Unconstrained
LCA
COTS Design
with
Unconstrained
LCA (EO-Lens)
Figure 4-12. Visual comparison of lens performance in the center regions (Top
Row) and corner regions (Bottom Row) of images simulated with the designed
lenses, after dewarping. For the Custom Four Lens Design with Unconstrained
LCA, the dewarping was performed with three separate look-up tables, each
optimized for an individual color channel (red, green, and blue). For the COTS
Design with Unconstrained LCA (EO-Lens), the dewarping was also performed
with three separate look-up tables, each optimized for an individual color
channel (red, green, and blue). Image simulations of the ISO 12233 test target
were performed with the individual color channel wavelength weights listed in
Table 4-3.
4.7 Custom Wide Angle Lens with COTS Elements
As can be seen in Table 4-7, the four lens optical system designs exhibit much
better chromatic aberration correction than either the single lens or the two lens
designs. In addition, both four lens optical system designs exhibit nearly identical
chromatic aberration correction. After designing and optimizing the four lens optical
system with unconstrained LCA shown in Figure 4-8, the closest matching optical
system using COTS elements was designed. Starting with the first lens element, the
lens was manually replaced with the closest matching COTS lens. Lens focal length
98
and center thickness were two key parameters considered in evaluating a COTS lens
for a match. After replacing the first lens, the radii of curvature of the nonplanar
surfaces and the thicknesses for the other lenses, as well as the lens separations, were
made variable, and the whole system was re-optimized. Then the next lens was
replaced with its closest matching COTS lens, and the system was again re-optimized
as before with the lens separations and the lens parameters for the remaining
elements as variables.
The set of available COTS elements is very limited, and therefore the
correction of chromatic aberrations by means of an optimal selection of glasses was
not possible. The first, third, and fourth lenses are fabricated from N-SF11, a dense
flint glass with an index of refraction of 1.785 at λ = 587.6 nm. The second lens is
fabricated from N-LASF44, a lanthanum dense flint glass with an index of refraction
of 1.804 at λ = 587.6 nm [4]. These lens elements are all commercially available from
Edmund Optics. The first lens has MgF2 coating and the last three lenses have VIS 0°
coating to provide low reflectance across the visible spectrum. Therefore, this four
lens COTS design will be referred to as the EO-Lens for short in the rest of this thesis.
The spatial frequency response of this lens system is shown in Figure 4-11 as the solid
orange line.
The same procedure of replacing lens elements with their closest matching
COTS counterparts was applied for the four lens design with pixel-limited LCA as well
(as shown in Figure 4-5). Since the selection of small COTS lens elements is very
limited, the end result has the same exact lens elements as the EO-Lens. However,
99
trying to minimize the LCA optically results in a slightly different lens design, with
different lens spacings. This lens design has smaller (but not pixel-limited) LCA as
compared to the EO-Lens, and the SFR results with this lens are shown in Figure 4-11
as the solid black line. In this case, the images simulated with this lens were also
dewarped with three separate look-up tables, in order to reduce any residual LCA.
Figure 4-11 shows that the EO-Lens (shown with the orange line) has better contrast
in the center region compared to this COTS lens that has been designed to exhibit
reduced optical LCA. A schematic diagram of the COTS lens design with reduced
optical LCA is shown in Figure 4-13, and the lens specifications are listed in Table 4-8.
Table 4-8. Specifications of the custom four lens COTS optical system with
reduced optical LCA. All dimensions are in mm. The thickness parameter is the
distance between the surface indicated and the following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.17
0.5
Edmund Optics
45373
−4.71 1.0
N-SF11 -
SCHOTT
1.5
Infinity 0.0025
1.5
Edmund Optics
47861
Infinity 1.7
N-LASF44
SCHOTT
2.0
−3.21 0.001
2.0
Edmund Optics
47460
4.71 2.5
N-SF11 -
SCHOTT
3.0
Infinity 0.0025
3.0
Edmund Optics
48341
Infinity 1.5
N-SF11 -
SCHOTT
3.0
7.06 0.75
3.0
Image Plane Infinity 0.623
2.41
100
Figure 4-13. Schematic diagram of the custom lens design using COTS elements
with reduced optical LCA. The reference wavelength for the diagram is 540 nm.
A schematic diagram of the custom wide angle four element COTS lens design
(EO-Lens) is shown in Figure 4-14. A custom lens housing for the final lens system
was designed and fabricated out of black Delrin® [10] material to minimize internal
reflections, as also shown in Figure 4-14. The housing outer thread was set to be
M12 × 0.5, which is a common, standard lens thread for miniature cameras, for ease
of lens mounting and testing. This thread has a 12 mm diameter; however, the largest
individual lens diameter in the system is only 6 mm. This allows for a much smaller
footprint for the lens housing in future custom camera implementations. The
individual lens elements were acquired, and the lenses were then integrated with the
housing. The total length of the lens is only 8 mm, with a 3.24 mm effective focal
length operating at f/3.24. The specifications for the designed lens are listed in
Table 4-9.
101
Figure 4-14. (Top) Schematic diagram of the custom lens design with
unconstrained LCA using COTS elements (EO-Lens). The reference wavelength
for the diagram is 540 nm. (Bottom) Size comparison of the designed and
fabricated lens (Bottom Right) with a commercially available lens of similar
field of view (Bottom Left). The fabricated lens diameter could be reduced from
12 mm (set in this implementation by the size of the standard M12 × 0.5 thread
employed for ease of optical mounting for lens testing) to approximately 7 mm.
102
Table 4-9. Specifications of the custom four lens COTS optical system with
unconstrained LCA (EO-Lens). All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and the following
surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.47
0.5
Edmund Optics
45373
−4.71 1.0
N-SF11 -
SCHOTT
1.5
Infinity 0.03
1.5
Edmund Optics
45224
Infinity 1.7
N-LASF44
SCHOTT
2
−3.21 0.10
2
Edmund Optics
45078
4.71 2.5
N-SF11 -
SCHOTT
3
Infinity 0
3
Edmund Optics
48333
Infinity 1.5
N-SF11 -
SCHOTT
3
7.06 0.75
3
Image Plane Infinity 0.644
2.34
4.8 Lens Testing and Verification
The optical performance of the as-implemented COTS EO-Lens designed with
unconstrained LCA was tested using a commercial optical test bench, the OS-400-25
from Wells Research [11]. The distortion simulation and corresponding
measurements are shown in Figure 4-15, from which it can be seen that the distortion
measurement closely matches the distortion profile generated from the lens model in
CODE V. Modulation transfer function (MTF) measurements were also performed
both on-axis and also at various field angles. The resulting MTF plots were compared
103
with the MTF plots generated with the CODE V model of the lens, with results shown
in Figures 4-16, 4-17, and 4-18. The results from the MTF measurements agree quite
well with the MTF plots obtained from the CODE V model of the lens, and show that
the physical implementation of the lens was successfully achieved.
Figure 4-15. Comparison of the measured distortion profile of the COTS
EO-Lens designed with unconstrained LCA to the distortion profile obtained
from the CODE V model.
It is challenging to exactly match the position of the reticle in the Wells
Research OS-400-25 optical test bench to the location of the image plane in the optical
design software, potentially resulting in discrepancies between the results. It is also
difficult to match the exact focal plane (reticle) location for different measurements
of the same lens. Figures 4-16 and 4-17 show two different MTF measurement
results of the same EO-Lens. Depending on the selected location of the focal plane,
the radial and tangential measurements for different field angles are slightly different.
0°
5°
10°
15°
20°
25°
30°
35°
40°
45°
50°
-40 -30 -20 -10 0
Field Angle
Distortion (%)
Measured Distortion
Simulated Distortion
104
Figure 4-18 shows the MTF averaged over the five field angles measured for the radial
and tangential sets of rays, showing the average performance of the lens to be
compared with the average results obtained from the CODE V design software. The
excellent agreement among the average measured and simulated MTF plots for both
the tangential and radial cases shows not only that the measurements with the Wells
Research OS-400-25 test bench are highly repeatable, but also that the implemented
design performs as expected relative to the initial optical system design.
In order to test its imaging performance, the COTS EO-Lens was tested with
two image sensor arrays, an OV10633 from OmniVision Technologies and also an
MT9M034 from Aptina Imaging. The OV10633 is a high-definition, wide dynamic
range (115 dB dynamic range) image sensor array with 1280 × 800 pixel resolution
and 4.2 µm × 4.2 µm pixels [12]. The MT9M034 is also a high-definition, wide
dynamic range (115 dB dynamic range) image sensor array with 1280 × 960 pixel
resolution and 3.75 µm × 3.75 µm pixels [13]. As implemented with either image
sensor array, the integrated system forms a wide angle, wide dynamic range camera.
The results achieved with the higher resolution MT9M034 image sensor array are
presented below.
105
Figure 4-16. First modulation transfer function (MTF) measurement of the
EO-Lens. Tangential MTF (Top) and radial MTF (Bottom) curves for the
designed and implemented EO-Lens. The MTF measurements were performed
with the green LED of the OS-400-25 system, with the spectrum as shown in
Figure 2-15. CODE V results were also obtained with wavelength weights
derived from the spectrum of the green LED as shown in Figure 2-15. The
measurement results show close agreement to the analysis results obtained
from the CODE V model of the lens at all field angles.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Tangential MTF
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Radial MTF
Simulation On-Axis Measurement On-Axis
Simulation 10° Measurement 10°
Simulation 20° Measurement 20°
Simulation 30° Measurement 30°
Simulation 40° Measurement 40°
106
Figure 4-17. Second MTF measurement of the EO-Lens. Tangential MTF (Top)
and radial MTF (Bottom) curves for the designed and implemented EO-Lens.
The MTF measurements were performed with the green LED of the OS-400-25
system, with the spectrum as shown in Figure 2-15. CODE V results were also
obtained with wavelength weights derived from the spectrum of the green LED
as shown in Figure 2-15. The measurement results show close agreement to the
analysis results obtained from the CODE V model of the lens at all field angles.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Tangential MTF
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Radial MTF
Simulation On-Axis Measurement On-Axis
Simulation 10° Measurement 10°
Simulation 20° Measurement 20°
Simulation 30° Measurement 30°
Simulation 40° Measurement 40°
107
Figure 4-18. Comparison of the modulation transfer function (MTF) averaged
over five field angles for two separate measurements with the result obtained
from the CODE V model of the EO-Lens. These results are based on the data
presented in Figures 4-16 and 4-17, and show excellent agreement among the
average results for the two measurements and the average result obtained from
the CODE V model.
Characterization of the integrated camera comprising the EO-Lens, the
MT9M034 image sensor array and an IR-cut filter (Sunex IRC30 with a cutoff
wavelength of approximately 650 nm [14]) shows that it has a 97° dewarped diagonal
FOV, an 84° dewarped horizontal FOV, and a 68° dewarped vertical FOV. The SFR plot
for the lens up to the Nyquist frequency for the image sensor array (133 lp/mm,
corresponding to the pixel size of 3.75 µm) is shown in Figure 4-19. The integrated
camera was used to capture images of an ISO 12233 resolution target, as shown in
Figure 4-20, as well as of a Macbeth color chart, as shown in Figure 4-21, and the
captured images were dewarped using distortion parameters calculated in CODE V
from the lens model.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Average MTF Comparison
Meas. 1: Tan.
Meas. 1: Rad.
Meas. 2: Tan.
Meas. 2: Rad.
CodeV Tan
CodeV Rad
108
Figure 4-19. Comparison of spatial frequency response (SFR) plots for
captured and simulated images with the EO-Lens integrated with an Aptina
MT9M034 image sensor array after dewarping, with the object placed at
infinity. Image simulations of the ISO 12233 test target (used in the SFR
calculations) were performed with the individual color channel wavelength
weights listed in Table 4-3.
The SFR plot shows that the custom four lens COTS optical system has very
good performance at the center of the image, and that the resolution decreases
towards the corners. If a modulation of 0.3 is considered for resolution estimation
(with the corresponding spatial frequency obtained from the SFR plot in Figure 4-19
for the captured image), then the (optical) resolution would be 150 × 150 pixels in
the central ±20° (diagonal) region, and the total (optical) resolution would be
448 × 336 pixels. As shown with the SFR plot in Figure 4-11, if a full custom design
with four spherical surfaces is considered (purple curve), the resolution can be
improved. The total resolution is determined by estimating the resolution in different
regions of the image (corners, left and right edges, top and bottom edges, and center)
as shown in Figure 4-20. This estimation is achieved by first calculating the
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Spatial Frequency Response
Spatial Frequency (lp/mm)
Simulated, Dewarped Center
Captured, Dewarped Center
Simulated, Dewarped Hor. Edge
Captured, Dewarped Hor. Edge
Simulated, Dewarped Corner
Captured, Dewarped Corner
109
corresponding SFR curve for each region (obtained from horizontal and vertical
edges in each region, as discussed in Section 2.8.2). As the standard ISO 12233 test
target does not contain vertical and horizontal edges in the defined top and bottom
edge regions, the SFR curve for the top and bottom edge regions is estimated by taking
an average of the center SFR curve and the left and right edge SFR curves, as these
two sets of curves are produced from edges located at an average distance from the
center of the image that is approximately equal to the distance from the center of the
image to the top and bottom edge regions. The estimated spatial resolution from the
SFR curve in each region at the selected modulation (0.3) is then multiplied by the
physical size that each region would occupy on the selected image sensor array in
order to estimate the resolution in each region.
Figure 4-20. The total resolution of the dewarped image was determined by
estimating the resolution in different regions of the image, including corners,
top and bottom edges, left and right edges, and center regions of an ISO 12233
test target.
110
The red, green, and blue color channels were characterized and dewarped
separately to minimize the effects of LCA. The visual effect of this LCA correction is
evident by comparison of the images in Figures 4-22 and 4-23. Table 4-10 lists the
correlation coefficient values among the color channels of a captured image of the
ISO 12233 test target (without the Macbeth color chart) after dewarping. The high
correlation among the color channels achieved after dewarping with three separate
look-up tables proves the importance of software LCA correction quantitatively.
Figure 4-21. Image of an ISO 12233 test target with a Macbeth color chart
placed on top of it, captured with the as-fabricated EO-Lens integrated with an
Aptina MT9M034 wide dynamic range image sensor array.
The designed system exhibits a degree of relative illumination variation at the
image plane, as shown in Figure 4-21, which becomes even more apparent with
111
captured images after dewarping is performed (Figure 4-22). Analysis of these
captured and dewarped images shows that the illumination level at the corners is
approximately 75% of the illumination level at the center. This effect can also be
reversed in software processing by flat-field correction, as explained previously. The
improvement in lateral chromatic aberration achieved through separate color
channel dewarping is shown in Figure 4-23, and the final image after flat-field
correction is shown in Figure 4-24.
Table 4-10. Correlation coefficients for the different color channels in the
overall captured and dewarped image of the ISO 12233 test target, and for a
corner patch of the image (as shown in Figure 4-23).
Overall Corner Patch
Single LUT 3 LUT Single LUT 3 LUT
Red-Green 0.813 0.974 0.621 0.968
Red-Blue 0.589 0.914 −0.021 0.921
112
Figure 4-22. (Top) The same image as shown in Figure 4-21 after dewarping
with a single LUT , without flat-field correction. (Bottom) The same image as
shown in Figure 4-21 after dewarping with 3 separate LUTs, one for each color
channel, without flat-field correction.
113
Figure 4-23. (Left) Expanded view of the top left region from Figure 4-22, Top.
(Right) Expanded view of the top left region from Figure 4-22, Bottom, showing
the significant improvement in lateral chromatic aberration through software
correction (separate color channel dewarping).
Figure 4-24. The same image as shown in Figure 4-21 after dewarping with 3
separate LUTs, one for each color channel, and flat-field correction to remove
relative illumination variation.
4.9 Summary
In this chapter, the design steps for a miniature, simple wide angle lens design
were presented. Initially a single meniscus lens was designed and analyzed. Later a
second lens was added to the design to expand the design optimization space. This
two lens design with positive meniscus lens elements was then converted to a four
114
lens design with plano spherical elements, with the intention of implementing a
similar lens with the closest matching COTS elements. All of these lens designs were
constrained to spherical surfaces only.
Optical design involves the minimization and balancing of aberrations to
achieve an overall better imaging system. As was shown with barrel distortion in
Chapter 3, correction of an optical aberration in software post-processing relaxes
constraints at the optical system design stage and allows for other aberrations to be
better optically corrected. Lateral Chromatic Aberration (LCA) is another aberration
that can be well corrected in software. This hybrid optical/digital approach of
designing a lens with unconstrained LCA, and then correcting the LCA in software was
shown to improve the performance of the overall imaging system.
This approach was crucial for the design and implementation of the lens with
closest matching COTS elements. COTS lens elements (especially those with small
diameters) are typically fabricated out of common glasses, and it is often difficult to
find the optimal selection of glasses for optical minimization of LCA. Even so, after
dewarping with three look-up tables (one for each color channel), the effects of LCA
can be minimized in the final images.
The designed and implemented COTS EO-Lens has a wide field of view (97°
diagonal FOV), a miniature footprint, and provides high resolution. It was integrated
with a wide dynamic range (115 dB) image sensor array to form a wide angle, wide
dynamic range camera. As such, this miniature hybrid optical/digital camera is ideal
for integration in wearable visual aid systems.
115
Chapter 4 References
[1] R. Kingslake, A History of the Photographic Lens, 1st Ed., Boston: Academic
Press, 1989, pp. 23-26.
[2] K. Henney, and B. Dudley, Handbook of Photography, 1st Ed., New York:
Whittlesey House, 1939, p. 37.
[3] Ohara Corporation Glass Catalog Data.
Available online: http://www.oharacorp.com/catalog.html
[4] Schott Inc. Optical Glass Data Sheets.
Available online: http://www.schott.com
[5] M. Laikin, “Wide Angle Lens Systems”, International Lens Design Conference,
Vol. 530, pp. 530–533, 1980.
[6] R. E. Fischer, B. Tadic-Galeb, and P. R. Yoder, Optical System Design, 2
nd
Ed.,
New York: McGraw-Hill, 2008, pp. 90-91.
[7] J. Mallon, and P. F. Whelan, “Calibration and Removal of Lateral Chromatic
Aberration in Images”, Pattern Recognition Letters, Vol. 28, No. 1,
pp. 125-135, January 2007.
[8] Matrix Vision mvBlueFOX-MLC202d USB 2.0 Single Board Camera.
Specification available online: http://www.matrix-vision.com/USB2.0-
single-board-camera-mvbluefox-mlc.html, as of October 2015.
[9] “Industrial Cameras: Spectral Sensitivity”, The Imaging Source White Paper,
2013. Available online:
http://www.theimagingsource.com/en_US/publications/whitepapers , as of
June, 2015.
[10] Delrin® is an acetal polymer produced by DuPont. http://www.dupont.com
[11] Wells Research, Lincoln, MA. http://www.wellsresearch.com
[12] OmniVision OV10633 720p Wide-Dynamic Range Image Sensor Product
Brief. Available online: http://www.ovt.com
[13] Aptina MT9M034 Product Flyer. Available online:
http://www.aptina.com, as of January 2015.
[14] Sunex Inc., Carlsbad, CA. http://www.optics-online.com
116
Chapter 5
WIDE ANGLE LENS DESIGN WITH AN ASPHERICAL ELEMENT
In parallel with the design and implementation of the wide angle compound
lens based on only spherical surfaces presented in Chapter 4, various other wide
angle lens designs were developed with the goal of designing a miniature,
well-corrected wide angle lens system. With proper design and optimization,
incorporation of one or more aspherical surfaces can help with reducing and
balancing the aberrations in a lens system [1].
Figure 5-1. The sag ( z) of an aspherical surface is defined as the deviation from
a planar surface measured from the vertex.
Typically, the surface of an aspherical lens is defined by a mathematical
relationship. The sag (z) of an aspherical lens surface (as shown in Figure 5-1), is the
deviation from a planar surface measured from the vertex, and for a rotationally
symmetric aspherical surface is defined as:
𝑧 =
𝑐 𝑟 2
1 + √1 − (1 + 𝑘 )𝑐 2
𝑟 2
+ 𝐴 𝑟 4
+ 𝐵 𝑟 6
+ 𝐶 𝑟 8
+ ⋯
(5-1)
117
in which r is the radial distance from the optical axis, c is the curvature at the vertex,
k is the conic constant of the surface, and A, B, C, and so on are the higher order
aspherical surface coefficients [1]. A spherical surface is defined by the surface
radius, which is the inverse of the surface curvature (1/c). When the conic constant
and the aspherical coefficients in Eq. 5-1 are zero, the surface sag reduces to that of a
spherical surface.
5.1 Lens Design Approach
One traditional design approach for wide angle lenses is to incorporate a
negative front element (or a lens group), followed by a lens group that covers a small
field of view. In this case, the negative front element increases the field of view of the
latter group, allowing for an overall wide field of view system [2].
Following these lens design principles, a wide angle compound lens was
designed around a COTS aspherical lens element, the LightPath Technologies 355150
lens [3]. This new design is based on a COTS aspherical lens, rather than starting with
a fully custom design as in Chapter 4. This is because COTS aspherical lenses are
typically optimized for a specific application, and there are many parameters (i.e.,
aspherical surface coefficients, in addition to the lens thickness and surface radii of
curvature that are the key parameters of spherical lenses) that make a particular lens
unique. Therefore, when starting with a custom designed aspherical lens element, it
is very difficult to find a matching COTS aspherical lens element. However, the
performance that can be achieved with a fully custom three lens design comprising a
118
single custom aspherical lens element and two custom spherical lens elements is also
presented below for comparison.
The selection process of the central aspherical lens was as follows:
(i) The lens should have viable operation over the visible spectrum. This can
be achieved by having a lens material that has high transmission over the
visible spectrum, and the lens should not have surface coatings that block
the visible spectrum. Lenses that are designed for the visible spectrum,
and some lenses operating in the near infrared spectrum typically meet
these criteria.
(ii) The lens should be designed and well corrected for infinite conjugate
operation, in which either the image or the object is at infinity and the
other is at a finite distance. Lenses designed for imaging with the object
placed at infinity, or lenses designed for collimation with the source placed
at the back focal plane are examples of infinite conjugate lenses.
(iii) The lens focal length should provide wide field of view with the image
sensor array considered. With a 1/3” format image sensor array (6 mm
diagonal image sensor size), a lens with a focal length of less than 3 mm
provides more than a 90° diagonal field of view.
(iv) The diameter of the lens should be large enough for high light throughput.
119
Figure 5-2. Schematic diagram of the LightPath Technologies 355150 lens
(modeled over the visible wavelength range). Large field angle rays are focused
closer to the lens than the paraxial rays. Therefore, in order to achieve the
smallest spot sizes for all field angles without field flattening, a curved image
surface would need to be incorporated. The curved image surface drawn with
the dashed orange line shows this curved surface for the drawn set of rays. The
reference wavelength for the diagram is 540 nm.
The LightPath Technologies 355150 aspherical lens element (as shown in
Figure 5-2) has two identical aspherical surfaces (with the same radius of curvature
and aspherical surface coefficients), and an effective focal length of 2 mm at
λ = 780 nm, and it meets the above requirements as described below.
(i) The LightPath Technologies 355150 lens is fabricated from CDGM
D-ZLaF52LA glass, which has a refractive index of 1.81 at λ = 587.6 nm, and
high light transmission for the wavelength range from 450 nm to 2 µm [3].
Various anti-reflection coatings are available for this lens, including a
visible range anti-reflection coating.
(ii) This lens was designed for the specific purpose of laser diode collimation
at 780 nm, a standard wavelength in optical communications. As such, this
lens is well corrected for collimation from a small near infrared light
120
source to a corresponding narrow range of paraxial field angles. Due to the
optical “Principle of Reversibility” [4], a lens that is designed for
collimation from a light source can be used in reverse to image an object at
infinity to a focal plane at the back focal length. As the lens has identical
surfaces, physical reversal of the lens is not needed.
(iii) The focal length of the lens (~2 mm for the visible wavelengths) is in the
correct range for the image sensor format considered (a 1/3” optical
format with a 6 mm diagonal image sensor array), and would provide
about a 110° diagonal field of view.
(iv) The lens has a 3 mm diameter with a 2.2 mm clear aperture. If the full clear
aperture of the lens is used, an f-number of 0.9 can be achieved. A smaller
physical aperture can be incorporated for reducing lens aberrations at the
cost of reducing light throughput.
Meeting these requirements, the LightPath Technologies 355150 lens element
was considered as potentially the central element in the design of a wide angle
compound lens that included several other lens elements.
The optical performance of the LightPath lens was analyzed by modeling it in
CODE V. Even though this lens was designed for a near infrared wavelength,
analyzing it over the visible wavelength range showed that it still has excellent
performance over a limited field of view. Spot diagrams obtained with the lens
modeled at different wavelengths are shown in Figure 5-3, and modulation transfer
121
function (MTF) curves for different wavelengths are shown in Figure 5-4. The
monochromatic performance of the lens is almost the same at the specified 780 nm
(near infrared) and 540 nm (green) wavelengths. The monochromatic MTF plots
show that the lens has near diffraction-limited performance on-axis.
Figure 5-3. (Left) Spot diagram for the LightPath Technologies 355150 lens
operating at f/ 2, for four field angles and at 780 nm wavelength. (Center) Spot
diagram for the LightPath Technologies 355150 lens operating at f/ 2, for four
field angles and at 540 nm wavelength. (Right) Spot diagram for the LightPath
Technologies 355150 lens operating at f/ 2, for four field angles and with three
wavelengths (red: 630 nm; green: 540 nm; and blue: 470 nm).
However, optical modeling also showed that this lens has severe field
curvature at large field angles, as shown with the increasing spot sizes at large field
angles in Figure 5-3 and with the field curves in Figure 5-6.
Field curvature is one of the five Seidel aberrations in an optical system. For a
lens with field curvature, the best-focused image is not planar but is instead a curved
surface. As shown in Figure 5-2, in order to obtain optimal imaging up to a 55° half
field of view (110° full field of view), the significantly curved image surface shown
with the dashed orange line must be accommodated. If a planar image surface, such
122
as a digital image sensor array, is used to capture the image, then at large field angles
the light rays diverge after coming to a focus before the image plane. This results in
large spot sizes and increased blur at large field angles [5], as shown in Figure 5-3.
Figure 5-4. (Top Left) Modulation transfer function (MTF) plot for the
LightPath Technologies 355150 lens operating at f /2, for four field angles and
at 780 nm wavelength. (Top Right) MTF plot for the LightPath Technologies
355150 lens operating at f/ 2, for four field angles and at 540 nm wavelength.
(Bottom) MTF plot for the LightPath Technologies 355150 lens operating at f/ 2,
for four field angles and with three wavelengths (red: 630 nm; green: 540 nm;
and blue: 470 nm).
123
Incorporation of a negative element before the aperture stop reduces the field
curvature, and as a consequence increases the usable field of view of the lens.
Therefore, in order to increase the field of view of the LightPath Technologies
aspherical lens, a custom negative lens element (placed in front of the aperture stop)
was designed and optimized, as shown in Figure 5-5, Left. The lens specifications for
this two lens system are listed in Table 5-1.
Figure 5-5. (Left) Schematic diagram of the LightPath 355150 lens with a
custom spherical (negative) element added to increase the usable field of view.
(Right) Schematic diagram of the LightPath Technologies 355150 lens with two
custom spherical (negative) elements added (LP-Custom lens) to increase the
usable field of view and reduce aberrations over the wide field of view. The
reference wavelength for the diagrams is 540 nm.
In order to increase the size of the image to match the size of the image sensor
format used (1/3” optical format), as well as to further reduce the residual field
curvature, a second custom negative lens element was added between the aspherical
lens and the image plane. The surface curvatures and thicknesses of both the first
lens element and the last lens element were then set variable in CODE V, and the
entire compound lens was reoptimized. The lens materials for the two custom lenses
were optimally selected using the Glass Expert tool in CODE V. The resulting three
124
lens design with the LightPath aspherical element and two custom negative lens
elements is shown in Figure 5-5, Right. The lens specifications for this three lens
system are listed in Table 5-2. This lens will be referred to as the LP-Custom lens in
the rest of this thesis. The reduction in field curvature with the incorporation of the
two custom lens elements is shown in Figure 5-6.
Table 5-1. Specifications of the custom two lens wide field of view optical
system (as shown in Figure 5-5, Left). All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and the following
surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
First Lens −3.707 0.650
NFK58-
SCHOTT
1.5
4.372 0.747
1.5
Aperture Stop Infinity 0.025
0.37
LightPath 355150 2.6671 1.92
DZLAF52LA-
CDGM
1.5
−2.6671 2.267
1.5
Image Plane Infinity 1.67
The custom negative front lens element and the custom negative rear lens
element in the LP-Custom lens (as shown in Figure 5-5, Right) were later replaced
with their closest matching COTS alternatives to produce a compact, wide field of
view lens with only COTS lens elements, as described in detail later in this section. As
the key central element in this COTS lens system is the 355150 lens from LightPath
Technologies, this COTS lens system will be referred to as the LP-Lens in the rest of
this thesis.
125
Table 5-2. Specifications of the LP-Custom lens, a custom wide field of view lens
with a central COTS aspherical lens element. All dimensions are in mm. The
thickness parameter is the distance between the surface indicated and the
following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
First Lens −3.063 0.833
NLASF31A-
SCHOTT
1.5
−8.514 0.853
1.5
Aperture Stop Infinity 0.025
0.43
LightPath 355150 2.6671 1.92
DZLAF52LA-
CDGM
1.5
−2.6671 1.038
1.5
Third Lens −4.464 0.5
NSF66-
SCHOTT
1.5
23.230 0.893
1.5
Image Plane Infinity 0
1.934
Field curves for an optical system show the displacement of the optimal focal
surface for the radial and tangential sets of rays with field angle. The field curves for
the single LightPath aspherical lens, for the two lens system after incorporation of the
custom negative front lens element (as shown in Figure 5-5, Left), and for the
LP-Custom lens are shown in Figure 5-6. The field curves for the LP-Lens are also
shown in this figure. Ideally, both the tangential and radial curves would be straight
vertical lines at zero displacement. The results shown in this figure demonstrate that
the field curvature is minimized in the LP-Custom lens as compared to both the single
aspherical lens and two lens designs. A modest increase in field curvature for the
126
LP-Lens as compared to the LP-Custom lens results from the substitution of COTS lens
elements for custom lens elements, as expected.
Figure 5-6. Tangential (T) and radial (R) field curves for the single aspherical
lens, the two lens design (as shown in Figure 5-5 Left), the LP-Custom lens (as
shown in Figure 5-5 Right), and the LP-Lens (as shown in Figure 5-9).
Both the LP-Lens and the LP-Custom lens are designed around a central,
aspherical COTS lens element, the 355150 lens from LightPath Technologies. In order
to evaluate the performance that can be achieved with a full custom lens design
having a similar lens structure and field of view, a three lens design was developed.
This particular full custom three lens design comprises a custom spherical front lens
element, a custom aspherical central lens element, and a custom spherical rear lens
element. The full custom lens was optimized with three reference wavelengths
127
covering the visible spectrum, 630 nm, 540 nm, and 470 nm. The full custom lens
specifications are listed in Table 5-3, and the lens is shown in Figure 5-7.
Figure 5-7. Schematic diagram of a full custom wide field of view lens design.
The first and the last lens elements have spherical surfaces, and the central lens
element has aspherical surfaces. The reference wavelength for the diagram is
540 nm.
In order to evaluate the performance difference between the LP-Lens, the
LP-Custom lens, and the full custom lens (as shown in Figure 5-7), imaging of the
ISO 12233 target was simulated with each lens in CODE V. The image simulation
results for each lens were dewarped in Matlab with three look-up tables (one for each
color channel, generated separately for each lens in CODE V) to correct for both
distortion and lateral chromatic aberration. The look-up tables for the red channel,
the green channel, and the blue channel were generated with 630 nm, 540 nm, and
470 nm as the reference wavelengths, respectively. Later the SFR was calculated (in
Matlab) at the center, edge, and corner regions (as defined in Section 2.8.2) of the
dewarped images. For this particular case of a 55° half field of view (110° full field of
view) lens, the edge region corresponds to a 45° field angle, and the corner region
128
corresponds to a 50° field angle. The SFR comparison plot is shown in Figure 5-8.
This plot shows that the performance in the central field of view is similar for all lens
designs. The full custom lens has the best performance at the edge and corner
regions. The LP-Lens has slightly worse performance at the edge and corner regions
as compared to the LP-Custom lens, as expected.
Table 5-3. Specifications of the full custom wide field of view lens design shown
in Figure 5-7. The first and the last lens elements have spherical surfaces, and
the central lens element has aspherical surfaces. All dimensions are in mm.
The thickness parameter is the distance between the surface indicated and the
following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
First Lens −3.148 0.5
NLASF31A-
SCHOTT
1.5
−2.960 0.2
1.5
Aperture Stop Infinity 0.146
0.32
Aspherical Lens −2.536 0.934
NLAF21-
SCHOTT
0.45
(Aspherical
Coefficients)
k = 18.187, A = −0.254, B = −0.375, C = 1.910,
D = −31.890, E = −0.0003
−0.918 0.629
0.85
(Aspherical
Coefficients)
k = 0.074, A = −0.072, B = −0.111, C = 0.237,
D = −0.158, E = 0.0003
Third Lens −2.207 0.5
NSF66-
SCHOTT
1.5
−18.869 1.294
1.5
Image Plane Infinity 0
2.03
129
Figure 5-8. Comparison of the SFR plots obtained from simulated images with
the LP-Lens, the LP-Custom lens, and the full custom lens after dewarping each
image with three corresponding look-up tables (one for each color channel).
The simulated images have a 55° half field of view (110° full field of view), the
edge region corresponds to a 45° field angle, and the corner region corresponds
to a 50° field angle. Image simulations of the ISO 12233 test target (used in the
SFR calculations) were performed with the individual color channel wavelength
weights listed in Table 4-3.
Using the COTS lens elements, the total length of the LP-Lens is 6.92 mm.
However, as can be seen from Figure 5-9, only the central region of the first COTS lens
element is needed for proper optical operation. Furthermore, the edges of this first
lens element block some of the large field angle rays (shown with dashed lines in
Figure 5-9, Left), and therefore limit the field of the view of the LP-Lens. In order to
both reduce the total length of the compound LP-Lens and also prevent rays from
being blocked, the edge thickness of the first COTS lens element can be reduced from
2.82 mm to 1.5 mm, reducing the total length of the LP-Lens package to only 5.6 mm,
as shown in Figure 5-9, Right.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Spatial Frequency Response
Spatial Frequency (lp/mm)
LP-Lens, Center LP-Custom, Center Full Custom, Center
LP-Lens, Edge LP-Custom, Edge Full Custom, Edge
LP-Lens, Corner LP-Custom, Corner Full Custom, Corner
130
Figure 5-9. (Left) Schematic diagram of the LP-Lens. (Right) Schematic
diagram of the LP-Lens after reducing the edge thickness of the first COTS lens
element. The reference wavelength for the diagram is 540 nm.
A custom lens housing for the LP-Lens was designed and fabricated out of
black Delrin® [6] material to minimize internal reflections. The housing outer thread
was set to be M12 × 0.5, which is a common, standard lens thread for miniature
cameras, chosen in this instance to allow easy mounting for experiments. This thread
has a 12 mm diameter; however, the largest individual lens diameter in the LP-Lens
is only 6 mm. This allows for a much smaller footprint for the lens housing in future
custom camera integration solutions. The individual COTS lens elements with visible
wavelength range anti-reflection coatings were acquired, and the LP-Lens was then
integrated into the designed housing. As explained above, the edge thickness of the
first lens element was reduced by sanding with ultra fine grit sandpaper. The
resulting LP-Lens (Figure 5-9, Right) has a 2.44 mm effective focal length, and
operates at f/3.5. The total length of the lens package is only 5.6 mm. The
specifications for the designed and implemented LP-Lens are listed in Table 5-4.
131
Table 5-4. Specifications of the LP-Lens, a custom wide field of view lens with
COTS lens elements. All dimensions are in mm. The thickness parameter is the
distance between the surface indicated and the following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Linos G314001 −3.101 0.5
BK7-
SCHOTT
3
Infinity 0.575
3
Aperture Stop Infinity 0.025
0.4
LightPath 355150 2.6671 1.92
DZLAF52LA-
CDGM
1.5
−2.6671 0.8175
1.5
Opto-Sigma
015-0024
−6.8 0.8
LASFN9-
SCHOTT
2
Infinity 0.976
2
Image Plane Infinity 0
1.953
5.2 Lens Testing and Verification
The optical performance of the as-implemented LP-Lens was tested using a
commercial optical test bench, the OS-400-25 from Wells Research [7]. The distortion
simulation and the corresponding measurements are shown in Figure 5-10, from
which it can be seen that the distortion measurement closely matches the distortion
profile generated from the lens model. Modulation transfer function (MTF)
measurements were also performed both on-axis and also at various field angles. The
resulting MTF plots were compared with the MTF plots generated with the CODE V
model of the lens, with results shown in Figure 5-11. Overall, the agreement is quite
good, with minor discrepancies between the simulation and measurement results for
the radial and tangential MTF plots. As an example, the simulation result for the
132
tangential MTF at the 30° field angle is somewhat better than what is achieved with
the measurement. However, the measurement result for the radial MTF at the same
field angle is somewhat better than what is expected by the simulation result. This
discrepancy probably results from focusing the lens at slightly different image planes
for the simulation and the measurement. If the average of the tangential and radial
MTF curves is plotted, the measurement results closely match the simulation results,
as shown in Figure 5-12.
Figure 5-10. Comparison of measured distortion profile of the LP-Lens to the
distortion profile obtained from the CODE V model.
0°
5°
10°
15°
20°
25°
30°
35°
40°
45°
50°
-30 -20 -10 0
Field Angle
Distortion (%)
Measured Distortion
Simulated Distortion
133
Figure 5-11. Tangential MTF (Top) and radial MTF (Bottom) curves for the
designed and implemented LP-Lens. Measurement results as well as the
simulation results obtained from the CODE V model of the lens are presented.
The MTF measurements were performed with the green LED of the OS-400-25
system, with the spectrum as shown in Figure 2-15. CODE V results are also
obtained with wavelength weights derived from the spectrum of the green LED
as shown in Figure 2-15.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Tangential MTF
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Radial MTF
Simulation On-Axis Measurement On-Axis
Simulation 10° Measurement 10°
Simulation 20° Measurement 20°
Simulation 30° Measurement 30°
Simulation 40° Measurement 40°
Simulation 45° Measurement 45°
134
Figure 5-12. Average modulation transfer function (MTF) curves for the
LP-Lens, showing the average of the radial and tangential MTF curves at a given
field angle. Measurement results show close agreement with the results
obtained from the CODE V model of the lens.
5.3 Comparison of the Two Lens Designs with COTS Lens Elements
The wide field-of-view lens design presented and implemented in Chapter 4
(the EO-Lens) and the wide field-of-view lens design presented and implemented in
this chapter (the LP-Lens), are both designed to work with a 1/3” format image
sensor array. However, they have different effective focal lengths and provide
different fields of view. The key parameters of the two lenses are summarized and
compared in Table 5-5 below.
In order to compare the theoretical imaging performance of these two lenses,
imaging of the ISO 12233 target was simulated with each lens in CODE V. The image
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Modulation
Spatial Frequency (lp/mm)
Average MTF
Simulation On-Axis Measurement On-Axis
Simulation 10° Measurement 10°
Simulation 20° Measurement 20°
Simulation 30° Measurement 30°
Simulation 40° Measurement 40°
Simulation 45° Measurement 45°
135
simulation results were dewarped in Matlab with three look-up tables (generated for
each lens) to correct for both distortion and lateral chromatic aberration. Later the
SFR was calculated (in Matlab) at the center, edge, and corner regions (as defined in
Section 2.8.2) of both dewarped images. The comparison plot is shown in Figure 5-13.
As shown in the figure, both lenses have similar SFR performance. The EO-Lens
provides a slightly narrower field of view, but with somewhat higher light throughput
due to its lower f/# as shown in Table 5-5. It also has a more uniform brightness at
the image plane as compared to the LP-Lens.
Figure 5-13. Comparison of the SFR plots obtained from simulated images with
the LP-Lens and the EO-Lens after dewarping with three look-up tables (one for
each color channel). Image simulations of the ISO 12233 test target (used in the
SFR calculations) were performed with the individual color channel wavelength
weights listed in Table 4-3.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Spatial Frequency Response
Spatial Frequency (lp/mm)
LP-Lens Center EO-Lens Center
LP-Lens Edge EO-Lens Edge
LP-Lens Corner EO-Lens Corner
136
Table 5-5. Comparison of the specifications of the two designed and
implemented lenses, the spherical surface EO-Lens (Chapter 4) and the
aspherical surface LP-Lens (Chapter 5).
EO-Lens LP-Lens
Lens Surfaces All Spherical Two Aspherical
Focal Length 3.24 mm 2.44 mm
Field of View 97° 110°
f/# 3.24 3.52
Package Size 8.25 mm 5.6 mm
Relative Illumination at Corner 75% 45%
Figure 5-14. Lateral color plot for the LP-Lens and the EO-Lens.
The lateral color plot for an optical system typically shows the difference in
the lateral position of the focal point corresponding to a blue wavelength and the
0°
11°
22°
33°
44°
55°
-60.00 -45.00 -30.00 -15.00 0.00
Field Angle
Lateral Color Shift (µm)
LP-Lens EO-Lens
137
lateral position of the focal point corresponding to a red wavelength as a function of
field angles (plotted on the y-axis). Figure 5-14 shows the lateral color plot for both
the LP-Lens and the EO-Lens, obtained from the models of the lenses in CODE V, with
the blue wavelength set at 470 nm and the red wavelength set at 630 nm. As shown
in Figure 5-14, the LP-lens has significantly less lateral chromatic aberration as
compared to the EO-lens. Using an image sensor array with 3.75 µm × 3.75 µm pixels
(such as the Aptina MT9M034), the image of a high contrast scene captured with the
LP-Lens would have about 6.5 pixels of lateral shift between the red and the blue
channels (at the corner of the image) as compared to about 16 pixels of lateral shift
for the EO-Lens (at the corner of the image). As such, the LP-Lens would be more
suitable for a camera implementation with dewarping implemented using a single
look-up table, such as the integration with the Leopard Imaging LI-USB30-M034WDR
camera module. This particular camera module has an integrated circuit that
provides on-chip dewarping capability. The LP-Lens would be more suitable for
integration with this camera module because the on-chip dewarping does not allow
for the correction of LCA through separate dewarping of the color channels. Even so,
the amount of residual LCA in the dewarped image could be tolerable in real life
situations. As explained in detail in Chapter 7, for the scene camera in the eye-tracked
extraocular camera, digital binning of the image sensor pixels is needed (in order to
reduce the resolution of the digital image) to match the resolution of the implanted
electrode array, which allows for the lateral shift in the lower resolution images to be
imperceptible.
138
5.4 Imaging Performance
The implemented LP-Lens was integrated with an Aptina MT9M034 wide
dynamic range image sensor array. Rather than capturing the image and performing
dewarping operations on a desktop computer, a small camera module (the
LI-USB30-M034WDR) that contains both the image sensor array and also the
corresponding image signal processor chip, the AP0100, was used for testing. This
camera module was purchased from Leopard Imaging [8], with specifications
introduced in Section 2.7. The AP0100 image signal processor has a built-in Spatial
Transform Engine (STE) for on-chip dewarping of barrel distortion. The image signal
processor chip was programmed to dewarp the specific barrel distortion profile of
the LP-Lens. Captured test images are shown in Figure 5-15. With the image signal
processor programmed for full distortion correction, the diagonal field of view of the
dewarped image is 100°. With the image signal processor programmed to allow for
minimal remaining distortion, the field of view can be increased to 110°.
The Aptina AP0100 image signal processor performs a range of image
processing functions, including adaptive tone mapping of the captured images to
generate a wide dynamic range image and also some image sharpening [9]. This
results in SFR performance of the as-implemented camera that is significantly better
than the SFR obtained from simulated images. This sharpening effect is most
significant at the center of the image, and is negligible at the corners. The SFR plot
showing a comparison of contrast in both captured images and simulated images is
shown in Figure 5-16. If a modulation of 0.3 (derived from the SFR plot obtained
139
using captured images shown in Figure 5-16) is considered for resolution estimation,
the total (optical) resolution would be 928 × 696. The regions for estimating the
resolution and the method used for estimation were defined in Chapter 4.
Figure 5-15. (Top) Image captured with the LP-Lens integrated with the
LI-USB30-M034WDR camera module. The camera module is programmed to
fully correct for the barrel distortion. The image has a 100° (measured)
diagonal field of view. (Bottom) Image captured with the LP-Lens integrated
with the LI-USB30-M034WDR camera module. In this case, the camera module
is programmed to correct for most of the barrel distortion, with minimal
uncorrected barrel distortion at the corners. The image has a 110° (measured)
diagonal field of view.
140
The designed system not untypically exhibits a relative illumination variation
at the image plane that is more apparent with captured images after dewarping is
performed. Analysis of these captured and dewarped images shows that the
illumination level at the corners is about 45% of the illumination level at the center.
This effect can also be reversed in software processing through flat-field correction.
The final image after flat-field correction (of the image shown in Figure 5-15, Left) is
shown in Figure 5-17. The significant difference in relative illumination results in
noise amplification in the flat-field correction step, apparent in the corner regions of
the image shown in Figure 5-17.
5.5 Summary
In this chapter, the methodology used for the design of a wide field of view
compound lens (referred to as the LP-Lens) that is based on a well-corrected narrow
field-of-view aspherical lens is presented. Initially, the COTS aspherical lens
(LightPath Technologies 355150) was modeled in optical design software. A negative
lens element was added as the front element, both to reduce the incoming wide field
angle rays to the narrow field angle range that the COTS aspherical lens element was
optimized for, and to reduce the significant field curvature at large field angles.
Another negative element was added as the last element to both further reduce the
field curvature and also to increase the size of the image. Later the lens with closest
matching COTS elements was designed and implemented.
141
Figure 5-16. Comparison of SFR plots for both captured and simulated images
with the LP-Lens after dewarping, with the object placed at infinity. Image
simulation of the ISO 12233 test target (used in the SFR calculations) was
performed with the individual color channel wavelength weights listed in
Table 4-3.
The resulting lens with only COTS elements is a low-cost, miniature wide field
of view lens with excellent resolution. This lens was implemented, verified, and
integrated with a wide dynamic range image sensor array to produce a wide field of
view, wide dynamic range camera. The chosen camera module has built-in
dewarping functionality that was programmed to correct for the barrel distortion of
the lens. The output of the camera module is a rectilinear image with a 110°
dewarped diagonal field of view. As explained in Chapter 7, this integrated camera
was used as the scene camera in the prototype eye-tracked extraocular camera
implementation.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100 125
Spatial Frequency Response
Spatial Frequency (lp/mm)
Simulated, Dewarped Center Captured, Dewarped Center
Simulated, Dewarped Edge Captured, Dewarped Edge
Simulated, Dewarped Corner Captured, Dewarped Corner
142
Figure 5-17. Image of an ISO 12233 target captured with the LP-Lens integrated
with the LI-USB30-M034WDR camera module. The image is flat-field corrected
to remove relative illumination variations.
143
Chapter 5 References
[1] R. E. Fischer, B. Tadic-Galeb, and P. R. Yoder, Optical System Design, 2
nd
Ed.,
New York: McGraw-Hill, 2008, Chap. 7.
[2] R. E. Fischer, B. Tadic-Galeb, and P. R. Yoder, Optical System Design, 2
nd
Ed.,
New York: McGraw-Hill, 2008, p. 137.
[3] LightPath 355150 Support Documents. Available online:
http://lightpath.com/displayLens.php?lensNumber=355150
[4] E. Hecht, Optics, 4th Ed., Boston: Addison Wesley, 2001, p. 110.
[5] W. J. Smith, Modern Optical Engineering, 3rd Ed., New York: McGraw-Hill,
2000, pp. 69-71.
[6] Delrin® is an acetal polymer produced by DuPont. http://www.dupont.com
[7] Wells Research, Lincoln, MA. http://www.wellsresearch.com
[8] Leopard Imaging Inc., Milpitas, CA. http://www.leopardimaging.com
[9] Aptina AP0100 Product Flyer. Available online:
http://www.aptina.com, as of January 2015.
144
Chapter 6
MINIATURE WIDE FIELD OF VIEW CAMERA BASED ON WAFER LEVEL OPTICS
Integration of miniature lenses with image sensor arrays at the wafer level is
a promising method of producing low cost (~$1) cameras [1]. Several commercial
wafer level integrated cameras are available on the market, mainly aimed at cell
phones and other mobile devices. Although very compact in size, these cameras
typically lack the wide field of view required for the wearable visual aid camera
system, as well as for the eye-tracked extraocular camera system. In order to address
this issue, and potentially leverage the availability of wafer level integrated cameras,
a miniature lens add-on module to extend the field of view of a commercially available
wafer level camera (the OVM7692 from OmniVision Technologies, as shown below in
Figure 6-1) was successfully designed and implemented.
The OmniVision OVM7692 is a complete camera in a miniature package. It has
VGA (640 × 480) resolution with a 64° diagonal field of view. The camera has a
footprint of only 2.5 mm × 3 mm, as well as a height of only 2.5 mm [2].
Figure 6-1. OmniVision OVM7692 wafer level camera shown next to a U.S.
penny for size comparison.
145
Table 6-1. Optical specifications for the OmniVision OVM7692 wafer level
camera [2].
Active Array Size 640 × 480 pixels
Pixel Size 1.75 µm × 1.75 µm
Diagonal Field of View 64°
Focal Length 1.15 mm
f/# 3.0
The exact optical specifications of the chosen wafer level camera have not been
released. However, a publicly available product brief [2] reveals several key optical
properties of the camera, as summarized in Table 6-1. A multi-element lens that
exactly meets the given properties and also has RMS spot sizes that are less than the
given pixel size (1.75 µm × 1.75 µm) across the field of view was designed to have an
idealized model of the camera in hand, as shown in Figure 6-2 and with the lens
specifications listed in Table 6-2. Having a camera model that is pixel size limited
ensures that the integrated camera and attachment will reflect the performance of
the attachment, rather than being limited by the performance of the wafer level
camera.
146
Table 6-2. Specifications of the idealized, multi-element model for the
OVM7692 wafer level camera optical system. All dimensions are in mm. The
thickness parameter is the distance between the surface indicated and the
following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.001
0.192
First Lens 0.716 0.139
SF56A-
SCHOTT
0.25
−178.98 0.044 0.25
Second Lens −1.04 0.188
NSF66-
SCHOTT
0.3
1.13 0.06 0.3
Third Lens −2.20 0.1
NLASF31-
SCHOTT
0.4
−0.616 0.001
0.4
Fourth Lens 1.46 0.65
NLASF31-
SCHOTT
0.5
1.22 0.01 0.5
Fifth Lens 1.06 0.10
NSK11-
SCHOTT
0.6
1.53 0.48 0.6
Image Plane 0.72
147
Figure 6-2. (Left) Schematic diagram of the idealized, multi-element model for
the OVM7692 wafer level camera optical system. The reference wavelength for
the diagram is 540 nm. (Right) Spot diagram for the lens shown on the left, for
five field angles and with three wavelengths (red: 630 nm; green: 540 nm; and
blue: 470 nm).
6.1 Afocal Attachment
Wafer level cameras comprise an image sensor array and lens (or lenses) that
are integrated at the wafer level with a fixed distance between the lens and the image
sensor array, and as such there is no way to adjust this distance or refocus the camera.
Therefore, any attachment designed for these cameras should be designed such that
light rays are properly focused on the same image plane once the attachment is
installed.
148
The largest depth of field for a camera can be achieved if the camera is focused
at the hyperfocal distance (𝑠 ℎ𝑓 ). In this case, objects located from infinity up to half
the hyperfocal distance will be in acceptable focus [3].
𝑠 ℎ𝑓 ≈
𝑓 2
𝑑 (𝑓 /#)
(6-1)
in which f is the effective focal length of the system, and d is the diameter of the circle
of confusion.
The OVM7692 camera is assumed to be focused at the hyperfocal distance,
such that the depth of field of the camera is maximized. The model for the camera is
also designed such that the camera has pixel-size-limited performance for object
distances ranging from infinity to half the hyperfocal distance.
Figure 6-3. Schematic diagram of a Galilean telescope design.
An afocal attachment for an imaging system is an optical system in which
parallel light rays entering the afocal system emerge parallel. Thus the attachment
does not alter the focal properties of the integrated system, but instead changes
certain imaging properties.
149
In order to expand the field of view of a camera, a reverse Galilean telescope
afocal attachment can be used. A Galilean telescope consists of two lenses, a positive
front element and a negative rear element (as shown in Figure 6-3). In this design,
the back focal points of the two lenses are coincident; as such, collimated light rays
incident on the positive element will converge towards the back focal plane. The
negative element will diverge these convergent rays so that the output light is again
collimated, however with a smaller beam diameter. Collimated light rays incident on
the longer focal length positive lens at an angle with respect to the optical axis will
emerge from the shorter focal length negative lens still collimated but at a larger angle
with respect to the optical axis. The reverse Galilean design maintains the
requirement for coincident focal points, but the negative element is the front element
this time. In the reverse configuration, collimated light rays parallel to the optical axis
will have a larger diameter exiting the afocal attachment. Collimated light rays
incident on the shorter focal length negative lens at an angle with respect to the
optical axis will emerge from the longer focal length positive lens of the afocal
attachment still collimated but at a smaller angle with respect to the optical axis [4].
This key feature effectively transforms wide angle inputs into narrower angle outputs
that can be imaged by the optical system of the wafer level camera.
Since the base camera (without the afocal attachment) was integrated at the
wafer level, it is not possible to test the performance of the isolated optical system. In
order to test the camera system performance, the spatial frequency response (SFR)
method was used [5, 6]. An ISO 12233 test target was imaged by the wafer level
150
camera, and slanted edge regions of the target were used to calculate the spatial
frequency response. The spatial frequency response of the OmniVision OVM7692
wafer level camera is shown in Figure 6-4 for both center and edge regions. Even
though the wafer level camera has a single lens, it can provide high contrast images.
The SFR at the center is greater than 0.25 up to the Nyquist frequency, and the SFR in
the edge regions is greater than 0.2 up to about 180 line pairs/mm. While capturing
the test images, default on-chip image sharpening was enabled, which increases the
modulation of some spatial frequencies (from about 50 line pairs/mm to about
250 line pairs/mm).
Figure 6-4. Spatial frequency response (SFR) analysis of the OmniVision
OVM7692 wafer level camera in the center and edge regions.
6.2 Custom Afocal Attachment Design
A custom reverse Galilean afocal attachment for the OVM7692 that expands
the field of view of the wafer level camera from a 64° diagonal field of view to a 100°
0
0.2
0.4
0.6
0.8
1
0 50 100 150 200 250
Spatial Frequency Response
Spatial Frequency (lp/mm)
Center Edge
151
diagonal field of view was designed. This afocal attachment consists of one negative
and one positive element, both with spherical surfaces. During optical design
optimization of this afocal attachment, three sample wavelengths were used to cover
the visible spectrum, 630 nm for red, 540 nm for green, and 470 nm for blue. The
final focal length of the system (the afocal attachment integrated with the wafer level
camera) is only 0.72 mm, and the optical system operates at f/3.
The total length of the afocal attachment is just 2.8 mm, and the lenses in the
afocal attachment have a 2 mm diameter. The system (the combination of the afocal
attachment and the pixel size limited lens model of the OVM7692 wafer level camera)
has close to diffraction limited performance for all field angles, as shown in
Figure 6-5 (Right). The specifications for the custom afocal attachment are listed in
Table 6-3.
Table 6-3. Specifications of the custom afocal attachment designed for the
OVM7692 wafer level camera. All dimensions are in mm. The thickness
parameter is the distance between the surface indicated and the following
surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
First Lens −3.0462 0.5
NLAK33B-
SCHOTT
1.0
1.8271 0.2
1.0
Second Lens 3.2351 1.2963 LF5-SCHOTT 1.0
−1.9670
1.0
152
Figure 6-5. (Left) Schematic diagram of the custom afocal attachment designed
for the OVM7692 wafer level camera. Note that the exit rays are parallel for each
field angle, as desired. The reference wavelength for the diagram is 540 nm.
(Right) Modulation transfer function (MTF) plot for the combination of the
custom afocal attachment and the pixel-size-limited optical model of the
OVM7692 wafer level camera. The reference wavelengths for the MTF plot are
630 nm, 540 nm, and 470 nm. The MTF contribution at 540 nm is weighed twice
as much as the 630 nm and 470 nm MTF contributions in the calculation.
6.3 Custom Afocal Attachment Design with COTS Elements
A similar afocal lens design to the one shown in Figure 6-5 was designed and
implemented, but this time using COTS elements, as shown in Figures 6-6 and 6-7.
This afocal design expands the field of view of the wafer level camera from a 64°
diagonal field of view to a 100° diagonal field of view as before, and consists of one
equiconcave and one plano convex element as shown in Figure 6-6. The final focal
length of the system (the afocal attachment integrated with the wafer level camera)
is only 0.75 mm, and the optical system operates at f/3.
The total length of the afocal attachment implemented with COTS lens
elements is just 2.4 mm, and both COTS lenses in the afocal attachment have a 2 mm
diameter. The integrated system has a 4.9 mm total length, allowing the overall
153
camera system to be highly compact. The specification for the afocal attachment with
COTS elements is listed in Table 6-4.
Figure 6-6. (Left) Schematic diagram of the custom designed afocal attachment
with COTS elements. The reference wavelength for the diagram is 540 nm.
(Right) Modulation transfer function (MTF) plot for the combination of the
custom afocal attachment with COTS elements and the pixel-size-limited optical
model of the OVM7692 wafer level camera. The reference wavelengths for the
MTF plot are 630 nm, 540 nm, and 470 nm. The MTF contribution at 540 nm is
weighed twice as much as the 630 nm and 470 nm MTF contributions in the
calculation.
Table 6-4. Specifications of the custom afocal attachment with COTS optical
elements. All dimensions are in mm. The thickness parameter is the distance
between the surface indicated and the following surface.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
OptoSigma
SLM-02B-02N
−3.5 0.7
LASF9-
SCHOTT
1.0
3.5 0.15
1.0
OptoSigma
SLM-02-03P
Infinity 1.3
LASF9-
SCHOTT
1.0
−2.55
1.0
154
Figure 6-7. (Left) As-fabricated custom afocal attachment with COTS elements
integrated in a housing for field of view expansion of a wafer level camera.
(Right) The afocal attachment integrated with the OVM7692 wafer level
camera development kit. The afocal attachment is placed on top of the wafer
level camera and is held in place behind the round silver mount shown in the
photo.
In order to compare the imaging quality of the camera with and without the
COTS wide angle attachment, the same ISO 12233 target was imaged, and the same
regions were used to extract SFR information following dewarping when integrated
with the afocal attachment (as shown in Figure 6-8). While capturing the test images,
default on-chip image sharpening was enabled, which increases the modulation of
some spatial frequencies.
This SFR analysis shows that integration of the afocal attachment with the
wafer level camera widens the field of view but only slightly degrades the image
quality in the center of the field of view and also at the edges. A set of images captured
by the OmniVision OVM7692 wafer level camera both with and without the custom
reverse Galilean afocal attachment is provided in Figures 6-9 and 6-10. The
expansion of the field of view to a much wider field of view is evident, as is the
preservation of the high image quality.
155
Figure 6-8. Spatial frequency response (SFR) analysis of the wafer level camera
with and without the afocal attachment.
If a spatial frequency response of 0.3 is considered (extracted from the SFR
plot shown in Figure 6-8), then the (optical) resolution of the camera is estimated to
be 332 × 249 pixels. However, as shown by the SFR plot in Figure 6-8, the camera
performance is mostly preserved with the addition of the afocal attachment. As a
consequence, higher resolution images could be achieved if a higher resolution wafer
level camera is used as the base system for the afocal attachment.
0
0.2
0.4
0.6
0.8
1
0 50 100 150 200 250
Spatial Frequency Response
Spatial Frequency (lp/mm)
Without Attachment, Center With Attachment, Center
Without Attachment, Edge With Attachment, Edge
156
Figure 6-9. Image of ISO 12233 test target captured with the OVM7692 wafer
level camera. The image is contrast enhanced to show detail.
6.4 Summary
Wafer level cameras, in which miniature lenses and image sensor arrays are
integrated at the wafer level, allow for very-low-cost miniature imaging systems.
Afocal lens attachments can be designed to modify the imaging characteristics of such
fixed-lens cameras. In this chapter, a custom reverse Galilean afocal attachment was
designed to expand the field of view of a wafer level camera. This custom design was
then converted to a design using commercial off-the-shelf lens elements, and this
latter design was implemented. With this expander, a very-low-cost ultraminiature
wide angle camera (4.9 mm camera height including the image sensor array) was
realized. Such a camera can be easily integrated on eyeglasses for wearable visual aid
applications, and also as the scene camera in an eye-tracked extraocular camera
157
system. The designed afocal adapter can be used with other miniature cameras with
existing lenses that have similar sized image sensor arrays and similar fields of view
to the wafer level camera used. As explained in Chapter 7, this field of view expander
was also integrated with a second off-the-shelf camera module to implement a
miniature eye-tracking system.
158
Figure 6-10. (Top) Image of the same target as shown in Figure 6-9 captured
with the designed afocal attachment integrated with the wafer level
camera. (The test target is brought closer to the camera to capture the whole
test target, as the field of view has been significantly increased). (Bottom) The
image on the top following software dewarping. The images are contrast
enhanced to show detail.
159
Chapter 6 References
[1] R. Fraux, “OmniVision's VGA Wafer-Level Camera”, 3D Packaging, Yole
Development, February 2012, pp. 26-27, 2012.
[2] OmniVision OVM7692 640 × 480 CameraCubeChip™ Product Brief. Available
online: http://www.ovt.com
[3] M. Laikin, Lens Design, 3rd Ed. New York: Marcel Dekker, Inc., p.37, 2001.
[4] W. J. Smith, Modern Optical Engineering, 3rd Ed., New York: McGraw-Hill,
p. 470, 2000.
[5] Photography - Electronic Still Picture Cameras - Resolution Measurements,
ISO Standard 12233:2000.
[6] P. Burns, “Slanted-Edge MTF for Digital Camera and Scanner Analysis”,
Proceedings of IS&T Image Processing, Image Quality, Image Capture, Systems
Conference, pp. 135 -138, 2000.
160
Chapter 7
EYE-TRACKED EXTRAOCULAR CAMERA
One method to restore foveation for retinal prosthesis patients is with an
eye-tracked extraocular camera (ET-EOC) [1-6], as decribed in Chapter 1. In the
ET-EOC approach, an image subregion corresponding to the direction of gaze
(determined by the eye-tracking system) can be extracted from the wide field of view
scene image captured by the extraocular camera, and then used to form a
corresponding microelectrode array stimulation pattern.
Figure 7-1. Retinal prosthesis with an eye-tracked extraocular camera;
modified after [1].
The eye-tracked extraocular camera has three main hardware components as
shown in Figure 7-1, (1) a wide field of view scene camera mounted on eyeglasses,
(2) a miniature eye-tracking camera, operating in the near infrared region and
pointed towards the eye, and (3) a battery-operated, belt worn computer for gaze
161
extraction and scene camera region selection. A prototype eye-tracked extraocular
camera was designed, fabricated, and integrated with custom associated software,
using the wide field of view imaging systems described in previous chapters in
conjunction with commercially available components.
7.1 Wide Field of View Scene Camera
Two COTS wide field of view lens designs were previously developed and
implemented, as presented in Chapter 4 (EO-Lens) and Chapter 5 (LP-Lens). The
LP-Lens was selected as the lens for the scene camera in the ET-EOC due to its more
compact size and wider field of view. This lens provides a 110° diagonal field of view,
and was integrated with a wide dynamic range camera module from Leopard
Imaging, the LI-USB30-M034WDR [7]. The integrated wide field of view scene
camera (as shown in Figure 7-2) outputs a rectilinear image after dewarping is
implemented on the image processing chip (Aptina AP0100) that is part of the camera
module.
Figure 7-2. Scene camera used in the designed and implemented eye-tracked
extraocular camera prototype. The scene camera comprises the
LI-USB30-M034WDR camera module from Leopard Imaging [7] and the custom
designed and implemented LP-Lens, as described in Chapter 5.
162
The current generation Argus II retinal prosthesis device includes a 6 × 10
electrode array covering about a 20° diagonal field of view in the central region of the
retina [8]. Next generation higher resolution implants will likely have 12 × 20 or
24 × 40 electrode arrays, again covering about a 20° diagonal field of view. The
ET-EOC is designed to provide a subregion of the wide field of view scene camera
image in the direction of the patient’s gaze to the implanted electrode array. The field
of view of the selected subregion should therefore match the field of view of the
implanted electrode array, namely approximately a 20° diagonal field of view for the
Argus II or next generation retinal prosthesis devices. The digital resolution of the
subregion image should also be adjusted to match the resolution of the implanted
electrode array.
The ET-EOC scene camera outputs a digital image with a 1280 × 960 pixel
resolution and a 110° (full) diagonal field of view. After 6 × 6 binning of the pixels of
the captured scene image, an output scene image of 213 × 160 pixel resolution can be
achieved, in which a 40 × 24 pixel subregion (matching the resolution of next
generation high resolution retinal prostheses) corresponds to about a 20° diagonal
field of view. In order to achieve the reduced resolution of the scene camera for this
demonstration, the output resolution of the scene camera was first reduced to
640 × 480, and further binning was achieved through software processing.
Additional pixel binning can also be implemented for the current generation lower
resolution retinal prostheses.
163
In order to achieve near real time frame rates, the image processing functions
performed for the captured scene images were minimized in order to reduce the CPU
load. Flat-field correction was performed only for the subregion corresponding to the
gaze direction. Lateral chromatic aberration correction was not performed, as
captured test images in real-life environments show that the effect of lateral
chromatic aberration is only significant at the corners, as shown in Figure 7-3.
Figure 7-3. (Left) An indoor hallway scene captured with the ET-EOC scene camera.
(Right) An outdoor scene captured with the ET-EOC scene camera.
7.2 Eye-Tracking Camera
Of the many different eye-tracking techniques that have been reported to date
[9], probably the least obtrusive technique is based on capturing the location of the
eye with a video camera. A miniature eye-tracking camera was built by relying on the
different camera designs that we have developed, rather than using a bulky
off-the-shelf eye-tracking system. The miniature field of view expander that was
presented in Chapter 6 (implemented with COTS lens elements, as shown in Figures
164
6-6 and 6-7) was tested with a wafer level camera development kit. However, a
miniature camera module with this particular wafer level camera and easy computer
interfacing options (i.e., either USB or FireWire) could not be found. Therefore, an
off-the-shelf USB camera module, an HBV-1306 from Huiber Vision Technology Co.
[10], was modified and integrated as the eye-tracking camera for the ET-EOC system.
The HBV-1306 camera module is a single miniature printed circuit board containing
a 1600 × 1200 resolution image sensor array, the HM2057 from Himax Imaging [11],
a miniature screw-on lens, and all necessary support and interface electronics. With
a custom cable, this module can be directly connected to a computer through the USB
interface (USB 2.0). The module measures 60 mm by 8 mm and is only 5 mm thick,
as shown in Figure 7-4. The integrated lens on the module provides a 50° horizontal
field of view.
Figure 7-4. HBV-1306 USB camera module shown next to a U.S. penny.
165
The eye-tracking camera module must be placed at a sufficient distance in
order to capture the entire eye. In order to mount the camera close enough to the eye
for unobtrusive incorporation on a pair of eyeglasses and yet at the correct distance
to be able to capture the entire eye, the original 50° horizontal field of view of the
camera must be expanded.
In order to use this camera as the eye-tracking camera, several modifications
were made. Initially, the infrared-block filter in the lens was removed. The reverse
Galilean field of view expander presented in Chapter 6 (implemented with COTS lens
elements, as shown in Figures 6-6 and 6-7) was integrated on top of the module’s
screw-on lens, with the last lens of the field of view expander approximately touching
the front surface of the camera screw-on lens. With this field of view expander, the
horizontal field of view was expanded to 92°.
Typically the pupil of the eye has the highest contrast to the surrounding areas
when imaged in the near infrared wavelength region. Even though the image sensor
array (HM2057) of the eye-tracking camera has a color image sensor array, without
an infrared-block filter Bayer color filters typically provide some light transmission
in the near infrared wavelengths (as can be seen in Figure 4-4 for wavelengths longer
than approximately 750 nm for a typical color image sensor array). In order for the
eye-tracking camera to operate in the near infrared region only, an infrared-pass
filter with the passband starting at 730 nm was installed. With ambient infrared
illumination, this allows for the pupil of the eye to appear black, with high contrast to
166
surrounding areas. The eye-tracking camera was operated at 640 × 480 resolution
for achieving high frame transfer rates with the computer interface.
7.3 Belt Worn Computer
A small, energy-efficient, battery-powered computer is needed for both
ET-EOC gaze estimation and region selection from the scene camera tasks. Instead of
a custom solution, a newly-introduced COTS computer, the Ainol Mini PC (shown in
Figure 7-5) was selected. The Ainol Mini PC has a 64-bit, quad-core Intel
®
processor
(Z3735F) with a 1.33 GHz CPU clock frequency, 2 GB of RAM, and 32 GB of flash
memory storage. In addition, it comes with Windows 8.1 preinstalled. The computer
measures 115 mm × 146 mm × 14 mm, and weighs only 335 grams, and has both a
built-in 7000 mAh battery and a passive cooling system without a fan [12].
Figure 7-5. The Ainol Mini PC, a small computer with a built-in battery that was
used as the real time image processor in the eye-tracked extraocular camera.
Even though this computer incorporates one of Intel’s lower-end Atom™
processors, the small package, built-in battery, fanless operation, and availability of
167
the Windows platform for easy programming made this computer an attractive
choice as the central processing element for the first ET-EOC prototype.
7.4 Software Development
The software for the ET-EOC was developed in the Python programming
language. For the video capture and image processing functions, OpenCV [13] and
SimpleCV [14] libraries were used. The software is composed of two parts,
calibration and operation.
In the calibration phase, the (sighted) user wearing the ET-EOC system is
seated in front of a computer display at sufficient distance such that the computer
display fills most of the field of view of the scene camera. No physical headrest is used
during calibration, and therefore the user is asked to keep their head as still as
possible. The user is initially shown a gaze target (a red circle with a yellow circle in
the middle) on the computer display and is prompted to fixate on the center of the
gaze target, as shown in Figure 7-6. The scene camera captures the image of the
display in the field of view, and the pixel location of the yellow circle in the captured
scene image is extracted in the software. The image from the eye-tracking camera is
also captured, and the gaze direction is estimated by finding the pixel location of the
center of the dark pupil in the eye-tracker image, as shown in Figure 7-6. The pixel
location of the pupil’s center is estimated by intensity thresholding the green channel
of the captured (color) eye-tracker image, and finding the center location of the
largest black circle in the image. The green channel of the eye-tracker image was
168
experimentally determined to have the highest contrast of the eye pupil to the
surrounding structures as compared to the other color channels in the image.
Figure 7-6. Eye-tracked extraocular camera (ET-EOC) images captured during
the calibration phase. (Left) Green channel of the captured eye-tracking
camera image. The location of the pupil is correctly identified by the software,
as shown with an overlaid red square around the pupil. (Right) Captured scene
camera image. The location of the target is correctly identified by the software,
as shown with an overlaid red square.
Later the user is shown and prompted to fixate on eight more different gaze
target locations. With each different gaze target location, the gaze target pixel
location in the scene camera image and the corresponding fixated pupil pixel location
in the eye-tracker image are determined. One of the gaze target locations is at the
center of the computer display. This gaze target pixel location in the scene camera
image and the corresponding fixated pupil pixel location in the eye-tracker image are
defined as the scene camera calibration center (scx, scy) and the eye-tracker
calibration center (pcx, pcy), respectively. These calibration centers define the four
169
separate calibration quadrants (shown as Q1 through Q4 in Figure 7-7) in the
respective images.
Figure 7-7. Schematic representation of the eye-tracker and the scene camera
images. The coordinates p x and p y are the x and y pixel locations of the pupil
center in the eye-tracker image (shown with the green dot), while pc x and pc y
are the x and y pixel locations of the eye-tracker calibration center in the
eye-tracker image (shown with the red dot). The coordinates s x and s y are the x
and y pixel locations of the gaze target center in the scene camera image (shown
with the purple dot), while sc x and sc y are the x and y pixel locations of the scene
camera calibration center in the scene camera image (shown with the blue dot).
Four quadrants in both images are also shown.
Once all of the calibration gaze target pixel locations in the scene camera image
and the corresponding fixated pupil pixel locations in the eye-tracker image data are
collected, a 2
nd
order polynomial fit is determined for relating the x and y pixel
locations of the gaze targets in the scene camera images to the x and y pixel locations
of the pupil centers in the eye-tracker images, as given in Eq.’s 7-1 and 7-2.
(𝑠 𝑥 − 𝑠𝑐
𝑥 ) = 𝑎 2
(𝑝 𝑥 − 𝑝 𝑐 𝑥 )
2
+ 𝑎 1
(𝑝 𝑥 − 𝑝 𝑐 𝑥 ) + 𝑎 0
(7-1)
170
(𝑠 𝑦 − 𝑠𝑐
𝑦 ) = 𝑏 2
(𝑝 𝑦 − 𝑝𝑐
𝑦 )
2
+ 𝑏 1
(𝑝 𝑦 − 𝑝𝑐
𝑦 ) + 𝑏 0
(7-2)
in which, sx and sy are the x and y pixel locations of the gaze target center in the scene
camera image, scx and scy are the x and y pixel locations of the scene camera
calibration center in the scene camera image, px and py are the x and y pixel locations
of the pupil center in the eye-tracker image, and pcx and pcy are the x and y pixel
locations of the eye-tracker calibration center in the eye-tracker image. During
calibration scx, scy, pcx, and pcy are determined, as well as eight sets of values for sx and
sy with corresponding values for px and py. This data is utilized to calculate the
polynomial coefficients (a2, a1, and a0 for x pixel locations and b2, b1, and b0 for y pixel
locations). A separate polynomial is calibrated for each individual quadrant (Q1
through Q4) set by the calibration centers, as shown in Figure 7-7.
In the operation phase, first the gaze direction is estimated from the
eye-tracking camera. The corresponding pixel location in the captured scene image
is calculated by using the 2
nd
order polynomials obtained during the calibration
phase, with Eq.’s 7-1 and 7-2. The rectangular 40 × 24 region of the scene camera
image (after resizing the scene camera image as described in Section 7-1) around the
calculated pixel location is then identified.
The proof-of-concept ET-EOC, as shown in Figure 7-8, was developed for quick
testing and benchmarking with a sighted individual. However, the system can easily
be adapted for retinal prosthesis patients. During the calibration phase, a system that
can provide localized sound or tactile feedback [4] as the gaze target could be
171
developed instead of providing a visual gaze target on a display. The system output
(a portion of the scene image in the direction of gaze) could then be directly interfaced
to the existing image processing functions that are used to properly condition the
video stream for the implanted microstimulator array.
Figure 7-8. Integrated eye-tracked extraocular camera (ET-EOC), showing both
the scene camera and the eye-tracking camera mounted on a pair of clear safety
glasses.
7.5 Testing and Benchmarking
The developed ET-EOC system was tested and benchmarked with software
running both on an iMac desktop computer (running Windows 8.1 Pro) and also on
the Ainol Mini PC. For the Ainol Mini PC implementation, a 23” LCD screen was used
as the display, which was placed close to the user such that the computer display fills
most of the field of view of the scene camera. Specifications for the two ET-EOC
computers used for benchmarking are listed in Table 7-1.
172
Table 7-1. Specifications for the Ainol Mini PC and the iMac desktop computer
used for benchmarking the ET-EOC software.
Ainol Mini PC iMac Computer
Central Processing Unit (CPU) Intel Atom Z3735F
Intel Core i5 3470
CPU Clock Frequency
1.33 GHz
(up to 1.83 GHz)
3.2 GHz
(up to 3.6 GHz)
CPU Cores 4
4
Random Access Memory
2 GB
24 GB
Display Size for Testing
23”
27”
Operating System Windows 8.1 Pro Windows 8.1 Pro
Eye-tracking systems typically have integrated infrared light sources for
sufficient illumination of the eye. These light sources must be carefully designed to
meet eye safety standards. However, for the prototpye ET-EOC infrared light sources
were not integrated to the eyeglasses as the light sources would need to be
extensively tested for safety. Therefore, two desk lamps with incandescant light bulbs
were used in order to achieve sufficient ambient infrared illumination.
Benchmarking results with both the Ainol Mini PC and the iMac desktop
computer are summarized in Table 7-2. With the Ainol Mini PC, the system runs at
19 frames per second (fps), and it runs at 31 fps on the iMac desktop computer. The
system overhead is primarily due to capturing and transferring of both scene camera
and eye-tracking camera video frames to the computer, and in particular is limited by
the slower frame transfer rate of the eye-tracking camera. Testing the frame rate of
173
the scene camera (capturing and displaying frames on the computer screen, not with
the ET-EOC software) shows that frame rates up to 48 fps can be achieved. The
maximum frame rate for the eye-tracking camera (under same testing conditions) is
only 33 fps. This issue can be improved by using an eye-tracking camera that can
achieve higher frame rates with a faster interface (e.g., USB 3.0). However, as
explained above, due to the miniature package size and the standard (USB 2.0)
interface, the HBV-1306 camera was selected for the eye-tracking module.
Table 7-2. Benchmarking results for the eye-tracked extraocular camera
(ET-EOC) software running on a belt worn computer and on a desktop
computer.
Ainol Mini PC iMac Computer
Frame Capture (Total transfer
time for the scene camera and
the eye-tracking camera)
31.3 msec 31.3 msec
Processing 20.3 msec 1 msec
Total time per frame 51.6 msec 32.3 msec
Frames per Second 19 fps 31 fps
After the calibration step, the accuracy of the system was tested by presenting
four new gaze target locations to the user. The center locations of the gaze targets in
the captured scene camera images as predicted by the system, the real center
locations of the gaze targets in the captured scene camera images, and the estimation
errors are presented in Table 7-3. An average error of 14.4 pixels in predicting the
center locations of the targets was achieved with the ET-EOC system, corresponding
174
to a 2.9° average error within the 110° (full) diagonal field of view of the scene image.
These values were obtained for the 640 × 480 pixel resolution scene image with (i.e.,
before further resizing in software).
Table 7-3. Accuracy testing results for the eye-tracked extraocular camera
(ET-EOC). Pixel locations for the centers of the targets, locations predicted by
the software after calibration, and the differences in image coordinates
between these two values are listed, as well as the distance between the two
pixel locations (Error). All values are in units of pixels.
Center
Location (x, y)
Location
Predicted by
ET-EOC (x, y)
Difference
(x, y)
Error
Target 1 415, 364 405, 352 10, 12 15.6
Target 2 489, 225 497, 215 8, 10 12.8
Target 3 197, 222 195, 207 2, 15 15.1
Target 4 314, 364 307, 352 7, 12 13.9
7.6 Summary
A prototype wearable eye-tracked extraocular camera system for retinal
prostheses was developed and implemented. This system provides one means of
allowing for the restoration of foveation capability in retinal prosthesis patients.
Optical system designs for both the wide field of view scene camera and the wide field
of view eye-tracking camera were developed, implemented, and integrated into the
prototype wearable system. Associated software for the designed hardware system
was also developed and benchmarked to show real time performance with a desktop
computer and near real time performance with a belt-mounted computer.
175
The prototype ET-EOC system uses off-the-shelf imaging components (camera
modules) for quick and low-cost proof-of-concept demonstration. Typically, the
optical system is the bulkiest element in a digital camera. In this particular case,
miniaturization of both cameras is achieved through custom designed and
implemented miniature optical systems. Further miniaturization of the total package
can be achieved by designing custom printed circuit boards containing the image
sensor array and interface electronics, both for the eye-tracking camera and also for
the scene camera. Flexible printed circuit boards could also be designed for optimal
packaging form factors.
Figure 7-9. The implemented prototype eye-tracked extraocular camera
(ET-EOC) as worn for field testing. The camera feeds are routed through two
separate USB cables to the belt-mounted computer.
176
Chapter 7 References
[1] N. R. B. Stiles, B. P. McIntosh, P. J. Nasiatka, M. C. Hauer, J. D. Weiland, M. S.
Humayun, and A. R. Tanguay, Jr., “An Intraocular Camera for Retinal
Prostheses: Restoring Sight to the Blind”, Chapter 20 in Optical Processes in
Microparticles and Nanostuctures, Advanced Series in Applied Physics,
Volume 6, A. Serpenguzel and A. Poon (Eds.), Singapore: World Scientific,
2010, pp. 385-429.
[2] F. E. Sahin, B. P. McIntosh, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “Design of a Compact Wide-Field-of-View Camera for
Retinal Prostheses”, 2013 Annual Meeting of the Association for Research in
Vision and Ophthalmology (ARVO), Seattle, Washington, May 5, 2013;
Investigative Ophthalmology and Visual Science, Vol. 54, No. 15, p. 1068,
June 2013.
[3] F. E. Sahin, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and A. R. Tanguay, Jr.,
“Optimal Design of Miniature Wide-Angle Computational Cameras for Retinal
Prostheses and Wearable Visual Aids”, 2014 OSA Frontiers in Optics, Art. No.
FTu5F.1, Tucson, AZ, October 2014.
[4] B. P. McIntosh, “Intraocular and Extraocular Cameras for Retinal Prostheses:
Effects of Foveation by Means of Visual Prosthesis Simulation”, Ph.D. Thesis,
University of Southern California, 2015.
[5] F. E. Sahin, B. P. McIntosh, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “Eye-Tracked Extraocular Camera for Retinal Prostheses”,
2015 OSA Frontiers in Optics, Art. No. FTu2C.3, San Jose, CA, October 2015.
[6] A. Caspi, A. Roy, G. Consedai, R. Greenberg, A. Safran, and J.-A. Sahel, “Retinal
Prosthesis − Steering the Line of Sight with Eye Movements”, 36th Annual
International Conference of the IEEE Engineering in Medicine and Biology
Society, 2014.
[7] Leopard Imaging LI-USB30-M034WDR. http://www.leopardimaging.com
[8] J. D. Dorn, A. K. Ahuja, A. Caspi, L. da Cruz, G. Dagnelie, J. Sahel, R. J. Greenberg,
M. J. McMahon, and Argus II Study Group, “The Detection of Motion by Blind
Subjects with the Epiretinal 60-Electrode (Argus II) Retinal Prosthesis”, JAMA
Ophthalmology, Vol. 131, No. 2, pp. 183-189, February 2013.
[9] A. Duchowski, Eye Tracking Methodology: Theory and Practice, Springer
Science & Business Media: London, Part 2, 2003.
177
[10] Huiber Vision Technology Co. Ltd. http://www.hbvcamera.com
[11] Himax Imaging Inc. http://himaximaging.com
[12] Ainol Mini PC. http://www.ainol-novo.com/ainol-mini-pc-black.html
[13] OpenCV (Open Source Computer Vision). http://opencv.org
[14] SimpleCV. http://simplecv.org
178
Chapter 8
ULTRAMINIATURE INTRAOCULAR CAMERAS
One of several possible methods to achieve natural foveation with retinal
prosthesis patients is to implant a miniature camera in the crystalline lens sac of the
patient. In this case, the output of the camera will drive the microelectrode array on
the retina after video processing on a belt worn visual processing unit (VPU). Being
in the crystalline lens sac, the intraocular camera (IOC) will move with the foveation
of the eye, and the stimulation on the retina will automatically match the gaze
direction. Extensive research and development on several generations of intraocular
cameras has been done at the Optical Materials and Devices Laboratory (OMDL) of
Prof. Armand R. Tanguay, Jr.
The optical design of the current generation camera prototype was initially
carried out by Michelle C. Hauer. In this fourth generation camera, a lightweight
polymer lens with two aspherical surfaces was designed and fabricated. This polymer
lens is 2.8 mm in diameter with a 2.22 mm center thickness and a 2.1 mm effective
focal length (when placed behind the cornea). The main motivation for having a
polymer lens instead of a higher index glass lens was the large mass difference (10 mg
vs. 75 mg) of comparable size lenses. It was not possible to meet the mass goal of
75 mg for the entire IOC with such a glass lens [1].
The image sensor array that has been used for the evaluation of the optical
system of the fourth generation IOC is the OmniVision Technologies OV6930 [2]. This
image sensor array has a 400 × 400 pixel resolution with a pixel size of 3 µm × 3 µm.
179
The Nyquist spatial sampling frequency corresponding to the 3 µm pixel size is
167 line pairs/mm. The lens of the fourth generation IOC was optimized to have
better than 50% modulation at 25 lp/mm, corresponding to a 32 × 32 resolution at
the image sensor, and covering an area of 0.74 mm × 0.74 mm on the image sensor
array.
For this fourth generation IOC prototype, the pixel size of the OV6930 image
sensor array (3 µm) is much smaller than the optical spot sizes (< 30 µm) at typical
field angles, and therefore the optical blur at the image plane is oversampled at the
sensor resolution. An image sensor array with larger pixels (as large as 20 µm pixels,
which would result in a Nyquist spatial sampling frequency of 25 lp/mm) could be
used, or a subset of image sensor array pixels could be digitally binned (to form a
super pixel) in order to match the resolution of the implanted electrode array.
8.1 Design Strategy for Ultraminiaturization
The fourth generation IOC design was based on the constraint to make the IOC
housing short enough (approximately 4.5 mm) to fit entirely within the crystalline
lens sac. This in turn dictated the lens diameter (in order to provide an f-number as
close to one as possible for high optical throughput) and housing diameter. The
resolution constraint on imaging was set to 32 32 in order to allow for multiple
future generations of intraocular retinal prosthesis microstimulator arrays,
irrespective of aspect ratio. This constraint in turn led to the specification of a spot
size of approximately 30 µm, and a corresponding 50% modulation at 25 lp/mm.
180
The initial concept of an ultraminiature IOC originated with a back of the
envelope calculation that there did not appear to be a physical constraint that would
limit reducing the size of the IOC based on the pixel sizes of available image sensor
arrays, which have been getting progressively smaller. Having a smaller overall
footprint for the IOC will make surgical implantation of the camera much easier. In
particular, we became interested in investigating the possibility of designing an
ultraminiature intraocular camera that could potentially be mounted within an
intraocular lens (IOL), thus further minimizing surgical complications during
implantation of the intraocular retinal prosthesis.
In order to achieve an ultraminiature IOC that provides a smaller overall
footprint as compared to the fourth generation IOC prototype, an ultraminiature IOC
that provides the required resolution for a given field of view with no or minimal
binning could be designed, instead of binning multiple pixels of the image sensor
array. As compared to the fourth generation IOC, this would result in a smaller image
on the image sensor plane, thereby reducing the size of the required image sensor
array, and also in optical designs with shorter effective focal lengths. Typically,
shorter focal length optical designs allow for shorter overall intraocular camera
lengths while still providing a similar resolution.
It should be noted that one advantage of employing super pixels (as in the
fourth generation IOC) is noise reduction through averaging and superior low light
performance, which would potentially be compromised to a degree with the minimal
181
binning designs. As such, the noise level and low light performance of an
ultraminiature IOC should be evaluated.
In the case of a color image sensor array with a Bayer color filter (red, green,
and blue), a single grayscale pixel value can be extracted by averaging the value from
four color pixels (Bayer color filters typically have repeating arrays of color filters,
with each array element consisting of one red pixel, one blue pixel, and two green
pixels). Therefore, a minimal binning of four color pixels is required on the image
plane to achieve single grayscale pixels, as current intraocular retinal prostheses are
designed for grayscale image sensors.
The fourth generation IOC has a single polymer (Zeonex®, n = 1.53 at
λ = 587.6 nm [1]) lens. In order to achieve the required optical power for imaging
with this lens without resorting to extreme curvatures, a high index difference at both
lens interfaces is required. This implies that the lens should operate with air on both
sides, and therefore an optical window is required to separate the lens from the
aqueous humor of the eye. The thickness of the optical window (manufactured from
fused silica) was specified to be 250 µm [1]. The optical window can be eliminated if
a lens that provides sufficient optical power with the first surface in contact with the
aqueous humor can be designed instead. This in turn is possible by using a high-index
lens material for the lens, such that the index difference at the first interface is large
even when in contact with the aqueous humor.
With ultraminiaturization, the physical size of the designed lenses can also be
much smaller as compared to the size of the fourth generation IOC lens. Such small
182
lenses will still be lightweight even if they are fabricated from a high density material.
This in turn allows the use of high index (typically high density) glasses as lens
materials, while still keeping the overall mass within IOC mass requirements
(approximately 75 mg for the entire IOC).
In addition to providing a high index difference, the fused silica window in the
fourth generation IOC also serves the purpose of providing biocompatibility for
long-term ocular implantation [1, 3]. A biocompatible, optically transparent, and
hermetically-sealed front element is still required for next generation IOC designs, as
certain lens materials might have non-biocompatible additives. However, with an
optical design that does not require the optical window for optical operation, a
biocompatible, optically transparent thin film coating could be applied to the front
surface of the IOC lens for the purpose of ensuring biocompatibility. Novel
biocompatible materials for hermetic packaging of implantable devices represent an
active area of research within the Optical Materials and Devices Laboratory, and have
the potential to produce optically transparent biocompatible thin film coatings.
As an alternative, if the first surface of the IOC lens is designed to be planar, a
fused silica window could potentially be optically contacted or cemented to this first
surface to provide the required biocompatibility. In this case, significant additional
mechanical support would be provided by the IOC lens, and therefore a much thinner
fused silica window could be used. Optical contacting and cementing of two or more
optical elements are common procedures in the manufacturing of multielement
doublet or triplet lenses.
183
The manufacturing of miniature glass lenses has been greatly advanced in
recent years. For example, Alps Electric Co. has announced the availability of
aspherical lenses as small as 1 mm × 1 mm (square in shape) for fiber collimation and
coupling applications [4]. Precision Optics Corporation can manufacture custom
spherical lenses down to 0.37 mm in diameter for microendoscopes by combining
precision grinding and polishing [5].
Given that glass lenses can be fabricated in such small sizes, the lens mass will
easily be within the requirements for the IOC. With the use of high index glasses, the
(optical) need for an optical window can be eliminated, and a much smaller overall
camera footprint (as compared to the fourth generation IOC) can be achieved.
A typical IOL is held in place by two haptic arms, with the center of the IOL
located 4.6 mm behind the posterior surface of the cornea. Various ultraminiature
IOC designs presented in this chapter are optimized such that the distance from the
posterior surface of the cornea to the center of the IOC is also 4.6 mm. The following
analysis is based on a 24 × 40 electrode arrangement of 960 electrodes for next
generation retinal prostheses (instead of a 32 × 32 electrode arrangement of 1024
electrodes as in the fourth generation IOC analysis), as this is a more likely extension
of the current 6 × 10 electrode array geometry of the Argus II [6].
8.2 Alternative Image Sensor Arrays
As mentioned above, the OmniVision OV6930 was selected as the image
sensor array for characterization of the fourth generation IOC prototype optical
184
system. With the potential to design smaller optical systems by using precision
machined and polished glass lenses, image sensor arrays with smaller package and
pixel sizes can be considered for further ultraminiaturization of the overall package
size of the IOC. Image sensor arrays with smaller pixels will result in camera designs
that cover the same field of view with minimal binning, shorter effective focal lengths,
and shorter overall lengths as compared to cameras designed with the OV6930 as the
image sensor array.
The Awaiba NanEye image sensor array is a digital color image sensor array
designed for endoscopy and other medical applications. This image sensor array has
a 250 × 250 resolution with a pixel size of 3 µm × 3 µm (the same pixel size as that of
the OV6930). The total package size of the camera is only 1 mm × 1 mm, and has only
four pins, which is also ideal for IOC hermetic packaging requirements. The quoted
power consumption of the camera is only 4 mW [7].
Fujikura Ltd.'s G2 is a color image sensor array designed for miniature
endoscopes, with a 170 × 170 pixel resolution and a (smaller) pixel size of
2.2 µm × 2.2 µm. The total package size of the image sensor is only
0.74 mm × 0.79 mm × 0.3 mm [8].
Another miniature image sensor array is the recently released OmniVision
OV6946. This image sensor array has a 400 × 400 resolution with a pixel size of
1.75 µm × 1.75 µm [9].
With these smaller pixel sizes, the Fujikura G2 or the OmniVision OV6946
image sensor arrays could be strong candidates for next generation ultraminiature
185
IOC prototypes. The Fujikura G2 image sensor array is currently being offered with
an integrated lens only as part of an endoscope camera package. After long
discussions with various professionals at Fujikura, we discovered that the lens is
integrated during the manufacturing process, and is neither removable
post-packaging nor can bare die image sensor arrays be provided. The OmniVision
OV6946 was just announced at the time of writing of this thesis. As such, the next
generation ultraminiature IOC designs are based on minimal binning with
3 µm × 3 µm color pixels. Even though the Awaiba NanEye image sensor array is
capable of providing a smaller overall camera package due to its smaller overall size
than the OV6930, the optical design of an ultraminiature IOC for the Awaiba NanEye
will be the same as the one for an OmniVision OV6930, due to the same pixel size.
Therefore, the existing OV6930 platform was used for the testing of the designed and
implemented optical designs. A comparison of key specifications for the OmniVision
OV6930, the Awaiba NanEye, the Fujikura G2, and the OmniVision OV6946 image
sensor arrays is provided in Table 8-1.
Table 8-1. Comparison of specifications for the OmniVision OV6930, the
Awaiba NanEye, the Fujikura G2, and the OmniVision OV6946 image sensor
arrays.
OmniVision
OV6930
Awaiba
NanEye
Fujikura G2
OmniVision
OV6946
Resolution 400 × 400 250 × 250 170 × 170 400 × 400
Pixel Size 3 µm × 3 µm 3 µm × 3 µm
2.2 µm ×
2.2 µm
1.75 µm ×
1.75 µm
Package
Size
1.815 mm ×
1.815 mm
1.0 mm ×
1.0 mm
0.74 mm ×
0.79 mm
0.95 mm ×
0.94 mm
186
8.3 Ultraminiature IOC with a Glass Lens
In order to minimize the size of the IOC for a given choice of image sensor array
pixel size and pitch, a lens with the shortest possible focal length that still satisfies the
resolution and field of view requirements of the implanted electrode array should be
designed. This can be achieved by covering the ±10° diagonal field of view that is
currently used for intraocular retinal prostheses with no oversampling at the image
plane. Therefore, an array of 48 × 80 color pixels is required on the image plane to
stimulate a matching set of 24 × 40 grayscale electrodes on the retina with minimal
binning, corresponding to ±8.6° horizontal and ±5.2° vertical (±10° diagonal) fields of
view. If a color image sensor array such as the OmniVision OV6930 (or the Awaiba
NanEye) with 3 µm × 3 µm pixels is considered, this corresponds to 240 µm
horizontal and 144 µm vertical image dimensions. The effective focal length for the
optical system (camera placed inside the eye, as shown in Figure 8-1) should then be
approximately 790 µm in order to map the desired angular field of view to the spatial
dimensions of the image sensor array.
Optical designers typically aim to have a modulation of better than 0.5 up to
the Nyquist frequency for a good imaging system [10]. In the case of the OmniVision
OV6930 image sensor array with minimal binning, the Nyquist frequency is 83 lp/mm
(for a 6 µm × 6 µm grayscale pixel size). After multiple optimization steps, an
ultraminiature lens with two conic surfaces that meets the design goal was achieved,
as shown in Figure 8-1. A conic surface is a surface for which the conic constant (k)
is nonzero but the aspherical coefficients are zero, as defined in Eq. 5-1. The designed
187
lens has a modulation of better than 0.5 over the entire ±10° diagonal field of view up
to the Nyquist frequency, as shown in Figure 8-2. The lens specifications for this
ultraminiature IOC optical system design are listed in Table 8-2.
Figure 8-1. Schematic diagram of the glass lens (with two conic surfaces)
designed for the ultraminiature intraocular camera (IOC), shown in the Liou-
Brennan [11] optical model of the eye. The cover glass above the image sensor
array is also shown. The reference wavelength for the diagram is 540 nm.
Figure 8-2. Modulation transfer function (MTF) plot for the designed
ultraminiature intraocular camera lens with two conic surfaces, in the Liou-
Brennan optical model of the eye and with the object at infinity. The lens has a
modulation of 0.5 or greater up to the Nyquist frequency (83 lp/mm) at all field
angles. The reference wavelengths for the MTF plot are 630 nm, 540 nm, and
470 nm. The MTF contribution at 540 nm is weighed twice as much as the
630 nm and 470 nm MTF contributions in the calculation.
188
Table 8-2. Specifications of the designed custom ultraminiature IOC lens with
conic surfaces. All dimensions are in mm.
Radius of
Curvature
Conic
Constant
Thickness
Glass
Material
Semi-
Diameter
0.4149 −0.7489 0.784
SLAL18-
OHARA
0.34
−0.8679 −16.4213
0.34
Another desirable feature of the IOC is a very large depth of field, so that
objects at different distances will be in focus, thereby eliminating the need for a bulky
and power consuming mechanical focusing system. A large depth of field has the
added benefit of providing good images over a range of different magnifications. A
patient implanted with an intraocular retinal prosthesis and an associated IOC could
bring an object close to the eye to in order to have a magnified yet still focused image.
This would be especially useful for sorting out objects with small details
(e.g., numbered keys).
The largest depth of field for a camera can be achieved if the camera is focused
at the hyperfocal distance (𝑠 ℎ𝑓 ). In this case, objects located from infinity down to
half the hyperfocal distance will be in acceptable focus [12]. The hyperfocal distance
𝑠 ℎ𝑓 is given by
𝑠 ℎ𝑓 ≈
𝑓 2
𝑑 (𝑓 /#)
(7-1)
in which f is the effective focal length of the system, and d is the diameter of the circle
of confusion.
189
The hyperfocal distance for the designed system, with the ultraminiature IOC
(with two conic surfaces) placed inside the eye (including the optical power of the
cornea) for which f = 0.79 mm, f/# = 1, and d = 6 µm, is calculated to be 10.4 cm. An
optimized lens design with better than 0.5 modulation from infinity down to half the
hyperfocal distance (5.2 cm) could not be achieved. Therefore, the performance at
short object distances was slightly sacrificed and the lens was optimized to have
better than 0.5 modulation from an object distance of 6 cm up to infinity.
The final optimized lens is only 680 µm in diameter and 784 µm thick with an
effective focal length of 517 µm (in air), as shown in Figure 8-1. The lens is fabricated
from S-LAL18, a high index glass from Ohara Corp. with an index of refraction of 1.729
at λ = 587.6 nm. This glass is a member of Ohara's S-family of environmentally
friendly glasses, and does not contain any lead, arsenic, or other chemicals harmful to
the environment. The density of this particular glass is 4.18 g/cm
3
, and the total mass
of the lens is calculated from its shape to be only 1.19 mg [13].
This lens was optimized to interface directly with the aqueous humor in the
human eye, and unlike the fourth generation IOC prototype does not require a
window to separate it from the aqueous humor for proper optical imaging. This helps
reduce the overall length of the optical system to 1.3 mm, measured from the front
vertex of the lens to the image plane, including the image sensor cover glass thickness.
Although this particular optical glass does not contain any harmful chemicals, either
extensive long-term biocompatibility testing of the integrated IOC must eventually be
performed to ensure safety, or a thin-film biocompatible coating can be applied, as
190
explained above. The lens diameter was designed to match the size of the aperture
stop for the optical system design that is located on the front surface of the lens, thus
eliminating the need for external physical apertures that might add bulk to the
camera.
Figure 8-3. Portion of the USAF 1951 resolution target used for image
simulation of ultraminiature IOC designs. The target has a 6:10 aspect ratio,
matching the aspect ratio of the implanted electrodes of the Argus II intraocular
retinal prosthesis. Next generation retinal implants are expected to have up to
24 × 40 electrodes, which can support at most 12 line pairs/vertical height ( i . e. ,
12 pairs of black and white lines over the vertical field of view).
The modulation transfer function (MTF) of the designed ultraminiature
intraocular camera lens (with two conic surfaces) for the object at infinity is shown
in Figure 8-2. The lens exhibits a modulation of greater than 0.5 at all field angles,
and at all spatial frequencies up to the Nyquist frequency (83 lp/mm). The IOC optical
system designs presented in this chapter provide a narrow field of view and do not
exhibit significant barrel distortion, as opposed to the wide field of view lenses
191
presented in previous chapters. As such, the MTF plots for optical designs are
provided instead of the SFR plots for the entire imaging system.
The USAF 1951 resolution target was used for image simulation, as shown in
Figure 8-3, with the resulting simulated image for the object placed at infinity shown
in Figure 8-4, corresponding to a diagonal field of view of ±10°. The simulated image
after resampling at a sensor resolution of 48 × 80 pixels is shown in Figure 8-5, and
the grayscale image after resampling at the electrode resolution of 24 × 40 pixels is
shown in Figure 8-6. Excellent resolution at 12 line pairs/vertical height (field of
view) is evident, as would be expected for a vertical resolution of 24 image sensor
array elements.
Figure 8-4. Image shown in Figure 8-3 simulated through the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-Brennan optical
model of the eye with the object placed at infinity. The diagonal field of view is
20°. The reference wavelengths for the image simulation are 630 nm, 540 nm,
and 470 nm, with equal weights.
192
Figure 8-5. Image simulation result shown in Figure 8-4 following resampling
at an image sensor resolution of 48 × 80 pixels.
Figure 8-6. Image simulation result shown in Figure 8-4 following resampling
at an electrode array resolution of 24 × 40 grayscale pixels.
193
Figure 8-7. Modulation transfer function (MTF) plot for the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-Brennan optical
model of the eye, with the object placed at 20 cm (approximately twice the
hyperfocal distance). The reference wavelengths for the MTF plot are 630 nm,
540 nm, and 470 nm. The MTF contribution at 540 nm is weighed twice as much
as the 630 nm and 470 nm MTF contributions in the calculation.
Figure 8-8. Image shown in Figure 8-3 simulated through the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-Brennan optical
model of the eye with the object placed at 20 cm (approximately twice the
hyperfocal distance). The diagonal field of view is 20°. The reference
wavelengths for the image simulation are 630 nm, 540 nm, and 470 nm, with
equal weights.
194
Figure 8-9. Image simulation result shown in Figure 8-8 following resampling
at an image sensor resolution of 48 × 80 pixels.
Figure 8-10. Image simulation result shown in Figure 8-8 following resampling
at an electrode array resolution of 24 × 40 grayscale pixels.
The modulation transfer function (MTF) of the designed ultraminiature IOC
lens (with two conic surfaces) for the object at 20 cm (approximately twice the
hyperfocal distance) is shown in Figure 8-7, and simulated images of the USAF test
195
target are shown in Figures 8-8, 8-9, and 8-10. In this case, the lens exhibits a
modulation of greater than 0.7 at all field angles, and at all spatial frequencies up to
the Nyquist frequency. An object distance of 20 cm is selected to show imaging at a
typical human close imaging distance. The corresponding MTF and simulated images
for the object at 6 cm (approximately half the hyperfocal distance) are shown in
Figures 8-11, 8-12, 8-13, and 8-14. As in the case with the object at infinity, the lens
exhibits a modulation of greater than 0.5 at all field angles, and at all spatial
frequencies up to the Nyquist frequency.
Figure 8-11. Modulation transfer function (MTF) plot for the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-Brennan optical
model of the eye, with the object placed at 6 cm (approximately half the
hyperfocal distance). The reference wavelengths for the MTF plot are 630 nm,
540 nm, and 470 nm. The MTF contribution at 540 nm is weighed twice as much
as the 630 nm and 470 nm MTF contributions in the calculation.
This ultraminiature intraocular camera lens design has only a single glass lens
with two conic surfaces, is small enough to allow incorporation into an intraocular
196
lens for surgical implantation in the eye, and has sufficient resolution to support not
only current generation 6 × 10 microelectrode arrays, but also next generation
12 × 20 (240 electrode) and 24 × 40 (960 electrode) arrays. Spot diagrams for the
designed intraocular camera lens with two conic surfaces are shown in Figure 8-15
for three object distances.
Figure 8-12. Image shown in Figure 8-3 simulated through the designed
ultraminiature IOC lens (with two conic surfaces) in the Liou-Brennan optical
model of the eye, with the object placed at 6 cm (approximately half the
hyperfocal distance). The diagonal field of view is 20°. The reference
wavelengths for the image simulation are 630 nm, 540 nm, and 470 nm, with
equal weights.
197
Figure 8-13. Image simulation result shown in Figure 8-12 following
resampling at an image sensor resolution of 48 × 80 pixels.
Figure 8-14. Image simulation result shown in Figure 8-12 following
resampling at an electrode array resolution of 24 × 40 grayscale pixels.
198
Figure 8-15. (Left) Spot diagram for the designed ultraminiature IOC lens
(with two conic surfaces) in the Liou-Brennan optical model of the eye, with the
object placed at infinity. (Center) Spot diagram for the designed ultraminiature
IOC lens (with two conic surfaces) in the Liou-Brennan optical model of the eye,
with the object placed at 20 cm (approximately twice the hyperfocal distance).
(Right) Spot diagram for the designed ultraminiature IOC lens (with two conic
surfaces) in the Liou-Brennan optical model of the eye, with the object placed at
6 cm (approximately half the hyperfocal distance). The reference wavelengths
for the spot diagrams are 630 nm, 540 nm, and 470 nm. The RMS spot diagram
contribution at 540 nm is weighed twice as much as the 630 nm and 470 nm
RMS spot diagram contributions in the calculation.
8.4 Alps Electric Miniature Lenses
Alps Electric Co. has announced the availability of ultraminiature circular as
well as square aspherical lenses (the latter only 1 mm × 1 mm in size) for optical fiber
coupling and collimation [4]. Although these lenses were designed for a different
purpose and for different wavelengths (1310 nm and 1550 nm; wavelengths in the
near infrared instead of the visible wavelength range of 450 nm to 600 nm), they
were evaluated as potential off-the-shelf components for the IOC, and for quick
testing of the fifth generation IOC with an ultraminiature glass lens concept.
199
Several suitable fiber collimation lenses from Alps Electric Co. were modeled
in CODE V in the visible spectrum, and their potential for IOC operation both with and
without a glass window was examined. As the lenses are designed for fiber to
free-space collimation, a window to separate the lens from the aqueous humor is
required so that the lens will have air on both sides, thereby providing sufficient focal
power in the visible as well as for imaging at finite conjugates [4].
The Alps Electric FLAS0SG01A is a fiber collimation lens with a 1.12 mm
effective focal length at 1310 nm, as shown in Figure 8-16 below. This ultraminiature
lens has two aspherical surfaces, and is fabricated from Schott P-SK57 glass (n = 1.587
at λ = 587.6 nm) [14]. The lens is mounted inside a metal housing that has a 1.8 mm
outer diameter and is only 1.4 mm long. Modeling of this lens as a potential lens for
the intraocular camera operating over the visible wavelength range revealed
impressive results, as indicated by the MTF shown in Figure 8-17. Lenses with visible
range antireflection coatings were acquired. Laboratory testing of this lens with the
OmniVision OV6930 image sensor array showed equally promising results, as shown
in Figure 8-18 below.
Figure 8-16. The Alps Electric FLAS0SG01A aspherical lens shown next to a
U.S. penny.
200
The Alps Electric FLAS0SG01A lens was simulated (in CODE V) in the Liou-
Brennan optical model of the eye [11] with an optical window to separate it from the
aqueous humor. The total length of the optical system, including the 0.25 mm thick
fused silica optical window, is only 2 mm.
If this lens was integrated with the OV6930 image sensor array, the central
region of the integrated intraocular camera image would have a ±10.3° diagonal field
of view corresponding to the central 72 × 120 pixels. In this case, 9 (3 × 3) pixels at a
time can be digitally binned to match the resolution of next generation retinal
prostheses (24 × 40). As such, the Nyquist spatial sampling frequency is 56 lp/mm.
As shown in Figure 8-17, the camera has a greater than 0.5 modulation for almost all
fields up to 56 lp/mm when operated at f/1. Even though the Alps Electric
FLAS0SG01A lens was designed for a different wavelength range and for a different
purpose, surprisingly it would have good performance if used as the lens in an
ultraminiature intraocular camera.
201
Figure 8-17. (Top) Schematic diagram of the Alps Electric FLAS0SG01A lens
shown in the Liou-Brennan optical model of the eye [11]. The reference
wavelength for the diagram is 540 nm. (Bottom) MTF plot for the Alps Electric
FLAS0SG01A lens in the Liou-Brennan optical model of the eye, with the object
placed at infinity. The reference wavelengths for the MTF plot are 630 nm,
540 nm, and 470 nm. The MTF contribution at 540 nm is weighed twice as much
as the 630 nm and 470 nm MTF contributions in the calculation.
202
Laboratory testing of the Alps Electric FLAS0SG01A lens also showed good
results. The lens was mounted to the OV6930 image sensor array and test images
were captured. The lip of the metal housing of the lens acts as the aperture stop of
the integrated optical system, and therefore the optical system operates at f/0.84.
Captured images exhibit good resolution over the central ±10° field of view, a result
that also agrees with the image simulation output as shown in Figure 8-18.
Figure 8-18. (Top) Image simulation through the Alps Electric FLAS0SG01A
aspherical lens operating in air at f/0.84, and with a 200 × 400 pixel image
sensor resolution. The reference wavelengths for the image simulation are
630 nm, 540 nm, and 470 nm. (Bottom) Captured image with the Alps Electric
FLAS0SG01A lens and the OV6930 image sensor array also operating in air at
f/0.84, and with a 200 × 400 pixel image sensor resolution. The central box
outlined in red dashes represents the central 20° diagonal field of view.
203
Even though the Alps Electric FLAS0SG01A lens performance is not quite as
good as the custom designed lens with conic surfaces presented in Section 8.3, and is
not ideal for the IOC application in its currently available commercial form due to the
non-removable metal housing, physical testing showed promising results. This is also
one of the first precision fabricated ultraminiature glass aspherical lenses in the
industry, and signals a bright future for fabrication technologies for custom designed,
glass IOC lenses.
8.5 Ultraminiature Intraocular Camera with Spherical Lenses
Fabrication of custom designed ultraminiature aspherical glass lenses in small
quantities is not common, and as such these lenses are also very expensive. As a
low-cost, more readily available alternative, the design of ultraminiature intraocular
cameras with spherical lens elements was also examined.
Initial attempts at designing a single lens camera with spherical surfaces did
not produce good results. Either the f/# of the system had to be increased, thereby
sacrificing light throughput, or the lens did not provide sufficient resolution for next
generation, high resolution intraocular retinal prostheses.
Therefore, a custom two lens next generation ultraminiature intraocular
camera with miniature spherical lenses was designed. In order to keep lens
fabrication simple and cost-effective, each lens was limited to only one spherical
surface while keeping the other surface planar. The optical design was optimized for
the OmniVision OV6930 image sensor array.
204
Figure 8-19. Schematic diagram of the designed ultraminiature two lens IOC
optical system, shown in the Liou-Brennan optical model of the eye [11]. The
cover glass above the image sensor array is also shown. The reference
wavelength for the diagram is 540 nm.
Table 8-3. Specifications of the designed custom ultraminiature IOC optical
system with plano convex lens elements, as shown in Figure 8-19. All
dimensions are in mm.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.025
0.36
First Lens Element Infinity 0.304
SLAL18-
OHARA
0.4
−1.941 0.01
0.4
Second Lens Element 0.745 0.626
SLAL18-
OHARA
0.4
Infinity
0.4
The designed lens has a 0.736 mm effective focal length (in air) and provides
a ±10° diagonal field of view, covering a 48 × 80 pixel resolution at the image plane of
the OV6930 image sensor array. The lens operates at f/1 for high light throughput,
205
and has better than 0.46 modulation up to the Nyquist frequency of 83 lp/mm
(considering a 24 × 40 pixel resolution and a 6 µm × 6 µm grayscale pixel size) with
the object placed at infinity, as shown in Figure 8-20. Images of the USAF resolution
target simulated through the designed custom lens covering a 20° diagonal field of
view are shown in Figures 8-21, 8-22, and 8-23. The diameters of the individual lens
elements are 0.8 mm, and the total length from the front surface of the first lens to the
image plane is only 1.4 mm. Both lens elements are fabricated from S-LAL18, a high
index glass from Ohara Corp. with an index of refraction of 1.729 at λ = 587.6 nm [13].
The lens specifications are given in Table 8-3. The total mass for the lens is calculated
to be 2.9 mg. This custom two lens next generation ultraminiature intraocular
camera designed with only spherical surfaces exhibits a modulation of greater than
0.46 up to the Nyquist frequency and at all field angles up to ±10°. Its performance is
nearly as good as that of the custom aspherical lens shown in Figure 8-1. The success
of this simple design motivated the design of a similar two lens ultraminiature
intraocular cameras based on COTS elements, as described in detail below.
206
Figure 8-20. Modulation transfer function (MTF) plot for the designed
ultraminiature two lens IOC optical system (as shown in Figure 8-19) in the
Liou-Brennan optical model of the eye and with the object at infinity. The lens
has a modulation of 0.46 or greater up to the Nyquist frequency (83 lp/mm) at
all field angles. The reference wavelengths for the MTF plot are 630 nm,
540 nm, and 470 nm. The MTF contribution at 540 nm is weighed twice as much
as the 630 nm and 470 nm MTF contributions in the calculation.
Figure 8-21. Image shown in Figure 8-3 simulated through the designed
ultraminiature two lens IOC optical system (as shown in Figure 8-19) in the
Liou-Brennan optical model of the eye and with the object placed at infinity. The
diagonal field of view is 20°. The reference wavelengths for the image
simulation are 630 nm, 540 nm, and 470 nm, with equal weights.
207
Figure 8-22. Image simulation result shown in Figure 8-21 following
resampling at an image sensor resolution of 48 × 80 pixels.
Figure 8-23. Image simulation result shown in Figure 8-21 following
resampling at an electrode array resolution of 24 × 40 grayscale pixels.
208
8.6 Two Lens Intraocular Cameras with COTS Spherical Lenses
8.6.1 Initial Design
After designing the custom two lens system presented in the previous section,
a similar optical imaging system with the closest matching COTS elements was also
designed. The designed optical imaging system incorporates two plano convex lenses
from Edmund Optics, both 1 mm in diameter. The first lens has a 1.5 mm focal length,
incorporates a VIS 0° antireflection coating, and is fabricated out of Schott N-LASF9
glass (n = 1.850 at λ = 587.6 nm). The second lens is an uncoated half ball lens with a
0.97 mm focal length, and is fabricated out of Schott N-BK7 glass (n = 1.517 at
λ = 587.6 nm) [14]. The lens specifications are listed in Table 8-4, and the optical
imaging system is shown in Figure 8-24. The total length of the designed optical
imaging system (with COTS lens elements) is 1.6 mm from the front planar surface to
the image plane. The IOC with this two lens optical system was designed to directly
interface with the aqueous humor.
Figure 8-24. Schematic diagram of the designed two lens ultraminiature IOC
optical system with COTS lens elements shown in the Liou-Brennan optical
model of the eye [11]. The cover glass above the image sensor array is also
shown. The reference wavelength for the diagram is 540 nm.
209
Table 8-4. Specifications of the ultraminiature IOC optical system with COTS
spherical lens elements, as shown in Figure 8-24. All dimensions are in mm.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.025
0.29
Edmund Optics
65275
Infinity 0.8
N-LASF9-
SCHOTT
0.5
−1.28 0
0.5
Edmund Optics
49568
0.5 0.5
N-BK7-
SCHOTT
0.5
Infinity 0.014
0.5
Image Sensor
Cover Glass
Infinity 0.3 SiO2
Image Plane Infinity 0
0.164
The effective focal length of the optical system is only 0.66 mm (including the
optical power of the cornea), and the optical system operates at f/1. The ±10°
diagonal field of view corresponds to a diagonal image size of 233 µm at the image
plane. This means that in order to achieve 48 × 80 pixel resolution to match the
24 × 40 resolution of the implanted electrode array, a color image sensor array with
slightly smaller (2.5 µm × 2.5 µm) pixels is required. The Nyquist frequency for the
corresponding grayscale pixels (5 µm × 5 µm) is 100 lp/mm.
The designed IOC compound lens with COTS lens elements has higher than
0.45 modulation at all field angles up to ±10°, as shown in the MTF plot of Figure 8-25.
Image simulation results with the designed lens are shown in Figures 8-26, 8-27, and
8-28.
210
Figure 8-25. Modulation transfer function (MTF) plot for the designed two lens
ultraminiature IOC optical system with COTS lens elements (as shown in Figure
8-24) in the Liou-Brennan optical model of the eye and with the object at
infinity. The lens has a modulation of 0.45 or greater up to the Nyquist
frequency (100 lp/mm) at all field angles. The reference wavelengths for the
MTF plot are 630 nm, 540 nm, and 470 nm. The MTF contribution at 540 nm is
weighed twice as much as the 630 nm and 470 nm MTF contributions in the
calculation.
The back focal length of this lens (0.21 mm) is shorter than the custom
designed lenses described above (0.31 mm or longer). A typical image sensor array
cover glass has an index of refraction of approximately 1.5, and therefore a lens with
a back focal length of 0.21 mm can support up to a cover glass thickness of
approximately 0.315 mm (= 0.21 × 1.5). The OV6930 image sensor array has a thicker
cover glass, and therefore the lens cannot be properly focused with this image sensor
array. Even though the lens was implemented and verified with the Wells Research
OS-400-25 optical testing system, test images could not be captured. An extensive
search was made for an appropriate commercially available image sensor array that
is available without a cover glass, but without success.
211
Figure 8-26. Image shown in Figure 8-3 simulated through the designed two
lens ultraminiature IOC optical system with COTS lens elements (as shown in
Figure 8-24) in the Liou-Brennan optical model of the eye with the object placed
at infinity. The diagonal field of view is 20°. The reference wavelengths for the
image simulation are 630 nm, 540 nm, and 470 nm, with equal weights.
Figure 8-27. Image simulation result shown in Figure 8-26 following
resampling at an image sensor resolution of 48 × 80 pixels.
212
Figure 8-28. Image simulation result shown in Figure 8-26 following
resampling at an electrode array resolution of 24 × 40 grayscale pixels.
A custom lens housing was designed and fabricated for this lens design.
Individual lens elements were acquired and the lens was integrated. The
performance of the as-implemented lens was tested with the Wells Research
OS-400-25 lens testing system, with results as shown in Figure 8-29. The measured
MTF curves closely match the MTF performance predicted by the CODE V model of
the lens.
213
Figure 8-29 Tangential MTF (Top) and radial MTF (Bottom) curves for the
designed and implemented compound IOC lens with two COTS lens elements,
as shown in Figure 8-24. The MTF measurements were performed with four
LEDs covering the visible spectrum, as shown in Figure 2-15. CODE V results
were also obtained with wavelength weights derived from the plot shown in
Figure 2-15. The measurement results show close agreement to the analysis
results obtained from the CODE V model of the lens at all field angles.
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100
Modulation
Spatial Frequency (lp/mm)
Tangential MTF
0
0.2
0.4
0.6
0.8
1
0 25 50 75 100
Modulation
Spatial Frequency (lp/mm)
Radial MTF
Simulation On-Axis Measurement On-Axis
Simulation 5° Measurement 5°
Simulation 10° Measurement 10°
214
8.6.2 Elongated IOC Design
As described above, a suitable image sensor array for testing of the imaging
performance of the COTS design was not available. In order to have a prototype
design that would prove the viability of the two lens IOC with spherical surfaces, a
lens with a longer back focal length was designed. This elongated lens was again
implemented with COTS lens elements. The effective focal length of the lens is longer
compared to the previous design with spherical COTS lens elements, and the total
length is also longer. However, this particular lens design is again a very low-cost
design that can be implemented and integrated with existing and readily available
COTS image sensor arrays.
The elongated IOC optical imaging system incorporates two plano convex
lenses from Edmund Optics. The first lens is 2 mm in diameter, and the second lens
is 1 mm in diameter. The first lens has a 3 mm focal length, and the second lens has a
1.5 mm focal length. Both lenses are fabricated out of Schott N-LASF9 glass (n = 1.850
at λ = 587.6 nm) [14], and include VIS 0° antireflection coatings. The lens
specifications are listed in Table 8-5, and the optical imaging system is shown in
Figure 8-30. The effective focal length of the elongated IOC lens design is only
1.13 mm (including the optical power of the cornea), and the optical system operates
at f/1. The total length of the designed optical imaging system is only 2.3 mm from
the front planar surface to the image plane. The elongated IOC (with two spherical
COTS lens elements) was designed to directly interface with the aqueous humor in
the human eye.
215
Table 8-5. Specifications of the elongated ultraminiature IOC optical system
with COTS spherical lens elements, as shown in Figure 8-30 below. All
dimensions are in mm.
Optical Element
Radius of
Curvature
Thickness
Glass
Material
Semi-
Diameter
Aperture Stop Infinity 0.025
0.5
Edmund Optics
65282
Infinity 0.8
N-LASF9-
SCHOTT
1
−2.55 0.07 1
Edmund Optics
65275
1.28 0.8
N-LASF9-
SCHOTT
0.5
Infinity 0.154 0.5
Using the OV6930 as the image sensor array and with 3 × 3 binning, a ±10°
diagonal field of view corresponds to a grayscale resolution of 23 × 38 pixels, which
almost perfectly matches the resolution of the envisioned next generation implanted
electrode array. Alternatively, using the OV6930 as the image sensor array and with
3 × 3 binning, a ±10.5° diagonal field of view corresponds to a grayscale resolution of
24 × 40 pixels, perfectly matching the resolution of the envisioned next generation
implanted electrode array. The Nyquist frequency for the corresponding grayscale
pixels (9 µm × 9 µm) is 56 lp/mm.
The elongated IOC compound lens with COTS spherical lens elements has
better than 0.28 modulation up to the Nyquist frequency at all field angles, as shown
in the MTF plot of Figure 8-30. A custom lens housing was designed and fabricated
for this lens design. Individual lens elements were acquired and the lens was
integrated. The performance of the as-implemented lens was tested with the Wells
Research OS-400-25 lens testing system, with results as shown in Figure 8-31. The
216
measured MTF curves substantially match the MTF performance predicted by the
CODE V model of the lens, with only a slight degradation in performance. The
observed slight degradation possibly results from imperfect alignment and setting of
the distance between the lens elements during manual assembly into the
lens housing.
Figure 8-30. (Top) Schematic diagram of the elongated IOC lens design (with
two COTS spherical lens elements) shown in the Liou-Brennan optical model of
the eye [11]. The reference wavelength for the diagram is 540 nm. (Bottom)
Modulation transfer function (MTF) plot of the same lens in the Liou-Brennan
optical model of the eye with the object placed at infinity, and evaluated at field
angles up to ±10° (a 20° diagonal field of view). The lens has a modulation of
0.28 or greater up to the Nyquist frequency (56 lp/mm) at all field angles. The
reference wavelengths for the MTF plot are 630 nm, 540 nm, and 470 nm. The
MTF contribution at 540 nm is weighed twice as much as the 630 nm and
470 nm MTF contributions in the calculation.
217
Figure 8-31 Tangential MTF (Top) and radial MTF (Bottom) curves for the
elongated IOC compound lens designed and implemented with two COTS lens
elements, as shown in Figure 8-30. The MTF measurements were performed
with four LEDs covering the visible spectrum, as shown in Figure 2-15. CODE V
results were also obtained with wavelength weights derived from the plot
shown in Figure 2-15. The measurement results show substantial agreement
with the analysis results obtained from the CODE V model of the lens.
0
0.2
0.4
0.6
0.8
1
0 25 50
Modulation
Spatial Frequency (lp/mm)
Tangential MTF
0
0.2
0.4
0.6
0.8
1
0 25 50
Modulation
Spatial Frequency (lp/mm)
Radial MTF
Simulation On-Axis Measurement On-Axis
Simulation 5° Measurement 5°
Simulation 10° Measurement 10°
218
Image testing of the elongated COTS lens design also showed results similar to
results predicted by image simulations. The lens was mounted to the OV6930 image
sensor array and test images were captured. The captured images exhibit good
resolution over the central ±10° field of view, a result that also agrees with the image
simulation output as shown in Figure 8-32.
Figure 8-32. (Top) Image simulation through the elongated compound IOC
lens with COTS spherical elements (as shown Figure 8-30) operating in air, and
with a 200 × 400 pixel image sensor resolution. The reference wavelengths for
the image simulation are 630 nm, 540 nm, and 470 nm. (Bottom) Captured
image with the lens (in air) and the OV6930 image sensor array operating at a
200 × 400 pixel image sensor resolution. The central box outlined in red dashes
represents the central 20° diagonal field of view.
219
8.7 Summary
Current generation FDA-approved retinal prosthesis devices do not allow for
foveation. The importance of restoring foveation for retinal prosthesis patients and
the feasibility of implanting an intraocular camera to achieve this has been
demonstrated within Optical Materials and Devices Laboratory at USC through visual
prosthesis simulation and several generations of intraocular camera designs.
In this chapter, novel ultraminiature intraocular camera optical system
designs are presented. These designs support high resolution with minimal pixel
binning of off-the-shelf image sensor arrays and incorporate glass lens element(s). As
an added feature, these lens designs eliminate the (optical) need for an optical
window and can be directly interfaced with the aqueous humor of the eye.
Initially, a custom designed miniature single glass lens with conic surfaces was
presented. The integrated package of the ultraminiature IOC with this lens and the
OV6930 image sensor array would be approximately 2 mm long and 3 mm in
diameter. As such, this ultraminiature IOC would fit substantially inside an
intraocular lens [15, 16]. This particular IOC also has a very large depth of field, as
objects from infinity down to about a 6 cm object distance are in acceptable focus.
The feasibility of an ultraminiature IOC with a single glass lens was verified by
implementing and testing a similar camera using an ultraminiature fiber collimation
lens from Alps Electric Co. operating in the visible spectrum.
In addition to custom designs with single aspherical glass lenses, custom
multi-lens IOC designs with spherical glass lens elements were also explored.
220
Ultraminiature custom optical designs to support next generation high resolution
retinal prostheses have been developed. First, a custom lens with two plano convex
spherical lens elements was designed. The integrated package of the ultraminiature
IOC with this compound lens and the OV6930 image sensor array would be
approximately 2.1 mm long and 3 mm in diameter. As such, this ultraminiature IOC
would also fit substantially inside an intraocular lens.
Second, two custom lenses with COTS plano convex lenses were developed
and implemented to show the feasibility and high quality imaging that can be
achieved. The first lens has two elements with a 1 mm diameter and the total
compound lens is only 1.6 mm long. The performance of the as-implemented lens
was verified experimentally with MTF testing.
An elongated lens design, again implemented with COTS plano convex lenses
was also developed to allow for image acquisition with COTS image sensor arrays that
incorporate cover glasses. The first lens element in the design has a 2 mm diameter
and the second lens element has a 1 mm diameter. The total length of the compound
lens is only 2.3 mm. The performance of the as-implemented lens was verified
experimentally with both MTF testing and image testing with the OV6930 image
sensor array.
A comparison of key specifications and RMS spot sizes for the designed
ultraminiature IOC optical systems is provided in Table 8-6.
All of these ultraminiature IOC designs are capable of supporting either a
48 × 80 pixel resolution in color or a 24 × 40 pixel resolution in grayscale if a 48 × 80
221
color image sensor array is employed. As such, these novel ultraminiature IOC
designs have high enough resolution to support the next two generations (12 × 20,
24 × 40) of intraocular retinal prostheses, and at the same time can be implemented
in packages small enough to fit substantially within and be supported by standard
intraocular lenses (IOLs).
Table 8-6. Comparison of specifications for four custom ultraminiature IOC
lenses presented in this chapter, with each optical imaging system modeled in
the Liou-Brennan optical model of the eye [11]. RMS spot sizes for three field
angles are listed. The reference wavelengths for the RMS spot sizes are 630 nm,
540 nm, and 470 nm. The RMS spot size contribution at 540 nm is weighed
twice as much as the 630 nm and 470 nm RMS spot size contributions in the
calculation.
Lens
Effective
Focal
Length
Image Size
(±10° dFOV)
RMS Spot
Size On-Axis
RMS Spot
Size (7°)
RMS Spot
Size (10°)
Single custom
lens with 2
conic surfaces
0.79 mm
144 µm ×
240 µm
3.06 µm 3.94 µm 5.95 µm
Two lens
design, with
custom plano
convex lens
elements
0.82 mm
144 µm ×
240 µm
7.0 µm 7.5 µm 13.7 µm
Two lens
design, with
COTS plano
convex lens
elements
0.66 mm
120 µm ×
200 µm
6.0 µm 4.8 µm 5.5 µm
Elongated two
lens design,
with custom
plano convex
lens elements
1.13 mm
207 µm ×
342 µm
11.1 µm 16.1 µm 22.0 µm
222
Chapter 8 References
[1] M. C. Hauer, “Intraocular Camera for Retinal Prostheses: Refractive and
Diffractive Lens Systems”, Ph.D. Thesis, University of Southern California,
2009.
[2] OmniVision OV6930 Product Brief. Available online: http://www.ovt.com
[3] “FDA Summary of Safety and Effectiveness Data for the Implantable Miniature
Telescope (IMT),” VisionCare Ophthalmic Technologies, Inc., PMA P050034,
2006.
[4] “ALPS Develops and Commences Mass Production of “FLGS3 Series” Lead-
Free Aspherical Glass Lens for Optical Communication Using Wide Angle
Laser Diodes”, ALPS Electric News Release, 2011.
[5] Precision Optics Corporation Micro Lenses Brochure, 2013. Available Online:
www.poci.com/files/MICRO lens.pdf
[6] M. S. Humayun, J. D. Dorn, L. da Cruz, G. Dagnelie, J. A. Sahel, P. E. Stanga,
A. V. Cideciyan, J. L. Duncan, D. Eliott, E. Filley, A. C. Ho, A. Santos, A. B. Safran,
A. Arditi, L. V. Del Priore, and R. J. Greenberg, “Interim Results from the
International Trial of Second Sight’s Visual Prosthesis”, Ophthalmology, vol.
119, no. 4, pp. 779–88, May 2012.
[7] Awaiba, GmbH, “NanEye Camera System”, 2012.
Available Online: http://www.awaiba.com/download/docs/NanEye/NanEy
e_Camera_system_Spec_v1.0.14_web.pdf
[8] Fujikura CMOS Imaging Sensor (G2) Product Features. Available Online:
http://www.fujikura.co.uk/products/medical-industrial-optical-
fibre/cmos-modules
[9] OmniVision OV6946 Product Brief. Available online: http://www.ovt.com
[10] R. Liang, Optical Design for Biomedical Imaging, 1st Ed., Washington: SPIE,
2010, p. 48.
[11] H. L. Liou and N. A. Brennan, “Anatomically Accurate, Finite Model Eye for
Optical Modeling”, Journal of the Optical Society of America A, Vol. 14, No. 8,
pp. 1684-1695, 1997.
[12] M. Laikin, Lens Design, 3rd Ed. New York: Marcel Dekker, Inc., p. 37, 2001.
223
[13] Ohara Corporation Glass Catalog Data.
Available online: http://www.oharacorp.com/catalog.html
[14] Schott Inc. Optical Glass Data Sheets.
Available online: http://www.schott.com/advanced_optics/us/abbe_datash
eets/schott_datasheet_all_us.pdf
[15] K. Naeser and E. V. Naeser, “Calculation of the Thickness of an Intraocular
Lens”, Journal of Cataract and Refractive Surgery, Vol. 19, No. 1, pp. 40-42,
1993.
[16] U. Kugelberg, C. Zetterström, B. Lundgren, and S. Syrén-Nordqvist,
“Intraocular Lens Thickness and Ocular Growth in Newborn Rabbits”, Acta
Ophthalmologica Scandinavica, Vol. 75, No. 3, pp. 272-274, 1997.
224
Chapter 9
SUMMARY AND FUTURE RESEARCH DIRECTIONS
9.1 Summary
Scientific and engineering advancements can be utilized to make
improvements in healthcare. A very important area is improving quality of life for
blind and low-vision patients through advanced wearable visual aid systems and
implantable retinal prosthesis devices.
In this thesis, the design principles as well as several novel designs for the
visual input front ends of several such devices are presented. The design and
implementation of wide angle, wide dynamic range scene cameras for eye-tracked
extraocular cameras and wearable visual aid applications, as well as of ultraminiature
intraocular cameras (IOC), are presented.
Several system-level constraints on these optical systems make the design
challenging. In order to design these unusual optical systems, non-traditional
approaches had to be employed.
The wide angle scene cameras (both for the eye-tracked extraocular camera
and also the wearable visual aid system) are mounted on eyeglasses, and therefore
need to be as unobtrusive as possible for wide acceptance by the patients. These
devices also need to operate under varying degrees of ambient illumination, and
therefore the image sensor arrays in these cameras need to provide imaging over a
wide dynamic range of illumination conditions. While having wide fields of view of
about 100° or more, the camera outputs need to be rectilinear for proper object
225
recognition either by the retinal prosthesis patient or by the computer vision
algorithms of the wearable visual aid system. In addition, due to cost concerns, the
designed optical systems were limited to comprise either lens elements with
spherical surfaces or commercial off-the-shelf (COTS) lens elements. Such extreme
constraints confined the optical design space, and sufficient correction of optical
aberrations through an optimized optical design was challenging.
Initially, a design framework for optimal design of wide angle computational
cameras was developed, and its efficacy was demonstrated through the design of a
miniature, wide angle camera. This approach relies on optimizing the entire imaging
system chain, including the optical, image capture, and post-processing systems.
Images were simulated through the designed lenses, the simulated images were
dewarped in software to correct for the varying degrees of barrel distortion in the
wide angle lenses, and the relative illumination variations at the image plane were
corrected in software through flat-field correction. The final images were analyzed
to determine the overall quality of the imaging system. The effects of image sensor
noise for different stages of image processing were also analyzed. A final quality
metric was then used to compare different designs to determine the optimal wide
angle imaging system that met a given set of system constraints and also provided
rectilinear images. With this approach, the optimal degree of lens barrel distortion
for the optical system of a miniature wide angle camera was determined.
The incorporation of a wide angle camera in a wearable visual aid system
allows for detection and identification of objects over a wide field of view without
226
mechanically scanning the scene with the head. For this purpose, a wide angle lens
with only spherical surfaces was designed. This design was later turned into a
compound lens with COTS lens elements for low-cost implementation. Changes in
image quality as the design progressed were analyzed. During the optimization of
this lens, another aberration characteristic of wide angle lenses, lateral chromatic
aberration (LCA), was relaxed in the optical design and later corrected in software by
dewarping of the three color channels (red, green, and blue) of the captured image
separately. The final design was implemented with the closest matching COTS
elements, and the implemented lens provides a 97° field of view with a package size
that is only 8 mm long.
The scene camera in the intraocular retinal prosthesis system can be replaced
with a wide angle camera, and together with the incorporation of an eye-tracking
system an eye-tracked extraocular camera (ET-EOC) can be developed to restore
foveation. For this purpose, a different wide angle lens with a larger field of view
(110° diagonal) was designed and implemented. In this case the design was based on
a narrow field of view aspherical COTS lens element. With the addition of custom
designed lenses, the field of view of the aspherical lens was expanded, and
aberrations were corrected over the wide field of view. The custom lenses were later
replaced with their closest matching COTS elements to produce a compact and
low-cost wide field of view lens. This compound lens (the LP-Lens) fits in a package
that is only 5.6 mm long. Due to the larger field of view required for restoring
227
foveation in an intraocular retinal prosthesis system, this lens was designed
specifically for the ET-EOC system.
Miniature wide angle cameras based on expanding the field of view of an
integrated wafer level camera were also explored. A miniature reverse Galilean field
of view expander was designed and later implemented using COTS elements. This
field of view expander expands the field of view of a wafer level camera from 64° to
100°. In combination, this provides the prospect of highly miniaturized wide angle
imaging systems based on commercially available and compact wafer level cameras.
A proof-of-principle eye-tracked extraocular camera (ET-EOC) system was
designed and implemented. This system includes a custom designed wide angle lens
(the LP-Lens) integrated with a wide dynamic range image sensor array as the scene
camera. As the eye-tracking camera, a modified miniature USB webcam module was
used. The cameras were mounted on a pair of eyeglasses and the camera outputs
were routed to a battery-powered, belt worn computer. Accompanying software for
image capture, gaze extraction, and region selection was developed and
benchmarked. The implemented system was demonstrated to function as designed,
and allows for remote testing of ET-EOC operation.
As an alternative method for restoring foveation for retinal prosthesis
patients, the ongoing research on and development of intraocular cameras (IOC) was
further advanced. Ultraminiature optical systems that allow for significant reduction
in size as compared to earlier IOC designs were designed with high-index optical glass
lens elements. Several of these designs were implemented to demonstrate excellent
228
imaging performance. Ultraminiature IOCs with single aspherical lenses and also
ultraminiature IOCs with two plano convex spherical lenses were developed. These
ultraminiature IOC optical system designs would fit in a package of approximately
2 mm thickness and 3 mm diameter. As such, these ultraminiature IOCs could fit
substantially in and be supported within the crystalline lens sac by implantable
intraocular lenses.
9.2 Future Research Directions
One area of future research for advancement of the presented concepts and
designs would be the integration, interfacing, and real-life testing of the designed
cameras with the other components of the respective devices for blind and low vision
patients.
The images captured by the designed and implemented wide angle camera (as
presented in Chapter 4) for the wearable visual aid system would be processed by the
computer vision algorithms developed for the system. The performance of the
integrated camera and computer vision algorithms in either identifying objects or
stereo depth mapping (with two cameras) under real-life situations with different
levels of ambient illumination and head motion can be characterized.
The prototype eye-tracked extraocular camera (ET-EOC) system was designed
and implemented for initial testing with sighted individuals. Calibration procedures
and the associated software can be developed to adapt the ET-EOC for retinal
prosthesis patients. The output from the developed ET-EOC system, which is a
229
subregion of the wide field of view scene image in the direction of the patient’s gaze,
can be interfaced with the current generation retinal prosthesis visual processing unit
[1], and the complete system can be tested with retinal prosthesis patients to
demonstrate efficacy.
The designed ultraminiature IOC lenses can be incorporated into the ongoing
efforts on the packaging and integration of intraocular cameras. Effects of image
sensor noise in low-light settings on the operation of IOC can be investigated.
Characterization and testing of new off-the-shelf image sensor arrays for IOCs can
also be accomplished. For example, the recently announced OmniVision OV6946
image sensor array [2] is currently undergoing in-house testing within the Optical
Materials and Devices Laboratory.
Novel packaging and hermetic coating technologies for chronic implantation
of ultraminiature intraocular cameras and other biomedical devices can be
investigated. As explained in Chapter 8, an optically transparent, biocompatible,
hermetic, and long-term viable thin-film coating that can be applied to the first lens
surfaces of the IOC lens designs would constitute a major advance, allowing additional
design flexibility.
The knowledge and experience gained in designing, implementing, and testing
ultraminiature IOCs can also be leveraged to develop implantable cameras for other
biomedical applications, such as post-surgery wound monitoring. As a starting point
for this research, a pair of the designed IOC lenses can be mounted back to back to
obtain a one-to-one (unit magnification) imaging system, as shown in Figure 9-1. This
230
would allow for a quick demonstration of the viability of the implantable camera
concept using the already implemented and characterized IOC lens designs. New
optical systems can be designed for optimized performance for each specific
application.
Figure 9-1. Schematic diagram of a one-to-one imaging system for implantable
cameras, based on the designed ultraminiature IOC, as shown in Figure 8-24.
The reference wavelength for the diagram is 540 nm.
The developed framework for optimal design of wide angle computational
cameras with digital distortion and lateral chromatic aberration correction can also
be extended for the miniature projection devices that are currently widely used in
augmented and virtual reality applications [3, 4]. In a process that is conceptually the
reverse of dewarping of barrel distorted captured images, the display image can be
prewarped. The projection optical system would be designed have the exact opposite
distortion profile, and the final projected image would be rectilinear [5].
231
Chapter 9 References
[1] M. S. Humayun, J. D. Dorn, L. da Cruz, G. Dagnelie, J. A. Sahel, P. E. Stanga,
A. V. Cideciyan, J. L. Duncan, D. Eliott, E. Filley, A. C. Ho, A. Santos, A. B. Safran,
A. Arditi, L. V. Del Priore, and R. J. Greenberg, “Interim Results from the
International Trial of Second Sight’s Visual Prosthesis”, Ophthalmology,
Vol. 119, No. 4, pp. 779–88, May 2012.
[2] OmniVision OV6946 Product Brief. Available online: http://www.ovt.com
[3] J. P. McGuire Jr., “Next-Generation Head-Mounted Display”, Proceedings of
SPIE 7618, Art. No. 761804, February 2012.
[4] D. Cheng, Y. Wang, H. Hua, and M. M. Talha, “Design of an Optical See-Through
Head-Mounted Display with a Low f-number and Large Field of View Using a
Freeform Prism”, Applied Optics, Vol. 48, No. 14, pp. 2655-2668, May 2009.
[5] A. M. Bauer, S. Vo, K. Parkins, F. Rodriguez, O. Cakmakci, and J. P. Rolland,
“Optical Distortion Correction Using Radial Basis Function Interpolation”,
2012 OSA Frontiers in Optics, Art. No. FTu2E-4, Rochester, NY, October 2012.
232
BIBLIOGRAPHY
A. Adebiyi, N. Mante, C. Zhang, F. E. Sahin, G. G. Medioni, A. R. Tanguay, Jr., and
J. D. Weiland, “Evaluation of Feedback Mechanisms for Wearable Visual Aids”, 2013
IEEE International Conference on Multimedia and Expo Workshops, pp. 1–6, July 2013.
M. P. Barry and G. Dagnelie, “Use of the Argus II Retinal Prosthesis to Improve Visual
Guidance of Fine Hand Movements”, Investigative Ophthalmology & Visual Science,
Vol. 53, No. 9, pp. 5095–5101, January 2012.
M. Bass, C. DeCusatis, J. Enoch, V. Lakshminarayanan, G. Li, C. MacDonald, V. Mahajan,
and E. Van Stryland, Handbook of Optics, Volume I: Geometrical and Physical
Optics, Polarized Light, Components and Instruments, 3rd Ed., USA: McGraw-Hill, 2010.
A. M. Bauer, S. Vo, K. Parkins, F. Rodriguez, O. Cakmakci, and J. P. Rolland, “Optical
Distortion Correction Using Radial Basis Function Interpolation”, 2012 OSA Frontiers
in Optics, Art. No. FTu2E-4, Rochester, NY, October 2012.
J-Y. Bouguet. Camera Calibration Toolbox for Matlab. Available online:
http://www.vision.caltech.edu/bouguetj/calib_doc/
P. Burns, “Slanted-Edge MTF For Digital Camera and Scanner Analysis”, Proceedings
of IS&T Image Processing, Image Quality, Image Capture, Systems Conference,
pp. 135-138, 2000.
P. Burns, “sfrmat3: SFR analysis for digital cameras and scanners”, Available online:
http://losburns.com/imaging/software/SFRedge/index.htm
A. Caspi, A. Roy, G. Consedai, R. Greenberg, A. Safran, and J.-A. Sahel, “Retinal
Prosthesis − Steering the Line of Sight with Eye Movements”, 36th Annual
International Conference of the IEEE Engineering in Medicine and Biology Society,
2014.
J. S. Chahl and M. V. Srinivasan, “Reflective Surfaces for Panoramic Imaging”, Applied
Optics, Vol. 36, No. 31, pp. 8275-8285, November 1997.
D. Cheng, Y. Wang, H. Hua, and M. M. Talha, “Design of an Optical See-Through
Head-Mounted Display with a Low f-number and Large Field of View Using a
Freeform Prism”, Applied Optics, Vol. 48, No. 14, pp. 2655-2668, May 2009.
O. Cossairt and S. Nayar, “Spectral Focal Sweep: Extended Depth of Field from
Chromatic Aberrations”, 2010 IEEE International Conference on Computational
Photography, pp. 1–8, March 2010.
233
O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel Computational Imaging”, 2011 IEEE
International Conference on Computational Photography, pp. 1-8, April 2011.
O. Cossairt, “Tradeoffs and Limits in Computational Imaging”, Ph.D. Thesis, Columbia
University, 2011.
M. Dahl, J. Heinisch, S. Krey, S. M. Bäumer, J. Lurquin, and L. Chen, “Ultra-Fast MTF
Test for High-Volume Production of CMOS Imaging Cameras”, SPIE Annual Meeting,
Optical Science and Technology, pp. 293-300, August 2003.
A. Darmont, High Dynamic Range Imaging: Sensors and Architectures, 1
st
Ed.,
Washington: SPIE Press, 2012, p. 37.
J. D. Dorn, A. K. Ahuja, A. Caspi, L. da Cruz, G. Dagnelie, J. Sahel, R. J. Greenberg,
M. J. McMahon, and Argus II Study Group, “The Detection of Motion by Blind Subjects
with the Epiretinal 60-Electrode (Argus II) Retinal Prosthesis”, JAMA Ophthalmology,
Vol. 131, No. 2, pp. 183-189, February 2013.
E. R. Dowski and W. T. Cathey, “Extended Depth of Field through Wave-Front Coding”,
Applied Optics, Vol. 34, No. 11, pp. 1859–1866, April 1995.
A. Duchowski, Eye Tracking Methodology: Theory and Practice, London: Springer
Science & Business Media, 2003.
R. E. Fischer, B. Tadic-Galeb, and P. R. Yoder, Optical System Design, 2
nd
Ed., New York:
McGraw-Hill, 2008.
R. Fraux, “OmniVision's VGA Wafer-Level Camera”, 3D Packaging, Yole Development,
February 2012, pp. 26-27, 2012.
J. Gauvin, M. Doucet, M. Wang, S. Thibault, and B. Blanc, “Development of New Family
of Wide-Angle Anamorphic Lens with Controlled Distortion Profile”, Proceedings of
SPIE, Vol. 5874, August 2005.
J. A. Gohman, “Wide Angle Lens System Having a Distorted Intermediate Image”, U.S.
Patent 7,009,765, 2006.
J. W. Goodman, Introduction to Fourier Optics, 3rd Ed., Englewood, Colorado: Roberts
& Company Publishers, 2004.
A. Goshtasby, Image Registration: Principles, Tools and Methods, 1
st
Ed., London:
Springer Science & Business Media, 2012.
234
A. Gupta, N. Joshi, C. L. Zitnick, M. Cohen, and B. Curless, “Single Image Deblurring
Using Motion Density Functions”, Computer Vision–ECCV 2010, pp. 171-184, 2010.
M. C. Hauer, “Intraocular Camera for Retinal Prostheses: Refractive and Diffractive
Lens Systems”, Ph.D. Thesis, University of Southern California, 2009.
E. Hecht, Optics, 4th Ed., Boston: Addison Wesley, 2001.
F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-Quality
Computational Imaging through Simple Lenses”, ACM Transactions on Graphics (TOG)
Vol. 32, No. 5, Art. No. 149, September 2013.
J. Heikkilä and O. Silvén, “A Four-Step Camera Calibration Procedure with Implicit
Image Correction”, Proceedings of IEEE Computer Society Conference on Computer
Vision and Pattern Recognition (CVPR, ‘97), pp. 1106 -1112, 1997.
K. Henney and B. Dudley, Handbook of Photography, 1st Ed., New York: Whittlesey
House, 1939, p. 37.
R. A. Hicks and R. K. Perline, “Equiresolution Catadioptric Sensors”, Applied Optics,
Vol. 44, No. 29, pp. 6108-6114, October 2005.
M. S. Humayun, E. de Juan, J. D. Weiland, G. Dagnelie, S. Katona, R. Greenberg, and
S. Suzuki, “Pattern Electrical Stimulation of the Human Retina”, Vision Research,
Vol. 39, No. 15, pp. 2569–2576, July 1999.
M. S. Humayun, “Intraocular Retinal Prosthesis”, Transactions of the American
Ophthalmological Society, Vol. 99, pp. 271–300, January 2001.
M. S. Humayun, J. D. Dorn, L. da Cruz, G. Dagnelie, J. A. Sahel, P. E. Stanga,
A. V. Cideciyan, J. L. Duncan, D. Eliott, E. Filley, A. C. Ho, A. Santos, A. B. Safran,
A. Arditi, L. V. Del Priore, and R. J. Greenberg, “Interim Results from the International
Trial of Second Sight’s Visual Prosthesis”, Ophthalmology, Vol. 119, No. 4, pp. 779–
788, May 2012.
Q. Huynh-Thu and M. Ghanbari, “Scope of Validity of PSNR in Image/Video Quality
Assessment”, Electronics Letters, Vol. 44, No. 13, pp. 800-801, 2008.
E. Kee, S. Paris, S. Chen, and J. Wang, “Modeling and Removing Spatially-Varying
Optical Blur”, 2011 IEEE International Conference on Computational Photography,
pp. 1-8, April 2011.
235
R. Kingslake, A History of the Photographic Lens, 1st Ed., Boston: Academic Press,
1989.
U. Kugelberg, C. Zetterström, B. Lundgren, and S. Syrén-Nordqvist, “Intraocular Lens
Thickness and Ocular Growth in Newborn Rabbits”, Acta Ophthalmologica
Scandinavica, Vol. 75, No. 3, pp. 272-274, 1997.
M. Laikin, “Wide Angle Lens Systems”, International Lens Design Conference, Vol. 530,
pp. 530–533, 1980.
M. Laikin, Lens Design, 3rd Ed. New York: Marcel Dekker, Inc., 2001.
A. Levin, P. Sand, T. S. Cho, F. Durand, and W. T. Freeman, “Motion-Invariant
Photography”, ACM Transactions on Graphics (TOG), Vol. 27, No. 3, Art. No. 71,
August 2008.
R. Liang, Optical Design for Biomedical Imaging, 1st Ed., Washington: SPIE, 2010.
H. L. Liou and N. A. Brennan, “Anatomically Accurate, Finite Model Eye for Optical
Modeling”, Journal of the Optical Society of America A, Vol. 14, No. 8, pp. 1684-1695,
1997.
R. Lukac (Ed.), Computational Photography: Methods and Applications, Boca
Raton: CRC Press, 2015.
V. N. Mahajan, Optical Imaging and Aberrations: Part 1. Ray Geometrical Optics,
Bellingham, WA: SPIE, 1998.
J. Mallon and P. F. Whelan, “Calibration and Removal of Lateral Chromatic Aberration
in Images”, Pattern Recognition Letters, Vol. 28, No. 1, pp. 125-135, January 2007.
D. L. Marks, S. S. Hui, J. Kim, and D. J. Brady, “Engineering a Gigapixel Monocentric
Multiscale Camera”, Optical Engineering, Vol. 51, No. 8, Art. No. 083202, August 2012.
J. P. McGuire Jr., “Next-Generation Head-Mounted Display”, Proceedings of SPIE 7618,
Art. No. 761804, February 2012.
B. P. McIntosh, P. J. Nasiatka, N. R. B. Stiles, J. D. Weiland, M. S. Humayun, and A. R.
Tanguay, Jr., “The Importance of Foveation in Retinal Prostheses: Experiments with
a Visual Prosthesis Simulator,” Neural Interfaces Conference 2010, Long Beach,
California, June 2010.
236
B. P. McIntosh, N. R. B. Stiles, M. S. Humayun, and A. R. Tanguay, Jr., “Visual Prosthesis
Simulation: Effects of Foveation on Visual Search”, Investigative Ophthalmology &
Visual Science, Vol. 54, No. 15, p. 1057, June 2013.
B. P. McIntosh, N. R. B. Stiles, M. S. Humayun, and A. R. Tanguay, Jr., “Effects of
Foveation on Visual Search Task with Visual Prosthesis Simulation”, Journal of Vision,
Vol. 13, No. 9, p. 685, July 2013.
B. P. McIntosh, “Intraocular and Extraocular Cameras for Retinal Prostheses: Effects
of Foveation by Means of Visual Prosthesis Simulation”, Ph.D. Thesis, University of
Southern California, 2015.
K. Naeser and E. V. Naeser, “Calculation of the Thickness of an Intraocular Lens”,
Journal of Cataract and Refractive Surgery, Vol. 19, No. 1, pp. 40-42, 1993.
M. Reiss, “The cos
4
Law of Illumination”, Journal of the Optical Society of America,
Vol. 35, No. 4, pp. 283-288, 1945.
M. D. Robinson, G. Feng, and D. G. Stork, “Spherical Coded Imagers: Improving Lens
Speed, Depth-of-Field, and Manufacturing Yield through Enhanced Spherical
Aberration and Compensating Image Processing”, Proceedings of SPIE 7429, Art. No.
74290M, August 2009.
M. M. Roosinov, “Wide Angle Orthoscopic Anastigmatic Photographic Objective”,
U.S. Patent US2516724, 1950.
S. P. Sadoulet, “Optics Testing: MTF Quickly Characterizes the Performance of
Imaging Systems”, Laser Focus World, March 2006.
F. E. Sahin, B. P. McIntosh, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “Design of a Compact Wide-Field-of-View Camera for Retinal
Prostheses”, Investigative Ophthalmology & Visual Science, Vol. 54, No. 15, p. 1068,
June 2013.
F. E. Sahin, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and A. R. Tanguay, Jr., “Optimal
Design of Miniature Wide-Angle Computational Cameras for Retinal Prostheses and
Wearable Visual Aids”, 2014 OSA Frontiers in Optics, Art. No. FTu5F.1, Tucson, AZ,
October 2014.
F. E. Sahin, B. P. McIntosh, P. J. Nasiatka, J. D. Weiland, M. S. Humayun, and
A. R. Tanguay, Jr., “Eye-Tracked Extraocular Camera for Retinal Prostheses”, 2015
OSA Frontiers in Optics, Art. No. FTu2C.3, San Jose, CA, October 2015.
237
F. E. Sahin, P. J. Nasiatka, and A. R. Tanguay, Jr., “Lateral Chromatic Aberration
Optimization in Wide-Field-of-View Computational Cameras”, 2015 OSA Frontiers in
Optics, Art. No. FTh1F.4, San Jose, CA, October 2015.
C. J. Schuler, M. Hirsch, S. Harmeling, and B. Schölkopf, “Non-Stationary Correction of
Optical Aberrations”, 2011 IEEE International Conference on Computer Vision,
pp. 659-666, November 2011.
J. A. Seibert, J. M. Boone, and K. K. Lindfors, “Flat-Field Correction Technique for
Digital Detectors”, Proceedings of SPIE: Medical Imaging, pp. 348-354, July 1998.
W. J. Smith, Modern Optical Engineering, 3rd Ed., New York: McGraw-Hill, 2000.
W. J. Smith, Modern Lens Design, 2nd Ed., New York: McGraw-Hill, 2005.
J. S. Stahl, “Eye-Head Coordination and the Variation of Eye-Movement Accuracy with
Orbital Eccentricity”, Experimental Brain Research, Vol. 136, No. 2, pp. 200-210,
January 2001.
I. Stamenov, I. P. Agurok, and J. E. Ford, “Optimization of Two-Glass Monocentric
Lenses for Compact Panoramic Imagers: General Aberration Analysis and Specific
Designs”, Applied Optics, Vol. 51, No. 31, pp. 7648-7661, October 2012.
I. Stamenov, A. Arianpour, S. J. Olivas, I. P. Agurok, A. R. Johnson, R. A. Stack,
R. L. Morrison, and J. E. Ford, “Panoramic Monocentric Imaging Using Fiber-Coupled
Focal Planes”, Optics Express, Vol. 22, No. 26, pp. 31708-31721, December 2014.
N. R. B. Stiles, B. P. McIntosh, P. J. Nasiatka, M. C. Hauer, J. D. Weiland, M. S. Humayun,
and A. R. Tanguay, Jr., “An Intraocular Camera for Retinal Prostheses: Restoring Sight
to the Blind”, Chapter 20 in Optical Processes in Microparticles and Nanostuctures,
Advanced Series in Applied Physics, Volume 6, A. Serpenguzel and A. Poon, Eds.,
Singapore: World Scientific, 2010, pp. 385-429.
N. R. B. Stiles, B. P. McIntosh, A. R. Tanguay, Jr., and M. S. Humayun, “Retinal
Prostheses: Functional Use of Monocular Depth Perception in the Low Resolution
Limit”, Investigative Ophthalmology & Visual Science, Vol. 54, No. 15, p. 1042, June
2013.
A. R. Tanguay, Jr., N. R. B. Stiles, B. P. McIntosh, and M. S. Humayun, “Functional Use of
Monocular Depth Perception in the Low Resolution Limit”, Journal of Vision, Vol. 13,
No. 9, p. 1182, July 2013.
238
K. A. Thakoor, S. Marat, P. J. Nasiatka, B. P. McIntosh, F. E. Sahin, A. R. Tanguay, Jr.,
J. D. Weiland, and L. Itti, “Attention Biased Speeded Up Robust Features (AB-SURF):
A Neurally-Inspired Object Recognition Algorithm for a Wearable Aid for the
Visually-Impaired”, 2013 IEEE International Conference on Multimedia and Expo
Workshops, pp. 1–6, July 2013.
S. Thibault, J. Gauvin, M. Doucet, and M. Wang, “Enhanced Optical Design by
Distortion Control”, Proceedings of SPIE 5962, October 2005.
E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and Scaling of
Monocentric Multiscale Imagers”, Applied Optics, Vol. 51, No. 20, pp. 4691-4702,
July 2012.
O. Whyte, J. Sivic, A. Zisserman, and J. Ponce, “Non-Uniform Deblurring for Shaken
Images”, International Journal of Computer Vision, Vol. 98, No. 2, pp. 168-186,
June 2012.
G. Wolberg, Digital Image Warping, 1
st
Ed., Los Alamitos: IEEE Computer Society
Press, 1990.
Z. Zhang, “Flexible Camera Calibration by Viewing a Plane from Unknown
Orientations”, Proceedings of IEEE International Conference on Computer
Vision, pp. 666-673, 1999.
C. Zhou and S. K. Nayar, “Computational Cameras: Convergence of Optics and
Processing”, IEEE Transactions on Image Processing, Vol. 20, No. 12, pp. 3322-3340,
December 2011.
“ALPS Develops and Commences Mass Production of “FLGS3 Series” Lead-Free
Aspherical Glass Lens for Optical Communication Using Wide angle Laser Diodes”,
ALPS Electric News Release, 2011.
Aptina AP0100 Product Flyer. Available online:
http://www.aptina.com, as of January 2015.
Aptina MT9M034 Product Flyer. Available online:
http://www.aptina.com, as of January 2015.
Awaiba, GmbH, “NanEye Camera System”, 2012.
Available Online: http://www.awaiba.com/download/docs/NanEye/NanEye_Came
ra_system_Spec_v1.0.14_web.pdf
239
CODE V® 10.7 Reference Manual, Synopsys Optical Solutions Group, Pasadena, CA,
2014.
Delrin is produced by DuPont. http://www.dupont.com
“FDA Summary of Safety and Effectiveness Data for the Implantable Miniature
Telescope (IMT),” VisionCare Ophthalmic Technologies, Inc., PMA P050034, 2006.
Fujikura CMOS Imaging Sensor (G2) Product Features. Available Online:
http://www.fujikura.co.uk/products/medical-industrial-optical-fibre/cmos-
modules
Geo Semiconductor Inc. GW3100 Product Brief. Available online:
http://www.geosemi.com
Himax Imaging Inc. http://himaximaging.com
Huiber Vision Technology Co. Ltd. http://www.hbvcamera.com
ImmerVision Panamorph Lenses,
http://www.immervisionenables.com/panomorph-technology
“Industrial Cameras: Spectral Sensitivity”, The Imaging Source White Paper, 2013.
Available online:
http://www.theimagingsource.com/en_US/publications/whitepapers , as of June,
2015.
Leopard Imaging Inc. http://www.leopardimaging.com
Lightpath 355150 Support Documents. Available online:
http://lightpath.com/displayLens.php?lensNumber=355150
Ohara Corporation Glass Catalog Data.
Available online: http://www.oharacorp.com/catalog.html
OmniVision OV6930 Product Brief. Available online: http://www.ovt.com
OmniVision OV6946 Product Brief. Available online: http://www.ovt.com
OmniVision OVM7692 640 × 480 CameraCubeChip™ Product Brief. Available online:
http://www.ovt.com
240
OmniVision OV10633 720p Wide-Dynamic Range Image Sensor Product Brief.
Available online: http://www.ovt.com
OmniVision Technologies, Press Release “OmniVision Adds Powerful Electronic
Distortion Correction Solution to Its Automotive Product Line”, October 2012.
Available online: www.ovt.com/news/presskit.php?ID=102
OpenCV (Open Source Computer Vision). http://opencv.org
Photography - Electronic Still Picture Cameras - Resolution Measurements, ISO
Standard 12233:2000.
Precision Optics Corporation Micro Lenses Brochure, 2013. Available Online:
www.poci.com/files/MICRO lens.pdf
Schott Inc. Optical Glass Data Sheets. Available Online: http://www.schott.com
SimpleCV. http://simplecv.org
Sunex Inc., Carlsbad, CA. http://www.optics-online.com
Theia Technologies SY125A Specification Sheet. Available Online:
http://www.theiatech.com/files/specs/SY125_spec_sheet_web.pdf
TriOptics GmbH. http://www.trioptics.com
United States Social Security Administration, “Title XVI - Supplemental Security
Income for the Aged, Blind, and Disabled”, Section 1614: “Meaning of Terms: Aged,
Blind, or Disabled Individual”, in Compilation of the Social Security Laws Volume 1,
Act 1614, as amended through January 2007.
Wells Research: http://www.wellsresearch.com
World Health Organization, “Global Data on Visual Impairments 2010”.
Available online: http://www.who.int/blindness/GLOBALDATAFINALforweb.pdf
Abstract (if available)
Abstract
The development of advanced wearable visual aid systems and implantable retinal prosthesis devices can enable quality of life improvements for millions of blind and low-vision patients. ❧ In this thesis, both designs and design principles of cameras for these devices are presented, such as the design and implementation of wide angle, wide dynamic range scene cameras for eye-tracked extraocular cameras and wearable visual aid applications, as well as ultraminiature intraocular cameras (IOC). Unique optical designs for miniature wide angle lenses are presented, as well as the implementation and testing of these lenses using closest matching commercial off-the-shelf lens elements. ❧ A design framework for the optimal design of wide angle computational cameras is also presented. In this hybrid optical/digital approach, the optical system produces an image with an optimal degree of barrel distortion and lateral chromatic aberration, and these are then corrected in software post-processing. ❧ In order to restore foveation in retinal prosthesis patients, a proof of principle eyeglass-mounted eye-tracked extraocular camera design with a custom designed and implemented wide angle camera, an integrated eye-tracking system, and software for image capture, gaze extraction and region selection is presented. ❧ Ultraminiature optical systems that allow for a significant reduction in size compared to earlier intraocular camera designs are also presented, and excellent imaging performance is demonstrated.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Intraocular camera for retinal prostheses: refractive and diffractive lens systems
PDF
Intraocular and extraocular cameras for retinal prostheses: effects of foveation by means of visual prosthesis simulation
PDF
Saliency based image processing to aid retinal prosthesis recipients
PDF
Towards a high resolution retinal implant
PDF
Lobster eye optics: a theoretical and computational model of wide field of view X-ray imaging optics
PDF
RGBD camera based wearable indoor navigation system for the visually impaired
PDF
Adaptive event-driven simulation strategies for accurate and high performance retinal simulation
PDF
Hybrid methods for robust image matching and its application in augmented reality
PDF
A joint framework of design, control, and applications of energy generation and energy storage systems
PDF
Elements of next-generation wireless video systems: millimeter-wave and device-to-device algorithms
PDF
Hybrid vat photopolymerization: methods and systems
Asset Metadata
Creator
Sahin, Furkan Emre
(author)
Core Title
Novel imaging systems for intraocular retinal prostheses and wearable visual aids
School
Viterbi School of Engineering
Degree
Doctor of Philosophy
Degree Program
Electrical Engineering
Publication Date
02/02/2016
Defense Date
12/16/2015
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
barrel distortion,computational cameras,computational photography,eye-tracked extraocular camera,eye-tracking,implantable devices,intraocular camera,intraocular retinal prosthesis,lateral chromatic aberration,lens optimization,OAI-PMH Harvest,optical design,optical design optimization,wearable cameras,wearable visual aids,wide angle computational camera,wide angle lens design
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Tanguay, Armand R., Jr. (
committee chair
), Sawchuk, Alexander A. (
committee member
), Weiland, James D. (
committee member
)
Creator Email
fsahin@usc.edu,furkansahin@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c40-205473
Unique identifier
UC11276490
Identifier
etd-SahinFurka-4079.pdf (filename),usctheses-c40-205473 (legacy record id)
Legacy Identifier
etd-SahinFurka-4079.pdf
Dmrecord
205473
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Sahin, Furkan Emre
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
barrel distortion
computational cameras
computational photography
eye-tracked extraocular camera
eye-tracking
implantable devices
intraocular camera
intraocular retinal prosthesis
lateral chromatic aberration
lens optimization
optical design
optical design optimization
wearable cameras
wearable visual aids
wide angle computational camera
wide angle lens design