Você está na página 1de 109

2000-2006 technical documentation

BEST OF
APPLICATION NOTES

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED

revised edition: includes 5 new articles

www.edmundoptics.com
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

TABLE OF
CONTENTS

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Off-the-Shelf Optics Offer Speed and Economy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2


Micro-Optics and Fiber Optic Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6
Keys to Cost Effective Optical Design & Tolerancing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10
Incorporating Aspheres into Your Designs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .14
Pushing Optical Coating Technology to New Limits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18
The Complexities of Creating High-Power Optical Coatings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .23
Detectors: A User’s Guide . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26
Using Filters in Machine Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .28
Contrast Enhancement Through Filtering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .31
Imaging Lens Selection Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .33
Optics and Machine Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .35
10 Lens Specifications You Must Know for Machine Vision Optics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .41
Need to Know Optics for Machine Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .45

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


How to Choose the Correct Optics for Your Vision System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .48
Lens Selection and MTF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
Using MTF in a Production Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .55
Correcting Perspective Errors with Telecentricity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .57
Manipulating Distortion Out of Your Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .59
To Zoom or Not to Zoom . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .61
Camera Choice: Color Versus Monochrome . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .63
Choosing a USAF Target . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .65
How to Reduce the Cost of Configuring a Vision System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .69
Set Your Sights on Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .74
Keeping a Tight Focus on Optics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .78
Designing a Vision System to Meet Your Space Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .81
Infrared Imaging: Thermal Versus Near-IR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .84
Machine Vision for Automated Food Manufacturing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .86
Using Light to Read the Code of Life . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .90
Splitting Images Solves Dual Magnification Dilemma . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .95
IR Vision: More than Meets the Eye . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .98
3D Measurements with Telecentric Lenses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .100
Optics That Focus on Manufacturing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .102
Confocal Microscope Lenses: Sharpen Your Sights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .104

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

INTRODUCTION
BEST OF EDMUND OPTICS™ APPLICATION NOTES

A Reading Assignment You’ll Be Glad About!

Put on your thinking caps — because Edmund Optics is re-presenting its


Best Application Notes ever.

These application notes cover a wide variety of topics, many in great


detail. The notes focus on the things that Edmund Optics knows best:
machine vision, the choice between custom vs. off-the-shelf optics, and
applications integration. All of the notes are written by in-house
Edmund engineers, the very same engineers that are waiting to answer
your Technical Support calls and e-mails.

And make sure you don’t miss our Tech Tips scattered throughout! Want
to know how to make an inexpensive light stop? Want to know how to
tell the different axes of a polarizer? These little nuggets of wisdom have
been passed along from our Applications Engineers to you.

So study up — because these notes will make your design work easier
and more problem free. But don’t take our word for it — get reading

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


and find out for yourself!

John Stack
President and COO
1

CONTACT INFO
PHONE
800-363-1992

FAX
856-573-6295

MAIL
Edmund Optics, Inc.
Order Department PD006
101 East Gloucester Pike
Barrington, NJ 08007-1380

E-MAIL
sales@edmundoptics.com

WEBSITE
www.edmundoptics.com

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

OFF-THE-SHELF Off-the-shelf optics are usually much less expensive and easier to use
than custom optics, though not in every case. This article takes a look
at when off-the-shelf optics should be used, and how to use them. Off-
OPTICS OFFER SPEED the-shelf optics are continually produced in large quantities, and kept in
stock by manufacturers and distributors. These stock optics are typi-
AND ECONOMY cally designed in a wide variety of sizes and focal lengths from which
to choose (see Figure 1).
Customers frequently struggle with the decision of when to use cat-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

alog optics and when to buy lenses custom made for their application.
As a manufacturer, we have often been asked to quote prices for custom
optics in volumes at which they are not as economical as off-the-shelf
elements. Because many different customers use the same lens, off-the-
shelf optics allow an economy of scale, even when one customer needs
only a few lenses. As a general rule of thumb, custom lenses make eco-
nomic sense only when one needs thousands of lenses. But, as with any
rule of thumb, there are always exceptions.

Off-the-shelf advantages
For many reasons, off-the-shelf components are more economical than
custom components. The first and most obvious is that economy of
scale can be gained by using off-the-shelf. To understand why volume
is important, one must understand how most lenses are made.
FIGURE 1: Stock lenses have a wide variety of popular The vast majority of lenses are produced the same way today as they
diameters, each with a wide variety of focal were made during World War II. This involves blocking many lenses
lengths, to provide customers with many
choices to fit their application.
onto one tool and grinding and polishing with pitch (see Figure 2).
Several tools can grind or polish at the same time on a single machine.
Making one lens takes as long as making several hundred. And because
material often is a small portion of the cost of manufacturing common
glass lenses, making one lens costs about the same as making 50.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Deterministic grinding and polishing machines can be used to man-
ufacture lenses one at a time. These machines have their own associat-
ed expenses. This method is usually used for low volume manufactur-
ing. Tooling is more of a consideration with deterministic polishing.
This is still fairly new technology and not as prevalent as the conven-
2 tional pitch polishing.
Once the lenses are polished, they must be tested. Test plates are typ-
ically used to test a lens. If the lens has a radius that is not currently
being used in the optical shop, then the costs of manufacturing the lens
increases because a specific test plate must be made for each radius
used. Special tooling may also be necessary for custom lenses, further
increasing cost.
In addition to volume, off-the-shelf optics are (by definition) avail-
able more quickly than custom lenses. A common request from cus-
tomers who do not understand that lenses are made in batches is to
FIGURE 2: Many elements are blocked onto one tool and receive just the first few lenses that are made, assuming this will save
ground and polished at the same time. time. The first few custom lenses in a batch might be available a day
earlier, due to the time taken for testing. Manufacturing a simple lens
can take on the order of one to three months. If test plates must be man-
ufactured as well, then expect to add another month or two. For quick
lead times, off-the-shelf optics cannot be beaten.
Lead time is an obvious consideration for prototypes, but also needs
to be understood for production. Sometimes a customer using a custom
lens suddenly has a dramatic increase in business and needs to have
twice as many lenses as they forecasted. If this sudden increase in
demand cannot be filled, it could shut down their assembly line.
Although some manufacturers try to keep a safety stock of lenses, such
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

stocks are more difficult to maintain for custom lenses than for off-the-
shelf lenses.

Off-the-shelf disadvantages
Off-the-shelf optics, however, also have disadvantages that should be
considered. Customers often want to buy off-the-shelf optics to insert in
their own optical designs. Ideally, off-the-shelf optics should be incorpo-
rated in the initial design. Altering a finished design can be costly.
The sooner off-the- Changing the lens inevitably means that the mounting must be
BEST OF EDMUND OPTICS™ APPLICATION NOTES

changed to accommodate any changes in focus. Even lenses with iden-


tical focal lengths can mount differently because a change in radius
shelf options are alters where the lens is mounted. Also, the optical design must be
redone. Engineering for these kinds of changes will have associated
worked in, the better. costs that may outweigh the savings of an off-the-shelf lens. If FDA or
similar approvals are required, the validation involved in a design
change also can lead to severe costs.
Another consideration is the effect on tolerances. If one uses more
elements to correct aberrations without using custom optics, then the
stack-up of tolerances can decrease performance. Also, some designs
require a specific tolerance for a specific element, which may not be
standard to off-the-shelf optics. Sometimes a very specific focal length
is required, or a specific lens form, such as a meniscus lens, to correct
aberrations — these may not be available off the shelf. Special coatings
are a popular reason for a custom lens. Sometimes designs require very
low reflectance at a specific wavelength or an antireflection coating in
the UV or near-IR wavelengths. Sometimes there is no off-the-shelf
solution and a custom lens is unavoidable.
Options are available to customize off-the-shelf elements, including
edging down a lens or a custom coating. Edging down, or changing the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


diameter of a stock lens, can be done quickly and often inexpensively.
This is useful for mounting in an existing housing or accommodating
space limitations. Another simple customization is applying a special
coating on an uncoated stock lens. The cost for custom coating a batch
of lenses can be quite low, and the lead time very short.
3
Designing in-stock lenses
Designing with off-the-shelf optics can be made easier in several ways.
The first is to design using these elements. Most design software pack-
ages have off-the-shelf lenses preloaded into them. Software such as
Zemax, Code V, Oslo, OLIVE, and others all include complete catalogs
of off-the-shelf lenses.
The sooner off-the-shelf options are worked in, the better. Typical
design software will give a starting point with custom lenses when one
optimizes all surfaces. Then one can force the software to replace the
custom lenses with the closest off-the-shelf matches and allow air
spaces to compensate. The best time to do this is before starting on the
mechanical design.
Many design tricks can be used to produce superior performance
with off-the-shelf components. Consider a laser objective, which is used
to focus a laser beam to a small spot. The lens system can be designed
in two ways. A custom solution would use a “best form lens” — a sin-
gle lens with two different curvatures selected to reduce spherical aber-
rations.
An off-the-shelf solution would use two identical PCX (plano-con-
vex) lenses in place of the single, more expensive custom lens (see
Figure 3). Two 30-mm focal length PCX lenses close together can, for
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

example, replace a best-form lens with a 15.5-mm focal length. The two
PCX lenses yield a smaller spot (around six times smaller), because four
surfaces bend the light instead of only two. Two PCX lenses also can be
cheaper than a single best form, because only one radius is being man-
ufactured. The radii of the lenses in the off-the-shelf design are also
longer, which often will be cheaper as well because it is easier to man-
ufacture.

Volume makes the difference


BEST OF EDMUND OPTICS™ APPLICATION NOTES

Volume is the key to deciding when to use off-the-shelf or custom


optics. Low volume will always favor off-the-shelf elements, but as vol-
ume goes up the advantages diminish and other factors take over.
Custom is almost always out of the question for prototypes and
proof of concept. When one needs a single or a small run of prototypes,
FIGURE 3: Two stock lenses can replace a single custom
lens and provide good performance for less off-the-shelf elements should be used whenever possible. In these quan-
money. Design software is Zemax. tities, custom lenses are astronomically expensive and require long lead
times. Deterministic grinding and polishing can make this more cost
effective, but also will lead to high prices. Off-the-shelf lenses provide
a major benefit by being available quickly, because speed is critical for
most prototypes. Also, if the prototype shows that the design must be
changed, the customer must repeat the one-time expenses associated
with custom lenses.
When a low volume of between 100 and 1,000 pieces is needed,
economy of scale still causes off-the-shelf to be the more economical
option. At this volume, off-the-shelf optics can save the customer from
having to commit to a supply for a year or more. Stock lenses offer the
considerable advantage of allowing the customer to buy on demand and
have stock available in case volume increases. However, if a custom
solution is necessary it can be done at a reasonable cost at this volume.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


For moderate volumes of 1,000 to 100,000 pieces, both custom and
off-the-shelf elements are viable options. Lenses are generally not
stocked in these volumes unless a need for them is forecasted for a spe-
cific customer. Increases in volume are still easier to accommodate with
an off-the-shelf option, because there is less risk in overstocking a stock
4 lens than in overstocking a custom lens. The savings that might occur
by using custom lenses start to be important at these volumes.
For high volumes above 100,000 units, custom lenses are almost
always used. If elements can be eliminated, custom is almost exclusively
used. This volume provides the customer with economies of scale for the
custom lenses. The cost per piece to manufacture 200,000 pieces is not
significantly less than the cost per piece of 100,000 pieces.

Saving time with stock lenses


Here’s an example of the design process that demonstrates many of the
trade-offs between stock and custom lenses. The initial request to
Edmund Optics was to redesign an imaging lens system for use in iris
identification. This large optical system was reduced to a package about
the size of a baseball.
The redesign used custom lenses, a stock PCX (plano-convex) lens
and two custom best-form lenses. The system required monochromatic
near-IR illumination. The working distance of the lens system needed to
be short and ideally would use as few elements as possible. Due to a
large production volume, a custom design was chosen to improve image
quality.
A prototype run of about 50 pieces was manufactured. This was very
expensive, but necessary for proof of design. OptoTech lens grinders
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

and polishers made the prototypes more quickly and less expensively
than traditional pitch polishing. Tooling for four separate radii was nec-
essary, increasing both the cost and lead time.
After testing the prototypes, the customer changed the design speci-
fication radically. The second redesign used off-the-shelf optics to
reduce lead time. In the first prototype run, the centering tolerance on
the two PCXs had driven up the cost of the metal housing. A cemented
achromat doublet greatly eased the tolerances. The second prototype
was manufactured using items that were in stock. C-Mount tubes were
BEST OF EDMUND OPTICS™ APPLICATION NOTES

used as the mounting platform.


The total redesign took about one week from specification to final
prototype. The costs of the achromat were less that the cost of the two
best-form lenses, and the cost of the housing was reduced. The customer
approved the off-the-shelf solution, even for volumes up to 10,000.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


TECH TIP ON CLEANING LENSES
5 Dust is the most common contaminant and can usually be removed using pressurized gas. If
more cleaning is necessary, hold the lens in lens tissue and apply a few drops of reagent-grade
acetone or lens cleaning solution. Slowly turn the lens while applying pressure in the center
and working outward, to pull dirt off the lens instead of redistributing it on the surface.
Fingerprints on a coated lens should be cleaned as soon as possible to avoid staining or dam-
aging the optic. Larger dirt particles, however, should be removed with a dust-free blower
before attempting to clean the optic with lens tissue. Larger particles trapped under the cloth
will scratch the surface you are attempting to clean. If the lens is still dirty after using acetone
— for instance, if the oil was just redistributed and not cleaned off the optic — then a mild
soap solution can be used to gently wash the lens. Repeat the procedure with acetone to elim-
inate streaks and soap residue.
Micro-optics may also be cleaned using acetone but, due to their extremely small size,
they require special handling and care. Delicate tweezers may be used to securely hold a
micro-lens by its edge, or a vacuum pick-up tool may be used.
Also, choosing the proper cleaning supplies and using the proper techniques are as impor-
tant as cleaning the component itself. Using improper cleaning practices can damage polished
surfaces or specialized coatings that have been used on a substrate or lens. Always check with
the manufacturer of the component to determine proper care and cleaning procedures.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Micro-optics are critical to fiber optic systems as they help connect the
MICRO-OPTICS fiber to the components responsible for manipulating light. For this rea-
son, an understanding of micro-optics is important to not only optical
AND FIBER engineers but to all designers working with fiber optic systems so that
they can specify the right micro-optics for their application.
OPTIC SYSTEMS Because network designers, manufacturers, and optical engineers
think very differently, we need to define some terms before we can talk
constructively across these disciplines. For example, what is through-
put? Is it the bandwidth of data, the number of widgets that a produc-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

tion line makes in a day, or the intensity of light that passes through a
lens? For the purposes of this article, we will use the terms of optical
engineers – throughput is the intensity of light through a lens. Light
intensity is an important factor because a lack of intensity creates a
noisy or weak signal, which is not useful for carrying data.

Flavors of micro-optics
FIGURE 1: Micro-optics of various sizes and shapes are Micro-optics are lenses, mirrors, prisms, windows, and other elements,
used in fiber-coupling and collimating used to manipulate light, that have dimensions between 0.5 and 3 mm
applications. (see Figure 1). Among the host of lens types are PCX (with one planar
and one convex side), DCX (with two convex sides), ball, drum, and
gradient index (GRIN) lenses. The latter are popular because they can
be made to guide light toward their axis, which can be very useful for
guiding light into a fiber core. There is more to the micro-optics for
fiber than GRIN lenses, however (see “Battle of the Lenses”, next
page).
Nearly all of these manipulative elements perform one of two basic
functions: they either collimate light or couple light from one device to
the next. Collimating optics catch and reshape the spreading beam that
emerges from a laser diode. Coupling optics have more varied jobs:
they are employed where the beam magnification needs to change,

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


which is typical any time light moves between a fiber and another com-
ponent (for example, multiplexers/demultiplexers, circulators, gratings,
or switches).
Micro-optics are made of either glass or plastic. The most typical
6 glass is BK7, and elements made from this material have standard char-
acteristics. By using materials with higher refractive indexes, a lens
with the same radius can have a shorter focal length and a higher numer-
ical aperture. High-index materials include LASF9 and cubic zirconi-
um.
Plastics, such as PMMA, SMMA, or polycarbonate, have consider-
ably lower indices of refraction than BK7, but are used because they can
be made easily by molding and are less expensive than glass.
Furthermore, the molding process allows manufacturers to incorporate
mechanical structures or aspheric surfaces into the elements.
Plastics, however, are difficult to coat, and coatings are essential for
fiber applications. Standard vapor-deposition coating methods cannot
be used on plastics because the materials cannot withstand the temper-
atures. Plastics can be coated by dipping techniques, but such coatings
are not as complex as those possible with vapor deposition. Coatings
maximize throughput, reduce reflection, and filter stray light.
More complex coatings can filter for polarization as well; this func-
tion cannot be done on plastic. For collimators, however, where narrow
passbands are not needed, plastic microlenses work well.

Using micro-optics
Micro-optics increase throughput by fighting back-reflection and align-
ment errors. Reflections not only reduce efficiency but can cause feed-
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

back in the laser. For example, when coupling two fibers with plane
faces or collimating light from a laser, one can reduce feedback by using
discrete elements or coatings, or both. Antireflection coatings cut down
on the amount of reflection at each surface.
Fiber coupling is subject to three types of misalignment (see Figure
2): separation, offset, and tilt. In separation, the fibers may not be close
enough together: if there is an unplanned-for distance along the z-axis
between them, light from one fiber core will spread out and lose much
of its intensity. When offset, the fiber cores may be displaced laterally
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Lateral Misalignment of Coupling Optic


along the x-axis, so that light from one core hits the cladding layer of
the second fiber, also reducing the light throughput. Finally, one fiber
may be tilted (rotated around the x- and z-axis) so that the light will hit
Angular Misalignment of Coupling Optic the cladding of the second fiber when launched.
The mechanical effects and tolerancing of the way the fibers are held
certainly prompt alignment errors. Optical tolerances apply to mounting
Longitudinal Misalignment of Coupling Optic devices as well as to the optics; they share the total error budget. If the
mounts are made of molded plastic, it is hard to hold tight tolerances to
the mounts, and the optics have to be much more precise in order to stay
FIGURE 2: Different types of misalignment in fiber within the budget.
couplers are caused by separation, offset, The gross error in molded plastic housings is sizeable. If you have
and tilt.
ever pulled the cover off of patch panels, you can see that the efficien-
cy of typical connectors needs improvement. To compensate for
mechanical errors, optical tolerances are driven very hard. A better solu-
tion would be to improve the accuracy of molded plastic connectors, or
at least to evaluate where more improvement can be made to meet the
requirement for an application.
With optical fiber, axial positioning tolerances make a big differ-
ence. If you have a 50µm fiber core, and the beam entering the fiber is
decentered by 10 to 30µm, the system may still work, but it will lose
throughput.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Battle of the lenses
Everyone who works with optical fiber seems to love gradient index
(GRIN) lenses. They do have some excellent characteristics: flat faces
7 and no spherical aberration on axis. They are expensive, however, and
at times PCX or ball lenses can work just as well at a lower cost.
For a 50µm core fiber, one can use other types of lenses and get the
same efficiency. Consider the following options:
• A drum lens, which is an edged-down ball lens (see Figure 3).
Drum lenses are readily available and comparable to the
performance of GRIN lenses.
• A ball lens has the same effect as a drum lens, is compact, and
the focal length is the diameter. Within the past year, ball lenses
that are coated uniformly on all sides have been commercialized.
The lens can be dropped into a system without worrying about an
axis or uncoated region, and because of its shape, mounting is
straightforward.
• PCX lenses, which are effectively half of a ball lens, work fine for
efficiency in a lot of moderate bandwidth systems, such as OC1 or less.
They are available off-the-shelf, and their biggest drawback is spherical
aberration. For a 50µm core fiber, however, one can achieve 90% to
95% efficiency with a PCX lens at lower bandwidths. By keeping track
of numerical aperture effects within the system, systems designers can
maintain that efficiency.
FIGURE 3: Drum lenses can be used for many fiber
applications, with fiber-coupling efficiencies
In the 1980s, an analysis of three types of lenses by A. Nicia showed
comparable to GRIN lenses. Drum lenses are them comparable in fiber-coupling efficiency.1 In experiments, ball
easily available and inexpensive. lenses were shown to reach the theoretical expectations. The efficiency
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

of these devices really depends on the packaging.

Tolerancing
What about tolerancing in the micro-optics themselves? Glass toleranc-
ing is well understood and built into popular optical design programs.
Plastics are not as well characterized. After being molded, plastics
shrink tremendously – for example, to get a 7mm diameter element, a
10mm mold may be needed. This limits the feasible tolerances of plas-
tic parts and certainly includes a different set of issues than for glass.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

TYPICAL TOLERANCES FOR


Plastic’s material qualities also limit surface accuracy and centering
MICRO-OPTICS
(see table at left).
The issues apply to elements other than lenses. Windows have many
Tolerance Glass Plastic of the same tolerancing and coating issues. Microprisms are manufac-
CT to ET ratio 2:1 4:1 tured and coated differently: special tools must be created to grind, pol-
Surface Quality 20-10 40-20 ish, and coat the right-angle prisms used for switching. To some extent,
Dimensional 0.03mm 0.015mm the tolerances depend on the tooling.
Centering 20min 20min As the size of the optic shrinks from macroscopic to micro-sizes,
EFL 2% 0.50%
two conflicting tolerancing issues occur. First, the error budget is small-
Power 2 fringes 2 fringes
er because the system and its components are smaller. But small size
Irregularity 1/2 fringes 1/2 fringes
also works for micro-optics. For example, consider a wedge: the total
indicator runout for this part is the product of the angle times the diam-
eter. Because the diameter is so small, the wedge is less sensitive to
errors in the angle. Tolerancing issues do not translate directly from
macro-optics.
Consider surface roughness. Imagine that a lens is specified to have
a quarter-wave roughness. This specification means that at some points
on the surface, the roughness may be as large as a quarter wave. As the
aperture gets smaller, the smaller area is more likely to be within spec
because it is less likely to contain one of the roughest spots. The prob-
ability of having a deviation within the field is smaller, for a smaller

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


field. For a micro-optic, one might achieve the same quality by speci-
fying only a half-wave surface roughness.
The scratch-dig surface requirements work the same way.
Specifying a quality of 20-10 should not change the price. As the diam-
8 eter decreases, so does the difficulty of holding quality over that small-
er area. Obtaining a surface quality of 10-5 is more difficult.

Conclusion
For your application, consider what kind of optics and coatings you
need to get the performance you want. As the sidebar suggests, PCX
lenses are inexpensive and readily available in many diameters and can
solve many of the same problems as GRIN lenses.
When you specify the elements, pay some attention to the toleranc-
ing: if you can inject some intelligence into the specifications to make
REFERENCES them fit your application, you may bypass some expensive manufactur-
ing problems that are not strictly necessary. Although the size of micro-
1 A.Nicia, Lens coupling in fiber-optic devices: efficien- optics suggests that they be more precise than macro-optics, the reality
cy limits, Appl. Opt. 20 (18), p. 3136-3145 (15 Sep
1981). is that the tooling is more difficult and tighter tolerances may not be
necessary.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

TECH TIP ON BALL LENSES


BEST OF EDMUND OPTICS™ APPLICATION NOTES

Ball lenses are great tools for improving signal coupling between
fibers, emitters and detectors.
The effective focal length of a ball lens is very simple to calculate D d
(Figure 1) since there are only two variables: the ball lens diameter, D,
BFL
and the index of refraction, n. The effective focal length is measured EFL
P
from the center of the lens. Therefore, the back focal length can also
be easily calculated. Figure 1
D nD
BFL = F _ EFL =
2 4(n-1) 0.900

0.800 BK7
SF8

Numerical Aperture (NA)


0.700
Sapphire
0.600

The Numerical Aperture, NA, of a ball lens is dependent on the 0.500

0.400
LaSFN9

focal length of the ball and the input diameter, d. Since spherical aber- 0.300

0.200

ration is inherent in ball lenses the following equation begins to fall off 0.100

0.000
0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90

as d/D increases. d/D

2d(n-1) Figure 2
NA =
nD

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


You can see from the graph (Figure 2) how the NA changes as input
beam diameter increases. Please note that this graph includes the D d

effects of spherical aberration.


When coupling light from a laser into a fiber, the choice of the ball
is dependent on the NA of the fiber and the diameter of the laser beam. Figure 3
9 The diameter of the laser beam is used to determine the NA of the ball
lens. The NA of the ball lens must be less than or equal to the NA of
the fiber in order to couple all of the light into the fiber. The fiber Fiber Coupler

should be placed at the focal point of the ball lens as shown in Figure
3.
To couple light from one fiber to another, use two ball lenses that
match the NA of their respective fibers (Figure 4). Figure 4

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

KEYS TO COST
As most designers know, optical design software can be a powerful tool.
EFFECTIVE OPTICAL But it’s just that, a “tool”. The proper interpretation of the optimized
results is just as important as the information inputted. This is why
DESIGN AND experienced designers will weigh the advantages and disadvantages of
using one lens design code over another prior to any actual design. But
TOLERANCING with growing industry demands, designers need to incorporate all
aspects of production into their design in order to ensure that the final
product will be brought successfully to market. They need to not only
BEST OF EDMUND OPTICS™ APPLICATION NOTES

be aware of the nuances of fabrication, assembly, coating, etc., but also


with how to integrate cost with the demands of the intended application.
Unfortunately, no software program provides a subroutine to assure that
costs are minimized.
This introduces the concept of designing with off-the-shelf catalog
lenses, which have the dual advantage of being inexpensive (compared
to a small custom production run) and immediately available. Clever
designers can often integrate stock lenses into custom multi-element
designs; by sacrificing marginal performance issues, a significant cost
saving can be achieved. Even though stock lenses may not be practical
for a required application, they may be suitable for fast prototyping
FIGURE 1: Zemax Optical Design Software. requirements. In addition, the readily available prescription data for
most lenses & even many multi-element lenses are encouraging many
to use stock lenses (see Figure 1).
This article will attempt to clarify some typical optical manufactur-
ing practices and emphasize the need to monitor costs during the design
process. With a keen knowledge of manufacturing practices, lens
designers can guide the optimization to an economical solution. By
investing some time at a local optics shop, designers can experience
firsthand fabrication techniques employed by an optician. Choices
made during the design stage that appear to have no effect on produc-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


tion could eventually prove otherwise.
As an example, the simple act of making elements equi-convex or
equi-concave could eliminate problems in a seemingly unrelated
process such as assembly. Ask any assembler how they feel about lens-
es that have nearly the same radii on their outer surfaces, and they will
10 tell you horror stories of multiple tear-downs to correct for lenses
mounted in the wrong direction. In fact, selecting symmetrical lenses
can often introduce cost savings by reducing the cost of test plates and
production time.
Any design starts with a given application, and thus some known
values. It’s the designer’s job to solve for the unknowns, typically set-
ting such lens specifications as the radii as variables and constraining
others by initially pre-selecting them; namely, the diameter, center
thickness, and glass material.

Selecting the diameter


Once clear apertures have been determined, it is important that design-
ers understand how the lens will be mounted, as well as ground and pol-
ished. The final lens diameter should be chosen to accommodate the
lens mounting (see Figure 2).
When mounting on a mechanical inner diameter (based on contact
points with the radii), glare may be introduced from light reflecting off
of a spacer, retainer ring or mounting seat/shelf. In comparison, light
that reflects off of a larger inner diameter (I.D.) will be cut-off by the
aperture of the system. If the element is coated, the diameter of the coat-
ing area should be larger than the mounting I.D. in order to avoid expo-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

sure of uncoated lens surface areas. Typically, a diameter 3mm larger


than the clear aperture diameter is needed for elements in the 20 – 40
mm diameter range.
In order to produce repeatable lenses, manufacturers often use lens
blanks (glass in pre-fabricated state) that are typically 2mm larger than
the selected lens diameter. This method of “oversizing” allows the opti-
cian to remove defects during the final centering process. One common
defect, called “edge-roll,” (see Figure 3) is a surface deformation that
results from excessive wear that the polishing tool exerts on the edge of
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Mount
the lens blank.
Retainer
Non-Coated Area
Another defect, often referred to as “wedge”, occurs when the optical
Mount Seat and mechanical axes of an element do not coincide. This centration error
can be corrected by aligning the centerline of the lens surfaces with a
spindle that rotates about the mechanical axis. The blank is then ground
down to the final lens diameter, while being aligned with the optical axis.
Mechanical ID
Clear Aperture

This in turn defines the diameter tolerance. The deviation angle specifi-
Diameter

cation is used to limit the amount of centration error. It is important for


a designer to consider this value when reviewing the effect of the com-
pounding errors on the alignment of a multi-element system. Not only
must each lens be axially aligned to each other, but the optical assembly
must also be aligned to the housing.
Non-Coated Area
The main consequence of working with oversized blanks is that the
edge thickness of a bi-convex or plano-convex element will be smaller
than at the final lens diameter. The designer can incorporate this knowl-
edge into the design process by using lens diameters that are typically
FIGURE 2: Mechanical mounting considerations. 10% — 20% larger than the final diameters and include a minimum
edge thickness operand in the merit function of their chosen software
program.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Selecting the center thickness
Typically, a designer will steer designs away from large center thickness
values in order to control the material volume, and thus the weight of the
Edge Roll to be
removed final product. Usually as a result of color correction, design software will
favor thin lenses with high diameter:center thickness ratios. If kept
11 below 10:1, the diameter:center thickness ratio rarely effects cost. When
Finished Diameter
the ratio approaches 15:1, costs begin to rise for low power lenses with
longer radii, as well as meniscus lenses. These types of lenses exhibit
“springing” during conventional and high-speed manufacturing. In con-
FIGURE 3: Interferogram of PCX lens showing “edge-roll”. ventional polishing, lenses are placed on a blocking tool with hot sticky
pitch. After polishing, the lenses are removed from the polishing block
by chilling the pitch to a brittle state, allowing easy separation from the
lens surfaces. Surfaces can deform when stress, introduced in the block-
ing process, is removed. For high-speed manufacturing, the effect is
manifested differently. Increased speed and pressure causes the lens to
oscillate, resulting in deformities and making it difficult to control the
irregularity (surface shape).
The effect of the diameter:center thickness ratio on cost can vary due
to the lens shape and is actually less cost sensitive for large negative
power lenses. In addition, these lenses have large edge thickness values
that provide support to handle pressures and stress.

Selecting the glass material


There is almost as much selection in types of glass materials as there is
in cost. For example, using a relative price comparison with the most
commonly used optical grade BK7 glass as a value of 1, then SF11 is 5

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

times more expensive, while LaSFN30 is almost 25 times more expen-


sive. Properties of a material that can drive up costs include high stain-
ing and softness, which are often difficult to work with and require care-
ful handling. It is important to note that these characteristics can affect
production during both fabrication and coating procedures.
Many design software programs provide an option to “model” a
glass type, allowing the index and dispersion values to vary continu-
ously. Although this will usually produce quicker results, caution should
be used. If this modeling option is selected, the designer must diligent-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

600
ly monitor the design to steer it away from expensive and difficult-to-
Relative Cost

500
control glass types. Many optical designers will use a personalized glass
400
catalog, usually containing glass types that are less expensive, readily
300

200
available and possess other desirable characteristics. This method,
100
although slower, may provide for an easier means to produce an inex-
± 0.10 ± 0.05 ± 0.02 ± 0.01 ± 0.005 ± 0.002 pensive design.
Diameter Tolerance (mm)
400
350
Using tolerancing schemes
Relative Cost

300
250
Once the initial design is completed, the designer’s next task is to assign
200 appropriate tolerances for the various parameters. Diameter, wedge,
150 power/irregularity and center thickness tolerances all need to be
100
± 0.200 ± 0.100 ± 0.050 ± 0.020 ± 0.010 ± 0.005 assigned for each element. Design performance will be more sensitive
Center-Thickness Tolerance (mm) to some of these tolerances, while others have little effect at all (see
500
Figure 4). The designer can limit the use of tight tolerances to the sen-
Relative Cost

400
sitive areas and permit them to broaden or loosen in others.
300 Additionally, many optical shops have varying degrees of success con-
200 trolling specific tolerances. By getting to know the strengths and weak-
100 nesses of various optical shops, as well as the associated costs, design-
9:1 15:1 20:1 30:1 40:1 50:1
Diameter-to-Thickness Ratio ers can streamline the process by directing designs to appropriate ven-
200 dors.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Relative Cost

Tolerancing runs performed by most design software programs


150 assume Gaussian distribution, with errors equally distributed about the
nominal value. However, some parameters tend to be skewed either to
100
the plus or minus end of the scale during manufacturing. Opticians tend
10 - 5 8-3 5-2 3-1 1 - 1/2 1/2 - 1/4
to polish lenses on the plus side of a center thickness tolerance. By leav-
Power/Irregularity of Fringes
12 500 ing extra material, the optician can rework lenses should they be dam-
aged during later stages of fabrication.
Relative Cost

400
Another trend is the practice of polishing surfaces on the “low” side.
300
When using a test glass to monitor the power tolerances, the optician
200
will avoid center contact in favor of edge contact in order to prevent
100
80-50 60 - 40 40 - 30 20 - 10 10 - 05 0-0 scratching the polished surface, as well as the test glass (see Figure 5).
Surface Finish, Scratch-Dig
As a result, the power tolerance is cut in half and thus convex/concave
surfaces will be flatter/sharper than the nominal value.
FIGURE 4: The effect of relative costs are shown for various
parameter and tolerance specifications. The value Finally, the presentation of the tolerancing must be interpretable by
100 represents the cost of a basic element. opto-mechanical designers. By emphasizing the sensitive areas of a
Source: See Reference #2
design, a designer can help ensure a successful opto-mechanical design.
Emphasizing axial position over individual spacing tolerances, for
instance, can better control fixed flange distance requirements that may
Test Glass Test Glass suffer due to the “stacking” of individual errors.
Concave There are several other topics that have not been discussed due to the
scope of this article, but nonetheless should be addressed. They should
include but not be limited to coating, surface accuracy (power/irregu-
Test Glass Test Glass larity), and surface quality (scratch-dig).
Convex

Center Contact Edge Contact

continued >
FIGURE 5: Polishing on the “low side”.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Conclusion
The goal of this article was to bring to light some of the key factors that
effect cost after a design has been completed. By being aware of what
goes on after a design is handed off, a designer can be better prepared
to integrate the relevant issues before and during the actual design. This
results in less redesigning and optimization and should lead to a better
final product.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

REFERENCES
1 Tech Spec Bulletin Article, Understanding
Optical Specifications, Issue 5, Volume 4, Winter
1998-99
2 Russell Hudyma and Michael Thomas,
Reasonable Tolerancing Aids Cost-effective
Manufacture of optics. LASER FOCUS WORLD,
May 1991, pgs. 183-193
3 Warren J. Smith, Optical Component
Specifications: Avoiding Pitfalls in Setting
Tolerances for Optical Components. The Photonics
Design and Applications Handbook 1999, pgs.
346-349

TECH TIP ON SURFACE QUALITY

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Surface quality refers specifically to the cosmetic condition of the surface of an optical ele-
ment. During the grinding and polishing stages of fabrication, small defects can occur, such
as scratches and digs. A scratch is any mark or tear and a dig is any pit or divot in the sur-
face. The specification used for the maximum allowable flaws is denoted by a combination
13 of numbers, the scratch number followed by the dig number; for example 60-40. The lower
the number, the higher the level of quality. For example, a 60-40 value is common for
research and industrial applications, whereas a 10-5 value represents a high quality standard
for laser applications.
It is important to note that both the scratch and dig numbers do not actually correspond
to a specific number of defects. Instead, they reflect the quality of an optical surface by means
of visual comparison to a precisely manufactured set of standards. This is in accordance with
the MIL Spec. Scratch and dig evaluation as defined by the U.S. Military Specification for
the Inspection of Optical Components, MIL-O-13830A.
There is no direct correlation between the scratch number and the actual size of the
scratch. As a common reference, the scratch number relates to the “apparent” width size of
an acceptable scratch. However, there is some ambiguity since it also includes the total length
and number of allowable scratches. Dig numbers do relate to a specific size. For example, a
40 dig number relates to a 400µm (or 0.4mm) diameter pit. Coating quality inspection is also
held to the same Scratch-Dig specification as the surface of an optic.
Surface Quality inspection typically includes additional criteria, such as staining and
edge chips. Overall cosmetic inspection also includes defects within the material, such as
bubbles and inclusions, including striae. Imperfections of this nature can contribute to scat-
tering (i.e., in systems involving lasers) and image defects (if at or near the image plane).
Inspection for surface accuracy and quality specifications are limited to within the compo-
nent’s clear aperture.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

INCORPORATING There are many advantages to incorporating aspheric lenses into your
optical designs (see Figure 1). Aspheres can improve performance of a
low-f-number lens, for instance, or of a lens in which weight constraints
ASPHERES INTO limit the number of optical elements. All too often the difficulties pres-
ent in aspheric surface manufacturing and metrology have prevented or
YOUR DESIGNS discouraged them from being used. In the last decade, there have been
advances in both asphere manufacturing and testing that make imple-
mentation of aspheres into optical design more feasible. Although some
BEST OF EDMUND OPTICS™ APPLICATION NOTES

manufacturing and testing challenges remain, these hurdles can be over-


come with some added care in the design process.

Using aspheres in your design


For over twenty years, optical design software packages have given
designers the freedom to add aspheric surfaces to their designs.
Computing power has long been sufficient to optimize the aspheric
coefficients to attempt millions of options to determine an optimal solu-
tion. With this freedom the software will converge to a solution using
FIGURE 1: Lenses that do not have spherical surfaces the aspheric terms in almost every case. Unfortunately, the indiscrimi-
can solve design problems such as the need nate use of all the degrees of freedom rarely leads to a manufacturable
for a low f/# or a limited number of lenses design. If a designer sends such an asphere to manufacturing, then the
in a system.
design is usually either cancelled or the specifications are reduced so
that an asphere is no longer used.
This doesn’t have to happen. With a better understanding of what
causes the lens to be expensive or unmanufacturable, a designer can
work around the problems and make the asphere a feasible option. The
designer must limit the software’s freedom to only find a solution that
is feasible to produce. Typically two aspheric surfaces are plenty to pro-
vide sufficient performance—one near the stop to correct for the spher-
ical aberration, and one away from the stop to correct for field-based

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


aberrations.
Adding appropriate constraints and pushing the design to the opti-
mum solution allows a designer to quickly turn a difficult part into a
viable solution. To properly constrain the software, however, the
designer must know how the part will be produced and what concerns
14 manufacturing will have.
FIGURE 2: Micro video lenses often contain aspheric
Manufacturing aspheres
elements.
Manufacturing technologies for aspheres have improved significantly
over the last decade. Molding technology has become more precise and
cost effective and options for machined aspheres have increased great-
ly in the last 10 years as manufacturing facilities have begun using mag-
netorheological finishing (MRF), and computer-controlled high-speed
grinding. Very accurate optical surfaces can be molded with exception-
al quality, but tooling costs are still rather high compared to conven-
tional optics, which limits the economical use of molding to high vol-
umes. Typical mold costs range from $25,000 to $50,000 depending on
the size and number of cavities, with manufacturing volumes typically
on the order of hundreds of thousands of pieces. Molded aspheres are
used primarily in commercial optics such as digital cameras that are
produced in volumes reaching hundreds of thousands or even millions
of units (see Figure 2).
In terms of machining, diamond turning has long been an option for
crystalline materials and nonferrous metals, but new methods like high-
speed computer-controlled small-tool or disc grinding have enabled
machining methods to produce aspheric surfaces in glass and other opti-
cal materials. Computer-controlled high-speed grinding can fine-grind
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

an aspheric shape by using the edge of a disc-grinding tool and tracing


out the aspheric shape as the lens spins axially. This does not yield an
optically smooth surface, so an additional polishing step is necessary.
Typically, polishing is done with a soft conforming tool that produce
optical smoothness while maintaining the ground shape.
In addition, magnetorheological finishing (MRF) can be used to
correct and smooth a finished asphere, or to induce a mild aspheric
departure in a spherical polished lens (see Figure 3). The MRF method
BEST OF EDMUND OPTICS™ APPLICATION NOTES

uses a fluid that changes shape and consistency when a magnetic field
is applied, which allows the computer to vary the removal rate of the
glass while it’s polishing different areas. Diamond turning has been
used for small volume infrared optics for years, and surface roughness
has limited its practical use on visible optics. MRF can also finish a
diamond-turned optic to remove high-frequency diamond-turning
marks, which makes diamond turning more useful for visible optics. It
is important to remember, however, that MRF is a finishing process: it
requires a polished part such as a machined asphere, a molded asphere,
FIGURE 3: Magnetorheological finishing can be used to cor- or a polished sphere.
rect and smooth a finished asphere. Since all these machining methods are computer-controlled, they
use a feedback loop in which the surface is tested and the motion across
the part is adjusted to achieve the designed shape. This feedback
requires accurate metrology to work properly.

Metrology of aspheres
Testing of aspheres has advanced in concert with the manufacturing.
Because of the deterministic nature of machining aspheres, testing is
required to correctly machine a surface. For conventional spherical
optics, interferometry has been the standard means of testing since
Newton’s time. For aspheres though, interferometry is much more

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


complex. Computer-generated holograms (CGHs) have long been used
by high volume manufacturing facilities to generate the reference
wavefront, but the significant tooling expense effectively restricts the
use of CGHs in low volume situations.
New developments in commercial interferometry, such as the use of
15 high-resolution cameras and the stitching of subapertures may eventu-
ally allow analysis of larger departures from spherical wavefronts, but
these technologies are currently limited to relatively small departures
from spherical surfaces. The promise of non-contact aspheric metrolo-
gy, while not fully realized, is coming close to fruition. Also, surface
profilometers have become more accurate and now give accurate pro-
files without special tooling.
Profilometers are relatively fast so they are more viable for small
volume testing. It is important to note, however, that they give profiles,
not surface measurements, so they only contain data from a single
plane. Several profiles are needed to get an accurate map of the surface.
Using a profilometer is best when it can be assumed that only rotation-
ally symmetric errors are present in the part.

What drives cost?


The cost drivers differ for molding versus machining aspheres.
Molding drivers include quantity, accuracy, thickness, and surface
accuracy. Tooling costs can make molding prohibitively expensive for
anything less than high volume manufacturing. Tooling costs also
increase as the required accuracy of the shape is increased, due to the
need for iterative correction of the mold—several iterations can be

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

required to get the mold to the final shape. The high temperatures
involved in molding can cause significant shrinking when a lens cools.
The larger the coefficient of thermal expansion and the greater the dif-
ference in thickness from the center to the edge of the part, the more
the lens will deform when it cools.
The main cost driver for machining an asphere is machine time,
which is affected by accuracy and size. For most methods, the greater
the departure from a sphere the more machine time is required and the
BEST OF EDMUND OPTICS™ APPLICATION NOTES

more difficult the testing. The desired accuracy of the surface also
affects machine time, because finishing with MRF may be necessary to
get a high quality surface. Finally, the size of a lens also affects the
machining time. A 4-inch diameter lens takes longer and is more
expensive than a 1-inch lens. Tooling is usually only an issue during
testing, so economy of scale is not as important for machining as it is
for molding.
The optical designer must also consider the limits of machining.
Machining lenses with a short localized radius is often a problem, for
instance, because the shortness of a concave radius is limited by the
limits of tool sizes. A slope change from convex to concave will make
a part difficult (maybe impossible) to machine, depending on the rate
of slope change. Very small parts can be impossible to machine
because of the relative size of the tool.

Designing for manufacturability


The optimal time to reduce the cost of the part is during the design. If
engineers consider the cost drivers and add the appropriate constraints
to the software, manufacturing can produce a part that meets the
requirements at a minimized cost.
The designer must first consider the production volume and design

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


for the appropriate manufacturing and testing methods to be used. The
designer can then tailor the design constraints for the manufacturing.
If parts will be molded, the designer can select glasses with low
melting points and low thermal expansion to minimize costs, because
the shrinkage of the part will be less. Controlling the difference
16 between the center thickness and the edge thickness of the part will
also reduce the deformation due to shrinkage.
When machining an asphere, controlling the departure from a
sphere is often the most significant cost improvement. Designers must
consider the departure at the edge of the lens. The lens design software,
based on where it traces rays, generates the equation describing the
asphere. When including an asphere in a design, it is important to trace
more rays than are typically traced in an all-spherical design. To do
this, add more field points and increase the grid density of the rays
traced for each field point.
The equation the code generates will be defined by these points
within the clear aperture of the lens, however manufacturing will have
to make the shape all the way to the edge. There can be many equations
that will describe identical sags over the clear aperture, yet very differ-
ent sags outside the clear apertures. It is not unusual for an aspheric
equation to have a slope inversion that is outside the clear aperture, but
would fall within the actual diameter of the lens. These dramatic
changes from a best-fit sphere can cause large problems in manufac-
turing.
Tracing rays outside the clear aperture, but within the diameter of
the part, can correct this problem. To do this, add field points outside

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

the actual field of view and slightly increase the aperture stop diameter.
Be sure to put less weight onto these new field points when optimizing so
they do not dramatically change the correction in the part. Also the soft-
ware can calculate the best-fit sphere to the aspheric equation and force
the sag outside the clear aperture to be similar to the best-fit spheres sag.
Almost all the design codes allow for the surface sag at any height to be
used as an operand in the merit function.
The added field points and larger aperture stop will also help to desen-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

sitize the design to lens decentration. Decentration of the lens will cause
rays to hit outside of the clear aperture. Adding more control over the sur-
face outside the clear aperture reduces the negative effects of decentering.
As in the case of molded optics, placing the asphere where it will have
a smaller diameter can also reduce the machining time of the part. The
smallest radii that can be cut into the mold limits how small a mold can
be made. Optimal mold sizes tend to be between 10 and 25mm.
Due to the advances in manufacturing and testing, designers now have
more tools available for their optical designs. With a fundamental under-
standing of the choices and limits in manufacturing, it is possible to use
aspheres to make economically producible designs.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


17

www.edmundoptics.com 800.363.1992
PUSHING OPTICAL
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Applying optical coatings is labor and time intensive — after a techni-


COATING cian sets specifications in the coatings chamber, the coating process fol-
lows a series of nine separate steps, each critical to producing a quality
TECHNOLOGY coating. And the more complex the coatings are, the more effort and
expertise coating production will require. The acceleration of optical
TO NEW LIMITS technology has challenged coating vendors to create increasingly elab-
orate coatings. These vendors must understand the capabilities of the
chamber and the coatings, utilize coating monitoring technology, and
BEST OF EDMUND OPTICS™ APPLICATION NOTES

consider the costs involved in custom vs. off-the-shelf choices.


Three Layer BBAR Design Market forces are pushing the performance of optics to their limits.
Optical components must be developed to provide the best possible
Air
n=1 MgF2 Ta205 Al203
combination of manufacturability, performance, and price. One vital
n=1.38 n=2.15 n=1.70 Substrate Ns step to success in creating optics lies in a discipline that is often over-
Incident Light
R1
looked or misunderstood — coating engineering.
R2 Transmitted Light Coating requirements for optical components that require complex
R3 and difficult-to-achieve wavelength accuracy and durability are driving
R4 the development of coating technology.
Selected Light QWOT HWOT QWOT
How coatings work
FIGURE 1: Three-layer broadband antireflection coating uses Coatings use constructive and destructive interference in thin films to
two quarter-wave optical thickness layers and one
half-wave, or absentee, layer.
create a specific spectral response (which can be a mirror, a partially
reflecting mirror, an antireflection coating, or a filter) over the spectral
region of interest. The coatings are thin dielectric films deposited on
glass. Dielectric materials are non-absorbing (in other words, exhibit
very high transmission) from the UV through the visible and well into
the IR.
To understand interference, consider light as a sine wave. When the
lightwave encounters an interface, the reflected portion of this wave
changes phase. The total phase change is a result of the combination of

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


phase changes at interface reflections in combination with phase
changes due to the optical path length the light travels. The phase
change is related to the thickness of the interface layer. Typically,
dielectric layers are deposited on the surface of the component in alter-
nating high and low refractive indexes of quarter-wave optical thickness
18 (QWOT).
The QWOT is prevalent throughout optical coating designs because
it produces the maximum change in phase for any single dielectric layer.
Layers of half-wave optical thickness (HWOT), also known as absentee
layers, do not alter the performance at the design wavelength but may
be used to modify transmission away from the design wavelength. The
resultant added wavefront from all the reflections presents either an
additive or subtractive effect. An additive effect is that of a high reflec-
tor, and a subtractive effect result as in an antireflection coating (see
Figure 1).

A coating chamber
It is easier to describe how a coating is made if one understands the
parts of a coating chamber. A typical optical coating chamber is 24 to 40
inches in interior diameter and contains an array of components. The
coating chamber includes several subsystems (see Figure 2). In this arti-
cle we focus on vapor deposition, a common coating method.
The first subsystem holds and rotates the components being coated.
It is either a planetary dual rotation or calotte single rotation mechani-
cal structure. Planetary tooling is preferred if precision and uniformity
are critical; the calotte is used if tight tolerances are not specified, and
provides more parts per coating run. The planetary spins the compo-
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

nents. Each tool includes a set of standard diameter holes that hold cus-
tom inserts, which in turn hold the components being coated. These
inserts are made, if not already available, for each type of component
being coated.
Moving down in the chamber, the next subsystem is the element
heaters. These are placed along the perimeter of the chamber to aid in
heating the chamber and specifically the substrate or components being
coated. The chamber is typically heated to between 250°C and 300°C.
Next is the focus point of the chamber: an electron-beam gun vapor-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

izes a target, held in a crucible, to create the vapor that fills the cham-
ber and deposits onto the components (as well as all the other surfaces
in the chamber). A complex system of crucibles and shutters allows the
correct material to be vaporized for the correct amount of time. These
Planetary Dual Rotation Optical Monitor Glass crucibles are loaded into a rotating wheel. The coating machine or the
Substrate Holder operator moves the correct material in front of the gun at the correct
time to deposit the next layer. The shutter stops vaporization after the
Quartz Xtal correct material thickness is deposited.
Rate Monitor In some systems an ion gun is used to add energy to the material as
it is vaporized for better control of the process. This ion-assisted depo-
Substrate Heaters sition (IAD) method increases the density, or packing factor, of the coat-
ing. This in turn decreases the voids in the coating and opportunity for
moisture to comingle with the layer. Moisture changes the effective
ION Gun E-Gun index of a thin film and causes the coatings properties to shift. Moisture
Evaporation in the coating limits the accuracy possible in a coating.
The layers are required to be a specific thickness, on the order of
Light Detector 1/10 of a wavelength of light. Two primary measuring methods are
Source
quartz crystal frequency monitoring and optical monitoring.
FIGURE 2: Coating chamber subsystems, from top to bottom, Crystal monitoring is based on the film being deposited on the crys-
may include planetary tooling that holds the com- tal the same as the components of interest. As the thickness builds up,

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


ponents, a quartz crystal deposition rate monitor, the characteristics of the crystal change accordingly. This change can be
substrate heaters, ion-beam gun, e-beam system
for vaporizing material from targets and deposit- monitored and directly related to the thickness of the film. The second
ing the material onto the components, and an method uses the same concept, with the exception that it uses an optical
optical deposition rate detector (including the
light source and detector). detection basis. All of these systems work in concert to deposit very
accurate layers of dielectric films to produce the result of the coating
19 design. The chambers can create complex coating structures in excess
of 100 layers.

Nine-step process
Coating a single surface takes nine separate steps, and a two-sided com-
ponent takes 16 steps. Each step is labor- and time-intensive (see Figure
3). A typical broadband antireflective (BBAR) coating can take more
than three hours of machine cycle time.
It takes the same coating time to coat an entire chamber full of parts
as it does to coat a single part. The nine-step process involves:
Prepare the tooling inserts for the coating run. If these inserts do not
exist for the specific parts, they must be machined. The machining
process can take up to several days depending on complexity of the
components to be coated, and the number that can fit into an insert.
Clean and load the components into the tooling. Depending on the size
of the part, and the number of them to coat, this process can take from
seconds per part to minutes.
Prepare the coating chamber for the run. The chamber needs to put
FIGURE 3: A technician sets specifications on a coating
chamber. A labor- and time-intensive process, through a series of checks to make sure all systems are functioning and
coating deposition involves nine separate steps. all necessary surfaces in the chamber are covered. Load the planetary
tools into the coating chamber.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Evacuate the chamber down to 2 x 10-5 Torr, and heat the chamber
to between 250°C and 300°C. The vacuum removes airborne contain-
ments and moisture from the chamber as well as allowing more mobil-
ity to the material being vaporized.
Deposit the coating onto the component. Depending on the complex-
ity of the coating, this process can take from half an hour to days. In com-
plex filtering technologies, multiple hours to days is the standard.
Cool and vent the chamber back to room temperature and pressure.
Remove the components from the chamber and test the witness sample.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

The witness sample is a window that is coated along with the compo-
nents. This window is the piece that will go into the spectrometer to
determine the spectral response of the coating. This window is neces-
sary because the spectrometer cannot test a part with a curved surface.
In addition to spectral testing, most coatings are checked for adhesion
and abrasion resistance. Depending on the application, coatings may
also be required to pass other environmental tests such as high humidi-
ty, high/low temperature cycling, salt spray and resistance to various
solvents. Inspect and package the components.

Challenges to repeatability
Designing and making a coating is not an exact science. The design of
a coating is highly dependent on the deposition chamber in which it will
be made. The designer and operator must know and understand the
nature of the calibration of the machine, as well as any issues with the
performance of the individual subsystems being used. All the factors
contribute to the accuracy and repeatability of the coating from run to
run. In coatings that require many multiple layers, the risk goes up for
effective monitoring of the process.
Coatings expected to function for more than 20 years can be made

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


in typical coating chambers equipped for ion-assisted deposition. This
is particularly helpful for coatings that require tight position accuracy of
the center wavelength. Notch or edge filter coatings also benefit great-
ly from this enhanced technology.

20 Cost factors
In many cases, tolerances are the key to the simplicity or complexity of
manufacture. The engineer who specifies the coating can reduce his
company’s costs and improve the coating yield by asking for realistic
performance. If the standard offerings from coating vendors will not
meet the customer’s need, the customer will do well to keep his require-
ments as close to the standard versions as his application will allow.
Better yet, call the coating company and discuss your requirement with
a coating designer. Working with the coating vendor during the design
stage can save money, time, and headaches during production.
As with any other product, however, off-the-shelf coatings are less
expensive than custom coatings. Any standard coating eliminates cost-
ly development and should be available at a shorter lead time. Using
tried and tested processes also reduces the probability of failure in the
coating chamber.
Coating failures do happen, and no one wants to see several weeks
— or months — worth of precision-manufactured glass tossed out
because it has a bad coating. Designs made on the computer always
claim that a coating is manufacturable, but the execution in the coating
chamber can be a different story.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Monitoring coating deposition ensures correct wavelength


positioning
Some applications demand precision wavelength positioning. Picking
out one wavelength and rejecting the others requires that the center
wavelength of coating pass or stop band be highly accurate. The accu-
racy of wavelength positioning depends on how carefully the deposition
is monitored.
For typical edge filters, the accuracy can be ±1.0%, and deposition
The accuracy of is monitored with a quartz crystal. For very narrow filters, however,
BEST OF EDMUND OPTICS™ APPLICATION NOTES

accuracy can be as fine as ±0.002%, and deposition is optically moni-


tored with state-of-the-art equipment.
wavelength position- Conventional optical monitoring charts the reflectance (or transmit-
tance) at a preselected wavelength during the deposition of the layer. As
ing depends on how the layer approaches quarter-wave optical thickness (QWOT), the per-
centage reflection or transmission reaches a turning point on the chart
carefully the deposi- (because the QWOT produces the maximum change in R or T). The
turning point may be used as a layer termination trigger or as a calibra-
tion is monitored. tion level from which to calculate the eventual termination value.
Optical monitoring can be highly effective in producing coatings of
tight tolerance provided the core design is a regular quarter wave stack.
This is a result of a highly effective error compensation feature inherent
in the turning point detection and layer termination method: if, during
the coating process, the turning point is over — or under — shot, it is
compensated by terminating the deposition at the turning point of the
next layer. This compensating effect minimizes the cumulative error in
the multilayer stack and can result in very accurate filter wavelength
positioning, as demonstrated by “successful” production of narrowband
transmission filters.
Quartz crystal monitoring measures the physical thickness of the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


depositing material, and, therefore, does not involve any turning point
methodology. While this technique does not provide any error compen-
sation, it is very useful when monitoring layers that are significantly
thinner than one quarter wave (layers less than QWOT have no turning
point and therefore present difficulties for optical monitoring). This
21 quartz crystal methodology is favored in the production of designs such
as broadband antireflection coatings, where layers as thin as 10 nm are
common. Precision multilayer coatings such as edge filters can be pro-
duced using quartz crystal monitoring; however, without the compensa-
tory effects of optical monitoring, accurate material characterization
and very tight process control are necessary to achieve specification.
Most coatings do not exhibit any significant polarization effects at
angles of incidence less than 20°. At higher angles, the S and P states
behave quite differently. This is a consequence of their effective angu-
lar refractive index, given by: Ns = Ncos(q) and Np = Ncos(q). At 45°,
the variation in S and P performance can be very significant. A 50:50
beamsplitter (random polarization) may transmit 75% P and only 25%
S. Polarization insensitive coatings can be produced at high angles for
single-wavelength operation. Achieving nonpolarization over a broad
waveband, however, presents a difficult challenge to both thin-film
designers and engineers.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

TECH TIP ON S & P POLARIZATION


BEST OF EDMUND OPTICS™ APPLICATION NOTES

S & P polarization refers to the plane in which the electric field of a light wave is oscillating.
S-Polarization is the plane of polarization perpendicular to the page in the figure below. P-
polarization is the plane of polarization parallel to the page in the figure below.

incident light

reflected light

S-Polarization, coming out of page


P-Polarization, parallel to page

transmitted light

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


The axis of a linear polarizer determines the plane of polarization that the polarizer pass-
es. There are two ways of finding the axis of a polarizer. A simple method is to start with a
22 known polarizer with a marked axis. Place both the known and unknown polarizer together
and transmit light through them. Rotate the unknown polarizer until no light passes through
the pair of polarizers. In this orientation, the unknown polarizer‘s axis is 90° from the axis of
the known polarizer.
If a known polarizer with a marked axis can not be found, the axis can be found by taking
advantage of the Brewster effect. When light reflects at glancing incidence off of a non-metal-
lic surface, the S-polarization is reflected more than the P-polarization (see figure above). A
quick way to do this is to look at the glare off of a tiled floor or another non-metallic surface.
Rotate the polarizer until the glare is minimized. In this position, the polarizer is oriented so
that the axis is vertical.

www.edmundoptics.com 800.363.1992
THE COMPLEXITIES
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

High-power thin-film optical coatings are typically required for optics that
OF CREATING must handle sustained high-power output from lasers. These coatings can be
reflecting, transmitting, polarizing, or beamsplitting; it is important to note
HIGH-POWER that “high power” may have different meanings depending on the applica-
tion. A reasonable definition is that the term “high power” applies to any
OPTICAL COATINGS coating that requires special attention and processing to avoid damage dur-
ing irradiation. As a rule of thumb, any design drawing that includes a power
specification (that is, for which the standard processing is insufficient) is
BEST OF EDMUND OPTICS™ APPLICATION NOTES

considered a high-power coating.


The optical coating is generally the limiting factor in the output of a
high-power laser system. The most common failure mode of high-power
laser coatings results from the presence of absorption sites within the coat-
ing or at the coating's interface with the substrate or air. These absorption
sites are usually in the form of gross defects that absorb laser energy, result-
ing in generation of heat that causes localized melting or thermal stress frac-
tures. Failure by this mechanism is usually catastrophic (see Figure 1).
Noncatastrophic failure, such as plasma burn, is typically the result of
unoxidized 1 to 5µm metallic nodules within the coating. (Some manufac-
turers will intentionally subject their coated elements to powers sufficient to
trigger plasma burns to remove the defect nodules.) Finally, intrinsic mate-
rial properties determine the laser threshold that an otherwise defect-free
film will sustain.
Before deposition
FIGURE 1: Coatings suffer catastrophic damage when Making high-power laser coatings requires tight control of every aspect of
defects absorb laser energy, generate heat, and production, from initial substrate manufacture to final packing. Before the
cause melting or thermal stress fractures. A
coating fails at relatively low thresholds of 11.77 substrate even reaches the coating chamber, its surface quality and cleanli-
(top left), 12.92, (top right), and 14.3 J/cm2 ness must be assured. A clean coating chamber, appropriate choice of thin-
(bottom left) for 20ns pulses at a 1064nm film materials, and good control of process parameters are also essential.
wavelength, due to poor coating process control.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


A coating fails at 73.3 J/cm2 due to a coating After deposition, coating makers must control contamination; even at this
defect (bottom right). stage, surface contamination may cause the element to fail when subjected
to high powers. For this reason, meticulous cleaning procedures are also
required at the assembly stage, typically under strict cleanroom operating
conditions.
23 Substrates for use with high-power laser coatings must be made of high-
quality materials. This is particularly important for transmitting optics—
these substrates must demonstrate extremely low intrinsic absorption at the
relevant wavelengths. Surface defects are potential damage sites and surface
quality is specified in terms of a scratch and dig value (scratch numbers do
not directly correlate to scratch size; dig numbers are in units of 0.01 mm).
High-power laser optics typically specify 20-10 or 10-5 scratch-dig surface
values.
The substrates must also be pristine. Any organic or particulate residue
from polishing or cleaning may absorb the laser energy and is therefore a
potential damage site. For this reason, the substrate and coating interface is
a critical area in achieving high damage thresholds. Mirror elements, how-
ever, reflect most of the laser energy from the layers closest to the incident
media (normally air), and as a result are less sensitive to the presence of
defect sites at the substrate surface than transmissive elements.

A clean room helps


Either way, this sensitivity to organic or particulate residue tests the clean-
ing process. Cleanroom conditions help because there is less risk of recont-
amination after cleaning the substrate. Most coatings companies use lint-
free wipes without silicone constituents when cleaning manually in their

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

final clean process. Solvents used are of extremely high purity—typically


methanol, isopropanol, or acetone.
Ultrasonic cleaning, when it works efficiently, can be useful and is
more effective at dislodging residual polishing compounds than cleaning
by hand. Certainly, it is less prone to error.
A typical multistage manual process includes a surfactant wash, sever-
al wipes with an ammonia solution, and on the final stage a drag-wipe tech-
nique using high-purity solvents. The drag-wipe technique produces very
BEST OF EDMUND OPTICS™ APPLICATION NOTES

high shear forces, resulting in the removal of any remaining contaminants


from the surface.
Contaminants from several parts of a coating chamber can migrate onto
1.4 the optical surfaces. If the tooling is not meticulously kept clean, it can con-
field intensity (squared)

1.2
taminate the glass. Backstreaming can occur with an inefficient diffusion
Normalized electric

1.0
pump, resulting in organic contamination.
0.8
Finally, the walls of the chamber itself can contribute to contamination
0.6
of the glass. Material evaporated from a target deposits on both the sub-
0.4
strate and on the walls of the chamber. After several runs, the material on
0.2
the walls builds up until it begins to flake off. During the pump-down
0.0
9 8 7 6 5 4 3 2 1 sequence, loose particulates can be transferred from the walls of a dirty
chamber onto the optic.
h

h
Low

Low

Substrate
Hig

Hig

The solution is to maintain the cleanliness of the chamber. Many cham-


Layer bers are lined with aluminum foil (made by rolling without oil), while other
coaters prefer to use removable steel liners. Cleaning the chamber consists
largely of replacing the foil or liners and removing coating buildup from
1.4 any uncovered areas within the chamber.
field intensity (squared)

1.2
Normalized electric

1.0 Design
0.8 For high-power applications, coating designers must choose materials with
0.6 intrinsically low absorption at the relevant wavelengths, which leaves the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


0.4 designer with only a few material choices in each of the spectral regions.
0.2 Coatings for use with high-power ultraviolet (UV) light are made of differ-
0.0 ent materials from those for use in the visible and near-infrared (IR).
9 8 7 6 5 4 3 2 1
Materials for use in mid-and far-IR coatings are a third group.
h

h
Low

Low

Substrate
Hig

Hig

Dielectric metal oxides are preferred materials for UV, visible, and near-
24 IR laser applications. Silicon dioxide (SiO2) is the generally accepted and
Layer ubiquitous choice for low-index layers. Choosing a material for high-index
layers is not as straightforward: oxides of titanium, tantalum, zirconium,
hafnium, scandium, and niobium are popular high-index materials.
FIGURE 2: The normalized electric-field intensity (EFI) The design of a coating can significantly alter the damage threshold. In
squared within a reflecting quarter-wave
dielectric stack shows peak EFI at layer the case of high-reflection coatings, the core structure is typically a repeat-
interfaces and high est EFIs occurring at the ing stack of high- and low-index layers, each a quarter-wavelength thick.
layers closest to the air boundary (top). For
clarity, the total number of layers shown is a less Simply adding a half-wave of low-index material (normally SiO2) as the
typical high-reflector design. The thickness of final layer can result in measurably higher damage thresholds.
the four layers closest to air in a nine-layer stack
is modified to reduce EFI in the high-index
According to some groups, laser-damage thresholds can be increased
layers (bottom). even further by manipulating the coating layers in at least one of several
ways. The electric-field distribution can be averaged across several layers,
thereby avoiding a high electric-field concentration within a relatively small
number of layers. The high-intensity resonant peaks can be shifted from
layer interfaces to locations within the film continuum (see Figure 2). The
highest-intensity resonant peaks can be positioned within the layers of the
thin-film material demonstrating the highest damage threshold. Reported
results for these techniques, however, are mixed.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Process control
Many parameters play critical roles in the deposition of a high-power laser
coating, including the rate of deposition, substrate temperature, oxygen
partial pressure (used in designs including dielectric metal oxides), thick-
ness calibration, material-melt preconditioning, and electron-gun sweep. A
poorly controlled evaporation process produces spatter from the source,
resulting in particulate condensates on the substrate surface and within the
depositing coating. These condensates are potential damage defect sites.
Fabricating high-power
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Unfortunately, some materials that can be used for high-damage-threshold


coatings are difficult to deposit smoothly. The settings applied to the elec-
thin-film optical coatings tron-gun sweep can be the difference between the production of a clear,
high-damage-threshold coating or the production of a high-scatter coating
challenges every step of with a much lower power capability.
The rate of deposition, substrate temperature, and oxygen partial
pressure (for dielectric oxides) determine the stoichiometry of the
the manufacturing growing film, which significantly affects the metal oxide chemistry in
the depositing film. These parameters must be optimized and con-
process and requires trolled to produce a homogeneous layer with the desired metal-oxygen
content and structure.
great attention to In producing antireflection coatings, thickness accuracy of the deposit-
ing films is an important factor in meeting the desired low reflectance.
cleanliness. Mirrors are generally less sensitive to small thickness errors as a result of
the relatively broad reflectance band afforded by the refractive index ratio
of the high and low index layers. Deep-UV mirrors are an exception, how-
ever, because material limitations in this spectral range produce relatively
narrowband reflectors.

Ion beams
Ion-beam technology is now a recognized and widely used tool in the man-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


ufacture of thin-film coatings, either as an enhancement to thermal evapo-
ration (ion-assisted deposition) or as a sputtering technology (ion-beam
sputtering). While these methods produce more compact and durable films
with properties closer to those of the bulk materials, conclusive evidence
may not exist that ion-beam technology produces higher damage thresh-
25
olds.

TECH TIP ON ANTIREFLECTION COATINGS


As light passes through an uncoated glass substrate, approximately 4% will be reflect-
ed at each surface. This results in a total transmission of only 92% of the incident light.
Antireflection coatings are especially important if the system contains many transmitting
optical elements. Coating each component will increase the throughput of the system
and reduce hazards caused by reflections traveling backwards through the system (ghost
images). Many low-light systems incorporate AR coated optics to allow for an efficient
use of the light.
The transmission properties of a coating are dependent upon the wavelength of
light being used, the index of refraction of the substrate, the index of refraction of the
coating, the thickness of the coating, and the angle of the incident light.
The coating is designed so that the relative phase shift between the beam reflect-
ed at the upper and lower boundary of the thin film is 180°. Destructive interference be-
tween the two reflected beams occurs, cancelling both beams before they exit the sur-
face. The optical thickness of the coating must be an odd number of quarter wave-
lengths (λ/4, where λ is the design wavelength or wavelength which is being optimized
for peak performance), in order to achieve the desired path difference of one half wave-
length between the reflected beams, which leads to their cancellation.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

DETECTORS: A When selecting an appropriate detector for an application, there are a num-
ber of factors to consider. Here we will primarily focus on photodiode (sili-
con and InGaAs) selection, but we will also point out when it would be more
USER’S GUIDE appropriate to use an avalanche photodiode (APD) or a photomultiplier tube
(PMT). We will conclude with a discussion of the use of readout amplifiers
to convert the photodiode current to a voltage.

Wavelength and Responsivity


One of the first factors
BEST OF EDMUND OPTICS™ APPLICATION NOTES

One of the first factors to consider in detector selection is the wavelength of


the light source. You will want to select a detector with the highest respon-
to consider in detector sivity (or quantum efficiency) for the wavelength(s) of interest. For wave-
lengths ranging from 350nm to 1100nm, silicon detectors can be used,
whereas InGaAs detectors would be suitable from 900nm to 1700nm.
selection is the wave-
Light Level
length of the light The second factor to consider is the light level. In particular, you will want
to select a detector that will provide the highest signal-to-noise ratio for a
source.You will want to given light level.

select a detector with The signal-to-noise ratio of an Si or InGaAs photodiode is given by:

the highest responsivi- SNR =


Is
4kTB
ty for the wavelengths + 2 q (I d + I s ) B
Rsh
of interest. Where k is Boltzmann’s constant, T is the photodiode temperature, B is the
measurement bandwidth, Rsh is the shunt resistance, q is the electron charge
and I d and I s are the dark and signal currents respectively.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


This expression shows that for a simple detector, you can improve the sig-
nal-to-noise ratio by: 1) Increasing the light level 2) Cooling the detector 3)
Operating at a narrower bandwidth 4) Decreasing the dark current 5)
Selecting a photodiode with a high R sh and low capacitance (small active
26 area). A high shunt resistance will result in a very low dark current.

Because the readout amplifier affects the sensitivity of your signal meas-
urement, in practice you will want to consider the signal-to-noise ratio of the
photodiode as well as the amplifier that follows it.

If the light level is high, then a photodiode would be the most suitable detec-
tor choice. While noise is not a problem when light levels are high, it does
present a challenge when measuring a low-light level fluorescence or laser
range finder signal. When light levels become comparable to the amount of
noise generated by a photodiode and its signal processing electronics, then
a PMT or an APD would be a more suitable detector choice. A general rule
of thumb to remember is that photodiodes should be used with light levels
of µWs to mWs, avalanche photodiodes from nWs to µWs and PMTs from
fWs to nWs. There are exceptions to this rule. For instance, with the proper
amplifier, special low-noise photodiodes have been used to detect light lev-
els as small as 30pW with reasonable signal-to-noise ratio.

Light Source Geometry and Active Area


Any time you are dealing with a diffuse source, a large active area detector
will ensure maximum energy capture. The cost increases with active area
however as does the noise level generated by the photodiode. In fact for
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

low-light levels, you don’t necessarily want to choose too large an active area
because the noise level is proportional to the square root of the active area.
For laser sources, the active area of the detector is typically chosen based on
ease of alignment and/or laser beam diameter.

Measurement Bandwidth
The noise generated by a silicon detector depends on the speed at which it is
being operated. Very often, it is preferred to reverse bias a detector in order
BEST OF EDMUND OPTICS™ APPLICATION NOTES

to speed up its response. Reverse biasing a silicon detector decreases its


response time, but the downside is that the dark current noise increases. As
measurement bandwidth increases, the total noise due to the detector and the
noise due to the readout amplifier increase. Consequently, for high-speed
and/or low-light level applications, it is usually best to use an avalanche pho-
todiode or a PMT. PMTs can provide nearly noise-free gains on the order of
10 6 and response times on the order of nano- to micro-seconds depending on
the device. Avalanche photodiodes are limited to gains below 200, but
because of their internal gain mechanism (avalanche multiplication), they are
able to operate at high speeds and do not generate nearly as much noise as
would a photodetector connected to an external amplifier.
FIGURE 1: Photovoltaic Mode
Readout Amplifier
Typically a transimpedance amplifier is used to amplify the signal from a
photodiode. Figure 1 shows how one would typically connect a photodiode
to such an amplifier. Here, the output is shown to be:
VOUT = -I P RF = -S( λ) PIN RF

where PIN is the incident light intensity, S(λ) is the responsivity of the photo-
diode and R F is the feedback resistance of the amplifier. Figure 2 shows how
one would reverse bias the photodetector. In Figure 2,

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


VOUT = I P R F = S( λ) PIN R F

Note that the voltage becomes more negative as the incident light level
increases in Figure 1 whereas the voltage and light intensity move in the same
direction in the case of Figure 2.
27 Ideally the photodetector and amplifier should be kept as close as possi-
FIGURE 2: Photoconductive Mode
ble to each other since cable capacitance and noise pickup can degrade the
signal-to-noise ratio and sensitivity of your measurement. In this way, noise
pickup and amplifier generated noise are both kept to a minimum.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

USING FILTERS IN Using filters in a machine vision system often provides a simple and
economical means of enhancing the application’s speed, accuracy, or
repeatability. When deciding whether filters will be useful in the appli-
MACHINE VISION cation, the entire system needs to be taken into consideration, including
the subject being imaged, illumination, optics, sensor, output, and final
viewing conditions. Filters can enhance or control density, focus, con-
trast and glare. They can also be used in fluorescent, UV and IR appli-
cations.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

CYAN
Filters work by decreasing portions of the incident radiation by
absorption or reflection, so that only radiation of the desired spectral
quality reaches the optics and sensor. Filters cannot add any radiation to
a system; a red filter cannot transmit red light if there is no red light in
GREEN BLUE
the radiation falling on the filter. However, they are very useful for
eliminating radiation from a system, selectively determining what illu-
mination reaches the sensor, thus enhancing a system’s signal-to-noise
ratio.
There are many different kinds of filters, but in general they either
remove incident radiation non-selectively by wavelength (neutral densi-
ty (ND) filters), selectively by wavelength (color, interference, dichroic
YELLOW MAGENTA and notch filters), or selectively by angle of polarization. The amount of
transmitted radiation is determined by the optical density of the filter.
Higher density (dark) filters block more and transmit less radiation.
Filters are commonly designed to have high optical density in one spec-
RED
tral region (to eliminate noise) while maintaining high transmission in
another spectral region (to enhance the signal). However, filters may also
FIGURE 1: Maxwell's Triangle
be designed simply to balance colors, in which case lesser degrees of
blocking and/or transmission are necessary.
Color filters, also known as bandpass filters, are commonly used to
control contrast in black and white images. By knowing what colors you

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


would like to differentiate or distinguish in your subject, you can choose
lighting and filters to control contrast by using the Maxwell triangle
(see Figure 1). The triangle illustrates which colors cancel one another
(red & cyan, green & magenta, yellow & blue) and which colors com-
bine to produce another color (for example, red & green make yellow,
28 and yellow & magenta make red). The general rule is that a filter will
lighten subject colors that are the same color as the filter or adjacent to
that color on the Maxwell triangle. To highlight a red subject, you
would use a light source containing red light, and a red filter. Magenta
and yellow subjects would also be lightened whereas blue, cyan and
green subjects would be dark (see Figure 2). If yellow subjects need to
be lightened and red subjects darkened simultaneously, a green filter
would be used (it is adjacent to yellow, but not red in the Maxwell tri-
angle).
An industry that has driven several advancements in filter technolo-
gy is biotechnology, where one very common application is fluores-
cence imaging. Fluorescence imaging, in the biotechnology field,
makes use of light-sensitive fluorophores in order to mark, identify and
FIGURE 2: The left image is a bottle of cleaning fluid that process cellular content. Machine vision has taken note of the advan-
is backlit for label inspection. Simply adding a
blue color filter in front of the lens greatly
tages of this technology, and is increasingly reaping the rewards of the
increases the contrast between the yellow advancement of filter performance in inspection applications.
printing and the blue liquid, as shown in the A common problem with machine vision inspection is extracting the
right image.
most useful signal information from a product. Many defects or areas of
interest do not stick out clearly from the rest of the product or other sur-
roundings. This can cause problems not only for visual inspections, but
systems designed to take measurements. Image processing algorithms
are based on changes in density. Having low contrast scenes may
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

require more difficult and time-consuming image processing programs


or even make them unusable/unreliable.
Trying to inspect white or clear glue on white paper, for example,
can be very difficult. Color filters cannot improve contrast since the
objects are the same color. Even if there is a slight difference in hue or
value, the color fidelity and tonal sensitivity of the camera system may
not be fine enough to distinguish the difference. By calculating the
affects of metamerism, a phenomenon in which spectrally different
Fluorescence imaging,
BEST OF EDMUND OPTICS™ APPLICATION NOTES

materials are undistinguishable to a particular sensor or viewer, a par-


ticular lighting scheme could help to distinguish a slight color differ-
in the biotechnology ence not visible under most lighting conditions. However, this may be
time consuming and hard to control. Another approach that is finding
greater industry acceptance is using fluorescence. In the glue and paper
field, makes use of example, the glue or paper may have brighteners in them that fluoresce
under particular lighting. Adding a fluorophore (or fluorescent whiten-
light-sensitive fluo- ing agent) to products is an option if they are not already fluorescent.
Many textiles, paints, plastics, ceramics, and cleaners contain fluo-
rophores in order to rophores, often called an “optical brightener.”
In order to take advantage of fluorescent materials, filters are need-
mark, identify and ed. Filters can be used to control the incident excitation light and the
fluorescence emitted from the subject. Usually an exciter filter is used
process cellular over the radiation source and a barrier filter (or emitter filter) is used
over the camera lens. The exciter filter only passes wavelengths that
content. will cause the product to fluoresce. UV light sources can be used for
strong illumination. In order to obtain the desired contrast, a filter is
placed over the camera lens to transmit the longer wavelengths pro-
duced by the fluorescent material.
Filter lifetime, acceptance angle, and overall transmission are several
inherent problems with traditional narrow bandpass interference filters

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


that have prevented their acceptance into industrial machine vision appli-
cations. A new generation of “hard-coated” fluorescence filters, however,
is now available with extremely durable and efficient coatings.
Traditional narrow bandpass interference filters consist of up to 50
layers of dielectric materials, several glass substrates, and metallic thin
29 film coatings. This traditional method leads to the following problems-
--first, the multiple layers reduce transmission at all wavelengths,
including the design passband wavelengths. Second, the multilayered
construction of the filters makes them difficult to use in imaging appli-
cations, because accurate ray tracing through the many layers is not pos-
sible and repeatable construction of the filters is difficult and costly.
Loose tolerancing on the overall thickness and on the center wavelength
is common. Finally, because the epoxy and often the coating materials
themselves have poor mechanical and optical durability and are prone
to humidity-induced swelling and degradation, the overall reliability of
the filters is limited.
"Hard" ion-beam sputtered (IBS) coating eliminates the problems
inherent to traditional interference filters. Dielectric films of arbitrary
thickness are deposited on a single substrate, eliminating the need for
complex cavity designs and epoxy layers. With arbitrary film thicknesses
and capabilities to deposit up to 200 layers, excellent transmission and
broad spectrum blocking can be achieved entirely by the dielectric coat-
ings applied to the substrate. Transmissions typically exceed 90% at the
design wavelength, with much steeper edges and deeper blocking. The
combination of improved signal transmission and ambient illumination
rejection simultaneously yields brighter images and darker backgrounds,

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

greatly improving system signal-to-noise ratio.


Thus, the use of fluorescence filters, originally designed for biotech-
nology, can greatly enhance the capabilities of a machine vision system.
The improved signal-to-noise greatly simplifies the needed complexity
of the imaging processing algorithm, which greatly improves yield and
throughput of the overall system. While these filters are certainly much
more expensive than standard color, dichroic, and polarizing filters tra-
ditionally used in vision systems, their increased durability and superior
BEST OF EDMUND OPTICS™ APPLICATION NOTES

performance pays for themselves almost immediately.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


30

www.edmundoptics.com 800.363.1992
CONTRAST
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Application requirements
ENHANCEMENT During packaging, pharmaceutical pills of different colors need to be
sorted. An automated imaging system, which distinguishes between the
THROUGH different colored pills, is essential in increasing production efficiency.
In such a system, pills are inspected for specific characteristics as they
FILTERING travel down a trough-like conveyor belt prior to sorting. A minimum of
60% contrast is needed for the software to be able to differentiate
between the different pills.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

System Requirements Given by Customer:


Working Distance: ~350-450mm
Field of View: ~70mm
Minimum Contrast: 60%

Component selection
The 35mm MVO® Double Gauss imaging lens, used with a ¹⁄₂" CCD for-
mat camera, yields an appropriate field of view and working distance for
FIGURE 1: An application where multi-colored pills are this application. The Sony XC-ST50 high resolution monochrome CCD
sent down a conveyor belt and sorted via an
imaging system. camera offers a suitable amount of resolution and dynamic range
(grayscales). A fiber optic area backlight is placed underneath the slotted
trough to diffusely illuminate the pills. A capture board is used to digi-
tize the camera signal for further image processing. In order to meet the
minimum contrast level, filtering is required. The process of the filter
selection is shown below.

Effects of filtering
Monochrome cameras cannot inherently discriminate between different
colors. In this example, both the red and green pills appear nearly iden-
tical when imaged with the Sony XC-ST50 (see Figure 2). Filtering can

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


31
FIGURE 2: A close-up of the colored pills (above) and the
grayscale values of those pills using various fil-
ters (below).

Sampling Area Sampling Area Sampling Area

255 255 255

217

166
Grayscale

Grayscale

Grayscale

119
100

62

12
0 0 0
NO FILTER RED FILTER GREEN FILTER

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

be used to improve the contrast between pills of different colors and


enables the system to differentiate between them. The images, along
with their associated grayscale profile curves, are illustrated in Figure 2.
All curves are generated only for the sampling area indicated.

Calculating contrast
A visual interpretation of the images and grayscale profile curves can be
quite subjective. However, a contrast value can be calculated from the
% Contrast = curves to determine which filter offers the highest contrast (see Figure
BEST OF EDMUND OPTICS™ APPLICATION NOTES

3).
(Imax - Imin) NO FILTER: CONTRAST =
119 - 100
= 8.7%
119 + 100
( Imax + Imin)
217 - 62
RED FILTER: CONTRAST = = 55.6%
where Imax is the maximum intensity 217 + 62
and Imin is the minimum intensity
GREEN FILTER: CONTRAST = 166 - 12
= 86.5%
166 + 12

Conclusion
255
In order to differentiate between the colored pills, the software needs a
minimum of 60% contrast. A grayscale profile can be generated from
the sample area in order to calculate the contrast. The original mono-
Grayscale

Imax
chrome image only has a 8.7% contrast difference between the red and
green pills. The contrast can be increased beyond the minimum require-
ment by ~25% by attaching a green filter to the front of the lens. This
Imin allows the user’s customized software to operate on a go/no-go princi-
0
ple and accurately sort the pills.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


FIGURE 3: An example on calculating contrast using the
equation given above.

32
TECH TIP ON USING FILTERS
For best results, point the coated, or “mirror-like” surface towards your light source. This
will minimize any thermal effects resulting from the absorption of the heat by the glass on
the other side. Placing the filter in the opposite direction will still work, though it will cut
down your throughput and you will not get the maximum desired effect. Also, having the
“mirror-like” side facing away from the source will cause an interference pattern when the
source is a coherent beam of light. The coated surface is easily determined by looking at the
edge of the substrate, from the direction of the center of the filter at a slight angle so look-
ing at the inside edge. If you can see the actual edge (thickness) of the glass, then the coat-
ing is on the other side. From the coated side, the edge is not visible. This is more difficult
to check on coatings that transmit in the visible, but the edge can still be detected by view-
ing the filter at a steep angle.
Also, be aware of the tilt of the filter. For filters in general, as the angle of incidence (the
angle your source light hits the filter) increases, a filter’s transmission curve will shift to
lower wavelengths. The effect of large angles from the center of the optical system is the
same as tilting a filter from a perpendicular position to an optical system. As the angle of tilt
gets larger, the curve will start to change shape, this typically means the transmission will
steadily drop and the slopes in the curve will start to change. Most filters are designed for a
0º angle of incidence, but some filters (such as Hot Mirrors) can be designed for other angles
of incidence. Keep this in mind when specifying a filter.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

IMAGING LENS A number of factors must be considered when selecting a lens for a par-
ticular imaging application. In straightforward applications, only basic
system parameters need to be considered when making the lens selec-
SELECTION BASICS tion. As more-complex applications arise, increased demands are being
placed on imaging systems. Therefore, a range of factors must be taken
into account to build a system that will perform with high levels of accu-
racy and reliability.
The following fundamental parameters of a system need to be speci-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

fied to solve almost any imaging application.

Field of View (FOV)


The FOV is the viewable area of the object under inspection. In other
words, it is the portion of the object that fills the camera’s sensor. The
FOV can be specified as a range, such as 10-50mm in the case of a zoom
or varifocal lens; as an angular specification, such as 25° for lenses that
work over a range of working distances; or as a fixed number, such as
60mm in the case of fixed-magnification lenses. However, all of these
specifications are subject to change if different sized imaging arrays are
used.
A variety of array sizes are available on the market, and the physical
FIGURE 1: Illustration of primary magnification and the dimensions of the array must be known to select the right lens for your
relationship between sensor size and FOV application. As the arrays increase in size, the FOV increases; converse-
ly, as the sensor gets smaller, so does the FOV. Using the magnification
of the lens, you can specify the FOV to take into account various sensor
sizes. The magnification of the lens is the primary magnification of the
system and is referred to as PMAG.

Primary Magnification (PMAG)


PMAG describes how much of an object will be seen on a given sensor

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


array and can be calculated using the following equation:

PMAG = sensor size (mm)


FOV (mm)
or
sensor size (mm)
FOV (mm) = PMAG
33
Ultimately, you need a lens that meets your FOV requirements when
coupled with your selected camera. The formula can be solved to deter-
mine what magnification lens is required or what FOV can be obtained
as the magnification or the sensor size varies. Again, you must know
your sensor size, also called chip or sensor format, to use these formu-
las.
Every lens has a PMAG or PMAG range associated with it regardless
of the sensor size that is used, since PMAG is a property of the lens (see
Figure 1). It should be noted that lenses focusing at distances approach-
ing infinity have PMAGs approaching zero. In these cases, angular field
of view values can more easily be used to determine FOV. Angular FOV
also changes as the sensor size changes:
θ
FOV = 2 x working distance x tan ( 2 )
where θ = the angular FOV of the lens for a specific imager size

Imaging-Sensor Formats
Most imaging arrays have 4:3 (horizontal:vertical) aspect ratio (see
Figure 2). Note that the sensor format size is not equivalent to the sen-
sor’s active area. In general, imaging lenses can be used with any cam-
era if the lens design format is larger or equal to that of the camera. If the
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

sensor is too large, vignetting (tunnel vision) will occur. For this reason,
most lenses will specify a maximum format, or maximum diagonal, with
which the lens can be used.

Working Distance
The working distance is the distance from the front of the lens to the object
under inspection (see Figure 3). Some lenses, such as objective lenses,
have a fixed working distance; many lenses have a working distance
BEST OF EDMUND OPTICS™ APPLICATION NOTES

range. While this is probably the most straightforward parameter to spec-


ify, one must consider a few details when determining the best working
distance for an application. Overall system size, moving parts, flying
12.8

3.2 4.8
6.4 7.2 8.8 debris, and lighting all must be taken into account when determining your
16
2.4 4.0

Inch
3.6 6.0 4.8 8.0 5.4 9.0 6.6 11.0 9.6
required working distance. Additionally, the higher the magnification of
Inch
Inch

Units: mm
Inch Inch
1 Inch
the system the longer the lens system will be. If high magnification is
required at a long working distance, the system size can increase even
more.

FIGURE 2: Imaging sensors come in several standard for- Resolution


mat sizes. As the arrays increase in size, the
FOV increases; conversely, as the sensor gets The minimum feature size of the object that can be distinguished by the
smaller, so does the FOV. imaging system is the resolution. This specification drives many of the
decisions regarding the selection of the camera and lens for a particular
application because the resolution of the system in many ways constrains
the system’s accuracy and repeatability. Resolution and contrast are direct-
ly linked, and all lenses are not created equally in terms of their ability to
produce resolution and contrast. While two lenses may have identical
Sensor
Sensor Size PMAG/FOV, working distance, dimensions, and so forth, they may not be
able to identically reproduce contrast levels and resolution detail from the
Camera object.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Depth of Field (DOF)
The depth of field (DOF) of a lens is its ability to maintain a desired
amount of image quality as the object is positioned closer to and farther
from best focus. Depth of field applies particularly to objects with depth,
Working
Distance
since high DOF lenses can image the whole object clearly. As the object
34 Depth Of is placed closer or farther than the working distance, it goes out of focus,
Field and both the resolution and the contrast suffer. For this reason, DOF only
makes sense if it is defined with an associated resolution and contrast.
Field O A standard industry practice is to specify the DOF with a single value
f View
calculated from the diffraction limit. Making a genuine comparison
Resolution
between lenses is difficult, however, because many imaging lenses are not
diffraction limited. Although two lenses may have the same f/# (that is,
FIGURE 3: The working distance is the distance from the equal diffraction limit), they do not necessarily have similar performance
front of the lens to the object under inspection.
The minimum feature size of the object that or comparable DOF. For applications where depth of field is a highly crit-
can be distinguished by the imaging system is ical requirement, contact the lens manufacturer for the performance capa-
the resolution. The depth of field of a lens is its
ability to maintain a desired amount of image
bilities of the lens you intend to use.
quality as the object is positioned closer to and
farther from best focus.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

OPTICS AND Image quality in a machine vision system is determined primarily by the
quality of the system’s components, such as lenses and frame grabbers.
And, image quality can be measured and specified fairly easily. You
MACHINE VISION can build a machine vision system by trial and error — by picking lens-
es, a CCD, and electronics at random and hoping they will work togeth-
er and provide an image quality sufficient for your application. Many
prototype systems are, in fact, built this way and many require consid-
erable troubleshooting to get them working.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

There is a better, faster way to build imaging systems that often can
yield a cheaper system than you get by guesswork. By starting with an
understanding of image quality, you can choose components that fit the
Sensor
application and complement one another. And none of the components
Sensor Size will be more expensive than necessary.
Camera The first step is to understand how image quality is specified. Next, by
considering the relationship between resolution and contrast you will
understand the tremendously useful modulation transfer function. Third,
you take into account other factors related to image quality, including the
relationship between f-number and resolution; the diffraction limit; and
aberrations, depth of field, distortion, and perspective error.
Working
Distance Fundamentals
Depth Of
Field The fundamental parameters of imaging systems (see Figure 1) include:
• Field of View (FOV). The viewable area of the object under
inspection, i.e., the portion of the object that fills the camera’s
Field O
f View sensor.
Resolution • Working Distance. The distance from the front of the lens to the
object under inspection.
• Resolution. The minimum feature size of the object under inspec-
FIGURE 1: The fundamental parameters of an imaging tion.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


system include the resolution of the object, the
field of view, and the depth of field the user • Depth of Field (DOF). The maximum object depth that can be
wishes to image. The sensor size and the work- maintained entirely in focus. The DOF is also the amount of
ing distance from the object to the lens are also
important. The primary magnification is the object movement (into and out of focus) allowable while main-
field of view divided by the sensor size. taining an acceptable focus.
• Sensor Size. The size of a camera sensor’s active area, typically
35 specified in the horizontal dimension. This parameter is impor-
tant in determining the proper lens magnification required to
obtain a desired field of view.
Another useful descriptor of the system, the primary magnification
of the lens, is the ratio between the sensor size and the field of view. It
is not a fundamental parameter:
magnification = sensor size (mm) / field of view (mm)
In addition to resolution and depth of field, image quality is also a
combination of three other properties: image contrast, perspective
IMAGE errors, and distortion (see Figure 2).
The primary purpose of any imaging system is to obtain enough
QUALITY image quality to allow the extraction of necessary information. A sys-
tem that works for one application might not for another. Furthermore,
RESOLUTION the use of over-specified components might do little more than increase
system cost. With this in mind, let’s take a closer look at the components
DEPTH OF FIELD CONTRAST that determine image quality.

PERSPECTIVE DISTORTION Resolution


Resolution is a measurement of the imaging system’s ability to repro-
FIGURE 2: A variety of factors contribute to the overall duce object detail. Imagine, for example, a pair of black squares on a
image quality, including resolution, image con- white background. If the squares are imaged onto neighboring pixels,
trast, depth of field, perspective errors, and
geometric errors.
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

they appear to be one large black rectangle. To distinguish one from the
other, a certain amount of space must exist between them. Determining
the minimum necessary space yields the limiting resolution of the sys-
tem. This relationship between alternating black and white squares is
often described as a line pair. The resolution is typically defined by the
frequency measured in line pairs per millimeter (lp/mm).
Two different but related resolutions are in play here: the resolution
in object space (the size of elements in the object that can be resolved)
and image space resolution (a combination of lens resolution and cam-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

era resolution). The sensor’s line pair resolution can be no more than
-2 -1 half the number of pixels on the sensor because a minimum of two pix-
els is required to discern a black and white area. The image and object
1
2 space resolutions (described in lp/mm) are related by the primary mag-
2 nification:
3 image space resolution =
3 2
0 1
1
2
3 4 (object space resolution) / (primary magnification)
3 4
The limiting resolution of the system can often be found by imaging
4 5
5
4
6
5 0 a test target (see Figure 3). A bar target consists of line pairs with vary-
6 1
6 ing frequencies; a star target consists of wedges with a continuum of
5 -2 frequencies. The orthogonal lines in a bar target are useful because they
allow an operator to test the system for astigmatic errors, which are
6 1 errors that show up differently in the X and Y planes of an image. Bar
targets, however, are limited by having a finite number of steps in fre-
quency. Star targets do not have this drawback but can be more difficult
to interpret.

Contrast
Although the resolution and the contrast of an image can be defined
individually, they are closely related. We have already examined reso-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


lution as an independent parameter that describes object detail. Let’s
now consider contrast independently before relating the two concepts.
Contrast, which describes how effectively the differences between
boundary areas on the image are reproduced relative to one another, can
often be defined in terms of grayscale or signal-to-noise. For an image
36 to appear well defined, the black details must appear black and the
white details, white (see Figure 4). The greater the difference in inten-
sity between a light and a dark line, the better the contrast. This is intu-
itively obvious, but more important than might first appear. The contrast
is the separation in intensity between blacks and whites:
FIGURE 3: Two test targets: a bar target and a star target % contrast = (Imax – Imin) / (Imax + Imin)
allow users to measure resolution and astigmatic
errors. The orthogonal lines in the bar and
Reproducing object contrast is as important as reproducing object
the radial pattern in the star allow users to detail, which is essentially resolution. The lens, sensor, and illumination
test the system for astigmatic errors. The star all play key roles in determining the resulting image contrast. The lens
target’s wedges have continuous frequencies
that can be calculated by radial distance, contrast is typically defined in terms of the percentage of the object con-
unlike the finite number of steps in frequency trast that is reproduced. A sensor’s ability to reproduce contrast is usu-
offered by the bar target.
ally specified in terms of decibels in analog cameras and bits in digital
cameras.

Linking resolution and contrast


Resolution and contrast are closely linked. In fact, resolution is often
meaningless unless defined at a specific contrast. Similarly, contrast
depends on resolution frequency. Consider two dots placed close to
each other and imaged through a lens (see Figure 5). Because of the
nature of light, even a perfectly designed and manufactured lens cannot
accurately reproduce an object’s detail and contrast. At best, if the lens

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

is operating at the diffraction limit (which we will discuss later), the


edges of the dots will be blurred in the image.
When the dots are far apart (i.e., at a low frequency), they are dis-
tinct; as they approach one another, the blurs overlap until the dots can
no longer be distinguished. The resolution depends on the imaging sys-
tem’s ability to detect the space separating one dot from another. System
resolution therefore depends on the blur caused by diffraction and other
optical errors, the dot spacing, and the system’s ability to detect con-
trast.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Imaging Pixels Because contrast and resolution are so closely related, it is often ben-
eficial to specify a contrast level at a specific resolution. The result of a
Imax
White range of frequencies being measured is the modulation transfer function
(MTF) curve.
Contrast
Square Wave
Modulation Transfer Function
Suppose we imaged a target of black and white parallel lines. Consider
Black the effect of progressively increasing the line spacing frequency of a tar-
I min get and how this might affect contrast. As we might expect, the contrast
will decrease as the frequency increases. The MTF is plotted by taking
FIGURE 4: Contrast is the difference in intensity between the contrast values produced by a series of different line pairs. The
blacks and whites. For an image to appear well- curve drawn from these points shows the modulation (i.e., the contrast)
defined, black details must appear black and
white details must appear white. The greater the at all resolutions, not just the limit resolution.
difference in intensity between a black and white For many images, having a high contrast at a lower frequency is
line, the better the contrast. The human eye can
see a contrast of as little as 1-2%. A typical lim- more important than the limit resolution. Many high-speed systems fail
iting contrast of 10 to 20% is often used to define because the designers don’t understand this.
the resolution of a CCD imaging system. There is another way to think about MTF. Instead of plotting the
contrast in the frequency domain, suppose we look at the intensity in the
spatial domain. We said earlier that no optical system can reproduce an
object’s detail and contrast, and we discussed how an image of a dot has

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


IMAGE
blurred edges. If, instead of a dot, a single point of light is imaged
OBJECT through a lens, it also spreads out. This can be measured by the point-
spread function (PSF) of a lens, which is a function of the intensity vs.
the linear distance across the image of the spot.
The MTF is the Fourier transform of the point spread function.
37 Because the PSF is a single cross section of the spot, we look at two
orthogonal PSFs to get a more complete picture of the image. This leads
to two (a sagittal and a tangential) MTF curves for each image point. The
Iris two curves are sometimes averaged to make a clearer graph.
It is necessary to understand the PSF because of its relevance to real
MTF measurements. In fact, it is a point source and not a target that is
most often used to determine MTF values. Because MTF curves repre-
sent only a single point on the image, it is necessary to show multiple
field points or curves to accurately define the full image. For example,
sample points taken on the optical axis, at 0.7 the full field and at the
FIGURE 5: Contrast is not constant; it depends on frequency. full field, will yield a very accurate representation of image MTF. (The
The dots at the top of the figure can be
imaged through a lens. They blur slightly. If we 0.7 full field is used because it represents half the area of the full field.)
moved the spots closer, their blurs overlap and
contrast decreases. When the spots are close
enough that the contrast becomes limiting, that Using MTF to choose a lens
spacing is our resolution. Each component of an imaging system has an MTF curve associated
with it — even the non-optical components such as capture boards and
cables. The MTF for each device describes the relationship between the
contrast and resolution (measured in frequency) for that component. By
understanding your needs and choosing components with the curves
you require, you can integrate a system without paying for components
with unnecessary performance.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Consider, for example, a machine vision system set up to look at an


object on an assembly line. Assuming that the information needed from
the image is not a small detail, the integrator can concentrate on maxi-
mizing contrast (increased signal to noise) at low resolutions. This will
allow the imaging system to capture the necessary data while allowing
the user to run the assembly line faster than with a high-resolution but
low-contrast imaging system.
This is also a chance to save money. When we pick a lens for the sys-
tem, our goal is to maximize contrast. Assume we look at the MTFs of
BEST OF EDMUND OPTICS™ APPLICATION NOTES

System
Lens CCD Camera
Typical 25mm F4 Fixed Focal Length (item #39-084)
(Lens x Camera) two lenses, one designed for 35mm film cameras, and the other
On-Axis 4mm Off-Axis

100
90
Horizontal
Vertical
Horizontal
Vertical
100
90
Typical Sony XC-75 Response
Note: % Modulation for horizontal and vertical is equal. 100
90
80
Standard 25mm F4 (item #39-084)
Horizontal
Vertical
designed to work with CCDs (see Figure 6). The CCD lens is less
80 80
70 70
expensive than the 35mm lens, and doesn’t offer usable contrast at high
% System Contrast
70

x =
60
% Contrast

60 60
% Contrast

50 50 50
40 40 40
30
20
10
30
20
10
0
30
20
10
0
resolutions. However, if we consider the MTFs of both lenses, we can
0
5 10 15 20 25 30 35 40
Image Resolution (lp/mm)

Typical 25mm Double Gauss F4 (item #46-094)


45 50 55 5 10 15 20 25 30 35
Resolution (lp/mm)
40 45 50 55 5 10 15 20 25 30 35 40
Image Resolution (lp/mm) Example
45 50 55
see that at the low frequencies of interest to us for this application, the
100
90
On-Axis
Horizontal
Vertical
4mm Off-Axis
Horizontal
Vertical 100
90
Typical Sony XC-75 Response
Note: % Modulation for horizontal and vertical is equal.
100
90
25mm F4 Double Gauss (item #46-092)
Horizontal
Vertical CCD lens outperforms the more costly 35mm lens by providing higher
80 80
80
70 70 70
contrast.
% System Contrast

x =
60 60
% Contrast
% Contrast

60
50 50 50
40 40 40
30
20
10
30
20
10
30
20
10
Assuming that the other components are also chosen with an eye to
0 0 0
5 10 15 20 25 30 35
Image Resolution (lp/mm)
40 45 50 55 5 10 15 20 25 30 35
Resolution (lp/mm)
40 45 50 55 5 10 15 20 25 30 35 40
Image Resolution (lp/mm) Example
45 50 55
enhancing contrast at low resolutions, the final system MTF — which
is a combination of the component MTFs — will provide the desired
FIGURE 6: Each system component has its own MTF (lens,
camera, cables, capture board, and monitor).
performance for a high-speed assembly line application. Remember: the
Multiplying each MTF yields an overall system choice depends on the application. If, as in the case outlined above, high
MTF. contrast at low frequency is important, then pay more attention to the
left side of the MTF curve.

Using MTF
In traditional system integration, a rough estimate of system resolution
is often made by assuming it is limited by the component with the low-
est resolution. Although this approach is useful for quick estimations, it
is flawed because every component in the system contributes error to

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


the image, yielding poorer image quality than the component with the
lowest resolution. A more accurate system resolution can be calculated
by combining the MTF of each component.
In addition to the lens, every component in an imaging system also
has an MTF associated with it: the cameras, capture boards, cables,
38 monitor, and user’s eyes all have MTFs. When looking at the MTF
curves, the more you depend on the application and the detector. If the
limiting resolution is important, then you want a curve with the
10%–20% contrasts as far to the right as possible. If, as is often the case,
high contrast at low frequencies is important, then you would pay more
attention to the left side of the MTF curve.

Lenses and apertures


MTF describes contrast and resolution, but what about other image
quality factors such as depth of field and geometrical errors? These are
consequences of dealing with lenses.
The diffraction of light limits the performance of a lens. The dif-
fraction limit of a lens is affected by the size of the aperture. The aper-
ture is inversely proportional to the f-number, which describes the light-
gathering ability of an imaging lens. As the lens aperture decreases, the
f-number increases. The diffraction limit dictates that the smallest spot
that can be imaged through a lens is proportional to the f-number.
Often, however, the limiting factor for a lens is not diffraction;
instead, optical errors and manufacturing tolerances limit performance.
When this is the case, lens performance can often be improved by
increasing the f-number.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Using telecentric lenses to overcome perspective errors


Perspective error, also known as parallax, is part of our everyday expe-
rience in gauging distance: we expect closer objects to appear larger
than objects that are the same size but farther away. Perspective also
exists in conventional imaging systems in which the magnification of
the object changes with its distance from the lens. While this is useful
for estimating the distance of objects with known sizes, it gets in the
way of measuring objects of an unknown size. Perspective is most trou-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Conventional Lens Telecentric Lens


blesome in measurement applications involving objects with depth or
objects moving relative to the lens.
Images overlayed
and refocused for each
Telecentric lenses are designed to minimize perspective error. They
working distance
optically correct for perspective, and objects remain the same perceived
size, independent of their location within a depth of field and field of
view defined by the lens (see Figure 7).
While telecentric lenses do not inherently have more depth of field
Objects at Different than conventional designs, the images tend to blur symmetrically.
Distances
Because the center of the blur corresponds to the center of the object,
however, no error is introduced in measuring the center-to-center sepa-
WD 1 ration of objects. This is true even if the object is not in focus. The field
of view of a telecentric lens is limited by the front diameter of the lens.
Because the magnification is constant for a telecentric lens, different
WD 2 lenses are necessary for different fields of view.

Depth of field
WD 3 As discussed earlier, the depth of field (DOF) is one of the fundamen-
tal parameters of image quality. The DOF of a lens describes its ability
to maintain a desired amount of image quality as the object is moved
closer to and farther from the best focus position. As the object moves

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


FIGURE 7: Like railroad tracks that appear to converge at
closer or farther than the working distance, both the contrast and reso-
the horizon, perspective error makes the square of lution suffer. The DOF therefore makes sense only when defined at both
dots at the longest working distance appear to be a specific contrast and resolution. Lenses used at higher f-numbers have
closer together than the square of dots closest to
the camera (left). A telecentric lens corrects this larger depths of field.
error within a range of working distances and Although the DOF can be calculated at the diffraction limit, the tech-
39 over a certain field of view (right).
nique isn’t useful if the lens is limited by other factors, which is often
the case. (This also means that although two lenses may have the same
f-number and thus the same diffraction limit, they do not necessarily
offer the same DOF.) Instead, the DOF can be measured at a specific
contrast and resolution for an application.
Depth of field is tested using a target with a regularly marked sur-
face that slopes at a 45° angle. Testers can either eyeball the image to
see where the image blurs or can calculate the contrast from looking at
something called the line-spread function. If the lens includes an iris, its
f-number can be raised by closing the iris. In this case, a user who needs
more depth of field may be able to gain it by raising the f-number, but
at the expense of resolution.

Distortion
Distortion also limits the image quality. There are a host of optical aber-
rations that cause the lens to change magnification at different points in
the image. The magnification changes with distance from the center of
the field. One important point to remember about distortion is that no
information is lost — it is merely misplaced.
All lenses have some distortion, which is worst at the edges of the
field. The difference between the actual (distorted image) and predicted

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

(non-distorted object) position can be expressed in terms of a percent-


age from the center of the field. Distortion can often be fairly well cor-
rected, although it is more difficult to correct for this aberration in short
focal length lenses, such as wide-angle or fisheye lenses.
Distortion is troublesome for measurement applications, but it can
be corrected. Because no information is lost, once the distortion has
been measured (using a distortion target), it can be factored into the cal-
culation of measurements. Furthermore, the images can be corrected by
software.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

The amount of distortion that is acceptable depends, again, on the


application. If the distortion at the edge of the sensor is less than the size
of a pixel, it will not have any effect on the image. If the distortion is
less than ~2%, the human eye will not perceive it.

Conclusion
If you are building a machine vision system, you need to understand the
characteristics of image quality that we have discussed. Once you
understand the tradeoffs associated with the optical system, you can
build an efficient system that works for your application. With this
information, you can specify a lens that fits the needs of the measure-
ment, without compromising performance or paying for features you
don’t need. You can also optimize the overall system for your applica-
tion before prototyping and thus cut development time dramatically.
We’ve seen some of our customers reduce their time-to-market by half.
In the process, you might also reduce overall cost.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


TECH TIP ON CHOOSING A CAMERA
When choosing a camera for an industrial application, many system specifiers instinctively
select color because they feel a monochrome image is inferior. That, however, is incorrect.
40 Monochrome cameras have higher resolution, better signal-to-noise ratio, increased light sen-
sitivity, and greater contrast than similarly priced color cameras. Although color imaging may
be preferable, the eye perceives spatial differences more clearly in gradients of black and
white. In addition, industrial applications requiring a computer interface typically operate
with a black and white camera, since a color image requires more processing time and does
not yield significantly more information about the object.
When a high resolution color image is necessary, it is beneficial to use a 3-chip (also
called 3-CCD or RGB) camera. By utilizing three CCD sensors, these cameras offer the best
of both worlds — yielding greater spatial resolution and dynamic range than single chip color
cameras. The image is directed to each sensor using a prism and is then filtered to provide
independent red, green and blue signals. The RGB output from a three chip camera is con-
sidered to be superior to the standard NTSC and Y-C formats, because the color information
is on three separate signals.

Using a Single Using a Black and


Chip Color Camera White Camera

www.edmundoptics.com 800.363.1992
10 LENS SPECS Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

YOU MUST KNOW Machine vision integrators and designers, faced with challenging hard-
ware, software, and electronic issues, often overlook optical perform-
ance specifications. Without understanding how to assess the optics,
FOR MACHINE however, the task of choosing among machine vision lenses quickly
becomes overwhelming. By understanding these ten specifications,
VISION OPTICS integrators and users select the appropriate lens to optimize their sys-
tem’s performance.
The four most basic parameters in specifying the optics for a vision
BEST OF EDMUND OPTICS™ APPLICATION NOTES

system are field of view, resolution, working distance, and depth of


field (see Figure 1). Other specifications to consider in advanced inte-
Sensor gration are the f/#, maximum chip format, distortion, zoom/focus
Sensor Size
features, design conjugate, and telecentricity.
Camera
The big four
Simply put, the field of view should be the size of the object you need
to inspect. Many engineers tasked with specifying a machine vision
system think in terms of magnification. Magnification, however, is a
relative specification and depends on the size of the image sensor and
Working the size of the display device -- it has no real meaning in terms of field
Distance of view or resolution. For example, a system with a 50X magnification
Depth Of
Field can have a field of view of 5.3 mm (if the system uses a 1/2-in. CCD
and 13-in. monitor) or a field of view of 15.2 mm (1-in. CCD,19-in.
monitor). You must specify the field of view to ensure that the vision
Field O
f View system can inspect the entire region of interest.
Resolution Specifying the field of view rather than the magnification also
ensures that the system will have the appropriate resolution. The reso-
FIGURE 1: Basic optical parameters of a machine lution of the system is the minimum distinguishable feature size of the
vision system include the field of view, object under inspection. In most instances, the smaller the field of view,
working distance, resolution, and depth of
field. (Magnification is not a basic parame- the better the resolution. The resolution of the system is determined by

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


ter.) the modulation transfer function (MTF) of the optics, camera, cabling,
and display hardware. MTF qualifies the overall imaging performance
of a component in terms of resolution and contrast.
Too often, the MTF of the optics is ignored, and the resolution of
the system is calculated based on primary magnification and camera
41 pixel size. This approximation assumes perfect optics and generally
leads to under-specifying the lens and degrading the performance of the
system. Knowing how accurately the lens transfers data from the object
onto the camera chip allows the integrator to maximize the system’s
field of view while maintaining appropriate resolution for the task at
hand. (Bonus tip: MTF is not just for lenses. Even the non-optical com-
ponents in the system have associated MTF curves that contribute to
the system MTF: you can avoid over- or under-specifying the perform-
ance of these components by making sure the MTFs of all the compo-
nents complement each other.).
Sometimes mechanical constraints dictate difficult optical con-
straints. The working distance is the distance from the front of the lens
to the object under inspection. The longer the required working dis-
tance, the more difficult and more costly it becomes to maintain a small
field of view. Often, a small field of view will be specified out of neces-
sity with a fairly long working distance specified out of convenience.
This configuration, however, greatly increases cost and typically
reduces the resolution and light collection ability of the optics, unnec-
essarily degrading the system’s overall imaging performance. Where
mechanical constraints exist (for example, imaging a reaction inside a
vacuum chamber), this configuration may be necessary. If a long work-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

ing distance isn’t necessary, though, don’t complicate matters need-


lessly.
If the objects to be imaged are three dimensional, then you must
also consider the depth of field. The depth of field of a lens is its abil-
ity to maintain a desired resolution as the object is positioned closer to
and farther from best focus. A large depth of field can simplify mount-
ing constraints, because precision movement is not necessary to posi-
tion the object at the nominal working distance of the lens. However,
Choose an off-the-shelf
BEST OF EDMUND OPTICS™ APPLICATION NOTES

keep in mind that although the lens will maintain the minimum resolu-
tion over the specified depth of field, the lens won’t necessarily main-
lens based on the tain the same field of view over that depth. This change in magnifica-
tion can have disastrous results on machine vision measurement appli-
cations. (Telecentric lenses – discussed below -- minimize this prob-
required working dis- lem.)

tance and field of view. Important subtleties


Specifying field of view, resolution, working distance, and depth of
Then determine field is enough to choose an appropriate lens for your machine vision
system. By considering other factors as well, including illumination
whether it provides the integration, CCD format, operator error, and software development,
you can reduce setup costs and system downtime while optimizing reli-
desired resolution, con- ability and repeatability.
Depth of field, to a great extent, is controlled by the f/# of the lens.
trast, and depth of field. The f/# is the ratio of the focal length of the lens to the diameter of the
aperture stop. In an ideal lens design, the f/# is the limiting factor in
system resolution. Common machine vision optics integrate an
adjustable iris into the design, allowing the user to adjust for varying
light levels and to control the depth of field. Increasing the size of the
aperture decreases the depth of field, but will often increase the resolu-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


tion of the lens (see Figure 2). Decreasing the size of the aperture (com-
monly referred to as "stopping down" the lens) increases the depth of
field, but decreases the effective diffraction limit of the lens. This
degrades overall system performance.
Note that f/#, resolution, and depth of field are interrelated. Given a
42 required resolution and depth of field, the manufacturer of your
machine vision optics will be able to determine the ideal aperture set-
ting of your lens. In other words, if you plan to integrate an off-the-
shelf lens into your machine vision system, the initial lens selection
should be based on required working distance and field of view. The
optics manufacturer should then work with you to determine whether
the lens you’ve selected will be able to achieve the desired resolution,
at the necessary contrast level, with the appropriate depth of field.
The maximum CCD format is an often-overlooked and misunder-
stood specification of machine vision lenses – partly because manufac-
turers are not all measuring the same thing. To some, the maximum
CCD format is the length of the diagonal of a common CCD chip that
most closely matches the diameter of the image the lens will produce
without vignetting. Other manufacturers, however, specify the maxi-
mum CCD format as the largest diagonal the lens will cover while
maintaining specified resolution and distortion characteristics. As
Figure 2 shows, the resolution of the lens degrades as the image moves
off axis. If the first definition (coverage without vignetting) of maxi-
FIGURE 2: Increasing the size of the aperture decreases mum CCD format is used, your system will maintain the specified res-
the depth of field, but will often increase the olution only in the center of the chip. If the object under inspection has
resolution of the lens.
critical details towards the outer edges of the image, those details may

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

not be resolved if your system incorporates the maximum specified


sensor format. It’s important to know how the optics manufacturer is
specifying the maximum format to avoid losing critical information
about your object.

Distortion
Distortion is an optical error that causes differences in magnification of
the object at different points on the image. The information about the
No object information
BEST OF EDMUND OPTICS™ APPLICATION NOTES

object is not lost, merely misplaced, so distortion can be calculated out


of the final image. Some integrators elect to develop software to
is lost due to distor- remove the distortion, rather than specify optics that have inherently
low distortion. This method, however, leads to increased costs in over-
head, as the software takes time to develop, specific test targets must be
tion, it is only mis- purchased to determine levels of distortion, and the targets and optics
need to be periodically recalibrated to ensure system accuracy. Well-
placed in the image. designed, long focal length optics inherently minimize distortion and
typically prove to be a more economical and more reliable long-term
solution, though they do so at a cost of increased working distances. If
system constraints require a short working distance and a large field of
view, often an off-the-shelf solution is not available and, other than a
custom lens, a software solution may be the only reasonable fix.
Barrel Pincushion Increased distortion can also arise when machine vision lenses are
(negative) (positive)
designed to be too modular. Features including focus knurls, adjustable
AD
PD
Predicted Distance (PD)
irises, and zoom functions greatly increase design and manufacturing
PD AD
Actual Distance (AD) costs while limiting overall performance. The enhanced features allow
the manufacturer to market the lenses for a variety of applications and
end-user markets, but the increased flexibility generally decreases res-
Non-Distorted Image olution and throughput, increases distortion, and creates user error.
When the software of the machine vision system is calibrated to a spe-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


cific field of view and aperture setting, any adjustment of the iris,
FIGURE 3: Illustration of positive (Pincushion) and zoom, or focus of the lens requires recalibration (leading to system
negative (Barrel) distortion. downtime) or else compromises reliability. For an OEM application,
custom optics designed specifically for your field of view, working dis-
tance, resolution, and depth of field requirements would eliminate these
43 problems, but long lead-times and high design and manufacturing costs
only make the custom solution practical for large volume requirements.
Integrators for small volume requirements should be careful to select
lenses that minimize the risks associated with these flexible features by
selecting lenses without them, or by selecting lenses with lockable
focus, iris, and zoom functions. However, these functions are very use-
ful for prototype and proof-of-concept work, as they can assist the inte-
THE FOLLOWING FORMULA grator in determining precise field of view and depth of field settings.
CALCULATES DISTORTION: Lens housings that include threads for screwing on filters can
enhance the system’s performance. Filter threads simplify the addition
of color filters, neutral density filters, or polarizers to a machine vision
system (which can enhance contrast levels, reduce glare, and improve
AD - PD system accuracy). The filter threads can also be used to easily integrate
% Distortion = x 100 illumination, thus simplifying mounting and reducing custom machin-
PD ing costs.

AD = Actual Distance Know the design


While most machine vision optics manufacturers are unwilling to
PD = Predicted Distance release the optical design of their lenses, it’s important to get as much
information as possible about the design criteria. One common mistake
integrators make is to ignore the design conjugate of the optics (i.e., the

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

optimized distance from the object plane to the lens) and to use a lens
at a short working distance when it is designed to focus at infinity. You
can force such a lens to focus at very short distances by adding spac-
ers, but the overall performance is likely to suffer: An otherwise well-
designed lens may exhibit increased distortion, chromatic and spherical
aberrations, reduced depth of field, non-uniform illumination, and
decreased light gathering ability. These problems become more preva-
lent as the lens is forced further and further from the situation for which
The more information a it was designed.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

One popular method for reducing magnification changes over dif-


machine vision optics ferent depths of field is to use lenses designed to be telecentric.
Maintaining constant magnification is very important in machine
vision based measurement systems. Using telecentric lenses, move-
manufacturer can pro- ment of the object toward or away from the lens (e.g., bottles bouncing
down a conveyor belt) will not result in the image getting bigger or
vide you about the lens smaller, and an object which has depth or extent along the optical axis
will not appear as if it is tilted. Calibrated software can then directly
for your system, the measure the size of the object.

more likely your project Apply smarts and reduce costs


The more information a machine vision optics manufacturer can pro-
is to succeed. vide you about the lens for your system, the more likely your project is
to succeed. For simple inspection and go/no-go systems, choosing a
lens with the proper field of view at the proper working distance may
be all you need. For more demanding applications, however, the choice
of optics becomes critical to the system’s success. Working with an
optics manufacturer capable of providing tested MTF data, meaningful
depth of field information, and knowledgeable support about the how
the lens performs in the application removes the guess-work from the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


system integration and allows the integrator to concentrate on the more
time consuming hardware and software issues.

44

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

NEED TO KNOW The four most basic parameters in specifying the optics for a vision sys-
tem are field of view, resolution, working distance, and depth of field.
Other specifications to consider in advanced integration are the f/#,
OPTICS FOR maximum chip format, distortion, zoom/focus features, design conju-
gate, and telecentricity.
MACHINE VISION Field of view and resolution
Many engineers tasked with specifying a machine vision system think
BEST OF EDMUND OPTICS™ APPLICATION NOTES

of field of view in terms of magnification. Magnification, however, is a


relative specification and depends on the size of the image sensor and
the size of the display device - it has no real meaning in terms of field
Sensor
of view or resolution.
Sensor Size For example, a system with a 50X magnification can have a field of
Camera view of 5.3mm (if the system uses a 1/2-in. CCD and 13-in. monitor) or
a field of view of 15.2mm (for a 1-in. CCD and 19-in. monitor). You
must specify the field of view to ensure that the vision system can
inspect the entire region of interest.
The resolution of the system is determined by the modulation trans-
fer function (MTF) of the optics, camera, cabling, and display hard-
Working ware. MTF qualifies the overall imaging performance of a component
Distance in terms of resolution and contrast.
Depth Of
Field Knowing how accurately the lens transfers data from the object onto
the camera chip allows the integrator to maximize the system's field of
view while maintaining appropriate resolution for the task at hand.
Field O
f View (Bonus tip: MTF is not just for lenses. Even the nonoptical components
Resolution in the system have associated MTF curves that contribute to the system
MTF. You can avoid over- or under-specifying the performance of these
components by making sure the MTFs of all the components comple-
FIGURE 1: Basic optical parameters of a machine vision ment each other.)

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


system include the field of view, working dis-
tance, resolution, and depth of field Sometimes mechanical constraints dictate difficult optical con-
straints. The working distance is the distance from the front of the lens
to the object under inspection.
The longer the required working distance, the more difficult and
more costly it becomes to maintain a small field of view. Where
45 mechanical constraints exist (for example, imaging a reaction inside a
vacuum chamber), this configuration may be necessary.

Depth of field
If the objects to be imaged are three dimensional, then you must also
consider the depth of field. The depth of field of a lens is its ability to
maintain a desired resolution as the object is positioned closer to and
further from best focus.
A large depth of field can simplify mounting constraints, because
precision movement is not necessary to position the object at the nomi-
nal working distance of the lens. Although the lens will maintain the
minimum resolution over the specified depth of field, the lens won't
necessarily maintain the same field of view over that depth. This change
in magnification can have disastrous results on machine vision meas-
urement applications. (Telecentric lenses - discussed later - minimize
this problem.)

Important subtleties
Specifying field of view, resolution, working distance, and depth of
field is enough to choose an appropriate lens for your machine vision
system. By considering other factors as well, including illumination

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

integration, CCD format, operator error, and software development, you


can reduce setup costs and system downtime while optimizing reliabil-
ity and repeatability.
In an ideal lens design, the f/# is the limiting factor in system reso-
lution. Common machine vision optics integrate an adjustable iris into
the design, allowing the user to adjust for varying light levels and to
control the depth of field. Increasing the size of the aperture decreases
the depth of field, but will often increase the resolution of the lens.
Specifying field of Decreasing the size of the aperture (commonly referred to as "stop-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

ping down" the lens) increases the depth of field, but decreases the
view, resolution, work- effective diffraction limit of the lens. This degrades overall system per-
formance.
Note that f/#, resolution, and depth of field are interrelated. Given a
ing distance, and required resolution and depth of field, the manufacturer of your
machine vision optics will be able to determine the ideal aperture set-
depth of field is ting of your lens.

enough to choose an The image format and distortion


The maximum image format is the length of the diagonal of a common
appropriate lens for imaging chip that most closely matches the diameter of the image the
lens will produce without vignetting. Other manufacturers, however,
your machine vision specify the maximum image format as the largest diagonal the lens will
cover while maintaining specified resolution and distortion characteris-
system. tics.
The resolution of the lens degrades as the image moves off axis. If
the first definition (coverage without vignetting) of maximum CCD for-
mat is used, your system will maintain the specified resolution only in
the center of the chip.
If the object under inspection has critical details towards the outer

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


edges of the image, those details may not be resolved if your system
incorporates the maximum specified sensor format. It's important to
know how the optics manufacturer is specifying the maximum format
to avoid losing critical information about your object.
Some integrators elect to develop software to remove distortion,
46 rather than specify optics that have inherently low distortion. This
method, however, leads to increased costs in overhead, as the software
takes time to develop, specific test targets must be purchased to deter-
mine levels of distortion, and the targets and optics need to be periodi-
cally recalibrated to ensure system accuracy.
Well-designed long-focal-length optics inherently minimize distor-
tion and typically prove to be a more economical and more reliable
long-term solution, though they do so at a cost of increased working dis-
tances. If system constraints require a short working distance and a
large field of view and an off-the-shelf solution is not available, a soft-
ware solution may be the only reasonable fix.
When the software of the machine vision system is calibrated to a
specific field of view and aperture setting, any adjustment of the iris,
zoom, or focus of the lens requires recalibration (leading to system
downtime) or else compromises reliability. Custom optics would elimi-
nate these problems, but at an increase in cost.

Know the design


It is important to get as much information as possible about the design
of the lens. One common mistake is to ignore the design conjugate of
the optics (the optimized distance from the object plane to the lens) and

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

to use a lens at a short working distance when it is designed to focus at


infinity.
You can force such a lens to focus at very short distances by adding
spacers, but the overall performance is likely to suffer: An otherwise
well-designed lens may exhibit increased distortion, chromatic and
spherical aberrations, reduced depth of field, nonuniform illumination,
and decreased light gathering ability. These problems become more
prevalent as the lens is forced further and further from the situation for
which it was designed.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

One popular method for reducing magnification changes over dif-


ferent depths of field is to use lenses designed to be telecentric.
Maintaining constant magnification is very important in machine vision
based measurement systems.
Using telecentric lenses, movement of the object toward or away
from the lens (for example, bottles bouncing down a conveyor belt) will
not result in the image getting bigger or smaller, and an object that has
depth or extent along the optical axis will not appear as if it is tilted.
Calibrated software can then directly measure the size of the object.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


47

www.edmundoptics.com 800.363.1992
HOW TO CHOOSE Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

THE CORRECT When building a vision system, you must consider the application, res-
olution, illumination, depth of field (DOF), field of view (FOV), pro-
cessing speed, and other elements. But all too often, systems are built
OPTICS FOR YOUR that either fail to meet performance expectations or utilize components
that are overspecified. Both pitfalls are expensive because an under-
VISION SYSTEM specified system that fails must be redesigned until it works; an over-
specified system contains components that are more expensive than
needed.
For every vision
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Because a vision system extracts necessary information from an


image, the application determines the required image quality. A system
system application, the with sufficient image quality for one application may not be sufficient
for another. The opposite can also be true, with many applications using
overspecified components that do little more than increase cost.
optical system is The imaging ability of a system is the result of the imaging ability of
the components. Every vision system needs illumination, a lens, a cam-
critical to overall image era, and either a monitor or a computer/capture board to analyze the
images. You should choose components to fit the application and com-
quality, accuracy, plement each other. By avoiding overspecifying the quality on some
parts of the system, you ensure that none of the components is more
speed and expensive than necessary.
To specify the appropriate lens for your application, there are a vari-
repeatability. ety of parameters and concepts that need to be understood. These con-
cepts include FOV, working distance, resolution, contrast, telecentrici-
ty, and lighting rolloff:

Field of view (FOV): The viewable area of the object under inspec-
tion is the FOV; that is, the portion of the object that fills the sensor in
the camera.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Working distance: The distance from the front of the lens to the
object under inspection is the working distance.

Resolution: The minimum feature size of the object under inspec-


tion is its resolution.
48
Depth of field (DOF): The maximum object depth that can be main-
tained entirely in focus is the DOF. It is also the amount of object move-
ment (in and out of focus) allowable while maintaining an acceptable
focus.

Primary magnification (PMAG): PMAG is used to relate how


much of an object will be seen on a given sensor array. For example, if
a lens' PMAG is 2x and the size of the array is 6.4mm, then a 3.2mm
FOV on the object will be seen on the array itself. Conversely, if the
magnification of the lens is 0.5x, then the FOV will be 12.8mm. PMAG
can also be used to calculate resolution in a system as it relates to how
much of the FOV is spread across the pixels in the array: PMAG = sen-
sor size/FOV.

Sensor size: The size of the active area of the sensor in a camera,
typically specified in the horizontal dimension, is important in deter-
mining the proper lens magnification required to obtain a desired FOV.

Resolution
By considering the relationship between resolution and contrast one can

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

understand the tremendously useful modulation transfer function


(MTF). Resolution is a measurement of the ability of the imaging sys-
tem to reproduce object detail. For example, imagine a pair of black
squares on a white background. If the squares are imaged onto neigh-
boring pixels, then they appear to be one large black rectangle in the
image. To distinguish them, a certain amount of space is needed
between them. Determining the minimum distance needed to see the
two squares yields the limiting resolution of the system. This relation-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

ship between alternating black-and-white squares is often described as


a line pair.
-2 -1 Typically the resolution is defined by the frequency measured in line
1
2 pairs per millimeter (lp/mm). There are two different but related resolu-
2
tions in play here: the resolution in object space (the size of elements in
3 0 1 3
2
3
1
2
3 4 the object that can be resolved) and image-space resolution (a combi-
4
4 5
5 nation of the lens resolution and camera resolution). The sensor's line-
4
6
5 0
6 1
6
pair resolution can be no more than half the number of pixels across on
5 -2 the sensor because a pair of pixels is the minimum required to discern a
6 1 black-and-white area. The image- and object-space resolutions
(described in lp/mm) are related by the primary magnification of the
system. The limiting resolution of the system can be determined exper-
imentally by imaging a test target (see Figure 1).
Contrast describes how well the blacks can be distinguished from the
whites. In the real world, black-and-white lines will blur into grays to
some degree. Noise and blurring of edges will cause the contrast to
decline. How effectively the differences between boundary areas on the
image are reproduced relative to one another is often defined in terms
of gray scale or signal-to-noise ratio. For an image to appear well
defined, the black details must appear black, and the white details must
appear white (see Figure 2).

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


The greater the difference in intensity between a light and dark line,
the better the contrast. Although intuitively obvious, it is more impor-
tant than it may first appear. The contrast is the separation in intensity
FIGURE 1: A bar target consists of line pairs with varying between blacks and whites. Reproducing object contrast is as important
frequencies, whereas a star target consists of
wedges with a continuum of frequencies. The
as reproducing object detail, which is essentially resolution. The lens,
49 orthogonal lines in a bar target are useful sensor, and illumination all play key roles in determining the resulting
because they allow users to test the system for image contrast.
errors that show up differently in the x and y
planes of an image (astigmatic errors). Bar tar-
gets are limited by having a finite number of Contrast
steps in frequency. Star targets do not have this
drawback; however, they can be more difficult The lens contrast is typically defined in terms of the percentage of the
to interpret. object contrast that is reproduced. The resolution and contrast of an
image can be defined individually, but they are also closely related. In
fact, resolution is often meaningless unless defined at a specific con-
Imaging Pixels trast. Similarly, contrast depends on resolution frequency.
Consider two dots placed close to each other and imaged through a
Imax
White lens (see Figure 3). Because of the nature of light, even a perfectly
designed and manufactured lens cannot accurately reproduce the detail
Contrast and contrast of an object. When the lens is operating at the diffraction
Square Wave
limit, the edges of the dots will still be blurred in the image. When they
are far apart (in other words, at a low frequency), the dots are distinct,
Black but as they approach each other, the blurs overlap until the dots can no
I min longer be distinguished. The resolution depends on the imaging sys-
tem's ability to detect the space between the dots. Therefore, the resolu-
FIGURE 2: The greater the difference in intensity between a tion of the system depends on the blur caused by diffraction and other
black-and-white line, the better the contrast. optical errors, the dot spacing, and the ability of the system to detect
The human eye can see a contrast of as little as
1%–2%. A typical limiting contrast of 10% to contrast.
20% is often used to define the resolution of a
CCD imaging system.
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Optical engineers usually specify a contrast level at a specific reso-


lution. When a plot is made of contrasts at a range of frequencies, you
have an MTF curve. Suppose we imaged a target of black-and-white
parallel lines. Consider the effect of progressively increasing the line-
spacing frequency of a target and how this might affect contrast. As one
might expect, the contrast will decrease as the frequency increases.
Taking the contrast values produced by a series of different line pairs
plots the MTF.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

IMAGE The curve drawn from these points shows the modulation (in other
OBJECT
words, the contrast) at all resolutions, not just at the limit resolution (see
Figure 4). It is important to note that the high-resolution end of the
curve is not always the most important part of an MTF. For many appli-
cations, high contrast at a low frequency is more important than the
limit of resolution. For such applications, a higher-resolution lens will
not improve the overall system, although it could very well increase the
cost. Instead, more balanced or brighter illumination may be all that is
Iris
needed.
In addition, if you are able to obtain better contrast at the desired res-
olution for your application, you can actually decrease the system's pro-
cessing time. This occurs because the finding and measuring of objects
and edges is done more quickly if the image has higher contrast.
Resolution and contrast are not the same for every point in the field.
Basically the farther out from the center of the image you go, the more
FIGURE 3: Contrast is not constant - it depends on frequency. resolution and contrast will fall off. This is not always a significant
The dots at the top of the figure can be
imaged through a lens. They blur slightly. If the issue, since many lenses can outperform the sensor that they are coupled
spots are moved closer, their blurs overlap and with at all points across the FOV. However, in many applications, if this
contrast decreases. When the spots are close
enough that the contrast becomes limiting, that is not considered, then the accuracy of the measurement taken can suf-
spacing is the resolution. fer at the edges of the field. This can lead to rejected parts being passed
or good parts failed. Again, this information can be expressed in the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


form of an MTF curve.
There are other things that can be done to enhance the performance
of a lens within the system. If you use only one color, then chromatic
aberration is no longer an issue. If the system does not need to be color-
100
corrected over the entire spectrum, the lens design can be simpler.
50 90
80
Using a monochromatic design may also simplify the illumination sys-
70 tem, since monochromatic LEDs use less power and create less heat
% Contrast

60 than white-light incandescent bulbs. This effect can also be achieved by


50 using color filters with a white-light source. Filters can be a very-low-
40 cost way of greatly improving the capabilities of a system. Additionally,
30 monochromatic light sources and filters can perform color analysis.
on axis MTF
20 0.7 field MTF
10 full field MTF Distortion
0 Distortion is a geometric optical error (aberration) in which information
0 10 20 30 40 50 60 70 80 90 100
Image Resolution (lp/mm)
about the object is misplaced in the image but not actually lost.
Distortion can come in a few different forms. One is monotonic distor-
FIGURE 4: Modulation transfer function (MTF) curves tion, which is distortion that is consistently positive or negative from the
represent contrast at a given frequency (resolu- center of the image out to the edges. Monotonic distortion comes in two
tion). Since the MTF can vary at different
positions within the field, multiple curves need forms—barrel (negative) and pincushion (positive).
to be plotted to see the true performance of the Distortion that is not monotonic goes back and forth between nega-
lens. The graph shows the MTF at three posi-
tions within the field - the middle, 0.7 of the tive and positive distortion as you work your way from the middle of
field, and the full field in the corner of the the field to the edges. Distortion that is not monotonic can occur from
image. The application requirements should be efforts made during the design of the lens to reduce overall distortion in
met at all points of the field, not just the center,
to guarantee system accuracy. the lens or from factors specifically related to the design type. In both
monotonic and nonmonotonic designs, distortion is not linearly corre-
lated to the distance from the center of the field.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Whether the distortion is monotonic or not, software can factor it out


so that accurate measurements can be made of the image. Using meas-
urement software and a dot target of known size, you can measure the
distortion at different distances from the center of the image (see Figure
5). Once this is done, distortion can be either processed out of the image
or taken into account during measurement. Removing distortion from
an image and redrawing the image can be a processor-intensive opera-
tion. Most often, simply using the distortion calculations is all that is
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Distortion (%) = required and will help to shorten processing time.


Actual Distance (AD) - Predicted Distance (PD) x 100
Predicted Distance (PD) Telecentricity
Perspective errors, also called parallax, are part of everyday human
experience. In fact, parallax is what allows the brain to interpret the
Undistorted Image three-dimensional world. We expect closer objects to appear relatively
larger than those farther away. This phenomenon is also present in con-
ventional imaging systems in which the magnification of an object
changes with its distance from the lens. Telecentric lenses optically cor-
rect for this occurrence so that objects remain the same perceived size
independent of their distance over a range defined by the lens.
So, why is telecentricity desirable? What are its advantages, disad-
vantages, and limitations? For many applications, telecentricity is desir-
Distorted Image
able because it provides nearly constant magnification over a range of
working distances, virtually eliminating perspective angle error. This
means that object movement does not affect image magnification. In a
FIGURE 5: The distortion in this figure has a negative
value because the edge of the field is closer to system with object-space telecentricity, movement of the object toward
the center of the image than it should be. or away from the lens will not result in the image getting bigger or
smaller, and an object that has depth or extent along the optical axis will
not appear as if it is tilted (see Figure 6).
In systems with image-space telecentricity, image-plane movements

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


to focus or intentionally defocus the system will not change the image
size. This property is fundamental to the microlithography industry,
where tolerances on feature size are routinely less than 0.1µm.
Additionally, image-space telecentricity can lead to extremely uniform
image-plane illumination. The normal cos4θ falloff (defined below) in
51 image-plane illumination from the optical axis to the edge of the field is
removed, since all chief rays have an angle of θ° with respect to the
Conventional lens Telecentric lens image plane.
There are a number of qualities inherent in telecentric lenses, how-
ever, that may be considered disadvantages. First, the optical elements
in the region of telecentricity (image side or object side) tend to grow in
size. In the case of a doubly telecentric design (telecentric in both object
FOCUS
and image space), both the front and rearmost lens groups need to be
F-STOP

bigger than the object and image, respectively. For example, if the
object is 100mm2 and needs to be inspected with a telecentric lens, then
the front element or elements of the lens system need to be significant-
ly larger than the diagonal of the part to provide an unvignetted field of
view of the object. The diagonal of this object is almost 6in., leading to
a lens that is more than 6in. in diameter. Such a lens would be very large
and very heavy and would require special attention to mounting. Its size
must be accounted for before building a machine into which it may be
FIGURE 6: A cylindrical object whose cylindrical axis is placed.
parallel to the optical axis will appear to be A common misconception concerning DOF and telecentricity is that
circular in the image plane of a telecentric lens.
Using a nontelecentric lens this same object will they have a larger DOF than ordinary lenses. Realistically, telecentrici-
appear to lean; the top of the object will appear ty does not imply large DOF, which is only dependent on F-number and
to be elliptical, not circular and the sidewalls
will be visible.
resolution. With telecentric lenses, objects still blur farther away from

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

best focus, but they blur symmetrically, which can be used to advantage.
As long as the features of the object are within the telecentric working
distance, the magnification will not change. In other words, features
closer to the lens do not appear larger than those further away.
Cos4θ rolloff must be considered in applications where a large sen-
sor format or linescan camera is used. This rolloff can also be an issue
in applications where large FOVs are obtained at fairly short working
distances. Essentially cos4θ rolloff is the relative difference in the
The success of an
BEST OF EDMUND OPTICS™ APPLICATION NOTES

amount of light that makes it to the center of the image as opposed to


the edges. It is determined by taking the cos4 of the angle created by the
application depends chief ray at the center of the image and the chief ray at the edge of the
image. In many applications, this is not much of an issue, but when
on fully recognizing these angles become larger than 30° it can be problematic. This is espe-
cially true in linescan applications in which systems may already be
light starved because of short exposure times. Pushing a system too far
that no one lens can can have adverse results:

solve all application θ = 30° => Relative Illumination = 0.56


θ = 45° => Relative Illumination = 0.25
issues. θ = 60° => Relative Illumination = 0.06

Rolloff must be kept in mind for systems with very short working
distances that try to achieve relatively large FOVs because this design
can produce large angles on the image side of the lens regardless of the
sensor size. Rolloff can be controlled by designing the lens to be image-
space telecentric. There are a limited number of image-space-telecentric
lenses on the market, so a custom solution may be required.
Another option for offsetting rolloff is to compensate by creating
unbalanced illumination on the object itself. Additional lights can be

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


deployed closer to the edges of the object, or neutral-density filters can
be added in front of the light source to reduce the relative illumination
at the center of the object.

Conclusion
52 Ultimately you are building a complete system, but each element needs
to be understood on its own to achieve the desired results. The optics
used can greatly affect the overall image quality, help ensure accuracy
and repeatability, and increase the overall speed of the system. The suc-
cess of an application depends on fully recognizing that no one lens can
solve all application issues.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Demonstrating the relationship between contrast and resolution


LENS SELECTION The performance of an imaging system is determined by its ability to
provide images of a given quality. Image quality requirements vary
AND MTF depending on application, and are determined by the amount of infor-
mation that is needed about an object in the image. The variables com-
prising image quality are: resolution, contrast, distortion, and perspec-
tive errors. There is also a measurement that combines resolution and
contrast into a single specification.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

a) Object Image
A low-resolution image contains blurry scenes in which objects lack
Imaging Lens detail. A high-resolution image provides crisp edges and includes much
detail. Contrast also factors into image quality because it expresses
how well an image differentiates between an object’s shades of gray. An
White White image with low contrast will appear “washed out” because it lacks vivid
100% Contrast 90% Contrast blacks and whites.
Black Black Resolution and contrast are closely related. To understand this, think
Line Pair
Object
Imaging Lens
Image of imaging a target with alternating equal-width black-and-white lines
(Figure 1a). This target represents 100% contrast. No lens — not even
b) a perfect one — at any resolution can fully transfer this contrast infor-
White mation to the image because of the inherent diffraction limit dictated by
White
100% Contrast 20% Contrast physics.
Black
Now imagine that the width of the line pairs on the target decreases
Black
(that is, the frequency increases). As the frequency increases, the lens is
less and less able to transfer the contrast, so the resulting image has less
and less contrast (Figure 1b). (A line pair is one black and one white
line of equal width. The “frequency” of these line pairs is often defined
FIGURE 1: At increasing frequencies, optical information
passing through a lens loses contrast. (a) The as the number of line pairs per millimeter, or lp/mm.)
100% contrast information becomes 90% con-
trast. (b) The higher-frequency information
that starts at 100% contrast becomes 20%
MTF incorporates resolution and contrast
contrast after passing through the same lens. When you must characterize the resolution and the contrast provided by

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


a lens, you can refer to modulation transfer function (MTF) supplied by
the manufacturer for a specific lens. You do not have to measure the
MTF for a lens. The MTF describes the ability to transfer contrast at a
particular resolution (frequency) from an object to an image. In other
words, the MTF indicates how much of the object’s original contrast
53 Lens CCD Camera
System
gets lost as the frequency in the object being imaged increases. In this
(Lens x Camera)
Typical 25mm F4 Fixed Focal Length (item #39-084)

100
On-Axis
Horizontal
Vertical
4mm Off-Axis
Horizontal
Vertical
100
Typical Sony XC-75 Response
Note: % Modulation for horizontal and vertical is equal. 100
Standard 25mm F4 (item #39-084)
Horizontal
Vertical
way, the MTF combines resolution and contrast in a single specifica-
90
90
80
90
80
70
80
70 tion.
% System Contrast

70

x =
60
% Contrast

60 60
% Contrast

50 50 50
40
30
20
40
30
20
40
30
20
Manufacturers measure the relationship between contrast and reso-
10 10 10
0
5 10 15 20 25 30 35 40
Image Resolution (lp/mm)
45 50 55
0
5 10 15 20 25 30 35
Resolution (lp/mm)
40 45 50 55
0
5 10 15 20 25 30 35 40
Image Resolution (lp/mm) Example
45 50 55 lution and then plot the results as shown in Figure 2 for two lenses. The
Typical 25mm Double Gauss F4 (item #46-094)
On-Axis
Horizontal
Vertical
4mm Off-Axis
Horizontal
Typical Sony XC-75 Response
Note: % Modulation for horizontal and vertical is equal. 25mm F4 Double Gauss (item #46-092)
Horizontal
Vertical
points on the lines provide the MTF values. Specifically, the graphs plot
Vertical 100 100
100
90
80
70
90
80
70
90
80
70
the percentage of transferred contrast vs. the frequency (lp/mm) of the
% System Contrast

x =
60 60
% Contrast
% Contrast

60
50
40
30
20
50
40
30
20
50
40
30
20
lines. As mentioned above, the contrast in the image decreases with
10
0
5 10 15 20 25 30 35
Image Resolution (lp/mm)
40 45 50 55
10
0
5 10 15 20 25 30 35
Resolution (lp/mm)
40 45 50 55
10
0
5 10 15 20 25 30 35 40
Image Resolution (lp/mm) Example
45 50 55 increased frequency. The MTF illustrated in Figure 2 was measured
both on axis (at the center of the image) and for the full field (toward
FIGURE 2: The MTF graphs for a 25-mm fixed focal length the corner edges of the field, or off axis). These measurements tell you
lens and a 25-mm double-Gauss lens show how how well the lens can resolve features throughout a field of view. Also,
contrast varies with the image resolution of each
lens. Multiplying the worst-case MTF curves for notice that the plot includes both horizontal and vertical performance.
a lens by the MTF curve for a camera yields an The difference between these two measurements indicates the amount
MTF curve for the system (the lens-camera com-
bination). The MTF for the camera is equal in of astigmatism present in the image.
horizontal and vertical directions. To understand the importance of the MTF specification, consider a
conventional technique used to predict a system’s performance. For a
typical machine-vision system, a designer might estimate the system’s
performance using the “weakest link” rule of thumb. The rule holds that
the system’s resolution depends mainly on the component with the low-
est resolution. This approach proves useful for quick estimates, but sys-
tems tend to have lower resolution than predicted by this rule of thumb,
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

because all of the optical and electronic system components reduce res-
olution to some extent. And the quick estimate includes no considera-
tion of contrast, which is also critical to image quality.
To accurately predict the image quality of the optical system, you
must combine the effects of each component to determine how the
overall system will affect resolution and contrast. Within a system,
every component — the lens, the camera, the cables, the capture board,
and so on — has an MTF. The system MTF is the product of all of the
BEST OF EDMUND OPTICS™ APPLICATION NOTES

component MTF curves.


To accurately determine whether a particular lens provides suffi-
cient image quality, you must multiply its MTF by the MTF for each
component in the system. You can observe how MTF affects system
performance by comparing the resulting MTF for two different lenses
used with the same camera. The examples in Figure 2 compare a 25-
mm fixed focal-length lens with a 25-mm double-Gauss lens, each
mounted on a Sony XC-75 CCD monochrome camera. (This example
simplifies the “system” to cover just the camera and the lenses to illus-
trate how lens MTFs can affect performance.) By analyzing the lens-
camera MTF curves for each combination, you can determine which
combination will yield sufficient performance for a specific machine-
vision application. For this application, assume that you require a min-
imum contrast of 35% for an image resolution of 30 lp/mm. The dou-
ble-gauss lens is the better choice.

Watch lens MTF specs


Lens manufacturers often can provide theoretical or nominal MTF
graphs for lenses. Although this information can be helpful for planning
purposes, it doesn’t indicate the actual performance of a manufactured
lens. Manufacturing always introduces some imperfections that

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


REFERENCES degrade the performance of a lens. Accurate MTFs can be obtained
either from software (as long as it takes the manufacturing tolerances
1 The Web site for the Research Libraries Group (Mt.
into consideration) or by measuring the actual MTF of the lens after
View, CA) provides more information about how to manufacturing.1 Not all lens manufacturers can provide accurate MTF
measure MTF: www.rlg.org/preserv/diginews/dig-
inews21.html. measurements, however. Be sure to ask for measured MTF data when
54 you’re evaluating lenses for machine-vision systems.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

USING MTF IN A Though resolution is a key factor in determining image quality, contrast
also plays a critical role. Resolution and contrast are inherently linked;
specifying the limiting resolution of an imaging system without a
PRODUCTION notion of contrast makes little sense.
The modulation transfer function (MTF) is an elegant way to char-
ENVIRONMENT acterize how well an imaging system transfers contrast as a function of
object detail; i.e., resolution. As the object details get smaller, it
becomes increasingly difficult for the system to transfer its contrast to
BEST OF EDMUND OPTICS™ APPLICATION NOTES

the image. This property is typically represented as a plot of contrast -


- modulation -- versus object detail -- frequency. This graph gives a
clear indication of the lens' performance within different frequency
ranges (see Figure 1).
The advantage of specifying image quality in terms of MTF is that
it is an objective measurement. Commonly used visual tests are sub-
jective. The dependence on human perception has limitations in pro-
duction environments where more hard data and recordkeeping have
become necessary. Beyond this objectivity, MTF is extremely useful in
determining how component performance affects system performance
and application requirements.
Many components of an imaging system have an associated MTF,
and as a result, contribute to the overall MTF of the system. This
includes the imaging lens, sensor, capture boards, and cables, for
instance. The resulting MTF of the system can be obtained by multi-
FIGURE 1: A modulation transfer function curve clearly
indicates how well the lens performs at various
plying those of all the individual components (see Figure 2). By ana-
frequency ranges. lyzing the system MTF curve, a designer can predict which combina-
tion will yield sufficient performance. Knowing the MTF curves of
components allows an integrator to make the appropriate selection to
optimize the system for a particular resolution.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


In a production environment
As lens production volumes rise, fast and flexible techniques to deter-
mine the quality of the lenses are required. MTF does present some
System
Lens
Typical 25mm F4 Fixed Focal Length
4mm Off-Axis
Sagittal
CCD Camera
Typical Sony Camera Response
(Lens x Camera)
Sagittal
drawbacks, specifically in terms of the metrology necessary to support
Tangential Standard 25mm F4 Tangential
Note: % Modulation for Sagittal and Tangential is equal.
100
90
100
90
100
90
it in production environments. Speed of measurement, for instance, has
55 80 80 80
70 70
% System Contrast

70

x = been problematic. It can take 15 to 20 minutes to characterize a lens.


60
% Contrast

60 60
% Contrast

50 50 50
40 40 40
30 30 30
20
20
10
0
5 10 15 20 25 30 35 40 45 50 55
20
10
0
5 10 15 20 25 30 35 40 45 50 55
10
0
5 10 15 20 25 30 35 40 45 50 55
Stability of MTF measurement instruments also has been an issue
Image Resolution (lp/mm) Resolution (lp/mm) Image Resolution (lp/mm)

because hardware developed for laboratory environments was


deployed onto the production floor. Calibration and correlation of
instruments continues to be challenging. The inherent sensitivity of
FIGURE 2: By multiplying MTF curves of the individual MTF and the large amount of data necessary to represent it correctly
components, a designer can predict how a sys- make it one of the hardest optical specifications to correlate between
tem will perform. In this example, the curves for
a typical 25mm, f/4 fixed focal length lens are instruments.
multiplied by those for a typical CCD camera to Despite these problems, MTF presents overwhelming advantages
determine system performance. Resolution is
given in line pairs per millimeter. for integrators and, therefore, has become a preferred method for
designers to specify the quality of an optical element coming off the
production line. Thus, manufacturers are forced to deal with this
metrology in production environments, motivating many of them to
develop internal production-ready instrumentation that strikes a proper
balance among speed, repeatability, and data management.
Speed is vital in high volume lens production environments. In these
applications, the instruments must make full lens characterization with-
in a matter of seconds. The need for this incredible speed stems from
the fact that one MTF instrument typically is used to control the output
quality from numerous assemblers. In these applications, trays of lens-
es are often loaded and unloaded semi-automatically. The new genera-
tion of instrumentation is a major departure from traditional MTF lab
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

equipment (see Figure 3).


Speed is not the sole requirement for production-ready instrumen-
tation, however. Accuracy and repeatability are also critical, with the
latter being significantly more important in volume production. Getting
repeatable results is crucial for observing trends of quality during pro-
duction. Having the proper controls to actively monitor quality is key
to running a production environment and ensures that the customer gets
consistent performance throughout the lifetime of the product.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Calibration
Some calibration techniques still present basic issues. The need for cor-
relation between different instruments is absolutely critical and not
trivial. It is essential to consider lens orientation, field weights and def-
inition, and frequency definition for proper “best focus” position, as
well as the spectral content and stability of the source. It can be easier
to gather all through-focus data so that the MTF can be correlated off-
line; this becomes a significant task in terms of data management.
A great amount of data is necessary to fully characterize a lens’
MTF. Multiple measurements must be taken at several field points
(typically at full field, 0.7 field, and on-axis) and should be taken in the
four quadrants of the field to identify asymmetric response. The sagit-
tal and tangential responses are important to identify and should be
plotted together. This represents 18 curves for a specific focus position
(see Figure 4). To find the best focus this must be done in small incre-
ments. Data management clearly becomes important in communicating
FIGURE 3: On the production floor, MTF instruments
must provide fast and repeatable measurements. the through-focus MTF data of a lens.
Data management is not a trivial undertaking. MTF is a powerful tool for designers and system integrators to spec-
ify image quality. Although there are many drawbacks associated with
testing it in volume production, the objective nature of the testing is

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


important. Devising instruments that are fast and repeatable, that cor-
relate easily and that have well-developed data management platforms
is key in production environments.

56
Image Plane

FIGURE 4: Up to 18 locations are required to measure the


through-focus MTF of a component.

www.edmundoptics.com 800.363.1992
CORRECTING Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

PERSPECTIVE Application requirements


In this example, a system is required to inspect the prototype of a hard-
ware computer key connector to verify the placement of its pins. This is
ERRORS WITH a laboratory setup requiring no automation. A precise measurement
between each pin is determined using measurement software.
TELECENTRICITY System Requirements Given by Customer:
Expected Vertical Pin Separation (center-to-center): ~2.5mm
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Number of Pins Viewable Simultaneously: ~7


Object Resolution to Meet Measurement Accuracy: 36µm

System parameter calculations


In order to accommodate the simultaneous inspection of multiple pins, the
minimum Field of View (FOV) should be about 18mm.
By using some basic equations (see left), we can specify the parame-
ters of our system and pick a suitable CCD camera.
Our system requirements dictate an 18mm field of view and a 36µm
object resolution. Using these values, Eqn. 4.0 can be reduced to a ratio
(Eqn. 5.0).
This ratio can be used to compare the resolution of different cameras
for a specific field of view (while factoring in the sensor size). We can cal-
culate this ratio for some of our high resolution digital and analog CCD
monochrome cameras:
Redlake MEGAPLUS ES 1.0 9.0/9.07 = 0.99
Sony XC-ST30 6.4/4.8 = 1.3
Sony XC-ST50 8.4/6.4 = 1.3
The Redlake MEGAPLUS ES 1.0 camera is the best match to the
desired ratio. Using Eqn. 2.0, we calculate PMAG = 0.5X for the imaging
lens. And from Eqn. 1.0, the camera’s resolution is 18µm. If we assume

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


FIGURE 1: An application where computer key connectors that the lens is not the limiting factor in the resolution of the system, the
need to be analyzed in off-line inspection.
corresponding object resolution is 36µm (Eqn. 3.0).
Note: Although the Sony XC-ST30 camera has a higher resolution
(13µm) it only yields about 50µm object resolution because it has a
smaller sensor.
57
SYSTEM PARAMETER EQUATIONS
Component selection
Since the camera has been selected, a 0.5X PMAG imaging lens needs
Equation 1.0:
to be decided on. Conventional lens designs suffer from perspective
CCD Res.(µm) = 2 x CCD Pixel Size (Horiz, µm) errors which are noticeable when imaging objects with significant
height/depth, as in this example. Telecentric lenses optically correct this
Equation 2.0: problem as illustrated in the images on the next page.
Sensor Size (Horiz, mm)
Primary Mag. (PMAG) =
FOV (Horiz, mm)

Equation 3.0:
Object Res. (µm) = CCD Resolution (µm)
PMAG

combining all of these expressions with the given val-


ues yields:
Equation 4.0:
2 x Pixel Size (µm) x FOV (mm)
Object Res.(µm) =
Sensor Size (mm)

Equation 5.0 (this example only):

Pixel Size (µm)/Sensor Size (mm) = 1

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Measurement software
Edge detection analysis at low depth of field (F2.8) was used
to determine the center of the pins. The telecentric design
(Image 2) maintains a symmetrical blurring within the pin
diameter. The result is an accurate circular fit to the pin by the
measurement software. On the other hand, the conventional
design results in a perspective blur which yields an elliptical
fit. This introduces error into the prediction of the pin center
and also other measurements. For example, the conventional
BEST OF EDMUND OPTICS™ APPLICATION NOTES

system measures 3.51mm center-to-center separation between


two diagonally adjacent pins. The telecentric system measures
EL_1 EL_1
a 3.21mm separation. The actual pin separation is 3.16mm.
GE_1 GE_1
Conclusion
The Redlake MEGAPLUS camera, model ES 1.0 digital cam-
Image 2: Telecentric Image 4: Conventional
era offers the best combination of high resolution and sensor
size to meet the measurement accuracy requirement for this
TELECENTRIC LENS
application. The telecentric lens was selected because it cor-
Image 1: High DOF (F16) Image 2: Low DOF (F2.8)
rects for the perspective errors by maintaining constant magni-
fication over the depth of field. Since the center of the object
does not shift as it blurs, the telecentric lens offers a huge
advantage when measuring center-to-center separation. In this
example, the telecentric lens, in combination with the Redlake
MEGAPLUS camera, improves the overall measurement accu-
racy by 86%.

CONVENTIONAL LENS

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Image 3: High DOF (F16) Image 4: Low DOF (F2.8)

58

TECH TIP ON CCD SENSOR SIZE


The size of the sensor’s active area is important in determining the system’s field of view. Given a fixed pri-
mary magnification (determined by the lens), larger sensors yield greater fields-of-view. The nomenclature of
these standards date back to the Vidicon vacuum tubes used for television, so it is important to note that the
actual dimensions of the chips differ. All of these standards maintain a 4:3 (horizontal:vertical) aspect ratio.
Units: mm
12.8 9.1
8.8
6.4
3.2 4.8

2.4 6.0 4.8 8.0 6.6 11.0 9.6 16 9.2 12.9


4.0 3.6

¹⁄₄ Inch ¹⁄₃ Inch


¹⁄₂ Inch ²⁄₃ Inch
1 Inch REDLAKE MEGAPLUS
ES 1.0

Another issue is the ability of the lens to support certain CCD chip sizes. If the chip is too large for the lens
design, the resulting image may appear to fade away and degrade towards the edges because of vignetting (extinc-
tion of rays which pass through the outer edge of the lens). This is commonly referred to as the “tunnel” effect,
since the edges of the field become dark. Smaller chip sizes do not yield such problems.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

MANIPULATING Application requirements


Precise measurements of plastic mesh fencing are needed during produc-
tion runs to ensure that all dimensions fall within the specified tolerances.
DISTORTION OUT In this situation, the space reserved for the imaging system is extremely
limited. The housing for the CCD and lens is integrated into the mount-
OF YOUR IMAGE ing of the machinery.

System Requirements Given by Customer:


BEST OF EDMUND OPTICS™ APPLICATION NOTES

Working Distance: ~50mm


Horizontal Field of View: ~50mm
Measurement Tolerance: ±0.3mm
Component Housing: <40mm cube

Component selection
Although their minimum working distance is longer than desired, the
MVO® Micro Video lenses are compact, making them ideal for this appli-
cation. The 4.3mm focal length MVO® Micro Video lens has a 60° angu-
lar field of view under normal conditions. By introducing 0.25mm of space
between the lens and the camera, the horizontal field of view is reduced to
FIGURE 1: An application where plastic mesh fencing is
50mm at a 50mm working distance. A high resolution monochrome board
measured to ensure all dimensions meet stan- camera offers the appropriate resolution and size. The illumination is pro-
dards. vided by a fiber optic illuminator with a dual branch flexible light guide.
Due to the macro configuration and wide angle of the lens, distortion has
CCD camera been introduced into the image. This distortion must be taken into account
in order to make accurate measurements.
C-Mount Adapter

Micro Calculating distortion


Video Distortion is a geometric optical error (aberration) in which information
Lens
about the object is misplaced in the image, but not actually lost. Using

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


measurement software and a dot target of known size (shown left), we
can measure the distortion at different distances from the center of the
image. Note: Distortion is not linearly correlated to the distance from
FIGURE 2: Assembly of CCD camera used in the above the center of the field.
inspection example.
59 Factoring distortion out
Once the amount of distortion is calculated, it can be factored out in
order to yield an undistorted image. In this example, –16% (barrel) dis-
Distortion (%) =
tortion is measured at the edges of the field. The distortion has a nega-
Actual Distance (AD) - Predicted Distance (PD) x 100 tive value because the edge of the field is closer to the center of the
Predicted Distance (PD)
image than it should be. Since we are using a ¹⁄₃" format CCD camera
(6mm diagonal sensor size), the corner of the sensor is 3mm from the
Undistorted Image
center. Based on the amount of distortion, this point would actually be
located at a distance of 3.57mm in an undistorted image. Since distor-
tion must be measured for each point on the image, repeated calcula-
tions are required. Once this is done, distortion can either be processed
out of the image (as shown on next page) or taken into account during

Distorted Image

FIGURE 3: Calculating the Actual Distance (AD) and


Predicted Distance (PD) for barrel distortion.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

measurement. We can also calculate the corresponding positions on the


object by dividing the image distances by the primary magnification
(PMAG=0.096). The edge of the field of view, the part of the mesh
measured to be 31.3mm (=3/0.096) from the center mark, is actually
37.2mm (=3.57/0.096) away.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

18.1mm ±0.2mm
18.0mm

16.5mm

DISTORTED IMAGE: This is an initial distort- BINARY IMAGE: A binary image (black CORRECTED IMAGE: Having measured the
ed captured image in which the contrast is not and white, no grays) can be generated distortion accurately, it can be removed through
ideal. through image processing. Note: It is not image manipulations. The resulting image is a
necessary to convert the image into binary to precise representation of the original object.
subtract distortion.

Conclusion
The MVO® Micro Video lens and board level camera offer an ideal solu-
tion for this space-limited application. There is a high degree of distor-
tion within the lens because of its large angular field of view. Once
measured, this distortion can be factored out of the image in order to
obtain more accurate measurements. In this example, we are interested
in measuring the height of the mesh (center row). Without taking distor-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


tion into consideration, the height fluctuates from 16.5 to 18.0mm. Once
the distortion is taken into account, we realize that the range of the
height of the mesh is actually 17.9-18.3mm, well within the ±0.3mm tol-
erance.

60
TECH TIP ON SIGNAL FORMATS
There are four basic signal types used in CCD cameras: Composite (NTSC, EIA), Y-C (S-
video), RGB and Digital (RS-422). NTSC (RS-170A/Color) and EIA (RS-170/Monochrome)
signals are the most common and will accommodate most applications. Y-C and RGB sepa-
rate the image into components and therefore provide superior image quality for video record-
ing and image analysis. Digital cameras provide a level of performance that make them
unique. Used in conjunction with image capture boards, digital cameras do not suffer from the
visual constraints imposed by video formats. The result is greater flexibility in image acquisi-
tion and quality. In any electronic system, the signal format should be constant. Any accesso-
ries added to the common camera-monitor system are added directly after the camera.
Y-C RCA BNC

Each video signal format corresponds to specific cable connectors, as shown above.
Composite signals can use either BNC or RCA type connectors. Y-C uses four pin-DIN type
and RGB uses four BNC connectors.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

TO ZOOM OR A zoom lens can overcome many hurdles faced


by machine vision applications, whether they
NOT TO ZOOM require a vastly changing field-of-view or pre-
cise magnification. But is it the best choice for
every application?
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Benefits
Zoom lenses run the gamut of FOV and magnification ranges. Some
are designed for high magnification applications commonly found in
semiconductor inspection, while others are designed for the larger
FOVs used in security applications. What they all have in common is
the ability to change the field without varying the working distance.
With a fixed focal length lens, the distance between the lens and the
object must be physically changed in order to achieve a different FOV.
This means that zoom lenses offer some great advantages. Imagine if,
in a security application, a camera and lens had to move from a safe
distance to within a few feet of an object or individual in order to see
more detail. It would not be practical, let alone cost effective.
Zoom lenses offer the ability to change magnification without
changing working distance. Additionally, many zoom lenses can per-
form over a range of working distances while providing the same mag-
nification. To achieve the same result without a zoom lens, a system
would require multiple fixed focal length lenses.

Drawbacks
The drawbacks to using a zoom lens should also be considered. The

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


optical and opto-mechanical design of zoom lenses are more compli-
cated than the designs for most fixed focal-length lenses. This results
in higher costs for materials, machining, and assembly. Generally
speaking, zoom lenses cost much more than their fixed focal length
counterparts.
61 Another concern is that while zoom lenses can provide many dif-
ferent working distances, the overall image quality produced at any
given setting may not be sufficient for the application. Fixed focal
FIGURE 1: Dual magnification system. length lenses designed to meet specific working distance requirements
may suit an application better.
Manufacturing issues can be another drawback for zoom lenses.
When the zoom lens is assembled, the centering of components and
bore sighting require a high degree of accuracy. Inaccurate assembly
ONE OBJECTIVE, can lead to images that do not stay centered in the FOV throughout the
TWO FIELDS-OF-VIEW zoom range. This can result in zoom lenses that are insufficient for the
application. Assembly of a vision system using a zoom lens also may
A dual magnification system achieves images at require centering the imaging array in relation to a camera’s lens mount
two different field sizes at the same time. Its -- an additional complication. These centering difficulties can become
design allows for one objective lens to be
problematic, and are much more pronounced in lenses with very large
employed while utilizing two different tube
zoom ranges.
lens–and camera–assemblies to visualize the
two different FOVs simultaneously. This tech- If an application must analyze two different FOVs, then the time the
nique can reduce processing time and guaran- system requires to zoom between them using one lens reduces the sys-
tee magnification repeatability. It can also tem’s efficiency. Unfortunately, utilizing two different fixed focal
increase image quality and allow for more cus- length lenses presents the same problem, since the object itself would
tomization (such as filtering) to be done within most likely need to be moved. Additionally, applications like these
each individual image path. require the zoom lens be driven by a motor, which both increases the
cost of the lens and raises concerns about the system’s repeatability.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Other solutions may be more appropriate for applications of this nature,


such as a dual magnification system (see sidebar on previous page).
Finally, size does matter. While most lenses are designed to be as
compact as possible, zoom lenses are larger than most fixed focal
length lenses. This may not be much of an issue in a lab setting, but size
can be a major drawback in manufacturing environments.

Summary
BEST OF EDMUND OPTICS™ APPLICATION NOTES

FIELD-OF-VIEW, SENSOR SIZE For many applications, zoom lenses have a lot to offer and are a great
AND MAGNIFICATION choice, especially when either the FOV or working distance is not eas-
ily defined. However, zoom lenses might not be the ideal choice in your
application. All of the issues outlined above need to be carefully
A fixed focal length lens is a lens system in
which the focal length is not varied, and thus weighed before making a final selection.
the lens maintains a fixed angular field.

A zoom lens is a lens system in which some


lens elements move in order to vary the focal
length, producing a range of achievable angular
fields.

FOV and magnification are directly related.


Because the FOV equals the sensor size divid-
ed by the magnification (FOV = sensor
size/magnification), then as the FOV increases,
the magnification decreases, and vice-versa.
Note: if the size of the imaging array changes,
then the FOV will also change, but the optical
magnification stays the same.

Example: An optical system with a sensor size

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


of 4.8mm and magnification of 1X will have a
FOV of 4.8mm. If the sensor size is increased to
6.4 mm with the same optical magnification,
the FOV would now be 6.4mm. However, if
optics that provide a magnification of 2X are
62 used then the FOV with a 4.8mm sensor would
be 2.4mm, and with the 6.4mm sensor the
resulting FOV would be 3.2mm.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

CAMERA CHOICE: Color or monochrome


When putting together a machine vision system, the designer must
inevitably decide, "Should I choose a color or monochrome camera?"
COLOR VERSUS System designers too often make this decision without understanding
how the choice will affect the performance of their machine vision sys-
MONOCHROME tem. Many vision systems are designed with color cameras simply
because end users are more comfortable with color images than with
black and white images. In this case, the system is being built for aes-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

thetics rather than performance. At other times the designers specify a

R G R G R monochrome camera assuming that it will cost less than a color cam-
era, which is not always the case. Neither approach considers the
strengths and weaknesses of the different camera technologies and both

G B G B G prevent end users from gaining access to a system with the greatest
value for their specific application.

Three options when you need high resolution


R G R G R Single pixels can only yield intensity information about all the wave-
lengths of light that fall on it. Most color cameras determine the color
of a pixel by using a combination of filters (such as the Bayer filter

G B G B G shown in Figure 1) and interpolation from the intensity values of the


neighboring pixels. This interpolation inevitably leads to some loss in
resolution. For example, the arrangement of the blue and red pixel sites

R G R G R in the Bayer filter makes the single imager color cameras prone to hor-
izontal and vertical artifacts, especially when imaging objects with
straight edges that follow a row or column. For this reason mono-
chrome cameras are often preferred when resolution is of the utmost
G B G B G importance.
For applications that require both color and the highest possible res-
olution, another option is required. These needs are met by a color

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


R G R G R imaging system that does not involve interpolation: instead these cam-
eras use three separate sensors to gather three color channels of infor-
mation. A separating prism is mounted in front of the three sensors so
FIGURE 1: The Bayer filter uses a combination of fil- that the red, green, and blue light is directed to the appropriate chip.
ters to determine the color of a pixel.
While the obtainable resolution of a three-chip color camera rivals that
63 of a monochrome camera, a three-chip color camera is much more
expensive because of the cost of the prism and additional sensors. The
power requirements of three-chip cameras are also greater than single-
chip cameras.
While a three-chip color camera may rival the resolution of a single
chip monochrome sensor of the same size, large format 3-chip cameras
are not readily available. Often, one can achieve the same resolution as a
three-chip camera on a large-format single-chip color camera by choos-
ing video lenses with an appropriately adjusted magnification.

Inspection example
These considerations play a part in deciding what kind of camera best
suits the needs of a particular machine vision application. For example,
consider an inspection system that must determine whether fuses are
properly placed in a electrical harness. Fuses are color coded, so one’s
first inclination would be to use a color camera. Figure 2 shows an
image of four fuses that are either red or green in color. If the applica-
tion required a visual inspection of images on an analog monitor, using
a color camera would be a good option: The machine operator would
be viewing images that reflect the real-world situation in the same way
that he experiences it (ie, in color).

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

If, instead, a computer will inspect the fuses, and only a few colors
must be differentiated, then a color camera is probably not the optimal
camera to use. Most image processing algorithms process pictures
with a pixel depth of 8-bits. If a color camera were being used, the
three color planes would have to be extracted from each image and the
analysis performed on three separate color channels. The result of each
analysis would then have to be weighed against the others before a final
decision could be made on the part. The lengthy processing time could
slow the inspection process, limiting system throughput. A mono-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

chrome camera can speed the image processing time. For an applica-
tion that requires high resolution and large amounts of image process-
ing, a monochrome camera serves best.
A monochrome camera, however, doesn't differentiate colors well.
The monochrome image of the four fuses in our example yields very
poor contrast between the two colors (see Figure 3). Variations in color
within batches are significant and can result in erroneous results.
Adding a color filter to the system typically improves contrast. Figure
4 was obtained by a monochrome camera with a red color filter, and the
increased ease of distinguishing the red and green fuses in Figure 4
over Figure 3 is qualitatively obvious.
If the application involves additional colors, however, a mono-
FIGURE 2: Fuses viewed using a color camera. chrome camera would probably not be sufficient. In this more compli-
cated case, a color camera would be most appropriate. The imaging
process that we first considered for Figure 2 could be used, with each
fuse being defined as a ratio between the mean pixel values in the three
different color planes. The processing time involved in doing this com-
parison would be considerably greater than in the two color example.
Therefore, when color is the only differentiating factor and the number
of colors present is beyond the capacity of filtered monochrome cam-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


eras, the best choice is a color camera. Applications that cannot com-
promise on resolution or color need to utilize three-chip color cameras.

64

FIGURE 3: Fuses viewed using a monochrome camera.

REFERENCES
John Titus, "What makes a camera work?", Test &
Measurement World, December 2002, p.31.

FIGURE 4: Fuses viewed using a monochrome camera


with a red filter.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Target Description
CHOOSING A The United States Air Force-1951 Target was originally developed
according to MIL-STD-150A. It has proven to be a versatile and valu-
USAF TARGET able testing standard in many industries, setting the standard for most
optical resolution tests.
A variety of standard and specialty USAF targets are available, but
they all feature the same standard, well-recognized pattern. The pattern
itself is a series of groups of varying frequencies. Each frequency is rep-
resented by a Group and an Element. Each Group has six elements. Each
BEST OF EDMUND OPTICS™ APPLICATION NOTES

element consists of 3 horizontal and 3 vertical bars. The actual frequen-


cy of each group and element can be determined by the following equa-
tion:

Group + (element - 1)
Frequency (lp/mm) = 2 6

Based on this equation, it can be seen that the number of lp/mm doubles
with every group. The line width and line length can also be determined:

Line Width (mm) = _______1_______


Group + 1 + (element-1)
2 6

Line Length (mm) = _______2.5______


Group + (element-1)
2 6
FIGURE 1: Negative Target

Based on these equations, it can be seen that the Line Length is 5 times
the Line Width. The Line Width is simply:

( _______1_______
2
)
Group + (element-1)
6
Line Width (mm) =

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


2

Sometimes “pattern 0,1” is used to represent Group 0, Element 1.


65
Choosing a USAF Target
A variety of USAF targets are available to fit particular applications &
lighting conditions. The following is a list of considerations and the tar-
gets to choose from.

1) Positive or Negative
If the object is opaque or front lighting is being used, a Positive (reflec-
tive) target should be used. If the object is being backlit or darkfield illu-
FIGURE 2: Positive Target mination is being used, a Negative (transparent) target should be used.
Positive targets are white with a black pattern, and negative targets are
black with a clear pattern etched in them.

2) High resolution
Standard resolution, glass USAF charts have resolutions printed up to
pattern 6,6 (114 lp/mm). Our high resolution targets include up to pat-
tern 9,3, allowing resolutions up to 645 lp/mm to be tested. If finer res-
olution is needed, consider using a High Precision Ronchi Ruling as the
test target. It should be noted that low resolution applications are not
well suited to glass targets. Situations requiring measurements below
0.25lp/mm are accomplished more cost effectively with photographic
paper.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

3) Substrate
Glass
Glass is easy to keep in prime condition. It can be easily cleaned and will
not warp, tear or discolor over time. Glass can be made in either Negative
or Positive patterns. For Negative glass patterns, the spectral transmit-
tance of the glass type determines what wavelengths can be passed
through the target. Glasses such as UV fused silica will pass lower wave-
lengths than typical glass, which is often necessary for some applications.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

If a wavelength less than 400nm is being used, UV fused silica should be


VARIABLE CONTRAST USAF FIELD TARGET
considered.
0 1 0 1 0 1 0 1
1 1
2 1 2 2 1 2
2 2 2 2
3 3 3
3 2
2 3
1
3 2
2 3
1
2
4 3 -2 -1
1
3 3 2
-2 -1
1
2
4
3
2
3 4 3 3
4
2
2
4 3 3
4

5
5 3 3
4

4
4 4 5
5
4
6 4

5
5 6

4
4 2 4 5
5
4
6 5 6 5 -2
5
6
2
1
6
6 1
6 5
6
-2
1
6
6 1
6

5 0 5 0 5 0 5 0
1 1

Photographic paper
6 1 6 6 1 6

0 1 0 1 0 1 0 1
1 1 1 1
2 2 2 2
2 2 2 2
3 3 3 3 3 3
2
2 3
1
2
3 4
2
2 3
1
2
3 4
3 2
-2 -1
1
2
4
3 2
-2 -1
1
2
4
3 3 4 3 3 3 3
4 4 4
4 5
5 4 5
5 5 5
4
5

4
5 4
4

4 4
6 6 6 6
5 2 5 2 5 -2 5 -2
6 1
6 6 1
6 6 1
6 6 1
6
5 0 5 0 5 0 5 0

Photographic paper can be produced in large sizes and is less expensive


6 1 6 1 6 1 6 1

2
3
4
5
6
0

2
3
4
5
6
-2 -1
1
2
3
4
5
6
-2
1
1
1
2
3
4
5
6
0
1
2
3
4
5
6
0

2
3
4
5
6
-2 -1
1
2
3
4
5
6
-2
1
1
1
2
3
4
5
6
0
1
than other materials. Multiple patterns can be printed on a large sheet
with varying contrast levels or varying colors to test chromatic aberra-
0 1 0 1
1 1
2 2
2 2
3 -2 -1 3 3 -2 -1
1
3
1 2
2
3
2
3
4
4 3
2
3
4
4
4 5
5 4 5
5
4 4
6 6
5 -2 5 -2
6 1
6 6 1
6
5 0 5 0
6 1 6 1

tions. The maximum resolution that can be achieved is pattern 4,3. The
0 1 0 1 0 1 0 1

pattern is printed with film emulsion.


1 1
2 1 2 2 1 2
2 2 2
2
3 3 3
3 -2 -1
1
3 3 2
-2 -1
1
2
4 3 2
-2 -1
1
3 2
-2 -1
1
2
4
2
2
4 3 3
4 3
2
3 4 3 3
4
3 3

5 4 5
5
4
5 4
4

4
4 6

5
6
4 5
5
4 -2
5
4 5

4 -2 6
6 5
5 -2 6 1
6 5 -2
6
6 1
6
6 1
6 6 1

5 0 5 0 5 0 5 0
1 6 1 1 6 1
6 6

0 1 0 1 0 1 0 1
1 1 1
2 2 2 1
2
2 2 2 2
3 3 3 3 3 3
2
3
-2 -1
1
2
3 4
2
3
-2 -1
1
2
3 4
3 2
-2 -1
1
2
4
3 2
-2 -1
1
2
3 4
4 3 3 3
4

5
4

5
5 5 4
4 4

4
6

4
6 4 5
5 4 5
5
4 4
6 6
5 -2 5 -2 5 -2 5 -2
6 1
6 6 1
6 6 1
6 6 1
6
5 0 5 0 5 0 5 0
6 1 6 1 6 1 6 1

Electroformed Nickel (clear optical path)


Electroformed Nickel targets are extremely thin, and are only available in
a negative pattern. The pattern is cut through the nickel substrate, so any
FIGURE 3: Variable Contrast USAF Target on Photo wavelength can be passed through the target. This target is ideal for far
Paper UV and IR applications including thermal tests, and. Up to pattern 3,6
can be measured.

4) Coating
Fluorescent coatings are extremely useful to test any fluorescent system.
Lighting does not have to be changed for the resolution test. The most
accurate measurements can be taken when the lighting being used in the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


actual setup is also used for the testing. Negative and Positive setups are
available. The coating is applied to glass so high resolutions can be
achieved.

Using the Target to Test a Digital Imaging System


66 The main goal of the test is to determine the smallest resolvable features
of the imaging system. Imaging the target and finding the smallest
Element that can be resolved can do this. This can be done by taking a
simple line profile through the various Elements of the target, and com-
FIGURE 4: Image of a glass, negative USAF target illumi-
nated with a diffuse backlight taken with a
paring the contrast.
monochrome FireWire camera and a micro- Figure 4 is an example of an image taken of a glass, negative USAF
scope objective. target. It was used to evaluate the resolution of a system comprised of a
diffuse backlight, a microscope objective, and a monochrome FireWire
camera. The purpose of this test was to determine if the system could
measure 2 mm features on a circuit board to 0.025mm accuracy.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

The table below shows the image analysis that was done to compare the
obtainable contrast at different frequencies.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Image with location of The line profile Contrast Calculation


line profile shown (Imax - Imin)
(Imax + Imin)

255 - 29
Contrast = = 0.80
255 + 29

Group 3, Element 2

255 - 41
Contrast = = 0.72
255 + 41

Group 4, Element 2

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


255 - 94
Contrast = = 0.46
255 + 94
67
Group 5, Element 2

234 - 151
Contrast = = 0.22
234 + 151

Group 6, Element 1

From these tests, it can be seen that pattern 5,2 (which is 36 lp/mm)
produces sufficient contrast (46%) to distinguish 13 micron features. The
goal of the system was to measure 2mm to a 25 micron accuracy, so this
system will have no problem doing that.
The disadvantage of using the USAF target is that it gives you reso-
lution values at a particular area on the image (usually on axis). If you
were to place Group 6, element 6 in the corner of the image instead of the
center, you may get less contrast depending on the illumination and the
optics.
The contrast of the object needs to be considered, since many objects
have less contrast than the black and white lines of the target. The densi-
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

ty difference between the light and dark areas is greater than 2.00 on the
high contrast USAF targets that we offer.
Illumination will have a large impact on the contrast of the object. Refer
to our illumination and filter primers for information on obtaining the
maximum contrast while obtaining high frequency information in your
images. Monochromatic light is a simple solution to decrease the affects
of ambient light and chromatic aberrations.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Using the Target to Test a Visual System


Testing a visual system with the USAF target is somewhat subjective.
The viewer determines the smallest element that can be resolved. The
horizontal limit is found using the vertical bars, and the vertical limit is
found using the horizontal bars. The viewer must be able to resolve all
larger elements also. Due to optical anomalies, low frequency patterns
may look more blurred than high frequency patterns.
To relate the resolvable frequency to the system resolution of some
devices (such as a telescope), angular resolution in cycles per milliradian
(cy/mr) can be determined. The frequency (lp/mm) is multiplied by the
distance from the target (in meters):

R=FxD

where R = Resolution of the system in cycles per milliradian (cy/mm)


F = Frequency of the pattern in line pairs per millimeter (lp/mm)
D = Distance from viewer to target in meters (m)

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


68

www.edmundoptics.com 800.363.1992
HOW TO REDUCE Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

THE COST OF When building a vision system, one must consider the application, res-
olution, illumination, depth of field, field of view, processing speed, and
other elements. But all too often, systems are built that either fail to meet
CONFIGURING A performance expectations or utilize components that are over-specified.
Both pitfalls are expensive in the long run because an under-specified
VISION SYSTEM system that fails must be redesigned until it works; and an over-speci-
fied system contains components that are more expensive than needed.
To avoid these pitfalls, pay attention to specifications. In this article we
describe the parameters of a vision system so that you can specify a sys-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

tem that meets your needs. We also suggest some specific cost-saving
strategies.
Sensor
Because the purpose of a vision system is to extract necessary infor-
Sensor Size mation from an image, the application determines the required image
Camera quality. A system with sufficient image quality for one application may
not be sufficient for another. The opposite can also be true, with many
applications using over-specified components that do little more than
increase cost. But what is image quality? There are two complementary
ways of looking at the issue: first, the image quality of a system is the
result of the image quality of the components; second, image quality is
Working specified not by a single number, but by several factors discussed below.
Distance
Depth Of
Field Equipment basics and the application
The imaging ability of a system is the result of the imaging ability of the
components. Any vision system needs illumination, a lens, a camera,
Field O and either a monitor or a computer/capture board to analyze the images.
f View
Resolution Even the electronics cables and the user’s eyes affect the entire system’s
image quality.
It does no good to specify a high-resolution camera for use with a
FIGURE 1: Fundamental parameters of an imaging system low-resolution monitor. Ideally, one chooses components to fit the appli-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


include the resolution of the object, the field of
view, and the depth of field that the user wishes cation and complement each other. By avoiding over-specifying the
to image. The working distance, from the quality on some parts of the system, one ensures that none of the com-
object to the lens, is also important, as is the
sensor size. The primary magnification is the ponents is more expensive than necessary.
field of view divided by the sensor size. The needs of the application determine image quality — does the
vision system have to capture images quickly? (That also affects the pro-
69 cessing speed of the system.) Does it need to check the orientation or
color or size of the workpiece, or just detect its presence? How large is
the smallest necessary detail? How much contrast is necessary?
In order to talk about the needs of the application, we need a vocab-
ulary for image quality.

Image quality
Image quality consists of a number of fundamental parameters (see
Figure 1):
• Field Of View (FOV): The viewable area of the object under
inspection. In other words, this is the portion of the object that
fills the camera’s sensor.
• Working Distance: The distance from the front of the lens to the
object under inspection.
• Resolution: The minimum feature size of the object under inspec-
tion.
• Depth Of Field (DOF): The maximum object depth that can be
maintained entirely in focus. The DOF is also the amount of
object movement (in and out of focus) allowable while main-
taining an acceptable focus.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

• Sensor Size: The size of a camera sensor’s active area, typi-


cally specified in the horizontal dimension. This parameter is
important in determining the proper lens magnification required
to obtain a desired field of view.
In addition to resolution and depth-of-field, as mentioned above,
image quality is also a combination of three other properties: image
contrast, perspective errors, and distortion.

Resolution, contrast, and MTF curves


BEST OF EDMUND OPTICS™ APPLICATION NOTES

By considering the relationship between resolution and contrast one can


-2 -1 understand the tremendously useful modulation transfer function, or
1 MTF.
2 Resolution is a measurement of the imaging system’s ability to
2 reproduce object detail. For example, imagine a pair of black squares on
3 0 1
1
3 a white background. If the squares are imaged onto neighboring pixels,
2
3
2
3 4 then they appear to be one large black rectangle in the image. In order
4
4 5
5 to distinguish them, a certain amount of space is needed between them.
4
6
5 0
6 1
6 Determining the minimum distance needed to see the two squares yields
the limiting resolution of the system. This relationship between alter-
5 -2 nating black and white squares is often described as a line pair.
Typically the resolution is defined by the frequency measured in line
6 1 pairs per millimeter (lp/mm).
There are two different but related resolutions in play here: the res-
olution in object space (the size of elements in the object that can be
resolved), and image space resolution (a combination of the lens reso-
lution and camera resolution). The sensor’s line pair resolution can be
no more than half the number of pixels across on the sensor because a
pair of pixels is the minimum required to discern a black and white area.
The image and object space resolutions (described in lp/mm) are relat-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


ed by the primary magnification of the system.
The limiting resolution of the system can be determined experimen-
tally by imaging a test target (see Figure 2). A bar target consists of line
pairs with varying frequencies, whereas a star target consists of wedges
with a continuum of frequencies. The orthogonal lines in a bar target are
70 useful because they allow users to test the system for errors that show
up differently in the x and y planes of an image (in other words, astig-
matic errors). Bar targets, however, are limited by having a finite num-
ber of steps in frequency. Star targets do not have this drawback, how-
ever they can be more difficult to interpret.
FIGURE 2: Two test targets: a bar target and a star target Contrast describes how well the blacks can be distinguished from
allow users to measure resolution and astigmatic the whites. In real life black and white lines will blur to some degree
errors.
into grays. Noise and blurring of edges will cause the contrast to go
down. How effectively the differences between boundary areas on the
image are reproduced relative to one another is often defined in terms
of grayscale or signal-to-noise. For an image to appear well-defined, the
black details need to appear black and the white details must appear
Image Space Resolution = white (see Figure 3). The greater the difference in intensity between a
light and dark line, the better the contrast. This is intuitively obvious,
(Object Space Resolution) but it is more important than it may first appear. The contrast is the sep-
aration in intensity between blacks and whites.
(Primary Magnification) Reproducing object contrast is as important as reproducing object
detail, which is essentially resolution. The lens, sensor, and illumination
all play key roles in determining the resulting image contrast. The lens
contrast is typically defined in terms of the percentage of the object con-
trast that is reproduced. A sensor’s ability to reproduce contrast is usu-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

ally specified in terms of decibels in analog cameras and bits in digital


cameras.
The resolution and contrast of an image can be defined individually,
but they are also closely related. In fact, resolution is often meaningless
unless defined at a specific contrast. Similarly, contrast depends on res-
olution frequency. Consider two dots placed close to each other and
imaged through a lens (see Figure 4). Because of the nature of light,
even a perfectly designed and manufactured lens cannot accurately
reproduce an object’s detail and contrast. Even when the lens is operat-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Imaging Pixels ing at the diffraction limit, the edges of the dots will be blurred in the
image.
Imax
White When they are far apart (in other words, at a low frequency), the dots
are distinct, but as they approach each other, the blurs overlap until the
Contrast dots can no longer be distinguished. The resolution depends on the
Square Wave
imaging system’s ability to detect the space between the dots.
Therefore, the resolution of the system depends on the blur caused by
Black diffraction and other optical errors, the dot spacing, and the system’s
I min ability to detect contrast.
Optical engineers usually specify a contrast level at a specific reso-
FIGURE 3: Contrast is the difference in intensity between lution. When a plot is made of contrasts at a range of frequencies, you
blacks and whites. For an image to appear well- have a Modulation Transfer Function (MTF) curve.
defined, black details must appear black and
white details must appear white. The greater the Suppose we imaged a target of black and white parallel lines.
difference in intensity between a black and white Consider the effect of progressively increasing the line spacing fre-
line, the better the contrast. The human eye can
see a contrast of as little as 1-2%. A typical lim-
quency of a target and how this might affect contrast. As one might
iting contrast of 10 to 20% is often used to define expect the contrast will decrease as the frequency increases. The
the resolution of a CCD imaging system. Modulation Transfer Function (MTF) is plotted by taking the contrast
values produced by a series of different line pairs. The curve drawn
from these points shows the modulation (in other words, the contrast) at
% Contrast = all resolutions, not just at the limit resolution.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


The high-resolution end of the curve is not always the most impor-
(Imax - Imin) tant part of an MTF! For many applications, a high contrast at a low fre-
( Imax + Imin) quency is more important than the limit of resolution. For such applica-
tions, a higher-resolution lens (for example, one designed to work with
film rather than with CCDs) will not improve the overall system —
71 where Imax is the maximum intensity although it will increase the cost. Instead, brighter illumination may be
and Imin is the minimum intensity all that is needed.

Other strategies for reducing costs


IMAGE
OBJECT There are some other specific strategies that designers can apply to
reduce costs. The overriding theme is to eliminate unnecessary com-
plexity: keep the system as simple as possible. Here are four strategies
that can be used together or separately:
• eliminate colors
• fix apertures
• eliminate folds
Iris • use off-the-shelf optics
Do you need white light or would monochromatic illumination work
just as well? If you only use one color, then chromatic aberration is no
longer an issue. If the system does not need to be color-corrected over
the entire spectrum, the lens design is simpler. Going monochromatic
may also simplify the illumination system since monochromatic LEDs
use less power and create less heat than white light incandescent bulbs.
FIGURE 4: Contrast is not constant! It depends on frequency. If you can fix the system apertures, while maintaining the ability to
The dots at the top of the figure can be
imaged through a lens. They blur slightly. If we
focus the system, this also simplifies the optical design and can reduce
moved the spots closer, their blurs overlap and the number of elements in the system. All other things being equal,
contrast decreases. When the spots are close
enough that the contrast becomes limiting, that
spacing is our resolution. continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

fewer elements mean less cost.


Folds in the optical path introduce aberrations. If at all possible, lay
out the path in a straight line. Also, you avoid the cost of the folding
mirrors.
Finally, use off-the-shelf system elements when possible. Unless the
vision system will be produced in quantities of several hundreds or
thousands, off-the-shelf optics will be cheaper than custom-made
optics. They will certainly be faster to obtain than custom optics.
Most design software packages have off-the-shelf lenses preloaded
BEST OF EDMUND OPTICS™ APPLICATION NOTES

into them. The sooner off-the-shelf options are worked in, the better.
Typical design software will give a starting point with custom lenses
when one optimizes all surfaces. Then one can force the software to
replace the custom lenses with the closest off-the-shelf matches and
allow air spaces to compensate. The best time to do this is before start-
ing on the mechanical designs.

Off-the-shelf solutions and design


All of these parameters can lead to exacting specifications for a lens.
This often leads integrators to want a custom lens, feeling that an off-
the-shelf lens could not fit all of the parameters correctly. However this
is often an easier task than it may seem. It’s pretty obvious that an off
the shelf solution will be more cost effective than a custom solution. In
optics this is very true. It takes high volumes to efficiently manufacture
lenses. There often is very costly and time consuming design necessary
to come up with a lens that will work properly.
With the tremendous growth in machine vision, off-the-shelf video
lenses are more common. We at Edmund carry a very wide selection of
lenses designed specifically for machine vision applications. Usually a
little bit of flexibility on one or more parameter allows for an easy selec-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


tion of an off-the-shelf lens. A simple and cost effective way to work
an off-the-shelf lens into a design is to choose a lens before the mechan-
ical design of the entire system is finished. The majority of times I walk
into a customer’s facility where they are trying to solve a vision prob-
lem, the lens was the last thing to be considered. The delay in consid-
72 ering it often leads to very difficult mechanical constraints to deal with.
The housings usually could have been changed early on, but by the time
the lens is integrated it would be too difficult.
There are times custom makes sense. For example when you have
very tight requirements for working distance and packaging. Also if
specific fields of view are necessary a custom fixed lens that gets the
correct field of view may still be cheaper than a zoom lens that happens
to get the right field of view with a lot of unused adjustability. Also for
very high volume the costs of design and set up can be amortized and
repeating costs like the number of lenses and mechanical adjustments
can often be minimized with a custom design.
The disadvantages of custom can also be avoided by using off-the-
shelf elements in the design. Lead times for manufacturing of lenses is
a significant factor to consider. Production of a custom lens can easily
take 8-10 weeks to manufacture. The price also associated with test
plate and tooling can be high as well. By using off-the-shelf elements
you avoid the set up fees and the lead times.
Most catalog lenses are already in popular design packages, which
makes designing with them easier. This also allows for a quick proto-
type that helps in proving the concept of the design before an invest-
ment in design and manufacturing. Many applications can easily be

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

solved with simple telephoto or reverse telephoto designs using two


achromats.
Finally if the design is very demanding a ground up custom may be
necessary. We see many applications each month where there is no
other option than a custom design. In these cases we always make sure
the customer knows that there will be a significant cost and lead time
for a design. Though we can often produce a design in rather short time,
the prototypes will always take sometime to produce if we could not use
off-the-shelf components.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

When we go to a custom design, the trade off we gain for the higher
lead time and the higher cost of design is that we can often save money
in the long run for high volume. One way to do this is, since we are
designing the lens for a known application, we can remove some of the
adjustments that are normally built into an off-the-shelf lens. If the illu-
mination is going to be constant we can make the iris fixed to only one
setting. We can design the lens to get the performance that is required
without over designing it beyond the needs of the application.
It is also good to have the manufacturer be part of the design
process. We do many custom designs because we also manufacture the
lenses. We can often save time and money designing to tooling and test
plates we already have. Also we understand our tolerancing and manu-
facturing better than anyone so we can design specifically to the manu-
facturing abilities.

Conclusion
The first step in an efficient vision system is to properly specify the nec-
essary requirements and the degrees of freedom. The more degrees of free-
dom, the easier it will be to find an off the shelf solution. If a custom
design is necessary look first to solutions using off-the-shelf components

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


to make design, prototyping, and production faster and cheaper.
The basic cost-saving strategy for vision systems is to specify what
you need, and no more. Apply as much intelligence as possible when
specifying the system, and use some common sense tips to reduce costs
during design. If you do, then your system will fill your needs efficiently.
73

TECH TIP ON CHOOSING MONITORS


Video systems require video compatible monitors rather than computer monitors.
Monitor specifications, such as signal format, component level signals, and relative
resolution, must match the input device. Based on the resolution of the human eye,
the following equation may be used to yield the maximum monitor viewing distance:

viewing distance ≤ 4148 x Diagonal


Screen (in inches)
TVL

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

SET YOUR SIGHTS Semiconductor equipment such as wire bonders and surface profiling
equipment requires integrated sensors that can monitor a process or
locate material, and these sensors are often optical imaging systems.
ON VISION Despite this fact, many semiconductor equipment manufacturers
which employ entire groups of mechanical, electrical and software
engineers have only a single engineer in charge of optical systems.
And yet, the need for integrating optics into the machinery has
never been greater. And because space is always at a premium in a fab
BEST OF EDMUND OPTICS™ APPLICATION NOTES

cleanroom, system designers have little elbow room. Often, integrat-


ing a vision system means snaking the optical system through the
equipment without interfering with the primary process, be it wire-
Sensor
bonding, die packaging, aligning wafers, or lining up registration
Sensor Size marks before lithography or metrology — many of the common fabri-
Camera cation processes can benefit from using optics.
If you can’t afford a large optical engineering department, you can
apply a design strategy for implementing imaging systems within the
tight space constraints of your equipment. The steps are straightfor-
ward:
1. Define the image quality you need
Working 2. Determine whether the quality is feasible
Distance 3. Prototype
Depth Of
Field 4. Place the lighting
5. Make it fit
6. Reduce production costs
Field O
f View
Resolution Image quality
The primary purpose of any imaging system is to obtain sufficient image
quality to extract necessary information. There is no single number that
FIGURE 1: Fundamental parameters of an imaging system. determines image quality. Before you can specify your vision system

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


needs, spend some quality time with the object you want to view.
For the fundamental parameters of an imaging system, see Figure 1
and the sidebar, at left.
Another useful descriptor of the system, the primary magnification of
FUNDAMENTAL PARAMETERS OF the lens, is the ratio between the sensor size and the field of view. It is
74 not typically used as a fundamental parameter.
AN IMAGING SYSTEM
In addition to resolution and depth-of-field (see sidebar, at left) image
Field Of View (FOV): The viewable area of the quality is also a combination of three other properties: image contrast,
object under inspection. In other words, this is the perspective errors, and distortion (see Figure 2).
portion of the object that fills the camera’s sensor. The point of considering all these factors is to determine the minimum
acceptable image quality. Defining the minimum image quality is cru-
Working Distance: The distance from the front of cial. Tightly packed optical systems all have one thing in common: they
the lens to the object under inspection.
sacrifice lots of image quality to accommodate for mechanical con-
straints. In addition, truly understanding image quality requirements can
Resolution: The minimum feature size of the
object under inspection. mean huge savings in both time and money. Know your minimums!

Depth Of Field (DOF): The maximum object depth Will it fit?


that can be maintained entirely in focus. The DOF Once you have nailed down your basic parameters — what you
is also the amount of object movement (in and out absolutely must have — it is time to crunch some numbers to find a
of focus) allowable while maintaining an accept- combination of focal lengths and object/image distances that will work
able focus.
— or to determine that no feasible system can fit the requirements
you’ve come up with.
Sensor Size: The size of a camera sensor’s active
area, typically specified in the horizontal dimen- The bad news is that this usually involves working through thin
sion. This parameter is important in determining lens equations, which you probably saw last in a college physics text-
the proper lens magnification required to obtain a book. In addition, those equations can lead to very misleading results.
desired field of view. The good news is that you don’t have to do it yourself. Get on the

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

phone and start calling optical companies. Any modern optical compa-
ny worth working with has optical design software that can quickly and
easily provide a preliminary solution. The even-better news: Unless
your problem is extremely complex, this service is usually free

Prototype
Prototype using off-the-shelf components. You will find that off-the-
shelf prototyping is fast, inexpensive, and it allows you to confirm
image quality requirements. Even the best optical designer cannot per-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

fectly predict the effects of illumination and object surface characteris-


IMAGE tics.
QUALITY Another word of hard-won wisdom: set up your initial prototype in
a straight line. The final optical system will, no doubt, have a number
of bends and twists in it, but at this point, you need to understand the
RESOLUTION basic effects of lenses, apertures, CCD’s and illumination. If you don’t
become familiar with these characteristics of your system at this point,
DEPTH OF FIELD CONTRAST debugging later can become a nightmare. Even if you have to make
special metal to hold the straight-line system, it is well worth the time.
PERSPECTIVE DISTORTION
Finally, make sure your aperture sizes are realistic. Chances are, you
have chosen lenses with diameters that won’t fit in the mechanical space
FIGURE 2: A variety of factors contribute to the overall you’ve allotted for the optics. Use apertures to simulate the diameters
image quality, including resolution, image con-
trast, depth of field, perspective errors, and
that you can realistically expect.
geometric errors.
Illuminate!
Most imaging systems that fail do so because the objects in the field of
view are improperly illuminated. Without sufficient illumination, the
system’s contrast suffers, and the image quality therefore suffers as
well. Contrary to popular belief, contrast is more important than reso-
lution in many imaging systems.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Let’s repeat that: contrast is more important than resolution in many
imaging systems.
For an image to appear well-defined, the black details need to appear
Tightly packed optical black and the white details must appear white. The greater the differ-
ence in intensity between them, the better the contrast. For the imaging
75
systems sacrifice image system to have a chance of transmitting a good contrast image, howev-
er, the object has to be illuminated in a way that provides good contrast
quality to accommo- to begin with.
Know your angles: illumination is all about geometry. Consider the
relationship between lighting geometry and surface features in these
date mechanical examples (see Figure 3):
Diffuse light from the front can be provided by fluorescent linear
constraints. or ring lamps and minimizes shadows and specular reflections, but it
also makes surface features less distinct.
Single-directional glancing incidence lighting, such as from
fiberoptic light guides, goes to the opposite extreme: it shows surface
defects and topology very well, but also causes extreme shadows and
bright spots.
Directional illumination, provided by one or more fiberoptic light
guide offers more moderate properties: strong relatively even lighting
but with some shadows and glare.
Ring lights, provided by fiberoptic or LED ring light guides, reduce
shadows and provide relatively even illumination, but can sometimes be
difficult to mount and can sometimes create a circular glare problem
from highly reflective surfaces.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Polarized lighting, provided by a regular light source with a filter


attached, provides even illumination but offers less intensity through the
polarizer.
Diffuse axial lighting can be offered by LED axial illuminators or
fiberoptic-driven axial adapters, and offers shadow-free even illumina-
tion with little glare, but requires an internal beam splitter which
reduces the intensity.
Structured light, which can be provided by a line-generating laser
Most imaging systems diode or a fiberoptic line lightguide, is very useful for extracting surface
BEST OF EDMUND OPTICS™ APPLICATION NOTES

features, but the disadvantage of using a laser is that some colors may
absorb the intense light and heat up.
fail because the object is Optical engineers love LEDs. The use of monochromatic LED’s
solves a lot of imaging problems and simplifies optical designs: the
improperly illuminated. main benefit is that if you use only one color of light, then chromatic
aberration simply isn’t a factor. As with most things, however, there is
a price to pay: LED illumination can be uneven and not provide enough
energy where you need it. To fit your purpose, LED generated light
may need to be reshaped, diffused or directed by a lens.
Debugging illumination can be tricky. Two tools you should not be
without are a flat mirror and chrome ball bearing. These two surfaces
accurately show the location and intensity of your illumination sources
regardless of object surface characteristics.

Making it fit
Now that you’ve ironed out the basic optical path and illumination, you
get to make the system fit in the space allotted to it. When you start
adding folds and combining optical paths, you start earning your keep.
This looks easy on paper, but it can be a tolerancing and debugging hell.
While we can’t make this easy, we can mention some details to think

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


about:
Mirror thickness has a direct effect on image quality. While it may
be tempting to specify ultra thin mirrors and beamsplitters, doing so
makes it impossible for optical manufacturers to guarantee surface flat-
ness and thus image quality. Just holding the mirror can deform its
76 FIGURE 3: What you can see depends on the light you shed shape. If you need surface flatness of ¹⁄₄ wave or less, a good rule of
on the object. Different types of illumination thumb is to use a 6:1 ratio between surface size and thickness. If you
can solve or create problems for your imaging
system. have to specify thinner optics, take a great deal of care when mounting
the parts to avoid deforming them due to strain in the mechanical fix-
tures or from bonding. One last point on mirrors: Mounting them from
the front can alleviate the need for tight thickness tolerances.
Your system may be using infrared LEDs, which is great, but good
IR mirrors take a little getting used to. They are often gold mirrors,
which are soft and easily damaged. Have a talk about these issues with
both your supplier and the production people before your design hits the
manufacturing floor.
A wise designer allows for adjustments. Long optical paths can be
very sensitive to centering, boresight and angular tolerances. Folding
the optical path multiplies this problem by a factor of three. What
works well in the lab may fail on the production floor. If possible,
design gimbal adjustments for all of your folds. An x-y adjustment in
the CCD plane can help to adjust for boresight errors. If you can’t do
that, do a stringent geometric tolerance analysis to minimize the effects
listed above.
Don’t ignore back reflections. When you use a beamplitter to com-
bine the illumination and imaging optics into the same path, only 20 to

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

40% of the illumination is used. The rest of the light passes out of the
system. But when the stray light hits a piece of metal in the machine and
reflects back into the optical system, you get problematic back reflec-
tions. Even black surfaces can reflect light! Prevent this problem by
baffling the excess light. Threaded barrels can really make a difference.
Or, you can make your own light stop (see Tech Tip below).

Production
Tricks of the trade:
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Finally, your design is nearing production. Although you started with


off-the-shelf components, most systems end up with some custom com-
When debugging your ponents. Talk to your supplier: what should you expect in terms of price
and delivery of custom components?
Some common customizations, which won’t break your budget,
illumination prototype, include edging lenses (to a smaller diameter) and resizing mirrors and
beamsplitters.
keep a flat mirror and If you are considering ordering custom lenses, consider the quantity
you need. If you need at least 500 pieces of a single lens or doublet,
chrome ball bearing on then custom lenses may make sense. If you need fewer pieces, off-the-
shelf lenses will probably be more economical.
hand. These two Determining whether to order custom or off-the-shelf compound
CCD lenses is more complicated. If you need more than 250 pieces, a
surfaces accurately custom lens can really make sense. One of the advantages of a custom
lens is that you can eliminate adjustable aperture stops and helical
show the location and focus, which reduces costs dramatically. In addition, most off-the-shelf
lenses turn out to be larger than if they were designed for a specific
application. So if you are working in a tight space (which this article
intensity of your assumes), a custom lens can be a real advantage. If you have done your
homework, a good optical company can make ordering a compound
illumination sources CCD lens painless and cost effective.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


In addition to adding gimbals for adjusting mirrors, make the lens-
regardless of object es adjustable too. The more leeway you can provide in terms of focus
and alignment, the easier life on the production floor will be. At the
surface characteristics. very least, you must provide for focus adjustments!
77
Conclusion
While designing an optical system into the tight constraints of semi-
conductor equipment is rarely easy, we’ve provided both a strategy and
some tactics that will make it possible.

TECH TIP ON MAKING A LIGHT STOP


Many optical applications call for a non-reflective light stop to be inserted into the system.
A quick solution to this problem is to run the same piece of paper through the copy machine
multiple times with the document cover open and nothing on the glass. This produces a black
piece of paper. After several passes through, the buildup of toner will provide an excellent,
inexpensive, non-reflective light stop.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

KEEPING A TIGHT Recently a semiconductor capital equipment maker was designing a new
wire-bonding machine that included a vision system. Engineers there
knew the demands on the vision system were not particularly strenuous,
FOCUS ON OPTICS so they concentrated most of their efforts on the electronics and motion
control. They did, however, leave some space for the camera and optics.
That's when my company got involved. The OEM asked us to design
a vision system with certain multiple magnifications that could be
changed by the user in the filed and which, of course, would fit in the
Problem:The engineers
BEST OF EDMUND OPTICS™ APPLICATION NOTES

space available.
Problem: The engineers hadn't left enough space for the optics. Too
hadn't left enough space bad, because with better planning, we could have provided the required
magnifications with ease using optics that were both inexpensive and off-
the-shelf. As it was, the only alternative was a custom system that came
for the optics. at some cost. Moreover, some of the specs had to be relaxed simply
because of the mechanical constraints.
Though the engineers involved had the right idea, they hadn't consid-
ered that the working distance for lower magnifications tends to be longer
than that for higher magnifications. And lack of sufficient room compli-
cated the focusing and compensation methods.
The irony was that optical specifications were not, in general, unreason-
able. They would have been easily met had we been able to place the lens-
es at our discretion.
The situation these engineers ran into is not at all uncommon in semi-
conductor manufacturing equipment. Few manufacturers in this area have
more than a single engineer in charge of optical systems. This despite the
fact that it may be unusually difficult to design these systems because they
must fit into the design after (and around) other systems.
Often, the integration of a vision system means snaking the optical
system through the equipment without interfering with the primary

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


process. This is particularly true for operations that include wirebonding,
die packaging, aligning wafers, and lining up registration marks before
lithography or metrology.
Optics are often the last systems that engineer’s design into their equip-
ment, but it can be crucial. The most cost-effective way to buy optics
78 depends both on volume and specifications of the system. The application
and not the manufacturer should determine these specifications. If you
take some time, early in the design process, to consider the requirements of
your system, you can fulfill them with the best balance of performance,
cost and yield available. The application dictates the system perform-
ance/requirement, which dictates the specifications. The System require-
ments should be identified and establish at the same time as the packaging
is being decided upon. Allowing the required envelope will reduce optical
performance/cost problems later on in the project.

Off-the-shelf or Custom?
The Design
With basic parameters nailed down, the next step is to work out a combi-
nation of focal lengths and object/image distances. The bad news is that
this usually involves calculating thin lens equations, which you probably
saw last in a college physics textbook. The good news is that you don't
have to do it yourself. Most modern optical companies have optical design
software that can quickly and easily provide a preliminary solution.
Unless the problem is extremely complex, this service is usually free.
This is the point before the design is finalized at which to consider

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

whether you need lenses custom-made or if off-the-shelf optics will do.


Custom lenses are almost always used to correct aberrations and or
package requirements. But sometimes correction is not important or
a combination of off-the-shelf and custom lenses will work as well.

Off-the-shelf vs. Custom


Economy of scale is everything when it comes to the price of optics,
because of how lenses are made. Low volume will always favor off-the-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

shelf elements. But the advantages diminish as volume rises and other fac-
tors take over. As a general rule of thumb, the custom approach makes
economic sense only when one needs thousands several hundreds of lens-
Sensor es. (As with any rule of thumb, there are always exceptions.) But if a cus-
Sensor Size tom approach is absolutely necessary, it can be done at a reasonable cost
Camera for 100 to 1,000 and up pieces.
Off-the-shelf optics are made in quantity, in continuing production,
and kept in stock by suppliers. These stock lenses are typically designed
into standard matrixes in a wide variety of sizes and focal lengths.
Prototype using off-the-shelf components: It is fast, inexpensive,
and lets you confirm image quality requirements. Moreover, custom
Working lenses are astronomically expensive in the small quantities normally
Distance used for prototyping. Custom lenses made by traditional methods may
Depth Of
Field require long lead times. If one uses lenses made with deterministic
grinding and polishing, the lead-time is less but the cost will be high.
Of course, if prototyping shows the design must change, any benefit
Field O
f View from the expenses is lost.
Resolution
Where are the breaks
Both custom and off-the-shelf elements are viable options for between
FIGURE 1: Fundamental parameters of an imaging system
1,000 and 100,000 pieces. No one stocks off-the-shelf lenses at such a

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


include the resolution of the object, the field of
view, and the depth of field that the user wishes large volume without a forecast from a specific customer. But it's easier to
to image. The working distance, from the handle increases in volume with an off-the-shelf option, because there is
object to the lens, is also important, as is the
sensor size. The primary magnification is the less risk in overstocking these lenses than in overstocking a custom lens.
field of view divided by the sensor size. And However, if a custom lens can reduce the number of elements in the
design, this approach becomes even more cost-effective at such volumes.
79 Custom lenses are almost always the choice above 100,000 pieces. At
FUNDAMENTAL PARAMETERS OF these volumes, the benefits of eliminating elements become more pro-
AN IMAGING SYSTEM nounced. There are real economies of scale. And the 100,000-piece level
is a significant breakpoint: The cost/piece of 200,000 lens is not signifi-
Field Of View (FOV): The viewable area of the cantly less than that for 100,000.
object under inspection. In other words, this is the
portion of the object that fills the camera’s sensor. Other factors
Additional factors that govern the choice between stock and custom lens-
Working Distance: The distance from the front of
the lens to the object under inspection.
es include whether or not designs need minimal weight, small size, tight
tolerances, or strenuous specifications for the focal length or aberration
Resolution: The minimum feature size of the correction. If the design is already complete and assumes custom lenses,
object under inspection. changes to incorporate off-the-shelf lenses may be expensive. Changing
the lens inevitably means different mounting to accommodate any
Depth Of Field (DOF): The maximum object depth changes in focus. Even lenses with identical focal lengths can mount dif-
that can be maintained entirely in focus. The DOF ferently because a change in radius alters where the lens must sit. All in
is also the amount of object movement (in and out all, the cost for engineering these changes may outweigh the savings of an
of focus) allowable while maintaining an accept- off-the-shelf lens.
able focus. Though custom lenses can reduce the number of elements in a design,
this may or may not cut costs. Additional elements, however, inevitably
Sensor Size: The size of a camera sensor’s active
area, typically specified in the horizontal dimen-
add weight to the system. Most off-the-shelf lenses are larger then than if
sion. This parameter is important in determining designed specifically for an application, as the components are meant to
the proper lens magnification required to obtain a
desired field of view. continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

cover a broad base of applications.


In a tight space, a custom lens can be a real advantage. Another con-
sideration is tolerancing. Tolerance stack-up can accompany the use of
numerous elements (rather than custom optics) to correct aberrations. A
marked decrease in performance results. In addition, some results require
that a certain element have a specific tolerance that may not be standard
for off-the-shelf versions.
There are also situations that demand a very specific focal length and
BEST OF EDMUND OPTICS™ APPLICATION NOTES

there is just no easy way to get around it. Some designs may need a spe-
cific form of optics, such as a meniscus lens, to correct aberrations; these
types of lenses are often unavailable not available off-the-shelf.
Finally, there are many ways of customizing off-the-shelf elements in
lieu of going full custom: Two Some of the most widely used include edg-
ing down a lens component, cutting it to a specific size or custom coating
it.
The easiest customization is changing the diameter of a stock lens ele-
ment. It is easy to edge down or cut an lens element even in small vol-
ume. This is often important for mounting in an existing housing or in
cramped quarters. “Edge downs” can be quick and inexpensive. Special
coatings are frequently a motivation for a custom lens. Sometimes designs
require low reflectance at a specific wavelength or an antireflection coat-
ing in the UV or near-IR range. Lens suppliers are accustomed to fielding
requests for special coatings on batches of uncoated lenses. As with edge
downs, the costs are usually quite low and the lead time is short. The cost
are reasonable depending on the lot size and the turn around time
required.
Before signing a PO, take some time to think about your system’s min-
imal requirements, and how you can fulfill them in the most effective
FIGURE 2: An assortment of custom and off-the-shelf manner. This includes some design considerations, an understanding of

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


optics and optical assemblies available from imaging in general and an understanding of your specific application
Edmund Optics.
needs. This understanding of the design elements, in turn, will affect the
decision to design/buy with custom made or off-the-shelf optics. These
decisions may have weighted factors and need to be evaluated early in the
design cycle to be effective.
80

www.edmundoptics.com 800.363.1992
DESIGNING A VISION Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

SYSTEM TO MEET Despite the fact that imaging systems are critically important for many
types of machinery, optical design is often addressed only after the
mechanical and other system designs have been completed.
YOUR SPACE Consequently, imaging systems have to be fit into the available volume,
which can often be tiny and/or awkwardly shaped.
CONSTRAINTS Because integrating optical systems on paper always seems to be
easier than it does in actual applications, many systems are simply put
together by trial and error. Although this may work in the lab, it can be
BEST OF EDMUND OPTICS™ APPLICATION NOTES

disastrous when the system is integrated into a piece of equipment.


This article outlines a series of steps that allow system designers to
integrate a workable imaging system into a too-small box without rig-
orous engineering.
The six steps are:
1. Define your mechanical constraints.
2. Define your fundamental parameters.
3. Layout the straight line imaging system.
4. Place the illumination and determine minimum f/#.
5. Compare optical design with mechanical constraints.
6. Bend the system.
FIGURE 1: The final imaging system must be able to dis- Consider a typical imaging system that requires integration into a
criminate between acceptable and unacceptable
characteristics of objects, such as the one shown confined space. For example, Figure 1 shows the object that needs to be
here. imaged. The current picture is not sufficient because it doesn’t show
enough of the object to register indentation locations.
Our goal is to build a system using off-the-shelf parts (to keep costs
down) and have as few bends in the system as possible (for simplicity
and to minimize the number of components). The resolution of the final
system must be sufficient to discriminate between acceptable and unac-
ceptable characteristics. It must also measure the location and size of
indentations in shiny objects that are roughly 20mm in diameter.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Building custom components can be considered after this is complete.

Constraints & Parameters


Designers must ask themselves a number of questions in order to deter-
mine the imaging systems’ mechanical constraints and fundamental
81 parameters. How big is the box? What space, exactly, is available for the
imaging system?
Several numbers must be determined at this point. For instance, the
250.0 available track, or the length of the space allotted for the optics. The
length of the camera, lens, and cables should be included in this meas-
urement. For most systems, room for illumination will also need to be
incorporated. Also, estimate how many bends are needed in the system.
175.0
An example box is shown in Figure 2.
250.0 The fundamental parameters (see Figure 3) of any imaging system
include:

• Object Field of View


• Working Distance
75.8
• Object Resolution
• Sensor Size
FIGURE 2: The example imaging system must fit within this • Depth of Field
oddly shaped box. When looking at fundamental parameters don’t forget the basics. Be
sure to keep in mind that the working distance often depends on
mechanical constraints. When selecting the object resolution, ask your-
self what size defect the system will be required to measure. Remember

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

that both horizontal and vertical measurements are important for deter-
mining sensor size.
For this example, we’re looking at a 20 x 25mm shiny metal object.
We are measuring location and size of indentations. High resolution is
necessary to maximize accuracies, ideally 10 line pairs per millimeter
(lp/mm). We need only a fairly narrow depth of field of about 5mm to
accommodate the depth of the indentations.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Straight Line & Illumination Layout


With the optical parameters in hand, it is time to start designing the opti-
cal system. This article does not cover the basics of choosing compo-
Sensor
nents, but it does assume that the design uses off-the-shelf components.
Sensor Size First, lay out the system in a straight line and check that the design
Camera works. Are the fundamental parameters of the system equal to your
needs? Where do you need to place the illumination? Determine the
clear apertures. Be ready to repeat this step as bends in the system are
introduced. Also, determine how sensitive the system is to adjustment.
A system that is sensitive to focus and alignment can add cost and com-
plexity later on.
Working For the example system, we found parts and decided that a camera
Distance with a 1/2-inch-format sensor could be used. The lenses have a focal
Depth Of
Field length of 50mm, working distance of 250mm, field of view of 28.5mm,
object resolution of 11 lp/mm, and a depth of field of less than 5mm (see
Figure 4). The fact that it does not fit into the box provided can be
Field O
f View ignored for the moment.
Resolution Now is the time to determine minimum f/#. The f/# is a measure of
the light-gathering ability of an optical system. For a lens, the f/# is the
focal length divided by the diameter.
FIGURE 3: This diagram illustrates the five fundamental Usually changing the aperture is the simplest way to adjust the f/#,

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


parameters of an imaging system.
but the aperture also alters the resolution and the amount of light getting
through. Changing the aperture can also affect the system’s depth of
field. Sometimes, changing the aperture will allow you to switch to a
significantly smaller lens.
The f/# is also intertwined with the illumination. If the illumination
82 must be brighter because the aperture has decreased, will the light bulb
be driven too hard, and will the resulting lifetime be too short?
Meanwhile, what sort of illumination is required: Point-source, diffuse,
or ring light, and normal or glancing? Some objects can be lit from
behind, providing a bright field. Others may benefit from illumination
15 by line generators or other types of structured light.
108 For our example, we chose diffuse illumination. To reduce specular-
ity in the image we need to make sure the image of the illuminator
65
appears larger than the object, in other words the illumination must
cover the entire object.
175

250 Reality Check


Now that we have a working design, it’s time to compare the optical
system to the space available for it. Does it fit? (Almost never.)
75 OBJECT
Before trying to bend the optical path, there are three tactics you can
250
try. First, try reducing the camera size. Either a board level camera or a
remote head camera can ease the space constraints. Next, consider
FIGURE 4: After determining the fundamental parameters
of the system, a working distance, sensor size, whether there is a different combination of focal lengths and working
and field of view can be determined. distances that could be used. If the system uses fiber optic cables to
deliver the illumination, can they be bent at the connection joint? Or,
perhaps, compact-but-typically monochromatic LED illumination
would work?
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Bending the System


Finally, there is no choice but to bend the system. There are a number
of considerations at this point. Where will the bends be located along
the optical path? Clear apertures must be defined as well as whether to
use prisms and mirrors, beamsplitters, or baffles. The mounting and
adjustment of each of these elements should also be considered now in
order to avoid expense later.
Locating bends is relatively straightforward. Look at the straight-line
BEST OF EDMUND OPTICS™ APPLICATION NOTES

design, and locate the bends where there is space for a mirror or prism as
FIRST SUFACE
well as a need to relocate components, such as a lamp or camera.
MIRROR
The straight-line system can also help define clear apertures. Make
sure to calculate entrance and exit apertures if prisms are used to bend
175
BEAMSPLITTER
(IN DIFFUSE AXIAL ILLUMINATOR)
the system rather than mirrors. Using elliptical or rectangular clear aper-
SOURCE 250 tures (rather than circular ones) may save space.
When choosing between mirrors and prisms, consider their strengths
and drawbacks. Mirrors offer high reflectivity, wide spectral range, mini-
mal image degradation (if mounted properly), and a low cost versus size
ratio. However, mirrors can be difficult to clean and align, and suscepti-
75 ble to mounting tension. Precision mounting is costly.
250
Prisms, on the other hand, are easy to mount, durable, can be
designed for easy alignment, and can isolate the optical system from
FIGURE 5: The final system fits within the given box while
utilizing a minimum number of bends. environment. However, they also cost more (for their size) than mirrors,
and are heavy. Other drawbacks include that the image is degraded by
the glass’s thickness, and that the prism faces reflect light. Reflected
light can be removed using baffling.
Similar considerations accompany the different types of beamsplit-
ters: cube, mirror, or pellicle. In our example, we chose a beamsplitter
to direct the illumination onto the object while allowing the image to
pass through (see Figure 5).

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


As illustrated in Figure 5, light from the diffuse axial illuminator
(shown in yellow) is reflected by the beamsplitter, illuminating the
object. Reflected light carrying the image (shown in blue) passes
through the beamsplitter. The image light bounces off a front-surface
mirror and arrives at the camera. The entire system fits within the box,
83 employing a minimum number of bends. In the end, there is usually
adequate room in an overall design for the optical system – even if it
does not seem like it at the beginning of the process.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

INFRARED IMAGING: How do you see the unseen?


For some machine vision applications, the objects that can be seen with
the naked eye aren’t sufficient. Low light conditions or poor contrast
THERMAL VERSUS can be reasons to consider vision systems that image beyond the visible
spectrum. Certain types of data can be obtained by imaging in the IR
NEAR-IR that cannot be seen by standard vision systems that operate in the visi-
ble, such as heat emission and some materials defects. Extending into
the IR can offer a number of advantages over purely visual imaging, but
BEST OF EDMUND OPTICS™ APPLICATION NOTES

the different parts of this region offer different benefits and demand dif-
ferent requirements.
The near-IR and thermal IR regions both lie within the infrared spec-
VIS
trum, but they use wavelengths that are an order of magnitude apart in
Spectrum

NIR
wavelength. For perspective, consider that the entire visible spectrum
IR
covers only wavelengths from about 400 to 650nm. All the wavelengths
from 650nm to 14,000nm, however, are considered infrared (see Figure
400 700 3000 14000 1).
Wavelength (nm)
This article presents a primer on these differences: the unique advan-
FIGURE 1: The electromagnetic spectrum includes the
tages provided by imaging in the near-IR or the thermal band; the lens-
small range of wavelengths visible to the human es, light sources, and detectors suited to each; and the engineering con-
eye, as well as the larger range of the infrared, cerns specific to these two parts of the spectrum.
which includes the near-IR (near to the visible)
and the thermal IR.
Machine vision applications
The thermal IR lets you see heat. Very few objects glow “white hot” or
“red hot” – ie, with enough thermal radiation to be seen with the naked
eye. But all objects give off some heat, and imaging systems sensitive
to thermal IR can see the blackbody radiation of objects in the common
range of temperatures.
Thermal IR cameras can distinguish between a sun-warmed side-
walk and cool grass, between a car with a hot engine and one that has

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


been parked for hours. For machine vision applications, thermal cam-
eras have been put to use to distinguish hot-spots on PC boards, moni-
tor ambient temperatures in applications that demand stability, and
monitor process control applications in which heat emission is impor-
tant. Solutions have included machine vision systems for the automo-
84 tive industry, in which thermal cameras have been used to inspect
heat/humidity controlled seating. In the steel industry where high tem-
perature environments are common, thermal imaging has been utilized
to monitor ladles, boiler tubes, and transformers to keep them from
overheating. Thermal imaging systems have even served as an impor-
tant part of the poultry industry where monitoring of embryos in eggs
can determine the difference between a dead egg and a live egg.
Thermal imaging is a growing field, aided by developments in imaging
and computing equipment, as well as their falling prices.
Near-IR imaging, on the other hand, is more like an extension to vis-
ible imaging. Instead of the imaging range stopping in the red around
600nm, it extends out to as far as 1600nm. Near-IR imaging can be used
in low-light environments, to image materials that fluoresce in the near-
IR but not in the visible, or for other reasons. A pharmaceutical manu-
facturer, for example, used near-IR imaging to determine whether its
pills were in their packaging, because the near-IR contrast proved bet-
ter than visible contrast. Near-IR imaging has seen an increase in use
especially in the agriculture and food industries. It has proven to be a
useful resource when inspecting for rotten fruit and vegetables. By
using cameras that combine both the visible and near-IR spectrums, the
contrast between a bruised and clean surface can be improved dramati-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

cally. Toxins and defects can also be found in meat, such as poultry.
A number of applications are found through trial-and-error, by users
testing to see whether near-IR imaging would provide advantages over
visible-only imaging. The barriers to moving into the near-IR are low.

Equipment
Thermal IR uses materials beyond the range of standard optical materi-
als such as fused silica or sapphire. Instead, thermal IR systems must
BEST OF EDMUND OPTICS™ APPLICATION NOTES

use crystalline optics like germanium, calcium fluoride, or magnesium


fluoride (see Figure 2). The detectors for IR cameras tend to be vanadi-
um oxide focal plane arrays. In the past, the detectors needed to be
cooled, but now both cooled and uncooled versions are available. In
fact, the capabilities of thermal IR cameras are becoming more and
more like visible cameras, including onboard menus, trigger I/Os, and
analog/digital video outputs. Thermal IR systems typically have no
need for additional light sources, because they detect radiation from
objects in the field of view.
Near-IR cameras can use many of the same detectors and optics
made for visible applications, although filtering out the visible light
FIGURE 2: Optics for thermal IR cameras use crystalline becomes important. In fact, some monochrome CCDs can be manufac-
optics such as germanium, calcium fluoride or
magnesium fluoride. Users have fewer choices tured to have higher efficiencies in the near-IR, through around 1µm.
in optics than they do with near-IR systems. Some detectors with phosphor coatings enhance response farther into
the IR, up to 1.6µm, at the expense of resolution. InGaAs focal plane
arrays are sometimes used for near-IR imaging, although these are
geared more towards laser-beam profiling applications. Light sources
tend to be LEDs in the range from 870nm to 1.5µm. Diode lasers are
occasionally used for structured illumination. Many of the optics
designed for visible applications can be used for near-IR with an appro-
priate anti-reflection coating (see Figure 3).

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Building systems
5
4.5
The ability to use many visible-light components means that optics for
4 near-IR imaging systems are relatively inexpensive, and many options are
3.5 Telecom-NIR on LaSFN9 available for different requirements. For example, high-magnification
85
Percent Reflection

3 NIR II on LaKN22 lenses are available for the near-IR, and in a pinch one can use a lens
2.5 NIR I on LaSFN9 designed for the visible, although with corresponding loss of efficiency.
2 VIS-NIR on BK7
Thermal IR fills a different need than near-IR, thus justifying the use
1.5
1
of equipment that is more expensive for applications that need to image
0.5 in that region. Technical advances have resulted in both price and size
0 reductions in recent years, so that now one can put together a thermal
200 400 600 700 800 900 1000 1100 1200 1300 1400 1500 1600
IR system, including a camera about the same size as a visible camera,
Wavelength (nm)
for less than $10,000. Because of the limited materials available for
FIGURE 3: An appropriate antireflection coating allows the lenses, users don’t have as many choices in optics as they do for near-
optics made for visible imaging applications to IR systems. In some cases, lenses may need to be custom-designed. For
be used for near-IR imaging. example, high-magnification lenses for thermal-IR systems are uncom-
mon.
In use, thermal IR systems are more sensitive to environmental con-
ditions, such as humidity and temperature, than near-IR systems due to
the materials involved in the components. Manufacturers, however,
have already addressed most of these concerns through advances in
housing and packaging.
When a machine vision application requires seeing the unseen,
either near-IR or thermal-IR systems may be the answer, and both can
be cost-effective solutions.

www.edmundoptics.com 800.363.1992
MACHINE VISION
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Machine vision has evolved with - and arguably driven - automated


FOR AUTOMATED food manufacturing. From material analysis to content control, vision
systems comprised of cameras, lighting and software have streamlined
FOOD production and reduced costs throughout the industry. Vision technolo-
gy has become so pervasive, in fact, that manufacturers cannot gain a
MANUFACTURING competitive edge simply by integrating cameras in their production line.
Instead, they must decipher which vision systems will deliver the best
cost/performance ratio for their particular operation.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

This can intimidate manufacturers unfamiliar with the jigsaw puzzle


of component technologies that shape these systems. Cameras, light
sources and optics of varying stripes present a dizzying range of choic-
es, that become more complicated when one realizes that selection of
one component often influences and is influenced by the selection of
another. The food industry's diversity of processing and packaging appli-
cations doesn't help matters. There are, however, some rules of thumb
guiding most selection processes that ensure a completed system's per-
formance and cost will closely match an application's needs:
• Identify the application's tolerances - those details that determine
whether an object passes or fails inspection.
• Select the most critical component first and buy only the perform-
ance you need.
• Select components that are compatible with each other. It is waste-
ful, for instance, to apply a large-format CCD camera with a short focal
length/low-magnification lens that may not file the entire format of the
CCD and fail to take full advantage of the camera's full resolution.
• Plan ahead. If a product, process or packaging undergoes frequent
changes, select components that easily adapt to new parameters or
switch out with modular alternatives.
Despite the variety of both components and food manufacturing

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


applications, most vision systems confront three basic challenges: ana-
lyzing portions and/or quality, distinguishing flaws from acceptable fea-
tures, and controlling image quality to ensure the accuracy of inspection
data. Again, there is no single universal solution to these challenges, but
three applications, examined hereafter, illustrate these problems, and
86 FIGURE 1: Backlighting demonstrates how a simple
how machine vision engineers approach them.
rearrangement of a vision system's illumination
source can create sharp improvements in imag-
ing performance. By positioning the light source Portioning and analysis
and camera on opposite sides of an object,
backlighting enhances edge contrast and Portioning applications range from very simple vision operations, such
helps software edge tools detect misaligned caps as inspecting precut contents of packaged lunch trays, to very complex
or substandard fill levels.
analysis, such as ensuring chocolate chip cookies have a certain size,
shape, color and chocolate content without appearing to have been too
"manufactured."
Fortunately, most of these applications are simple enough that rela-
tively low-cost monochrome cameras serve the purpose. Monochrome
cameras deliver better resolution, signal-to-noise ratio, light sensitivity
and contrast than similarly priced color cameras. Their sensor element
compiles grayscale images from thousands of pixels that assign numer-
ic values to the amount of incident light they collect - zero representing
black, the highest number representing white and every number in
between representing a shade of gray. Also, due to the way that single-
chip color cameras interpolate color information, monochrome cameras
deliver 10 percent more resolution than comparable single-chip color
cameras.
A simple histogram of a high-contrast monochrome image can easi-
ly confirm that the sections of a dinner tray are full, empty or some-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

where in between. Furthermore, straightforward software analysis can


easily determine the size and shape of objects in the image.
Filtering either the light source or the camera optics can further
enable monochrome imagers to perform simple color differentiation, as
long as only one color needs to be separated out. Slices of red pepper-
oni, for instance, are distinguishable from portions of yellow cheese and
pizza shells, even if all of them are of similar size and shape. Capturing
monochrome images through a color filter will make the red pepperoni
BEST OF EDMUND OPTICS™ APPLICATION NOTES

appear black and the yellowish foods white. It can also help increase
contrast in the image.
Larger challenges arise when a vision system must identify two or
more colors, or distinguish two different shades of a single color.
Examples include foods that discolor over time, multicolored products
and packaging.
Using two monochrome cameras with separate filters is often more
cost-effective than applying color imaging. Single-chip color cameras
serve well with help from more advanced algorithms that separate mul-
tiple colors within an image. This is extremely useful for separating red
from orange, blue from purple, and even variations of shades of a spe-
cific color. However, ensuring strict color accuracy or inspecting multi-
colored items could require a three-chip camera (also called 3-CCD or
RGB cameras).
These devices separate image data into red, green and blue signals,
and send each to a separate chip within the camera. This produces bet-
ter color depth in an image than the interpolated color information pro-
duced by a single-chip camera. Three-chip cameras offer the best com-
bination of both color and resolution, yielding excellent spatial resolu-
tion and dynamic range. This allows colors of interest to be analyzed at
finer levels of detail and can detect slight variations that alternate

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


approaches cannot accurately obtain. Their performance, however,
comes at a higher price than single-chip color cameras based on
NTSC/PAL and Y-C formats and applies largely to more demanding
applications.
87 FIGURE 2: Low-cost monochrome cameras deliver higher
Features and flaws
resolution and faster processing than compara-
bly priced color cameras, making black and Fruit inspection has become increasingly automated thanks to a number
white images more than capable of handling
most portioning inspection applications. In
of vision innovations - innovations that also further illustrate color
these images, software analysis easily deter- analysis applications. Many fruits, for example, display natural shad-
mines the size, shape and placement of objects ings and color variations as well as features such as stems and navels
in the image, enabling simple histograms to
confirm that the sections of a dinner tray are that are difficult to distinguish from bruises, blemishes and punctures.
full, empty or packed incorrectly. With the Detecting blemishes on uniformly colored fruit, such as oranges, is
addition of color filters, monochrome cameras
can even single out objects of a specific color. simpler. The filtering techniques described earlier - specifically, apply-
ing filters or light of a specific wavelength disclose most blemishes and
some punctures in a monochrome image. In any case, the camera should
be capable of at least 640 x 480-pixel resolution, which would detect
defects measuring roughly 1 percent of the orange's surface.
Inspecting apples or multicolored fruit for discolorations is more
problematic. Here, three-chip cameras and high-intensity, white light
sources are preferable, but this approach may require pattern recogni-
tion software capable of analyzing complex color distributions. Another
possibility is to use near-infrared cameras and illumination, which can
sometimes distinguish blemishes from colorations visible only in the
visible spectrum. Both of these solutions, although effective, will raise
the system's overall cost.
Distinguishing small punctures from natural features is another task.
Since navels and stems appear in the same general area on a fruit and
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

often display a similar size and shape, standard monochrome cameras


again might provide an inexpensive solution with help from basic his-
tograms.
As diverse and adaptable as cameras and software are, however, they
cannot solve all imaging problems. As even amateur photographers
know, good cameras can deliver bad images, and the most common cul-
prit in vision applications is poor illumination.
Reflective materials, for example, can pose trouble for even high-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

end vision systems, whether the camera is inspecting food portions or


examining fruit. Food trays may be wrapped in cellophane or made of
foil. Fruit may be wet or transported on a metallic belt. In all events,
poor illumination of reflective elements can cause blooming, hot spots
or shadowing in an image and hide important information or cause false
edge calculations. Non-uniform lighting can also detract from signal-to-
noise ratios and make tasks such as thresholding more difficult. On the
positive side, however, skillful illumination schemes can raise the over-
all cost/performance of a vision system, without the need for high-end
detectors, imaging lenses and software.

Bottle stoppers
Good illumination is essential in bottling applications, where vision
systems inspect bottle integrity, ensure that labels are properly applied
and check cap alignment, fill levels and content purity. Given the dif-
ferent needs of each step, these functions would likely be divided into
three stages. All could use standard cameras but would require different
optical and illumination configurations.
Preventing good product from going into bad bottles requires vision
systems to inspect bottles for chipped tops and cracks. This is within the
capability of standard cameras equipped with telecentric lenses and

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


positioned over the bottles. However, lighting can become very com-
plex depending on the design, shape and material from which the bottle
is constructed. Many of these applications benefit from diffuse light
sources or oblique lighting techniques. Often, direct lighting should be
avoided around reflective surfaces or clear surfaces that can produce
88 glare. But directing light along the same axis as the camera's view actu-
ally induces a consistent reflection that highlights defects in the image
of a bottle top. This approach allows blob analysis to identify defects as
irregular white or dark areas.
Standard cameras equipped with standard or telecentric optics can
also perform cap alignment, fill level and detection of foreign objects.
Diffuse, backlit illumination also serves, although the light source and
FIGURE 3: Darkfield imaging is another example of effec-
tive illumination. Angling the light source camera are typically positioned on opposite sides of the bottles. This
slightly off the camera's axis, enables clear backlighting arrangement enhances edge contrast in the image and aids
images of highly reflective objects or objects software edge tools in detecting misaligned caps or substandard fill lev-
under cellophane as demonstrated by the image
on the bottom. If not angled correctly, however, els.
the light source may cause blooming, such as Verifying correct alignment and application of labels is also possible
what appears in the image on the top.
with standard cameras and optics, plus a strobed LED light source to
freeze objects in motion at high rates of speed. If the labels appear
metallic, then a diffuse light source may serve better.
Highly reflective bottles or labels, however, might require darkfield
illumination. More of an approach than a product, this illumination
scheme simply angles the lighting so that it reflects at an angle the cam-
era does not capture, making items such as edges or defects appear
brightly lit against a dark background. Besides imaging bottles, dark-
field illumination is also effective when examining food through cello-
phane or other reflective films.
continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Finding a supplier
The principles of specifying and implementing machine vision are sim-
pler to grasp than the details, which is why selecting a quality supplier is
essential to getting the performance you need on the factory floor.
It is important to work with established suppliers that can either
demonstrate proven commercial experience in food manufacturing or a
broad enough range of experience to undertake new applications in this
sector. Mature products - those that have been on the market for at least
BEST OF EDMUND OPTICS™ APPLICATION NOTES

a year or two - are more reliable. However, applications that require


higher performance or a longer service life should consider the newest
generation of component technology.
The most effective suppliers have active research and development
departments to keep up with rapid advances and emerging applications
in machine vision. Suppliers should also offer technical support after
the sale is made.

TECH TIP ON USING ILLUMINATION

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


There are many things to consider from an optical perspective when designing the illumination for a machine
vision system. Some are as simple as the use of low-cost optical filters to increase contrast or to reduce unwant-
ed lighting effects. Others stem from radiometer principles related to the lens design used. Doing some upfront
legwork will increase system capability and reliability, while helping to reduce the likelihood of undesirable
results and delays in start-ups or rollouts.
89 One of the simplest ways to increase a system's capability is to employ color or polarizing filters on either
the light source or the lens itself. The goal is to boost the contrast levels between objects under inspection, even
with gray-scale imaging systems, and thus to yield an image that is easier to process.
Now consider higher-level optical issues that can affect the illumination strategy. Over the past few years,
the availability of higher-resolution cameras with larger formats for both area- and line-scan applications has
steadily grown. As camera sensor formats increase in size, it can become more and more difficult to cover the
imaging chip with even, relative illumination.
This effect is seen most dramatically when it is necessary to obtain large fields of view at relatively short
working distances, which frequently is the case with applications such as high-speed web inspection. Basically,
what occurs is a change in intensity of lighting, starting in the middle of the image and rolling off to the edges
of the field of view, so that edges and corners of the image appear darker than the center. This can lead to pro-
cessing problems if the light levels dip too low to produce good relative contrast throughout the image.
The good news is that several compensation strategies are available – some more costly than others. One
option is to use a camera that can automatically adjust gain across the field by turning up the intensity in areas
where the light level may be lower. The potential problem here is that turning up the gain usually increases sys-
tem noise. It also may be possible to adjust the lighting levels across the object to match the lens characteris-
tics. Another technique is to filter the original source so that lighting in the center of the image is lowered, and
the illumination is increased going toward the edges. A final option would be to use what is known as an image
space telecentric lens to help optically compensate for these issues.
Ultimately, in many applications the goal should be to examine the lighting and optical systems as early
as possible when designing the inspection project. This will not only boost machine vision performance, but
also eliminate illumination problems that could later stop a project in its tracks.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

USING LIGHT TO The DNA sequencing field is competitive and fast-moving, partly because
so much data must be obtained, and partly because the stakes in the bio-
medical industry are so high. Optical imaging system is an integral part of
READ THE most sequencing systems. An incredible amount of proprietary work
focuses on optimizing setups for efficiency, accuracy, and cost effective-
CODE OF LIFE ness. However, most DNA sequencing systems work on the same gener-
al principles.
DNA is a long double-helix molecule made of four nucleotides. In
order to “read” the DNA, researchers must figure out the sequence in
BEST OF EDMUND OPTICS™ APPLICATION NOTES

which the nucleotides are arranged. We describe how DNA is fragment-


ed, tagged with fluorescent molecules, and how the fragments race across
gel lanes. We describe the basic optical system required to excite and col-
lect fluorescence from gels. As with most optical systems, the DNA
sequencers require robust design and tradeoffs between speed, price, and
accuracy.

Intro
The newest advances in biotechnology require the ability to read DNA -
or rather, to read the sequence of the four bases (aka nucleotides) that
make up DNA. The ability to sequence segments of DNA accurately and
quickly provides a clear advantage to the researcher. Therefore, the mak-
ers of DNA sequencers put an incredible amount of proprietary work into
optimizing setups for efficiency, accuracy, and cost effectiveness.
DNA can be sequenced in a number of ways. However, most DNA
sequencing systems work on the same general principles. To read the
sequence of bases, the DNA molecule is fragmented and tags are added to
identify the ending. The fragments are separated by size and then the tags
are read.
Before sequencing, if there is only a small sample of DNA available,

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


it may be amplified, using a polymerase chain reaction (PCR). An enzyme
(DNA polymerase) uses single-stranded DNA as a template to make a
new, complementary strand from a stew of available nucleotides. The
process can be repeated over and over again.
Similarly, PCR-based sequencing can copy the original DNA strand
90 - with one important difference. Along with the pure nucleotides, the
solution in which the process takes place contains dideoxynucleotides.
These dideoxynucleotides lack a hydroxyl group, which means that no
more nucleotides can attach to the strand. Each of the four dideoxynu-

DNA SEQUENCING – A General Overview

1 3 5
2
DNA SAMPLING GEL GENERATION DATA AQUISITION 4 DATA ANALYSIS 5
DATA OUTPUT
(Amplification/Electrophoresis) (Software/Algorithm)

T T G G C G TA AT C AT G G T C ATA G

1) Double 2) Double Helix 3) Many


Helix Splits and Begins Double
Reforming Helixes

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

cleotides (one for each type of nucleotide) also has a fluorescent mole-
cule attached. Because each of the four different kinds of dideoxynu-
cleotides fluoresces a different color, the base at the end of the chain can
be identified.
After this reaction ends, the solution is full of DNA fragments, of many
different lengths and with four different endings. If one shone a low-energy
laser at the solution and imaged the resulting fluorescence, four different
colors would be apparent, but the image still wouldn't provide information
BEST OF EDMUND OPTICS™ APPLICATION NOTES

about the sequence of the nucleotides - for that, the fragments must be sort-
ed by size. This can be done by taking advantage of the molecules' electri-
cal charge.
Sensor
Sensor Size Electrophoresis
Camera The solution is introduced onto an agarose gel with an electric potential. The
negatively charged DNA migrates across the gel toward the positive termi-
nal. Not surprisingly, the smallest segments move the fastest and farthest
down the gel “lane”. This process takes some time, but it does sort the frag-
ments by length. The gels can be “read” by lab technicians under ultravio-
let light, but it is more efficient to automate the reading. If one shines the
laser at the gel and captures the image of the fluorescent tags, the spatial
Working
Distance
sequence of the colored tags correlates to the sequence of the bases in the
Depth Of DNA (see Figure 1).
Field Ideally, the system would capture an image quickly that could distin-
guish fragments very close together with inexpensive imaging equipment.
As with most optical systems, however, the design of DNA sequencers bal-
Field O
f View ances tradeoffs between accuracy, speed, and price.
Resolution
Optics
Every imaging system has the following elements:

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


FIGURE 1: Fundamental parameters of an imaging system • Field of View: The portion of the gel that fills the camera's sensor.
include the resolution of the object, the field of • Working Distance: The distance from the front of the lens to the gel.
view, and the depth of field that the user wishes
to image. The working distance, from the • Resolution: The minimum feature size of interest on the gel. The resolu-
object to the lens, is also important, as is the tion should be sufficient to differentiate between wells and the spaces
sensor size. The primary magnification is the between fragments separated by electrophoresis.
91 field of view divided by the sensor size.
• Depth of Field (DOF): The maximum object depth that can be main-
tained entirely in focus. The DOF is also the amount of object movement
(in and out of focus) allowable while maintaining an acceptable focus. The
DOF is not typically critical for these systems.
• Numerical Aperture (NA): The light-gathering ability of an imaging
lens - important because this is a low-light application. The NA is inverse-
ly proportional to the aperture of the lens.
• Sensor Size: The size of the sensor’s active area. This parameter is impor-
tant in determining the proper lens magnification to obtain the desired field
of view.

In the most general terms, the imaging system must also:


• provide a light source (a laser, in this case)
• be able to move and focus
• collect, detect and output optical data to a computer

Algorithms
Once you've collected the raw data, how do you process it? Algorithms are
tailored to specific systems, but all have to tangle with the same issues:
• Resolving background and peaks
• Contrast (and resolving adjacent bases that are identical)

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

• Pattern recognition
Basically, the algorithm must be able to distinguish fluorescing tags from
the background. It must also be able to separate one tag from the next: when
the tags are for different bases, they are easier to distinguish, but what about
when a base repeats? The tags are imaged as intensity peaks - in short, they
are somewhat fuzzy colored dots.
The optics of the system, and the algorithm used to interpret the
image, dictate the minimum distance between tags (at the time of imag-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

ing) for consecutive nucleotides. This, in turn, dictates how long the
electrophoresis will take: the closer the tags can be, the shorter the time.
And the shorter the time until an acceptably accurate result is obtained,
the better.
Contrast is often used as a normalized metric to describe the limit
beyond which the system cannot resolve a signal from a background.
Contrast is measured at different resolutions. The system must be able to
determine whether a block of a single color indicates one, two, or sev-
eral peaks.
One can improve the contrast by filtering out unwanted wavelengths.
There are some other methods that can also minimize the contrast

Peak Proximity Effects on Contrast


The effects of lowering peak separation Large Peak Separation
is only evident as the peaks are blurred
Small Peak Separation
by non-ideal components within the
system. The peak levels are represented
by . The overall resulting curve
is shown in .

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


IDEAL SYSTEM
Identical contrast levels of the
peak are placed closer together
Imax Imax

Imin Imin

NON-IDEAL SYSTEM
Although the contrast level is comparable
92 to the ideal system case when the peaks
are far apart, as they come closer together, Imax
they blend into each other. This causes Imax
a break down in overall contrast and makes Imin
the central peak impossible to discern.
Imin

THRESHOLD TECHNIQUES FOR IMAGES


(Note: The Grayscale Values are Inverted)
ORIGINAL IMAGE

300
Before analysis, the thresholds of
250 the original image can be determined
200
and the original image is turned into
150
100
a series of binarized images. Binary
50 images can be stored compactly and
0
0 10 20
are easily analyzed.

THRESHOLD IMAGE: Grayscale 0-124 THRESHOLD IMAGE: Grayscale 124-187

BINARIZE (red=0, other=255) BINARIZE (red=0, other=255)

300 300
250 250
200 200
150 150
100 100
50 50
0 0
0 10 20 0 10 20

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

threshold and improve the peak separation. The peaks can be accentu-
ated by an algorithm that takes the derivative of the image line profile
and plots the resulting slope. Noise, however, can cause problems with
this approach.
Another option is to set up the software as a pattern recognition system.
This requires active calibration tools, but because the general shape of the
signal profile is fairly consistent, it enables quick analysis of patterns.
Finally, converting the data to a binary representation can simplify the
BEST OF EDMUND OPTICS™ APPLICATION NOTES

algorithms -- but only if one sets the threshold with some care.
ALGORITHM OPTOMECHANICAL
• Peak Separation • Speed Calibration: active or upfront?
• Algorithmic Techniques • Control Over Axes A critical component to any imaging analysis system is calibration. A sys-
• Calibration • Minimization of Movement
tem must be suitably calibrated in order to trust the results that are gener-
ated. There are two methods to obtain a calibrated system: active and
SPEED AND ACCURACY upfront calibration. Both of these methods can ease tolerances and reduce
A Simplied Overview
the manufacturing cost of the system.
For example, one method of active calibration is to keep the optical
SIGNAL-TO-NOISE RATIO
boresight in check across focus movements using calibration software and
SIGNAL NOISE calibration marks in the electrophoresis gel. However, this and any other
• Source • Chromatic Filtering
• Lens Speed/Performance • Polarization Filters active calibration method requires computation time that will slow down
• Detector Sensitivity • Baffing Techniques the data analysis.
• AR Coatings • Electronic Noise
For values that will not drift over time or changing environmental con-
ditions, upfront calibration is more cost-effective and does not slow down
FIGURE 2: A simplified overview of the various components processes significantly.
that affect speed and accuracy

Designing for speed and accuracy


As mentioned earlier, DNA sequencers are designed to offer faster and
more accurate readings than manual sequencing or competing sequencers.
Most commercial systems measure performance by throughput. This is

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


measured in tens of thousands of base-calls per run, hundreds of sequenc-
ing lanes per day, or thousands of fragment analysis per day, each within
some percentage of accuracy.
Speed and accuracy are closely linked precisely because one must
often be sacrificed to improve the other. If one can increase the signal-
93 to-noise ratio (SNR), however, then both speed and accuracy can be
increased. In this case, the signal is the fluorescence from the tags,
while the noise is the background. Improving SNR inherently eases the
computational component of the system, and thus affects the through-
put. The goal is to maximize the signal by optimizing the light source,
the transfer of light through the lenses and lens coatings, and the detec-
tor sensitivity.
Because fluorescent energy is typically very low, the signal starts out
low. To a certain extent, one can increase it by increasing the power of the
laser that excites the fluorescent molecule. If one uses illumination-shap-
ing optics, one can match the laser illumination with the field of view of
the imaging system -- this also helps to maximize the signal.
The lens must also pick up enough light to be read by the system.
Although lens performance is not directly related to the amount of signal
incident on the detector, it does determine whether the system is imaging
the energy in the correct location and within a minimum amount of area.
By accounting for the fluorescent emission pattern as well as the coating
and performance of the lenses, the radiometric transfer can be calculated
from the numerical aperture. Coatings on the optics both help increase
throughput and reduce noise by reducing lens flare and ghost reflections.
Noise can also be reduced using baffles and filters. Baffles isolate stray

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

light from the detector. Proper modeling can predict the problem surfaces
and stray light sources. Then the opto-mechanical design can be altered to
minimize the problems by including features like strategically placed baf-
fle threads and light-absorbing finishes.
Above, we mentioned using filtering out unwanted colors. There are
many types of wavelength-differentiating filters. One must balance the
absolute throughput with the throughput of the desired wavelength range.
In general, the narrower the wavelength range you choose to sense, and
BEST OF EDMUND OPTICS™ APPLICATION NOTES

the sharper the filter's boundaries, the less absolute throughput you
receive. One can also use polarizing filters to suppress stray light, but
again you must find a balance: Polarizers can be extremely powerful at
reducing noise, but they also cause significant fading of the signal.
When selecting a detector, sensitivity is not the only criterion but again
because the signal level is low - the sensitivity should play a significant
role. CCDs produce a linear response and offer high quantum efficiency,
which leads to good sensitivity across the spectral band of interest. Speed
can sometimes be an issue in arrays, although scanning systems are
emerging as an alternate solution.

Conclusion
Design issues for DNA sequencers are similar to many optical imaging
systems, in that they involve trade-offs between speed, cost, and accura-
cy. On the other hand, the technology and economics of DNA sequencing
result in systems that push the limits of the imaging systems - and the
ingenuity of their designers.

TECH TIP ON WORKING WITH SPACERS

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Fixed focal length lenses are an economical solution to many machine vision inspection
applications. One drawback of these lenses, however, is that they are typically
designed and optimized for an infinite conjugate, which leads to long minimum work-
ing distances. The required long distances make mounting these lenses in a bench-top
94 application impractical, and provide fields of view much too large for close inspection.
Spacers address these issues by reducing the specified minimum working distance
and by increasing the magnification, which decreases the field of view.
CCD CCD
Camera

I Spacer
I

O
O

Object

It is important to keep in mind, however, that adding spacers forces the lens to
focus much closer than its optimized design. This may cause an otherwise well-
designed lens to exhibit increased distortion, chromatic and spherical aberrations,
reduced depth of field, illumination non-uniformity, and decreased light gathering abili-
ty. These problems become more prevalent as additional spacers are introduced, and
the lens is forced further and further from its design.

www.edmundoptics.com 800.363.1992
SPLITTING IMAGES Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

SOLVES DUAL A number of vision systems suppliers provide turn-key solutions for
common imaging problems. But when standard solutions are insuffi-
cient for an application, one must customize a solution to fit. In our
MAGNIFICATION case, an inspection system was needed that offered both high magnifi-
cation and a large field of view. The system achieved this by splitting
DILEMMA one image into two paths, with the first path achieving a much higher
magnification than the second. The first path needed a 0.15 mm field
of view while the second required a 1 mm field of view.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Typically, high magnification implies a small field of view, but a


small field of view makes it difficult to see which part of a larger object
the system is imaging. This problem pops up in a number of applica-
tions, including semiconductor and electronics inspection, as well as in
biological imaging (see Figure 1). Our application required that we see
a fairly large field of view (for alignment and object location) while
still being able to resolve extremely fine detail on exceptionally small
components. The detail required was so small that we pushed the dif-
fraction limits of resolution. In this case, the detail and field of view
requirements far exceeded the capabilities of standard off-the-shelf
assemblies.

Standard solutions
When dealing with the need for different magnifications in a system
there are some standard solutions available. Our first thought was to
use a zoom lens to allow the user to change magnification. Zoom lens-
es, however, could not provide the necessary resolution at extremely
high magnifications. They also take time to zoom from one magnifica-
tion to another, which reduces the system's overall efficiency and
increases production costs. This is time that would be better spent actu-
ally inspecting and gauging the components.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


Removing zoom lenses left us with one real alternative: fixed mag-
nification lens systems like microscope objectives. Microscope objec-
tives, however, are designed for specific magnifications and yield only
one size field of view in standard configurations. (This is why micro-
scopes are generally designed with rotating turrets: to allow for a vari-
95 ety of magnifications.) We ran into the same problem as with zoom
FIGURE 1: Higher and lower magnification images of
lenses: the system loses time while changing magnifications. Also,
the same object yield different information. switching from low to high magnifications introduced problems with
High magnification images (bottom) have aligning the different objectives.
smaller fields of view, which can make it dif-
ficult to keep track of what part of the object
is in view. A dual-magnification system pro- Custom solutions
vides users with two images at different
magnifications of the exact same area. At this point, the designers went back to the drawing board, looking for
something a little different and somewhat creative. When the applica-
tion was broken down into its most basic elements, what was really
needed was a system that looks at two specific fields of view at the
same time. As stated earlier, one image needed to have extremely high
magnification, resolution, and contrast. The other image needed mag-
nification of some factor lower.
Many microscope objective designs offer some flexibility in the
magnification that they can produce. For example, video couplers take
the images transferred by the objectives (which are almost all designed
to be used in visual-based instrumentation -- ie, designed to be viewed
with eyepieces) and send them to CCD or film cameras. Because visu-
al based instruments seldom produce the same image sizes that a stan-
dard CCD camera is designed to capture, video couplers for most
microscope systems are available in different magnifications, designed

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

to best match the image size to a specific camera array size.


That is very similar to what this application required: the same
objective yielding different magnifications. Although the application
required more range of magnification than is seen in microscope video
couplers, the concept helped us move towards a solution.
With this concept (which is not exclusive to microscopes) we can
change the total magnification of a system by changing the lenses on
the back end while using the same objective on the front. We found it
The most cost-effective was simple to change the lens on the back end of the system if we used
BEST OF EDMUND OPTICS™ APPLICATION NOTES

an infinity-corrected objective on the front end. These objectives pro-


solution to the problem duce an image at infinity, rather than at some finite distance, and
require a tube lens to focus the field onto the image plane. The total
magnification of a system using an infinity-corrected objective is found
may look very different by dividing the focal length of the tube lens by the focal length of the
objective:
than what was original-
Magnification = (FL of tube lens)/(FL of objective lens)
ly envisioned.
Varying either of the values alters the magnification of the system. For
our application we use the same objective lens and two different tube
lenses to achieve the magnifications needed. But we also wanted to
view both magnifications at the same time. How could we manage
that?

Splitting the image


The answer to the dual magnification issue lies in being able to split the
image after the objective lens. By sending the image to two tube lens-
es of different focal lengths, two different final magnifications of the
same scene arrive at two different cameras. Therefore, the system need-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


ed to incorporate a beamsplitter (see Figure 2). The question becomes:
how easy is this, and what are the optical limitations on the system?
Luckily, due to their design, infinity-corrected microscope objec-
tives offer a great deal of flexibility. Many other optical components
can be placed behind such an objective with theoretically little change
96 in the optical performance of the objective. Components behind the
objective may include beamsplitters, filters, mirrors, and prisms.
Note that if too many components are added between the objective
and tube lens the overall image quality may suffer: Placing compo-
nents in the system will extend the distance between the objective and
the tube lens. As this distance increases, rays coming from the off-axis
positions of the object may no longer pass through the aperture of the
tube lens and will not make it to the image plane (see Figure 3). This
reduction in rays ultimately will lead to lower light levels at the edges
of the image and even the inability to view a portion of the object. For
our particular application this did not become an issue.
The final system utilized two beamsplitters after the objective. One
beamsplitter was used to integrate inline illumination into the system,
while the other split the return image and directed it to the two tube
lenses. The final design included an optical system with no moving
components for imaging the dual field, which basically eliminated the
issue of alignment. (The objective moved for autofocus, but this is not
FIGURE 2: In order to obtain two images with different related to the dual field issues.) The larger-view system shows a field
magnifications through a single objective of view around a millimeter in size, while the second one has a rough-
lens, a beamsplitter was integrated into the
system.
ly 5 times higher magnification and correspondingly smaller field of
view. Because the users view two fields simultaneously, they can meas-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

ure in both fields at the same time.


The first version of this system was made, essentially, from compo-
nents taken off the shelf. Later, the mounting components were cus-
tomized to fit the needs of the machine into which the vision system
was incorporated.

Outside the box, off the shelf


A number of lessons in creating vision systems were illustrated by this
BEST OF EDMUND OPTICS™ APPLICATION NOTES

example. Making this type of solution work required us to think both


imager inside and outside of "the box": we had to accept the constraints
imposed by the application, but we also needed to find unconventional
ways of using standard components. This process is helpful for cus-
tomizing systems in general.
First, both system obstacles and parameters had to be clearly speci-
fied: including field of view, resolution, depth of field, space con-
straints and other physical limitations (which can determine the work-
ing distance and overall size of the system).
Often, the list of obstacles and parameters is fairly short. However,
when trying to integrate vision systems into applications that have pre-
existing equipment, harsh environment changes, high concentrations of
particulates in the air, many moving parts, or just limited space, these
factors can seem insurmountable. Interestingly, sometimes things that
appear to be constraining factors can be used to a system's advantage.
Managing the trade-offs between different parameters is a critical
part of designing these systems. The designer must find the balance
particular to that application. In this situation, the system had a fixed
relationship between the wide and narrow fields of view. This was fine
for our application, but it could be completely unacceptable in a differ-
ent application.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


We customized a fairly standard system, rather than building a com-
pletely custom solution. This allowed us to use relatively inexpensive
focusing objective off-the-shelf components: lenses, mirrors, prisms, filters, and beam-
lens splitters that can be easily mounted or inexpensively integrated into
systems. Because most of these components are available for rapid
97 delivery, the system could be up and running within weeks, days, or in
many cases even the next day.
Be open minded about alternative ways to solving your application.
FIGURE 3: As the distance between the objective and The most cost-effective solution to the problem may look very differ-
the tube lens increases, less light reaches ent than what was originally envisioned.
the image plane. Light rays coming from
off-axis positions of the object may no Finally, do not stop searching for a solution. Many companies are
longer pass through the aperture of the tube producing components and lens systems that allow for multiple config-
lens. This results in lower light levels at the
edges of the image.
urations and are designed to be much more flexible than earlier prod-
ucts. Companies are also designing more application-specific lenses
and systems to meet the diverse demands of the electronic imaging
market place. Because customer demand is the driving force behind
these changes and innovations, you should never stop asking for that
solution!

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

IR VISION: With infrared detector technology becoming more cost-effective, IR


has developed as a viable way to augment the traditional machine
vision toolbox. Infrared detectors complement the traditional visible-
MORE THAN light machine vision applications by imaging the emission characteris-
tics of the parts under inspection. These resulting images can then be
MEETS THE EYE analyzed directly or be calibrated to measure heat or temperature dif-
ferences. Direct analysis of the image can lead to very interesting tech-
niques for inspection because surface substructures will inherently
BEST OF EDMUND OPTICS™ APPLICATION NOTES

change the emission characteristics of the surface. This enables you to


measure subsurface defects or structural properties of the part. If the
system is calibrated, temperature can then be used to discriminate
defects for part inspection.
One advantage of using the IR spectrum is that the system can often
operate without a traditional light source as is required in the visible
spectrum with conventional machine vision cameras. This is because
imaging in the IR is inherently more emission based than reflection
based. This property of IR systems means that they can be used to
detect both faulty parts and properly operating components on a pow-
ered-up, functioning unit within an enclosure (assuming that the system
has been properly calibrated.) For example, an integrated circuit pack-
age sitting just below the surface of a keyboard will be known to display
a certain temperature signature when operating properly, but when the
FIGURE 1: An infrared image of a motor housing indi- IC is faulty, the energy leakage may cause it to be hotter. The addition-
cates a uniform temperature while the end
bell is cooler and the output shaft is hotter al heat energy changes the keyboard's surface temperature characteris-
(near-white color). tics, which can easily be detected with an IR system. In other applica-
tions, unique thermal patterns can be captured and stored to indicate
cracks and inclusions in an object due to the thermal resistance to heat
flow through or around these kinds of faults.
Sometimes, complex visible machine vision systems can be

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


replaced with a simpler IR system. For example, some visible systems
are set up to detect surface-finish blemishes on reflective materials. But
straylight and light reflections may wash out the image, resulting in a
lower contrast of the image, which in turn makes it difficult for the
algorithm to detect the defects. On the other hand, no external source
98 is needed for the IR system; the defect gives a specific emission signa-
ture without an external source. This means that there is a limited
amount of wash out, which increases the signal-to-noise ration, and the
defect can therefore be easily and quickly detected.
Although the IR spectrum can add valuable intelligence to the
process, it isn't necessarily intended to replace visible machine vision
systems entirely. IR may be added to an existing system and share some
or most of the major components (for instance, the software, mechan-
ics, etc.). In most cases, the cameras and lenses typically need to be
upgraded to handle this new spectrum of interest. Interestingly enough,
in many applications, the ideal solution may be a hybrid system which
involves visible and IR imaging.
Key parameters to consider when selecting an IR system include
field of view, spatial resolution, thermal resolution, spectral range, and
availability. The field of view (FOV) is the viewable area of the object
under inspection, or the portion of the object that fills the camera's sen-
sor. Spatial resolution is a measure of the IR system's ability to repro-
duce object detail. This is typically specified in terms of a number of
pixels, however, keep in mind pixel interpolation. Within the commer-
cial realm, sensor resolutions range from about 160x120 to 640x480
pixels. Thermal resolution is a measure of the IR system's ability to dis-

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

tinguish changes in the temperature in the FOV. This is similar to the


notion of dynamic range in visible imaging systems. Noise can be prob-
lematic. Keep in mind that in some cases, the systems are sensitive to
room temperatures, which means that any heat source in the environ-
ment, including the optics, metal housings, and even the sensor itself
can be a potential source of noise. The spectral range is the band of
wavelength detectable by the sensor. Sensors are categorized as near
infrared (NIR), mid-wave (MWIR), and long wave (LWIR) based on
Machine vision experts their individual spectral ranges. In terms of commercial availability, IR
BEST OF EDMUND OPTICS™ APPLICATION NOTES

systems leave something to be desired. The choices in detector and


are increasingly discov- lenses is very limited, particularly if you are used to visible applications.
Beyond the lack of choices in lenses, keep in mind that these are much
costlier than their visible counterparts.
ering the benefits of
Many applications are an ideal fit for IR systems when the parameter
including the infrared of interest hinges on temperature-related effects. For example:

spectrum in their tool- • A thermally induced defect, such as monitoring the temperature
gradient across a web of material before it reaches the spool.
box for certain diagnos-
• Thermal expansion and contraction of ceramic materials as they
tic applications. are being heated or cooled.

• Measuring the liquid level in a steel tank.

• Observing the differences in materials of an assembly.

• Observing the specific heat of water in ceiling tile.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


• Electric motor heating.

• Monitoring the temperature of an open-top freezer.

99

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

3D MEASUREMENTS A common stumbling block in setting up a machine vision measure-


ment system is trying to gage three-dimensional objects. Standard
lenses, when coupled with the appropriate electronics and software, do
WITH TELECENTRIC an excellent job of accurately measuring flat objects. However, when
depth is introduced to the system requirements, standard machine
LENSES vision lenses fail miserably. Magnification variations with depth of
field lead to inaccurate and unreliable results. Because telecentric lens-
es eliminate this magnification error, they are useful for a number of
If you expect your
BEST OF EDMUND OPTICS™ APPLICATION NOTES

applications.
The formal definition of a telecentric lens is an optical system in
machine vision system which the entrance pupil and/or the exit pupil is located at infinity. The
most common design, referred to as "object space telecentricity",
locates the entrance pupil at infinity, which causes the chief rays for all
to accurately gauge points across the object to be collimated. In practice, what this means
is that the object will remain the same perceived size regardless of its
objects with bosses and distance from the lens. There are limitations to this distance, known as
the telecentric range of the lens, though this range is typically fairly
dings, you might want broad.

to consider using tele- Object Space Telecentricity


Object space telecentricity offers several inherent advantages over non-
centric lenses. telecentric lenses, including imaging three-dimensional objects and
simplifying mounting.
Imaging three-dimensional objects. Using the bare eye or conven-
tional lenses, objects far away appear smaller than the same size objects
nearby. Like railroad tracks that appear to converge at the horizon, an
object with depth along the optical axis will appear tilted with a non-
telecentric imaging system (see Figure 1). For automated processes,
this tilt can cause several problems. Alignment systems that use sim-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


ple edge-detection may indicate that the tops of the parts are not
aligned with the bottoms, causing the system to reject good parts. In a
measurement system, the change in magnification with depth leads to
measurement errors.
Simplifying mounting. Even more troublesome than change in mag-
100 nification for some systems is a change in focus. In a manufacturing
environment, the position of an item on a conveyor system can rarely
be guaranteed. Whether the system is imaging bottles bouncing down
the conveyor belt, tall objects resting on their sides, or multi-level fea-
tures of a complex component, a change in depth is hard to avoid.
While a telecentric lens does not inherently have a greater depth of
field than a non-telecentric system, most telecentric lenses are designed
to allow for three-dimensional objects. Thus, an inability to position the
item accurately does not inhibit the reliability of the system.
The true advantage of the telecentric lens, however, is seen when the
FIGURE 1: 3-D objects appear tilted when imaged
using a non-telecentric lens. The apparent two situations are combined: Imagine the scenario of the bottles bounc-
tilt can cause problems if the application ing down the conveyor system. A quality assurance process requires a
requires measuring that edges are parallel.
vision system to check the fill levels of the bottles and to measure the
tops to guarantee the caps will fit. A camera system directly overhead
measures the tops, while a system mounted at 45° overhead checks the
fill levels. Analysis on the speed of the conveyor system indicates the
working distance between the measurement camera system and bottles
can change by as much as 10 mm. A standard design lens could
undoubtedly maintain this 10 mm depth of field, but the magnification
of the top of the bottle will increase as the bottle moves closer to the
lens. Depending on when the measurement is triggered, the bottle may

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

pass or fail, regardless of its true dimensions.


The fill-level scenario offers a separate challenge. The vision sys-
tem is programmed to do an edge detection between the liquid level
and the manufacturer’s name on the side of the bottle, necessitating a
45° mounting configuration. Because of this, the relative working dis-
tance between the bottle and camera system varies along the bottle’s
length. The manufacturer’s name is imprinted with very small charac-
ters and the system is expecting to see a high contrast edge. Again, a
TELECENTRIC LENS
BEST OF EDMUND OPTICS™ APPLICATION NOTES

standard lens arrangement could accommodate this depth, but this lens
may not have the necessary resolution over that depth.
The resolution of a lens is a function of its object space numerical
aperture, which means the resolution degrades much more quickly
towards the lens than it does farther from the area of best focus for a
standard lens design. For a telecentric design, however, the resolution
degrades equally over the depth of field. Even though the manufactur-
er’s name is at the shorter working distance, the resolution at that dis-
tance will be equal to the resolution on the bottom of the bottle, pro-
viding a significant increase in contrast to the vision system and allow-
ing for dependable results.

Image Space Telecentricity


A less common design is an "image space telecentric lens", where a
CONVENTIONAL LENS change in the sensor location along the optical axis will not change the
image size. This design is critical in microlithography, where toler-
ances on feature size are routinely below one tenth of a micron. In
machine vision, a distinct advantage of the image space telecentric
design is that it can lead to extremely uniform image plane illumina-
tion. This design is especially useful with 3-CCD cameras, in which
the location of each of the image planes is difficult to define. When the

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


three separate images are combined to form the single color image,
however, each image must be of equal size.
The most robust design is that of a doubly-telecentric lens, which
incorporates the considerable advantages of both object and image
space telecentricity.
101 Applications involving machine vision measurement, metrology
equipment, and microlithographic camera systems commonly incor-
porate telecentric designs. The significant size and complexity of tele-
FIGURE 2: When viewed at 45 degrees with a non- centric lens designs, however, have made it difficult for these lenses to
telecentric lens, jumper pins on a circuit
board appear tilted toward the center of penetrate other markets. As integrators and system designers become
the field. A telecentric lens eliminates this more aware of the advantages and these lenses are specified for more
tilt and provides an accurate measurement.
volume requirements, decreased manufacturing costs should lead to
telecentric lens designs finding their way into an increasing number of
applications.

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

OPTICS THAT It has become a truism that designs today must be readily manufacturable.
But the road to manufacturability may not be obvious when it comes to
multi-element lenses. There are often subtleties involved in such devices.
FOCUS ON One consequence is that it may not be easy to see whether a working pro-
totype will present problems when it hits production.
MANUFACTURING A working design, by itself, is not enough. An optical design is just a
plan created to meet certain requirements: how large the dimensions can
be and the wavelengths at which it must work. But a design doesn't tell
BEST OF EDMUND OPTICS™ APPLICATION NOTES

you how much the product will cost to make.


The key to design-for-manufacturing and design-for-cost comes out of
an understanding of how glass optics are made, the mechanical structure
of the device, and how the pieces are aligned, assembled, and tested. It's
best to consider such key issues from the beginning because design opti-
mization is a series of tradeoffs. A prototype design may not be the easi-
est to manufacture or the least costly solution possible.
Many designers aren't well-versed in the production process and don't
understand the criteria that effect price. And designers can get locked into
the technical details of the design and forget to make sure the $500/lb
glass isn't five times as good a choice as the $100/lb glass. When making
hundreds or thousands of units of a complex optomechanical assembly, a
change in materials and surfaces can be costly but it also can save a lot of
time and money.
Or consider the choice of metal. Beryllium (a material frequently used
in the aerospace industry), for example, is super light and has amazing
structural properties. This makes it an attractive material for mechanical
designers. However, most commercial products don't require these prop-
erties. The cost of the raw beryllium and the cost of machining it would
wreck most budgets -- if, in fact, the manufacturing shop can handle it at
all. Aluminum and steel cost much less and are easier to manufacture than

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


beryllium yet still provide the level of quality needed for most optome-
chanical applications.
Similarly, a simplistic view of design suggests that the fewer ele-
ments, the better. More optical elements do raise cost. But if additional
elements can loosen up tolerances, the approach may reduce costs, not
102 raise them. Would you rather make nine easy-to-manufacture parts or six
difficult ones?

Tolerancing tiny dimensions


Those of us who work on optical components spend 50% or more of our
design time working on manufacturability and cost control. We do this
largely by setting appropriate tolerances and reducing the design's
sensitivity to tolerances. For example, a design that requires a tight toler-
ance on the centering of the optics will be more labor-intensive and costly
to make than one that allows more generous tolerances. But tolerances can
be much tighter in optics than in many other mechanical systems. A system
that can tolerate an optic shifting the width of a hair (75 to 100µm) is con-
sidered to have a loose tolerance. A tight tolerance in optics would be about
20 to 35µm. If the manufacturing shop can only cut a seat to a certain accu-
racy, then the tolerances must be looser than that accuracy. Thus part of the
work involved in design-for-manufacturing goes into changing the design
so it accommodates the tolerances of the manufacturing plant.
Manufacturing and assembly are easier, for example, if the design staggers
the size of optics by a few millimeters, with the biggest optic on the out-
side. Each lens is retained within the housing. This approach provides bet-
ter precision than could be had using spacers.

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

Eyeing a fisheye
A recent project serves as a good example of design-for-manufacturing
principles at work. Users of 3D CAD and virtual reality systems want
to be immersed in a realistic environment, whether they use the system
for training or for fun. So when the manufacturer of an immersive 3D
projection system, Elumens Corp. (Durham, N.C.), designed its
VisionStation product, one requirement was to project an image over
more than 180° on the inside of a dome.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

The project was interesting because the technical challenges of cre-


ating such a device are considerable. The system needed a special very-
wide-angle fisheye lens. Readers who have looked through a fisheye
lens installed in a door may recall how objects near the center of the
field (say, a visitor's face) appear fairly normal, but objects near the
edge of the field (say, a receding hallway) appear weirdly distorted.
Such distortion in the projector would make the image look unrealistic
and was unacceptable for the 3D system. So the fisheye lens had to
offer relatively low distortion over the entire field of view. Also, the
lens had to mount on a standard LCD projector.
The Elumens team created a good prototype design for the projec-
tor lens but they were unsure if it was economical for the number of
units the company expected to sell. Working with Elumens engineers
we redesigned both the optical and the mechanical system with an eye
toward mass production. Our main strength was our familiarity with
manufacturing issues. The final design for the lens used inexpensive
glass without sacrificing performance, which saved money in the end.
The redesign of the lens also included changes to the mechanical inter-
face between the lens and projector that provided better focusing abil-
ity than the original design. The 3D VisionStations are now in produc-
tion and over two hundred projectors are now on the market. The final

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


design hit cost targets partly because it had optical and mechanical
manufacturing capabilities in mind.

103

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

CONFOCAL Confocal scanning microscopes were invented almost five decades ago
MICROSCOPE with the intention of producing sharper images than were possible with
conventional microscopes. One reason for their failure to become more
LENSES: SHARPEN than just a casual curiosity over the years had been the relatively high
cost to produce the special optics needed. Yet, despite this cost, confo-
YOUR SIGHTS cal microscopes substantially benefited many science and technology
firms, particularly the biologics industry, where observing specific
pathologies within a precise depth of field is critical. But with newer
Confocal microscopes
BEST OF EDMUND OPTICS™ APPLICATION NOTES

tools, materials, and techniques now available, confocal microscopes


are less expensive to design and manufacture. The cost has gradually
produce sharper dropped to a level more in line with the budgets of leading microscope
manufacturers.
images, but be ready How it works
Conventional microscopes project thin, but visible extraneous images
to accept some trade- just above and below the principle plane being observed.
Unfortunately, these undesirable images are out-of-focus and tend to
offs to match your blur the preferred image. But confocal microscopes eliminate this inter-
ference and project the planes buried deep within a medium with equal
specific application. sharpness. The principle of operation lies in a precisely located pinhole
in front of the microscope’s imager, which removes most of the blur
from the out-of-focus planes. The pinhole allows only those light rays
emitted from the desired plane of the object to pass on to the imager
with higher intensity. The area immediately above and below the
desired layer will come into focus slightly in front of or behind the pin-
hole. Therefore, most of the light rays transmitted from these areas
can’t pass through the aperture and don’t reach the imager. The light
that does pass through the pinhole comes from the desired plane. The
desired image now comes from an extremely thin section of the sample

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


because of the small depth of field. Consequently, numerous thin sec-
tions of the sample can be scanned and the imager can construct high-
resolution 2D and 3D images that are otherwise difficult or impossible
to obtain.
Besides the pinhole, the next most critical component is the objec-
104 tive lens. It determines the overall resolution and capability of the sys-
tem. However, selecting a suitable lens for a particular application is
not as easy as for a conventional microscope. Designers need more than
a fundamental knowledge of optical parameters; they also have to
understand how to control specific types of aberrations. These are
errors that show up in the final image, not necessarily present in the
original object, and tend to diminish the image quality.
Aberrations in a system come from numerous sources, including the
method of designing and manufacturing the optical and mechanical
components, the effect of other optical components which may have
been added to the system after the major design work had been com-
FIGURE 1: Layout of a confocal microscope pleted, and even the materials and the composition of the media to be
The major features that make the confocal
microscope stand apart from conventional inspected. Aberrations are unavoidable, and although some can be cor-
microscopes are the pinhole and the objec- rected, they are inherent in all optical systems. Unfortunately, a perfect
tive lens. Only those rays from the best
focus point on the specimen pass through optical system does not exist.
the pinhole and hit the imager. The light
source is usually a laser with a wavelength
best suited for illuminating the features of a
Chromatic aberrations
specific specimen, such as different fluores- The list of possible aberrations is long, but chromatic aberrations and
cent dyes used in the biologics and medical spherical aberrations usually are the worst. Chromatic aberration
areas.
comes from the change in the index of refraction of a lens material as
the wavelength (color) of the light passing through it changes. The

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

image cannot remain in focus within a fixed image distance as the


wavelength changes. For axial or longitudinal chromatic aberrations,
different wavelengths focus at different distances from the lens. The
focal length of the lens changes proportionally to the wavelength. For
example, blue light focuses at a point closer to the lens than does green
and red, so when attempting to obtain the best focus for a polychro-
matic system, a colored halo forms around a brighter focused spot. The
halo usually appears purple because the blue and red portions of the
spectrum defocus and combine.
BEST OF EDMUND OPTICS™ APPLICATION NOTES

Although the pinhole aperture is the feature that ultimately pro-


duces the crisp images, the intensity of the light diminishes at its focus
position in proportion to the wavelength. In the final image, the per-
ceptible intensity is less than the actual intensity, because not all the
light reaches the sensing area. This is not a serious problem in mono-
chromatic systems or those that use a small range of wavelengths. But
FIGURE 2: Chromatic aberrations, caused by the change
in index of refraction of the lens as the wave- for a system looking at fluorescence across the blue, green, and red
length changes, focus the image at different spectrums, such as in medical applications, the effect can seriously
points along the axis in both the axial and
lateral portions of the system. deteriorate the image.
Similar difficulties appear as the image moves off axis, farther away
from the center of the image. These are called lateral chromatic aberra-
tions. Different wavelengths focus to different spot sizes farther across
the image. They introduce different levels of intensity and displace
color information from the actual position on the object under analysis;
lateral chromatic aberrations produce magnification errors. Slight
changes in magnification affect the quality of the image and reduce the
reliability of critical measurements.
But objective lenses are now available specifically to compensate
for these shortcomings. Achromatic and apochromatic objective lenses
use a combination of different types of glass which allow multiple

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


wavelengths to focus to the same position. An achromatic objective
lens works for at least two different wavelengths, while apochromatic
objective lenses handle four wavelengths.
Even with apochromatic objectives, some systems depend on other
optical components such as an eyepiece to correct for chromatic aber-
105 rations. Although correcting for multiple wavelengths is important,
they become more difficult to correct over a range of wavelengths
which are deeper into the UV part of the spectrum.

Laser lights
Lasers usually provide the illumination for indirect imaging. Compared
to multi-chromatic light, laser wavelength and intensity are controlled
to provide more predictable results. It’s also required for imaging in the
human biological sciences where fluorescent dyes are used to highlight
pathologies.
The objective lens focuses the laser onto the object plane and
images the fluorescent dye. Consequently, the laser’s wavelength also
needs to be considered. System efficiency may be affected under two
possible conditions: When chromatic aberrations prevent the objective
lens from focusing the laser into the object plane, and when materials
in the objective lens reduce the overall laser emission or transmission.
But adding or changing optical components at the laser can compensate
for some of these problems.
A reflective objective lens also compensates for chromatic aberra-
tions; they are not sensitive to different wavelengths. But, their resolu-
tion is generally lower than traditional refractive lenses due to inherent

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

design constraints, particularly where resolution levels are near theoret-


ical design limits such as in high-magnification applications.

Spherical aberrations
Although better lenses compensate for many aberrations, spherical
aberrations don’t always come from optics. Spherical aberrations are
similar to chromatic aberrations. They arise axially as a function of the
distance from the lens and laterally, across the image. The difference
comes from the fact that spherical aberrations are not wavelength
BEST OF EDMUND OPTICS™ APPLICATION NOTES

dependent. But higher orders of spherical aberrations, known as sphe-


rochromatism, depend on the wavelength.
Spherical aberrations relate to the size of the lens aperture and the
position of the object and image. When rays from the object pass
through points farther from the center of the lens, the focus point moves
farther away from the best possible focus point of the lens. Thus, not
all the light intensity from a given point of the object gets through the
pinhole.
However, these aberrations can be corrected. The design of an opti-
mal objective lens depends upon the relative positions of the object and
the desired image. Variations in either adversely affect the visible
image because the system is working slightly outside its design param-
eters. This may not seem like an issue, but factors outside the objective
lens can affect such problems.
On the object side, the cover slip, media, and any solution or mate-
rial under inspection all have an index of refraction different than air.
The material that the light passes through (located between the object
and the first lens element) is considered when designing an objective
lens, whether air, water, oil, or another medium. A different medium
put between the object and the objective lens looks like one more opti-

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


cal component. This moves the best focus point and introduces more
FIGURE 3: In a finite conjugate system, the addition of
optical components such as beamsplitters spherical and other aberrations.
and filters can adversely affect the optical Users also must understand how to prepare and inspect the sample
performance.
so they can make a more informed decision when choosing the best
optics. Changes in solution concentration, temperature, and quality of
106 cover slips and immersion oils must be controlled or repeatability may
be compromised.

Finite and infinite conjugates


Other factors considered on the image side of the objective lens include
the beam splitter that introduces the laser lighting into the system and
other optical components. If not considered initially, they introduce
aberrations and reduce image quality. In a system using a finite conju-
gate objective (one that focuses the image to a specific distance with-
out other optical components), the negative effects are easily seen. That
is, light passing through a beam splitter, the filters, and other compo-
nents, pass through yet another index change and not all rays pass
through at the same angle. The rays deviate slightly and introduce aber-
rations which were not found in the original objective lens design.
Some filters and beam splitters also reduce the intensity of the light
rays where the amount lost depends on the wavelength and the angle.
The angle at which a ray of light at a particular wavelength goes
through one of these optical components can affect the transmission
level of that ray and reduce certain wavelength transmissions. This can
be detrimental to light-starved systems.
One solution is an infinity corrected objective lens. The difference

continued >

www.edmundoptics.com 800.363.1992
Edmund Optics Inc. USA | UK | Germany | Japan | Singapore | China

between an infinity corrected objective lens and one designed for a


finite conjugate is that the former cannot focus an image at a particular
distance without a second set of optics called a tube lens.
The rays exiting infinity corrected objective lenses are parallel.
When they pass through a beam splitter or a filter, all rays pass at the
same angle and maintain their relative positions. This reduces or elim-
inates aberrations, even those introduced by the tube lens, and it allevi-
ates the problem caused by the angle.
Yet another solution is objective lenses designed at a finite conju-
BEST OF EDMUND OPTICS™ APPLICATION NOTES

REFERENCES gate and optics added to the systems immediately following the object
H. Ernst Keller, "Objective Lenses for Confocal lens to make an infinite conjugate. This can work well, but the optics
Microscopy" Handbook of Biological Confocal might increase the magnification and develop more aberrations if not
Microscopy, ed. James B. Pawley, Plenum Press, New properly matched to the objective lens.
York, 1995.

© COPYRIGHT 2006 EDMUND OPTICS INC. ALL RIGHTS RESERVED


107

www.edmundoptics.com 800.363.1992

Você também pode gostar