US6184974B1 - Apparatus and method for evaluating a target larger than a measuring aperture of a sensor - Google Patents

Apparatus and method for evaluating a target larger than a measuring aperture of a sensor Download PDF

Info

Publication number
US6184974B1
US6184974B1 US09/340,502 US34050299A US6184974B1 US 6184974 B1 US6184974 B1 US 6184974B1 US 34050299 A US34050299 A US 34050299A US 6184974 B1 US6184974 B1 US 6184974B1
Authority
US
United States
Prior art keywords
wavefront
sensor
light
subregion
wavefront sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US09/340,502
Inventor
Daniel R. Neal
Ron R. Rammage
Darrell J. Armstrong
William T. Turner
Justin D. Mansell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AMO Development LLC
Original Assignee
Wavefront Sciences Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wavefront Sciences Inc filed Critical Wavefront Sciences Inc
Priority to US09/340,502 priority Critical patent/US6184974B1/en
Assigned to WAVEFRONT SCIENCES INC. reassignment WAVEFRONT SCIENCES INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MANSELL, JUSTIN D., ARMSTRONG, DARRELL J., NEAL, DANIEL R., RAMMAGE, RON R., TURNER, WILLIAM T.
Priority to AU59100/00A priority patent/AU5910000A/en
Priority to JP2001508019A priority patent/JP4647867B2/en
Priority to DE60001280T priority patent/DE60001280T2/en
Priority to AT00945111T priority patent/ATE231609T1/en
Priority to EP00945111A priority patent/EP1192433B1/en
Priority to KR1020017016960A priority patent/KR100685574B1/en
Priority to PCT/US2000/018262 priority patent/WO2001002822A1/en
Priority to MYPI20003027A priority patent/MY128215A/en
Publication of US6184974B1 publication Critical patent/US6184974B1/en
Application granted granted Critical
Assigned to BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT reassignment BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT INTELLECTUAL PROPERTY SECURITY AGREEMENT Assignors: WAVEFRONT SCIENCES, INC.
Assigned to AMO WAVEFRONT SCIENCES, LLC reassignment AMO WAVEFRONT SCIENCES, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: WAVEFRONT SCIENCES, INC.
Assigned to AMO WAVEFRONT SCIENCES, LLC; FORMERLY WAVEFRONT SCIENCES, INC. reassignment AMO WAVEFRONT SCIENCES, LLC; FORMERLY WAVEFRONT SCIENCES, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: BANK OF AMERICA, N.A. AS ADMINISTRATIVE AGENT
Anticipated expiration legal-status Critical
Assigned to AMO DEVELOPMENT, LLC reassignment AMO DEVELOPMENT, LLC MERGER (SEE DOCUMENT FOR DETAILS). Assignors: AMO WAVEFRONT SCIENCES, LLC
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/04Optical or mechanical part supplementary adjustable parts
    • G01J1/0407Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings
    • G01J1/0422Optical elements not provided otherwise, e.g. manifolds, windows, holograms, gratings using light concentrators, collectors or condensers

Definitions

  • the present invention is directed to an apparatus and method for evaluating an object, particularly an object larger than an aperture of a sensor.
  • wavefront sensors including Shack-Hartmann wavefront sensors
  • Shack-Hartmann wavefront sensors is a known technique for measuring the wavefront of light.
  • the features of a surface such as a wafer, an optic, etc., may be measured by reflecting light from the surface and directing it to the wavefront sensor.
  • Wavefront sensors determine wavefront error through slope measurement.
  • a plurality of lenslets arranged in an array are used to sample the wavefront.
  • Each lenslet creates a corresponding sub-aperture.
  • the resulting array of spots which may be interpreted as a physical realization of an optical ray trace, are focused onto a detector.
  • the position of a given focal spot is dependent upon the average wavefront slope over the sub-aperture.
  • the direction of propagation, or wavefront slope, of each of the samples is determined by estimating the focal spot position shift for each lenslet.
  • the wavefront may then be reconstructed from the detected image in a number of known manners.
  • the resolution and sensitivity of the sensor are determined by the lenslet array.
  • Shack-Hartmann wavefront sensor There are several applications of the Shack-Hartmann wavefront sensor. Several of these applications have been extensively developed, with specific devices developed for adaptive optics, measurement of pulsed lasers and laser beam quality, ocular adaptive optics and measurement, and a wide variety of metrology applications. For some applications, the Shack-Hartmann sensor is advantageously applied, since it is relatively insensitive to vibration, independent of source light wavelength, and can be arranged in a simple, compact and robust assembly. A summary of uses of Shack-Hartmann wavefront sensors is set forth in D. R. Neal et al. “Wavefront Sensors for Control and Process Monitoring in Optics Manufacture,” Lasers as Tools for Manufacturing II , SPIE Volume 2993 (1997).
  • An example of such a metrology application is the measurement of a silicon wafer.
  • the key result is the determination of surface defects that affect the fabrication of small features on the silicon wafer.
  • the minimum feature size for microelectronic circuits has steadily decreased since their inception. Where 0.35 ⁇ m features are currently the norm, the next generation of circuits will need 0.18 ⁇ m or even 0.13 ⁇ m. Fabrication of these small features requires the detection (and elimination) of ever smaller size defects. At the same time, the wafer size is getting larger.
  • the current generation of 200 mm wafers is rapidly being supplanted by the 300 mm wafer, with 450 mm wafers planned for the near future. The need for ever better resolution, combined with larger wafers places extremely difficult demands upon the metrology tools.
  • the present invention is therefore directed to a method and apparatus for evaluating the surface of an object which is larger than an aperture of the sensor which substantially overcomes one or more of the problems due to the limitations and disadvantages of the related art.
  • a very good reference flat may be obtained, and hence an extremely accurate measurement may be made.
  • Off the shelf cameras and lenslet array technology may be employed. A number of adjacent and overlapping regions are measured using this technique over the whole surface of interest. In order to measure a large area, in accordance with the present invention, these regions are then “stitched” together with an appropriate algorithm that may take advantage of the slope information to provide a characterization of the whole surface.
  • the tem “stitching” means assembling a wavefront from the derivatives of wavefronts in the overlapping regions. In this way high resolution, yet large area measurements may be made without the need for extremely large optics or detectors.
  • the method is scalable to any size that may be measured with appropriate translation devices.
  • At least one of these and other objects may be realized by providing a method for reconstructing a wavefront from a target having a plurality of subregions including illuminating a subregion, delivering light from the subregion to a lenslet array, detecting positions of focal spots from the lenslet array, determining a wavefront from the subregion from detected focal spot positions, repeating steps the preceding steps until all subregions have been measured, and stitching together wavefronts thereby reconstructing the wave front from the target.
  • the target may be ideally a flat surface.
  • the method may include calibrating using a reference surface.
  • the repeating may include moving the object and a system providing said illuminating, delivering and detecting relative to one another.
  • the moving may include moving by an integral number of lenslets.
  • the moving may result in a 10-50% overlap of adjacent measurements.
  • the moving may include moving in a single direction orthogonal to first direction for complete measurement of the subregions.
  • the stitching may include setting a wavefront in an overlap region having more than one wavefront from the determining associated therewith equal to an average of the wavefronts for the overlap region.
  • the target may be one of a wafer, a flat panel display, and a large optic.
  • the light delivered from the object may be reflected by or transmitted from the object.
  • the illuminating of a subregion may occur only once for each subregion. Of course, multiple illuminations may be employed for increased accuracy.
  • At least one of these and other objects may be realized by providing a metrology system for analyzing an object larger than an aperture of the system including a light source, a wavefront sensor, an optical system for delivering light from the light source onto a portion of the object being measured and for delivering light from the object to the wavefront sensor, a translator for adjusting a relative position of the object and the system, and a processor for stitching together wavefronts measured by the wavefront sensor for different portions of the object measured at positions provided by the translator.
  • a metrology system for analyzing an object larger than an aperture of the system including a light source, a wavefront sensor, an optical system for delivering light from the light source onto a portion of the object being measured and for delivering light from the object to the wavefront sensor, a translator for adjusting a relative position of the object and the system, and a processor for stitching together wavefronts measured by the wavefront sensor for different portions of the object measured at positions provided by the translator.
  • the system may include a reference surface for calibrating the optical system.
  • the translator may include a translation stage on which the object is mounted.
  • the reference surface for calibrating the optical system may be mounted on the translation stage.
  • the wavefront sensor may be a linear wavefront sensor extending along an entire dimension of the object.
  • the translator may adjust the relative position only in one dimension.
  • the system may include a position sensor which measures a position of the light from the object in the optical system.
  • the system may include a translatable surface which directs light from the object to the wavefront sensor which is controlled in accordance with the position measured by the position sensor.
  • the object may be one of a wafer, a flat panel display, a large optic, and other surfaces.
  • the optical system may deliver light reflected and/or transmitted by the object to the wavefront sensor.
  • the translator may adjust the relative position after the optical system delivers light to the portion of the object being measured only once.
  • FIG. 1 a is a schematic diagram of the metrology system of the present invention.
  • FIG. 1 b is a schematic diagram of another configuration of the metrology system of the present invention.
  • FIG. 1 c is a schematic diagram of still another configuration of the metrology system of the present invention.
  • FIG. 1 d is a schematic diagram of yet another configuration of the metrology system of the present invention.
  • FIG. 1 e is a schematic diagram of yet another configuration of the metrology system of the present invention.
  • FIG. 2 is a schematic diagram of a Shack-Hartmann sensor
  • FIG. 3 illustrates the stitching of the wavefronts in accordance with the present invention
  • FIG. 4 is a schematic diagram of a linear Shack-Hartmann sensor
  • FIG. 5 is a schematic diagram illustrating the operation of the linear Shack-Hartmann sensor of FIG. 4 .
  • FIG. 1 a A schematic diagram of the metrology system of the present invention in use with an object, e.g., a wafer, is shown in FIG. 1 a .
  • a light source 10 supplies light via an optical fiber 12 to a collimating lens 14 .
  • the light source 10 is preferably a broad band source, i.e., a source having low temporal coherence, so that the effect of any cross talk between lenslets of the wavefront sensor is minimized, but can be any source of light, including a laser.
  • the light source 10 is also preferably pulsed, to reduce the sensitivity of the system to vibrations.
  • the apparent source image size on the focal plane of the wavefront sensor should be arranged to provide adequate sampling of separation of the focal spots consistent with the desired dynamic range.
  • the collimated light is delivered to a beam splitter 16 , which directs the light onto optics 18 which image the light onto a surface 20 of the object to be evaluated.
  • the surface to be evaluated 20 may provided on a chuck 22 to minimize any bow or warp.
  • the chuck 22 is in turn mounted on a translation stage 24 .
  • the light reflected by the surface 20 is re-imaged by the optics 18 and passes through the beam splitter 16 to a wavefront sensor 26 , preferably a Shack-Hartmann wavefront sensor, including a lenslet array.
  • the optics 18 are designed such that all lenslets of the lenslet array of the wavefront sensor 26 are filled.
  • the surface 20 and the wavefront sensor 26 are preferably positioned at conjugate image planes, so no diffraction effects should be present.
  • a zoom lens 28 for magnifying the image of the surface 20 on the sensor 26 is provided. Increasing the magnification of the image increases the sensitivity of the system.
  • the sensor outputs image information to a processor 30 , which may be internal to the sensor, external to the sensor or remote from the sensor.
  • the processor 30 processes the sensor data to evaluate the desired profile, e.g. flatness, of the object.
  • FIGS. 1 b - 1 e illustrate alternative embodiments of the system. As can be seen in these configurations, the relative placement of the elements is not critical as long as the desired optical paths between the light and the surface and between the reflections from the surface and the detector are maintained.
  • FIG. 1 b a compact configuration is illustrated.
  • Two thin wedges 34 , 35 are provided which are positioned in orthogonal planes.
  • the first thin wedge 34 introduces astigmatism in a first direction of the beam
  • the second thin wedge 35 will add the same amount of astigmatism in a second direction, orthogonal to the first direction, of the beam.
  • the astigmatism may be simply compensated for by altering the distance at which the beam impinges on the detector 26 .
  • FIG. 1 c illustrates that the light delivery portion and the detecting portion do not have to be parallel to one another. Indeed, these portions do not have to be in the same plane.
  • a prism or wedge 34 may be used as the beam splitter for directing the light from the light source 12 to the surface 20 and from the surface 20 to the sensor 26 . This allows separation and filtering of secondary reflections by an aperture 37 .
  • a steerable mirror 31 which may be controlled by a position detector 33 to insure that light is being properly directed to the sensor 26 .
  • a prism or wedge 35 is used to split off a portion of the light returned from the surface 20 to the position detector 33 and is arranged to compensate for aberrations introduced by the beamsplitter prism 34 .
  • any appropriate beam splitter arrangement may be employed.
  • the mirror 31 may then be adjusted until the position detector 33 indicates the beam is in the center of the optical system.
  • This position detecting is particularly important when using an optical system 36 employing an aperture stop 37 .
  • an aperture stop 37 is used, if the beam is off center of the optical axis of the system, the sensor 26 will not receive an accurate signal.
  • This position sensing including an adjustable mirror may be used in conjunction with any of the configurations noted above.
  • the elements for directing the light to the surface 20 and to the sensor 26 may be eliminated.
  • the desired directing is realized by positioning the light delivery system and the sensor 26 at an oblique angle to the surface 20 .
  • a reference surface 32 is also mounted on the translation stage 24 . Since only a portion of the target surface is imaged at a time in accordance with the present invention, this reference surface may be readily constructed. Indeed, the amount of the target surface to be measured at a time by the wavefront sensor is in part determined by the largest available reference flat having an acceptable accuracy. Very high accuracy, i.e., better than ⁇ fraction (1/200) ⁇ th of a wavelength, reference surfaces are currently achievable at up to three inches in diameter and are available from, e.g., REO, Burleigh and Zygo Inc. By measuring the reference surface using the system, the system may be calibrated.
  • This calibration allows errors in the optical system to be subtracted from any subsequent measurements, thereby reducing the quality requirements on the optics.
  • the system may be re-calibrated as often as desired.
  • the relative motion of the system and the target may be continuous or discontinuous.
  • FIGS. 1 a - 1 e only a portion of the surface 20 is imaged onto the sensor by the metrology system.
  • the metrology system and the surface 20 are moved relative to one another using the translation stage 24 , with an image being taken at each position, possibly with a slight overlap.
  • These multiple images are then stitched together as to form the full image as set forth below.
  • a number of algorithms may be used to perform this stitching, including a least square fit and simulated annealing, etc.
  • a key feature of the present invention is the use of the direct slope information from adjoining overlapping fields. Previously, any separate images of the surface would be adjusted until the edges line up.
  • any resulting difference at the edges of the images is due to stage tilt.
  • the stitching of the present invention does not rely on the assumption that the stage is perfect. Such an assumption, required by previous methods, can lead to erroneous errors, both in accepting surfaces which appear flat and in not accepting surfaces that are acceptable.
  • FIG. 2 is a schematic diagram of the basic elements of a two-dimensional embodiment of a Shack-Hartmann wavefront sensor for use as the wavefront sensor 26 .
  • a portion of an incoming wavefront 40 from the surface 20 is incident upon a two-dimensional lenslet array 42 .
  • the lenslet array 42 dissects the incoming wavefront 40 into a number of small samples.
  • the smaller the lenslet the higher the spatial resolution of the sensor.
  • the spot size from a small lenslet due to diffraction effects, limits the focal length which may be used, which in turn leads to lower sensitivity.
  • these two parameters must be balanced in accordance with desired measurement performance. Extremely low, preferably at least 12-16 bit, noise cameras are now available, which aid in increasing the sensitivity of the overall wavefront sensor and allow this balance to be achieved.
  • Each sample forms a focal spot 44 on a detector 46 .
  • the detector 46 for example, is a low noise, high pixel count charge coupled device (CCD) camera, e.g., SMD-2K manufactured by Silicon Mountain Designs.
  • the processor 30 performs centroid estimation to determine positions of the focal spots. A position of a focal spot depends upon the average wavefront over the sample. Thus, the direction of propagation of each of these samples is determined by the location of the focal spot on the detector 46 .
  • the processor 30 compares the focal spot positions against a reference set of positions. This reference is established during calibration of the system.
  • the processor 30 then divides the difference between the measured focal spot and the reference position by the focal length of the lenslet to convert the difference into a wavefront slope.
  • the processor 30 then integrates the wavefront slope in two dimensions to form the wavefront of the beam for the portion of the object being measured.
  • the processor 30 determines any deviations of the wavefront from the calibration wavefront to assess the flatness of the test object.
  • the lenslet array 42 of FIG. 2 actually is a two-dimensional array, having a plurality of lenses in two directions, in accordance with the dimensions of the reference flat.
  • any convenient method may be used for creating a plurality of focal spots on a detector. This may include using a lenslet array, an array of holes, a grating array, a prism array, etc.
  • a linear Shack-Hartmann sensor as shown in FIG. 4, may be provided for which the sensor may be either longer than the longest dimension of the part of interest, or a portion of the part may be measured and stitched as described below. Large parts may be measured at an oblique angle, as shown in FIG. 1 e , to allow a small diameter sensor to measure a large diameter part. Reference data may still be obtained using a small reference flat.
  • a one dimensional (1-D) wavefront sensor for measuring the wavefront along a single line is disclosed in U.S. Pat. No. 5,493,391 to Neal et al., which is hereby incorporated by reference in its entirety for all purposes.
  • This type of measurement scheme has advantages in the bandwidth of the measurement because fewer camera pixels must be acquired for the same measurement.
  • the sensor bandwidth scales as R/N where R is the camera pixel rate (pixels/sec) and N is the number of pixels (or pixel clocks) across one line of the sensor.
  • R is the camera pixel rate (pixels/sec)
  • N is the number of pixels (or pixel clocks) across one line of the sensor.
  • R/N 2 this scales as R/N 2 , which, for the same pixel rate R, can greatly decrease the effective bandwidth of the system.
  • the 1D bandwidth is 19.5 kHz, while the 2D bandwidth is 38 Hz.
  • the sensor disclosed in this patent only measures the x-derivatives, and can only provide some information about y- or cross derivatives through inference.
  • FIG. 4 illustrates a linear wavefront sensor 60 that has most of the speed advantages of the previous 1-D sensor while allowing measurement of both x- and y- derivatives.
  • the linear wavefront sensor 60 includes a lenslet array 62 having a series of individual spherical (or near-spherical) lenses 66 arranged in a line with their edges touching and a detector 64 .
  • the f/number of the individual lenses should be fairly large. This provides maximum sensitivity for the measurement.
  • the f/number of the lenses is chosen to create the optimum spot size that is appropriate for the specific detector 64 .
  • the linear wavefront sensor 60 may be realized by using a two dimensional CCD which is clocked out to maximize the measurement bandwidth as the detector 64 . This can be accomplished by providing extra horizontal clock pulses to the CCD camera or a vertical synch pulse. An area of interest is defined on a conventional 2-D CCD and clock pulses are sent to reset the frame after the first few lines of data are read out. The spot positions may then be obtained using a centroid algorithm, a matched filter algorithm, fast Fourier phase shift algorithm, or any other appropriate algorithm.
  • One advantage of this technique is that, as long as the camera control electronics and the frame grabber used for data acquisition are both controlled by software in the same computer, it is possible to extend the dynamic range by tracking the spots as they move.
  • FIG. 5 An alternative to using electronic control to realize the linear wavefront sensor 60 is shown in FIG. 5.
  • a three-line CCD 68 serves as the detector 64 .
  • Three-line CCD detectors were originally developed to provide color operation of line-scan cameras.
  • the three-line CCD 68 is modified by leaving off the color filters.
  • the three lines provide some sensing in the vertical direction, as well as in the horizontal, as illustrated schematically in FIG. 5, in which the focal spots 69 from the lenslet array 62 are shown over the various lines of the detector 68 .
  • a spot can be located through its centroid in both axes. This provides the necessary information for measuring both the x- and the y-derivative.
  • the drawback of this scheme is that the dynamic range in the y-direction (normal to the detector array line) is reduced because only three measurements are made in this direction.
  • the spot shape can be adjusted by varying the aspect ratio of the lenslet design to compensate for this problem.
  • the high data rate of such a system can be used to compensate for any loss of dynamic range in this direction by acquiring more data.
  • the linear wavefront sensor 60 of the present invention is similar in some respects to other one-dimensional wavefront sensor implementations, e.g., U.S. Pat. No. 4,227,091 to Sick entitled “Optical Monitoring Apparatus.” However, the linear wavefront sensor 60 of the present invention is significantly simpler than other concepts that can measure both derivatives.
  • the electronic control of the present invention described above uses off-the-shelf components and relies on software, camera control electronics and frame grabber to acquire area of interest information.
  • the physical three line configuration of the present invention described above has a restricted dynamic range, but is easy to implement.
  • the linear wavefront sensor 60 of the present invention is particularly useful when it is impractical to mount and/or translate the object.
  • the wavefront sensor 26 (or linear wavefront sensor 60 ) is moved relative to the object being measured to provide wavefront slopes which are then stitched together to form an overall wavefront.
  • the term “stitching” means reconstructing the wavefront using the derivatives of the wavefront in overlap regions. While only one dimension is shown in FIG. 3, it is to be understood that this translation may occur along both dimensions of the surface of the object. Further, while only a one-dimensional array is shown, it is to be understood that a two-dimensional array may be employed in an analogous manner.
  • the lenslet array 42 ( 62 ) is in a first position relative to the full incoming wavefront 48 . Only a portion of the full incoming wavefront 48 is imaged by the lenslet array 42 ( 62 ). In the middle illustration, the lenslet array 42 ( 62 ) has been shifted relative to the incoming wavefront 48 to a second position. An overlap region 50 between the first position and the second position is stitched together by forcing the average slopes of the overlapped region 50 to match. This match may be achieved by hueristic convergence and may result in a discontinuity in the first derivative, as opposed to previous methods which forced the first derivative to be continuous. This discontinuity will represent translation stage errors that may then be simply subtracted to leave only information about the surface.
  • the lenslet array 42 ( 62 ) has again been shifted relative to the incoming wavefront 48 to a third position.
  • An overlap region 52 between the second and third position is stitched together by forcing the average slopes of the overlap region 52 to match, as set forth above. After all the shifting and stitching has been a completed, a full image of the wavefront 40 is obtained.
  • an error minimization algorithm may be used.
  • each wavefront image is taken so that a region on edges of the image overlapped with the previous image.
  • the average slope in the x- and y-direction in each overlapped region of each wavefront image is then calculated. Since the tip and tilt of the test object and the wavefront sensor relative to one another could change slightly during translation between acquisition of wavefront images, the tip and tilt of each of the wavefront images is adjusted to match up the tip and tilt of the overlapped region of the images.
  • an error function may be defined as the sum of the absolute value of the difference of the tip and tilt in the overlapped region.
  • an iterative search algorithm may be employed, e.g., simulated annealing. The above algorithms are only one example of compensating for any tip or tilt error between adjacent images.
  • the wavefront sensor is translated in one direction at a time by an integer multiple of lenslets. While typically the translation is of the same amount for each shift, the final shift required for imaging the entire object in the direction of the shift may only be far enough to image the edge of the object.
  • the resulting overlap is on the order of 10%-50% of the aperture of the lenslet array.
  • the size of the overlap region may be adjusted to provide a desired stitching accuracy. For a one-dimensional sensor, the overlap may actually be an even larger percentage, since data may be acquired at higher rates than for a two-dimensional sensor.
  • the wavefront sensing of the present invention collects all data needed for a specific region in a single frame.
  • the wavefront sensing of the present invention allows continuous scanning, rather than translating the system and stopping for a series of measurements. This further increases the speed with which the analysis can be performed. This is a significant advantage over the interferometry techniques, which must stop and measure 3-6 frames for each position.
  • high resolution measurements of slopes of a wavefront of sub-apertures may be stitched together to form a high resolution measurement of an entire aperture.
  • a reference corresponding to the area of the sub-aperture may be measured to provide calibration of the system. While the above description is related to measuring wafer flatness, the metrology system of the present invention may be used to provide high resolution wavefront measurements over areas which are larger than the aperture of the system for many different objects or optical systems, and to measure profiles which are not flat.

Abstract

A Shack-Hartmann wavefront sensor having an aperture which is smaller than the size of an object being measured is used to measure the wavefront for the entire object. The wavefront sensor and the object are translated relative to one another to measure the wavefronts at a plurality of subregions of the object. The measured wavefronts are then stitched together to form a wavefront of the object. The subregions may overlap in at least one dimensions. A reference surface may be provided to calibrate the wavefront sensor.

Description

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention is directed to an apparatus and method for evaluating an object, particularly an object larger than an aperture of a sensor.
2. Description of Related Art
The use of wavefront sensors, including Shack-Hartmann wavefront sensors, is a known technique for measuring the wavefront of light. The features of a surface, such as a wafer, an optic, etc., may be measured by reflecting light from the surface and directing it to the wavefront sensor. Wavefront sensors determine wavefront error through slope measurement.
In a Shack-Hartmann test, a plurality of lenslets arranged in an array are used to sample the wavefront. Each lenslet creates a corresponding sub-aperture. The resulting array of spots, which may be interpreted as a physical realization of an optical ray trace, are focused onto a detector. The position of a given focal spot is dependent upon the average wavefront slope over the sub-aperture. The direction of propagation, or wavefront slope, of each of the samples is determined by estimating the focal spot position shift for each lenslet. The wavefront may then be reconstructed from the detected image in a number of known manners. The resolution and sensitivity of the sensor are determined by the lenslet array.
There are several applications of the Shack-Hartmann wavefront sensor. Several of these applications have been extensively developed, with specific devices developed for adaptive optics, measurement of pulsed lasers and laser beam quality, ocular adaptive optics and measurement, and a wide variety of metrology applications. For some applications, the Shack-Hartmann sensor is advantageously applied, since it is relatively insensitive to vibration, independent of source light wavelength, and can be arranged in a simple, compact and robust assembly. A summary of uses of Shack-Hartmann wavefront sensors is set forth in D. R. Neal et al. “Wavefront Sensors for Control and Process Monitoring in Optics Manufacture,” Lasers as Tools for Manufacturing II, SPIE Volume 2993 (1997).
However, there are a number of metrology applications where the size of the target is a limiting factor in the application of wavefront or other metrology technology. Examples include large mirrors or optics, commercial glass, flat-panel displays and silicon wafers. While some previous methods have been developed, e.g., U.S. Pat. No. 5,563,709 to Poultney, which is hereby incorporated by reference in its entirety for all purposes, these suffer from a loss of spatial resolution when applied to large elements; and from difficulties in size and calibration.
An example of such a metrology application is the measurement of a silicon wafer. In such a measurement, the key result is the determination of surface defects that affect the fabrication of small features on the silicon wafer. The minimum feature size for microelectronic circuits has steadily decreased since their inception. Where 0.35 μm features are currently the norm, the next generation of circuits will need 0.18 μm or even 0.13 μm. Fabrication of these small features requires the detection (and elimination) of ever smaller size defects. At the same time, the wafer size is getting larger. The current generation of 200 mm wafers is rapidly being supplanted by the 300 mm wafer, with 450 mm wafers planned for the near future. The need for ever better resolution, combined with larger wafers places extremely difficult demands upon the metrology tools.
The current generation of metrology methods is clearly not scalable to the needs of these new processes. Such scaling to larger sizes requires extremely large optics with their associated high cost, large footprint and difficulty of fabrication. Furthermore, the required resolution cannot reasonably be obtained with such methods. The Shack-Hartmann method requires at least four pixels per lenslet. Thus, the resolution over a given aperture is limited. Scaling to larger areas with methods such as disclosed in Poultney, requires the use of cameras with an extremely large number of pixels. While the interferometry methods may be applied to larger areas with less loss in resolution, modern practical methods required the acquisition of 4-6 frames of data. This leads to difficulties in automated inspection in a clean-room environment because of vibration and to throughput reduction when analyzing a large object.
Other applications may be even more stressing than the wafer analysis discussed above. While silicon wafers may be scaled to 300 mm or even 450 mm, flat panel displays are currently being fabricated at 1500×600 mm. Scaling of existing metrology tools for single aperture measurement is clearly impractical. Automotive or commercial glass is manufactured in even larger areas, with 4 m wide segments not uncommon. Clearly an alternative technique is needed.
As the feature size to be analyzed decreases, the size of tolerable distortions decreases, and high resolution measurements must be made to insure sufficient surface flatness. This high resolution requirement is incompatible with making measurements over a large area. Further, the calibration of a system for measuring flatness over a large area in a single measurement requires a reference of similar dimensions, which is difficult to produce.
While some solutions, such as those set forth in U.S. Pat. No. 4,689,491 to Lindow et al., U.S. Pat. No. 4,730,927 to Ototake et al. and U.S. Pat. No. 5,293,216 to Moslehi disclose point by point analysis of surfaces, the analyzing disclosed in these patents is very time consuming.
SUMMARY OF THE INVENTION
The present invention is therefore directed to a method and apparatus for evaluating the surface of an object which is larger than an aperture of the sensor which substantially overcomes one or more of the problems due to the limitations and disadvantages of the related art.
It is therefore an object of the present invention to combine the advantages of the Shack-Hartmann sensor (namely insensitivity to vibration, measurement of surface slope directly, and invariance with wavelength) when applied to measure a small area with a large area measurement. For a small area, a very good reference flat may be obtained, and hence an extremely accurate measurement may be made. Off the shelf cameras and lenslet array technology may be employed. A number of adjacent and overlapping regions are measured using this technique over the whole surface of interest. In order to measure a large area, in accordance with the present invention, these regions are then “stitched” together with an appropriate algorithm that may take advantage of the slope information to provide a characterization of the whole surface. As used herein, the tem “stitching” means assembling a wavefront from the derivatives of wavefronts in the overlapping regions. In this way high resolution, yet large area measurements may be made without the need for extremely large optics or detectors. The method is scalable to any size that may be measured with appropriate translation devices.
At least one of these and other objects may be realized by providing a method for reconstructing a wavefront from a target having a plurality of subregions including illuminating a subregion, delivering light from the subregion to a lenslet array, detecting positions of focal spots from the lenslet array, determining a wavefront from the subregion from detected focal spot positions, repeating steps the preceding steps until all subregions have been measured, and stitching together wavefronts thereby reconstructing the wave front from the target.
The target may be ideally a flat surface. The method may include calibrating using a reference surface. The repeating may include moving the object and a system providing said illuminating, delivering and detecting relative to one another. The moving may include moving by an integral number of lenslets. The moving may result in a 10-50% overlap of adjacent measurements. When each subregion extends along an entire first direction of the target, the moving may include moving in a single direction orthogonal to first direction for complete measurement of the subregions. The stitching may include setting a wavefront in an overlap region having more than one wavefront from the determining associated therewith equal to an average of the wavefronts for the overlap region. The target may be one of a wafer, a flat panel display, and a large optic. The light delivered from the object may be reflected by or transmitted from the object. The illuminating of a subregion may occur only once for each subregion. Of course, multiple illuminations may be employed for increased accuracy.
At least one of these and other objects may be realized by providing a metrology system for analyzing an object larger than an aperture of the system including a light source, a wavefront sensor, an optical system for delivering light from the light source onto a portion of the object being measured and for delivering light from the object to the wavefront sensor, a translator for adjusting a relative position of the object and the system, and a processor for stitching together wavefronts measured by the wavefront sensor for different portions of the object measured at positions provided by the translator.
The system may include a reference surface for calibrating the optical system. The translator may include a translation stage on which the object is mounted. The reference surface for calibrating the optical system may be mounted on the translation stage. The wavefront sensor may be a linear wavefront sensor extending along an entire dimension of the object. The translator may adjust the relative position only in one dimension. The system may include a position sensor which measures a position of the light from the object in the optical system. The system may include a translatable surface which directs light from the object to the wavefront sensor which is controlled in accordance with the position measured by the position sensor. The object may be one of a wafer, a flat panel display, a large optic, and other surfaces. The optical system may deliver light reflected and/or transmitted by the object to the wavefront sensor. The translator may adjust the relative position after the optical system delivers light to the portion of the object being measured only once.
These and other objects of the present invention will become more readily apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, aspects and advantages will be described with reference to the drawings, in which:
FIG. 1a is a schematic diagram of the metrology system of the present invention;
FIG. 1b is a schematic diagram of another configuration of the metrology system of the present invention;
FIG. 1c is a schematic diagram of still another configuration of the metrology system of the present invention;
FIG. 1d is a schematic diagram of yet another configuration of the metrology system of the present invention;
FIG. 1e is a schematic diagram of yet another configuration of the metrology system of the present invention;
FIG. 2 is a schematic diagram of a Shack-Hartmann sensor;
FIG. 3 illustrates the stitching of the wavefronts in accordance with the present invention;
FIG. 4 is a schematic diagram of a linear Shack-Hartmann sensor; and
FIG. 5 is a schematic diagram illustrating the operation of the linear Shack-Hartmann sensor of FIG. 4.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
While the present invention is described herein with reference to illustrative embodiments for particular applications, it should be understood that the present invention is not limited thereto. Those having ordinary skill in the art and access to the teachings provided herein will recognize additional modifications, applications, and embodiments within the scope thereof and additional fields in which the invention would be of significant utility without undue experimentation.
A schematic diagram of the metrology system of the present invention in use with an object, e.g., a wafer, is shown in FIG. 1a. A light source 10 supplies light via an optical fiber 12 to a collimating lens 14. The light source 10 is preferably a broad band source, i.e., a source having low temporal coherence, so that the effect of any cross talk between lenslets of the wavefront sensor is minimized, but can be any source of light, including a laser. The light source 10 is also preferably pulsed, to reduce the sensitivity of the system to vibrations. The apparent source image size on the focal plane of the wavefront sensor should be arranged to provide adequate sampling of separation of the focal spots consistent with the desired dynamic range. The collimated light is delivered to a beam splitter 16, which directs the light onto optics 18 which image the light onto a surface 20 of the object to be evaluated. The surface to be evaluated 20 may provided on a chuck 22 to minimize any bow or warp. The chuck 22 is in turn mounted on a translation stage 24.
The light reflected by the surface 20 is re-imaged by the optics 18 and passes through the beam splitter 16 to a wavefront sensor 26, preferably a Shack-Hartmann wavefront sensor, including a lenslet array. Preferably, the optics 18 are designed such that all lenslets of the lenslet array of the wavefront sensor 26 are filled. The surface 20 and the wavefront sensor 26 are preferably positioned at conjugate image planes, so no diffraction effects should be present. Advantageously, a zoom lens 28 for magnifying the image of the surface 20 on the sensor 26 is provided. Increasing the magnification of the image increases the sensitivity of the system. The sensor outputs image information to a processor 30, which may be internal to the sensor, external to the sensor or remote from the sensor. The processor 30 processes the sensor data to evaluate the desired profile, e.g. flatness, of the object. There are a number of algorithms that can be used to process the information and construct an overall surface map from the individual measurements.
FIGS. 1b-1 e illustrate alternative embodiments of the system. As can be seen in these configurations, the relative placement of the elements is not critical as long as the desired optical paths between the light and the surface and between the reflections from the surface and the detector are maintained.
In FIG. 1b, a compact configuration is illustrated. Two thin wedges 34, 35 are provided which are positioned in orthogonal planes. Thus, while the first thin wedge 34 introduces astigmatism in a first direction of the beam, the second thin wedge 35 will add the same amount of astigmatism in a second direction, orthogonal to the first direction, of the beam. Thus, the astigmatism may be simply compensated for by altering the distance at which the beam impinges on the detector 26.
FIG. 1c illustrates that the light delivery portion and the detecting portion do not have to be parallel to one another. Indeed, these portions do not have to be in the same plane.
As shown in FIG. 1d, a prism or wedge 34 may be used as the beam splitter for directing the light from the light source 12 to the surface 20 and from the surface 20 to the sensor 26. This allows separation and filtering of secondary reflections by an aperture 37. Also shown in FIG. 1d is a steerable mirror 31 which may be controlled by a position detector 33 to insure that light is being properly directed to the sensor 26. A prism or wedge 35 is used to split off a portion of the light returned from the surface 20 to the position detector 33 and is arranged to compensate for aberrations introduced by the beamsplitter prism 34. Of course, any appropriate beam splitter arrangement may be employed. The mirror 31 may then be adjusted until the position detector 33 indicates the beam is in the center of the optical system. This position detecting is particularly important when using an optical system 36 employing an aperture stop 37. When an aperture stop 37 is used, if the beam is off center of the optical axis of the system, the sensor 26 will not receive an accurate signal. This position sensing including an adjustable mirror may be used in conjunction with any of the configurations noted above.
As shown in FIG. 1e, the elements for directing the light to the surface 20 and to the sensor 26 may be eliminated. In this embodiment, the desired directing is realized by positioning the light delivery system and the sensor 26 at an oblique angle to the surface 20.
Advantageously, as shown in FIGS. 1a and 1 c-1 e, a reference surface 32 is also mounted on the translation stage 24. Since only a portion of the target surface is imaged at a time in accordance with the present invention, this reference surface may be readily constructed. Indeed, the amount of the target surface to be measured at a time by the wavefront sensor is in part determined by the largest available reference flat having an acceptable accuracy. Very high accuracy, i.e., better than {fraction (1/200)}th of a wavelength, reference surfaces are currently achievable at up to three inches in diameter and are available from, e.g., REO, Burleigh and Zygo Inc. By measuring the reference surface using the system, the system may be calibrated. This calibration allows errors in the optical system to be subtracted from any subsequent measurements, thereby reducing the quality requirements on the optics. By mounting the reference surface on the translation stage, the system may be re-calibrated as often as desired. The relative motion of the system and the target may be continuous or discontinuous.
As can be seen in FIGS. 1a-1 e, only a portion of the surface 20 is imaged onto the sensor by the metrology system. In order to obtain a full image of the surface 20, the metrology system and the surface 20 are moved relative to one another using the translation stage 24, with an image being taken at each position, possibly with a slight overlap. These multiple images are then stitched together as to form the full image as set forth below. A number of algorithms may be used to perform this stitching, including a least square fit and simulated annealing, etc. A key feature of the present invention is the use of the direct slope information from adjoining overlapping fields. Previously, any separate images of the surface would be adjusted until the edges line up. In contrast, by preserving the direct slope information, any resulting difference at the edges of the images is due to stage tilt. Thus, the stitching of the present invention does not rely on the assumption that the stage is perfect. Such an assumption, required by previous methods, can lead to erroneous errors, both in accepting surfaces which appear flat and in not accepting surfaces that are acceptable.
Two-Dimensional Wavefront Sensor
FIG. 2 is a schematic diagram of the basic elements of a two-dimensional embodiment of a Shack-Hartmann wavefront sensor for use as the wavefront sensor 26. A portion of an incoming wavefront 40 from the surface 20 is incident upon a two-dimensional lenslet array 42. The lenslet array 42 dissects the incoming wavefront 40 into a number of small samples. The smaller the lenslet, the higher the spatial resolution of the sensor. However, the spot size from a small lenslet, due to diffraction effects, limits the focal length which may be used, which in turn leads to lower sensitivity. Thus, these two parameters must be balanced in accordance with desired measurement performance. Extremely low, preferably at least 12-16 bit, noise cameras are now available, which aid in increasing the sensitivity of the overall wavefront sensor and allow this balance to be achieved.
Each sample forms a focal spot 44 on a detector 46. The detector 46, for example, is a low noise, high pixel count charge coupled device (CCD) camera, e.g., SMD-2K manufactured by Silicon Mountain Designs. The processor 30 performs centroid estimation to determine positions of the focal spots. A position of a focal spot depends upon the average wavefront over the sample. Thus, the direction of propagation of each of these samples is determined by the location of the focal spot on the detector 46. The processor 30 compares the focal spot positions against a reference set of positions. This reference is established during calibration of the system. The processor 30 then divides the difference between the measured focal spot and the reference position by the focal length of the lenslet to convert the difference into a wavefront slope. The processor 30 then integrates the wavefront slope in two dimensions to form the wavefront of the beam for the portion of the object being measured. The processor 30 then determines any deviations of the wavefront from the calibration wavefront to assess the flatness of the test object.
While the above illustration in FIG. 2 shows only a single line of lenses, it is to be understood that the lenslet array 42 of FIG. 2 actually is a two-dimensional array, having a plurality of lenses in two directions, in accordance with the dimensions of the reference flat. Further, any convenient method may be used for creating a plurality of focal spots on a detector. This may include using a lenslet array, an array of holes, a grating array, a prism array, etc.
One-Dimensional Wavefront Sensor
Alternatively, a linear Shack-Hartmann sensor, as shown in FIG. 4, may be provided for which the sensor may be either longer than the longest dimension of the part of interest, or a portion of the part may be measured and stitched as described below. Large parts may be measured at an oblique angle, as shown in FIG. 1e, to allow a small diameter sensor to measure a large diameter part. Reference data may still be obtained using a small reference flat.
A one dimensional (1-D) wavefront sensor for measuring the wavefront along a single line is disclosed in U.S. Pat. No. 5,493,391 to Neal et al., which is hereby incorporated by reference in its entirety for all purposes. This type of measurement scheme has advantages in the bandwidth of the measurement because fewer camera pixels must be acquired for the same measurement. For a 1-D sensor, the sensor bandwidth scales as R/N where R is the camera pixel rate (pixels/sec) and N is the number of pixels (or pixel clocks) across one line of the sensor. For a 2D sensor, this scales as R/N2, which, for the same pixel rate R, can greatly decrease the effective bandwidth of the system. For example, given that R is 10 MHz, and N is 512 pixels, the 1D bandwidth is 19.5 kHz, while the 2D bandwidth is 38 Hz. This is a great advantage for inspection of moving systems, flow, turbulence, or other dynamic systems. However, the sensor disclosed in this patent only measures the x-derivatives, and can only provide some information about y- or cross derivatives through inference.
For the present invention, measurements of both the x- and y-derivatives are needed. FIG. 4 illustrates a linear wavefront sensor 60 that has most of the speed advantages of the previous 1-D sensor while allowing measurement of both x- and y- derivatives.
The linear wavefront sensor 60 includes a lenslet array 62 having a series of individual spherical (or near-spherical) lenses 66 arranged in a line with their edges touching and a detector 64. For optimum use with the present invention, the f/number of the individual lenses should be fairly large. This provides maximum sensitivity for the measurement. The f/number of the lenses is chosen to create the optimum spot size that is appropriate for the specific detector 64.
The linear wavefront sensor 60 may be realized by using a two dimensional CCD which is clocked out to maximize the measurement bandwidth as the detector 64. This can be accomplished by providing extra horizontal clock pulses to the CCD camera or a vertical synch pulse. An area of interest is defined on a conventional 2-D CCD and clock pulses are sent to reset the frame after the first few lines of data are read out. The spot positions may then be obtained using a centroid algorithm, a matched filter algorithm, fast Fourier phase shift algorithm, or any other appropriate algorithm. One advantage of this technique is that, as long as the camera control electronics and the frame grabber used for data acquisition are both controlled by software in the same computer, it is possible to extend the dynamic range by tracking the spots as they move.
An alternative to using electronic control to realize the linear wavefront sensor 60 is shown in FIG. 5. A three-line CCD 68 serves as the detector 64. Three-line CCD detectors were originally developed to provide color operation of line-scan cameras. For use with the present invention, the three-line CCD 68 is modified by leaving off the color filters. The three lines provide some sensing in the vertical direction, as well as in the horizontal, as illustrated schematically in FIG. 5, in which the focal spots 69 from the lenslet array 62 are shown over the various lines of the detector 68. Thus, a spot can be located through its centroid in both axes. This provides the necessary information for measuring both the x- and the y-derivative. The drawback of this scheme is that the dynamic range in the y-direction (normal to the detector array line) is reduced because only three measurements are made in this direction. However, the spot shape can be adjusted by varying the aspect ratio of the lenslet design to compensate for this problem. In addition, the high data rate of such a system can be used to compensate for any loss of dynamic range in this direction by acquiring more data.
The linear wavefront sensor 60 of the present invention is similar in some respects to other one-dimensional wavefront sensor implementations, e.g., U.S. Pat. No. 4,227,091 to Sick entitled “Optical Monitoring Apparatus.” However, the linear wavefront sensor 60 of the present invention is significantly simpler than other concepts that can measure both derivatives. The electronic control of the present invention described above uses off-the-shelf components and relies on software, camera control electronics and frame grabber to acquire area of interest information. The physical three line configuration of the present invention described above has a restricted dynamic range, but is easy to implement. The linear wavefront sensor 60 of the present invention is particularly useful when it is impractical to mount and/or translate the object.
Wavefront Reconstruction
Once data is achieved using either embodiment of the wavefront sensor, the data must be processed to form the wavefront for the entire object. As shown in FIG. 3, the wavefront sensor 26 (or linear wavefront sensor 60) is moved relative to the object being measured to provide wavefront slopes which are then stitched together to form an overall wavefront. In accordance with the present invention, the term “stitching” means reconstructing the wavefront using the derivatives of the wavefront in overlap regions. While only one dimension is shown in FIG. 3, it is to be understood that this translation may occur along both dimensions of the surface of the object. Further, while only a one-dimensional array is shown, it is to be understood that a two-dimensional array may be employed in an analogous manner.
In the top illustration, the lenslet array 42 (62) is in a first position relative to the full incoming wavefront 48. Only a portion of the full incoming wavefront 48 is imaged by the lenslet array 42 (62). In the middle illustration, the lenslet array 42 (62) has been shifted relative to the incoming wavefront 48 to a second position. An overlap region 50 between the first position and the second position is stitched together by forcing the average slopes of the overlapped region 50 to match. This match may be achieved by hueristic convergence and may result in a discontinuity in the first derivative, as opposed to previous methods which forced the first derivative to be continuous. This discontinuity will represent translation stage errors that may then be simply subtracted to leave only information about the surface.
In the bottom illustration, the lenslet array 42 (62) has again been shifted relative to the incoming wavefront 48 to a third position. An overlap region 52 between the second and third position is stitched together by forcing the average slopes of the overlap region 52 to match, as set forth above. After all the shifting and stitching has been a completed, a full image of the wavefront 40 is obtained.
To process the collected data, i.e., to combine the individual wavefront images, an error minimization algorithm may be used. During data acquisition, each wavefront image is taken so that a region on edges of the image overlapped with the previous image. The average slope in the x- and y-direction in each overlapped region of each wavefront image is then calculated. Since the tip and tilt of the test object and the wavefront sensor relative to one another could change slightly during translation between acquisition of wavefront images, the tip and tilt of each of the wavefront images is adjusted to match up the tip and tilt of the overlapped region of the images. To accomplish this matching, an error function may be defined as the sum of the absolute value of the difference of the tip and tilt in the overlapped region. To minimize this error function, an iterative search algorithm may be employed, e.g., simulated annealing. The above algorithms are only one example of compensating for any tip or tilt error between adjacent images.
Preferably, the wavefront sensor is translated in one direction at a time by an integer multiple of lenslets. While typically the translation is of the same amount for each shift, the final shift required for imaging the entire object in the direction of the shift may only be far enough to image the edge of the object. Preferably, the resulting overlap is on the order of 10%-50% of the aperture of the lenslet array. The size of the overlap region may be adjusted to provide a desired stitching accuracy. For a one-dimensional sensor, the overlap may actually be an even larger percentage, since data may be acquired at higher rates than for a two-dimensional sensor.
When using the Shack-Hartmann wavefront sensor, which measures the slope of the wavefront, slight differences, e.g., on the order of nanometers, in the vertical direction, i.e., the separation between the wavefront sensor and the object being measured, will not affect the measurement. When other types of sensors, e.g., inteferometric sensors, are employed, such a difference in the vertical direction could significantly affect the flatness measurement. Thus, while the translation stage must be able to accurately position the wavefront sensor and the object relative to one another so that a known region is overlapped, the stage does not have to be extremely precise in vertical runout. Further, while interferometry typically requires between 3-6 frames to gather enough data for the analysis, the wavefront sensing of the present invention collects all data needed for a specific region in a single frame. Finally, since only a single image is required and each image contains a relatively small amount of data, the wavefront sensing of the present invention allows continuous scanning, rather than translating the system and stopping for a series of measurements. This further increases the speed with which the analysis can be performed. This is a significant advantage over the interferometry techniques, which must stop and measure 3-6 frames for each position.
Thus, in accordance with the present invention, high resolution measurements of slopes of a wavefront of sub-apertures may be stitched together to form a high resolution measurement of an entire aperture. A reference corresponding to the area of the sub-aperture may be measured to provide calibration of the system. While the above description is related to measuring wafer flatness, the metrology system of the present invention may be used to provide high resolution wavefront measurements over areas which are larger than the aperture of the system for many different objects or optical systems, and to measure profiles which are not flat.
Although preferred embodiments of the present invention have been described in detail herein above, it should be clearly understood that many variations and/or modifications of the basic inventive concepts taught herein, which may appear to those skilled in the art, will still fall within the spirit and scope of the present invention as defined in the appended claims and their equivalents. For example, while all of the configurations have been illustrated with all elements in the same plane for ease of illustration, elements may be in different planes.

Claims (29)

What is claimed is:
1. A method for reconstructing a wavefront from a target having a plurality of subregions comprising:
a) illuminating a subregion;
b) delivering light from the subregion to a spot generator;
c) detecting positions of focal spots from the spot generator;
d) determining a wavefront from the subregion from detected focal spot positions;
e) repeating steps a)-d) until all subregions have been measured, said repeating including illuminating a subregion such that adjacent subregions have an overlapping region; and
f) stitching together wavefronts determined by step d) including using derivatives of the wavefront in the overlapping regions, thereby reconstructing the wavefront from the target.
2. The method of claim 1, wherein the target is ideally a flat surface.
3. The method of claim 1, further comprising calibrating the method by performing steps a)-d) on a reference surface.
4. The method of claim 1, wherein said repeating includes moving the object and a system providing said illuminating, delivering and detecting relative to one another.
5. The method of claim 4, wherein said spot generator is a lenslet array and said moving includes moving by an integral number of lenslets.
6. The method of claim 4, wherein said moving results in a 10-50% overlap of adjacent measurements.
7. The method of claim 4, wherein, when each subregion extends along an entire first direction of the target, said moving includes moving in a single direction orthogonal to first direction for complete measurement of said subregions.
8. The method of claim 4, wherein said moving includes continuously moving the object and the system relative to one another.
9. The method of claim 1, wherein said stitching includes setting a wavefront in an overlap region having more than one wavefront from said determining associated therewith equal to an average of the wavefronts for the overlap region.
10. The method of claim 1, wherein the target is one of a wafer, a flat panel display, and a large optic.
11. The method of claim 1, wherein light delivered from the object is reflected by the object.
12. The method of claim 1 wherein light delivered from the object is transmitted by the object.
13. The method of claim 1, wherein said illuminating of a complete subregion occurs only once for each subregion.
14. The method of claim 1, wherein said illuminating includes illuminating the subregion with pulsed light.
15. The method of claim 1, wherein said stitching includes determining a wavefront value in an overlap region using an error minimizing algorithm.
16. A metrology system for analyzing an object larger than an aperture of the system comprising:
a light source;
a wavefront sensor;
an optical system for delivering light from the light source onto a portion of the object being measured and for delivering light from the object to the wavefront sensor;
a translator for adjusting a relative position of the object and the system, the translator adjusting the relative positions such that adjacent measurements have an overlap region; and
a processor for stitching together wavefronts measured by the wavefront sensor for different portions of the object measured at positions provided by the translator, including using derivatives of wavefronts in overlap regions.
17. The system according to claim 16, further comprising a reference surface for calibrating the optical system.
18. The system according to claim 16, wherein said translator includes a translation stage on which the object is mounted.
19. The system according to claim 18, further comprising a reference surface for calibrating the optical system, the reference surface being mounted on the translation stage.
20. The system according to claim 16, wherein said wavefront sensor is a linear wavefront sensor extending along an entire dimension of the object.
21. The system according to claim 16, wherein said translator adjusts the relative position only in one dimension.
22. The system according to claim 16, wherein said system further comprises a position sensor which measures a position of the light from the object in the optical system.
23. The system according to claim 22, wherein said system further comprises a translatable surface which directs light from the object to the wavefront sensor which is controlled in accordance with the position measured by the position sensor.
24. The system according to claim 16, wherein the object is one of a wafer, a flat panel display, and a large optic.
25. The system according to claim 16, wherein the optical system delivers light reflected by the object to the wavefront sensor.
26. The system according to claim 16, wherein the optical system delivers light transmitted by the object to the wavefront sensor.
27. The system according to claim 16, wherein the translator adjusts the relative position after the optical system delivers light to the portion of the object being measured only once.
28. The system according to claim 16, wherein the translator continuously adjusts the relative position.
29. The system according to claim 16, wherein the light source is pulsed.
US09/340,502 1999-07-01 1999-07-01 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor Expired - Lifetime US6184974B1 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
US09/340,502 US6184974B1 (en) 1999-07-01 1999-07-01 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor
KR1020017016960A KR100685574B1 (en) 1999-07-01 2000-06-30 Apparatus and method for evaluating a large target relative to the measuring hole of the sensor
JP2001508019A JP4647867B2 (en) 1999-07-01 2000-06-30 Apparatus and method used to evaluate a target larger than the sensor measurement aperture
DE60001280T DE60001280T2 (en) 1999-07-01 2000-06-30 DEVICE AND METHOD FOR EVALUATING A TARGET BIGGER THAN THE MEASURING APERTURE OF A SENSOR
AT00945111T ATE231609T1 (en) 1999-07-01 2000-06-30 APPARATUS AND METHOD FOR EVALUATION OF A TARGET LARGER THAN THE MEASURING APERTURE OF A SENSOR
EP00945111A EP1192433B1 (en) 1999-07-01 2000-06-30 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor
AU59100/00A AU5910000A (en) 1999-07-01 2000-06-30 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor
PCT/US2000/018262 WO2001002822A1 (en) 1999-07-01 2000-06-30 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor
MYPI20003027A MY128215A (en) 1999-07-01 2000-07-03 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/340,502 US6184974B1 (en) 1999-07-01 1999-07-01 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor

Publications (1)

Publication Number Publication Date
US6184974B1 true US6184974B1 (en) 2001-02-06

Family

ID=23333644

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/340,502 Expired - Lifetime US6184974B1 (en) 1999-07-01 1999-07-01 Apparatus and method for evaluating a target larger than a measuring aperture of a sensor

Country Status (9)

Country Link
US (1) US6184974B1 (en)
EP (1) EP1192433B1 (en)
JP (1) JP4647867B2 (en)
KR (1) KR100685574B1 (en)
AT (1) ATE231609T1 (en)
AU (1) AU5910000A (en)
DE (1) DE60001280T2 (en)
MY (1) MY128215A (en)
WO (1) WO2001002822A1 (en)

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6376819B1 (en) * 1999-07-09 2002-04-23 Wavefront Sciences, Inc. Sub-lens spatial resolution Shack-Hartmann wavefront sensing
US20020159048A1 (en) * 2001-02-23 2002-10-31 Nikon Corporation Wavefront aberration measuring method and unit, exposure apparatus, device manufacturing method, and device
US6480267B2 (en) * 1999-12-28 2002-11-12 Kabushiki Kaisha Topcon Wavefront sensor, and lens meter and active optical reflecting telescope using the same
US6548797B1 (en) * 2000-10-20 2003-04-15 Nikon Corporation Apparatus and method for measuring a wavefront using a screen with apertures adjacent to a multi-lens array
US6577447B1 (en) 2000-10-20 2003-06-10 Nikon Corporation Multi-lens array of a wavefront sensor for reducing optical interference and method thereof
US20030137654A1 (en) * 2000-12-22 2003-07-24 Nikon Corporation Wavefront aberration measuring instrument, wavefront aberration measuring method, exposure apparautus, and method for manufacturing micro device
WO2003068057A2 (en) * 2002-02-11 2003-08-21 Visx, Inc. Method and device for calibrating an optical wavefront system
US6624896B1 (en) * 1999-10-18 2003-09-23 Wavefront Sciences, Inc. System and method for metrology of surface flatness and surface nanotopology of materials
US6630656B2 (en) 2000-07-14 2003-10-07 Adaptive Optics Associates, Inc. Method and apparatus for wavefront measurement that resolves the 2-π ambiguity in such measurement and adaptive optics systems utilizing same
US6714282B2 (en) 2000-12-25 2004-03-30 Nikon Corporation Position detecting method optical characteristic measuring method and unit, exposure apparatus, and device manufacturing method
US6724479B2 (en) 2001-09-28 2004-04-20 Infineon Technologies Ag Method for overlay metrology of low contrast features
US6724464B2 (en) 2000-12-27 2004-04-20 Nikon Corporation Position detecting method and unit, optical characteristic measuring method and unit, exposure apparatus, and device manufacturing method
US20040090606A1 (en) * 2001-02-06 2004-05-13 Nikon Corporation Exposure apparatus, exposure method, and device manufacturing method
US20040223214A1 (en) * 2003-05-09 2004-11-11 3M Innovative Properties Company Scanning laser microscope with wavefront sensor
US6819414B1 (en) 1998-05-19 2004-11-16 Nikon Corporation Aberration measuring apparatus, aberration measuring method, projection exposure apparatus having the same measuring apparatus, device manufacturing method using the same measuring method, and exposure method
US20040260275A1 (en) * 2003-04-09 2004-12-23 Visx, Incorporated Wavefront calibration analyzer and methods
US20040257530A1 (en) * 2003-06-20 2004-12-23 Visx, Inc. Wavefront reconstruction using fourier transformation and direct integration
US20050024585A1 (en) * 2003-06-20 2005-02-03 Visx, Incorporated Systems and methods for prediction of objective visual acuity based on wavefront measurements
US20050046865A1 (en) * 2003-08-28 2005-03-03 Brock Neal J. Pixelated phase-mask interferometer
US20050098707A1 (en) * 2000-07-14 2005-05-12 Metrologic Instruments, Inc. Method and apparatus for wavefront measurement that resolves the 2-pi ambiguity in such measurement and adaptive optics systems utilizing same
US20050131398A1 (en) * 2003-11-10 2005-06-16 Visx, Inc. Methods and devices for testing torsional alignment between a diagnostic device and a laser refractive system
US20060152710A1 (en) * 2003-06-23 2006-07-13 Bernhard Braunecker Optical inclinometer
US7088457B1 (en) * 2003-10-01 2006-08-08 University Of Central Florida Research Foundation, Inc. Iterative least-squares wavefront estimation for general pupil shapes
US7168807B2 (en) 2003-06-20 2007-01-30 Visx, Incorporated Iterative fourier reconstruction for laser surgery and other optical applications
US7187815B1 (en) * 2004-10-01 2007-03-06 Sandia Corporation Relaying an optical wavefront
US20070058132A1 (en) * 2005-09-02 2007-03-15 Visx, Incorporated Calculating Zernike coefficients from Fourier coefficients
US7268937B1 (en) 2005-05-27 2007-09-11 United States Of America As Represented By The Secretary Of The Air Force Holographic wavefront sensor
US20070222948A1 (en) * 2006-03-23 2007-09-27 Visx, Incorporated Systems and methods for wavefront reconstruction for aperture with arbitrary shape
US20070236702A1 (en) * 2006-04-07 2007-10-11 Neal Daniel R Geometric measurement system and method of measuring a geometric characteristic of an object
US20080073525A1 (en) * 2006-03-14 2008-03-27 Visx, Incorporated Spatial Frequency Wavefront Sensor System and Method
US20080100850A1 (en) * 2006-10-31 2008-05-01 Mitutoyo Corporation Surface height and focus sensor
US20080100829A1 (en) * 2006-10-31 2008-05-01 Mitutoyo Corporation Surface height and focus sensor
US20090152453A1 (en) * 2005-12-13 2009-06-18 Agency For Science, Technology And Research Optical wavefront sensor
US20090152440A1 (en) * 2007-11-16 2009-06-18 Mitutoyo Corporation Extended range focus detection apparatus
US20100274233A1 (en) * 1999-08-11 2010-10-28 Carl Zeiss Meditec Ag Method and device for performing online aberrometry in refractive eye correction
US20120062708A1 (en) * 2010-09-15 2012-03-15 Ascentia Imaging, Inc. Imaging, Fabrication and Measurement Systems and Methods
CN102435420A (en) * 2011-09-20 2012-05-02 浙江师范大学 Method for detecting intermediate frequency errors of optical element
US20120147377A1 (en) * 2009-06-24 2012-06-14 Koninklijke Philips Electronics N.V. Optical biosensor with focusing optics
US20130083245A1 (en) * 2011-09-30 2013-04-04 Stmicroelectronics, Inc. Compression error handling for temporal noise reduction
JP2013148427A (en) * 2012-01-18 2013-08-01 Canon Inc Method and apparatus for measuring wavefront inclination distribution
US8596787B2 (en) 2003-06-20 2013-12-03 Amo Manufacturing Usa, Llc Systems and methods for prediction of objective visual acuity based on wavefront measurements
CN104198164A (en) * 2014-09-19 2014-12-10 中国科学院光电技术研究所 Focus detection method based on principle of Hartman wavefront detection
US8911086B2 (en) 2002-12-06 2014-12-16 Amo Manufacturing Usa, Llc Compound modulation transfer function for laser surgery and other optical applications
DE102014220583A1 (en) 2013-10-11 2015-04-16 Mitutoyo Corporation SYSTEM AND METHOD FOR CONTROLLING A TRACKING AUTOFOK (TAF) SENSOR IN A MECHANICAL VISIBILITY INSPECTION SYSTEM
DE102015219495A1 (en) 2014-10-09 2016-04-14 Mitutoyo Corporation A method of programming a three-dimensional workpiece scan path for a metrology system
US9534884B2 (en) 2012-01-03 2017-01-03 Ascentia Imaging, Inc. Coded localization systems, methods and apparatus
US9739864B2 (en) 2012-01-03 2017-08-22 Ascentia Imaging, Inc. Optical guidance systems and methods using mutually distinct signal-modifying
CN107179605A (en) * 2017-07-04 2017-09-19 成都安的光电科技有限公司 Telescope focusing system and method
US10126114B2 (en) 2015-05-21 2018-11-13 Ascentia Imaging, Inc. Angular localization system, associated repositionable mechanical structure, and associated method
US10132925B2 (en) 2010-09-15 2018-11-20 Ascentia Imaging, Inc. Imaging, fabrication and measurement systems and methods
CN110987377A (en) * 2019-12-18 2020-04-10 中国空间技术研究院 Optical axis angle measuring method of space optical camera
CN114543695A (en) * 2022-02-08 2022-05-27 南京中安半导体设备有限责任公司 Hartmann measuring device and measuring method thereof and wafer geometric parameter measuring device

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007085788A (en) * 2005-09-20 2007-04-05 Nikon Corp Hartmann sensor
DE102006062600B4 (en) 2006-12-29 2023-12-21 Endress + Hauser Flowtec Ag Method for commissioning and/or monitoring an in-line measuring device
FR2920536B1 (en) * 2007-08-29 2010-03-12 Thales Sa DEVICE FOR MEASURING THE MODULATION TRANSFER FUNCTION OF LARGE DIMENSIONAL OPTIC INSTRUMENTS
JP5416025B2 (en) * 2010-04-22 2014-02-12 株式会社神戸製鋼所 Surface shape measuring device and semiconductor wafer inspection device
JP2013002819A (en) * 2011-06-10 2013-01-07 Horiba Ltd Flatness measuring device
US8593622B1 (en) * 2012-06-22 2013-11-26 Raytheon Company Serially addressed sub-pupil screen for in situ electro-optical sensor wavefront measurement
JP6030471B2 (en) * 2013-02-18 2016-11-24 株式会社神戸製鋼所 Shape measuring device
JP6448497B2 (en) * 2015-08-03 2019-01-09 三菱電機株式会社 Wavefront sensor, wavefront measuring method, and optical module positioning method
DE102016210966A1 (en) 2016-06-20 2017-12-21 Micro-Epsilon Optronic Gmbh Method and device for measuring a curved wavefront with at least one wavefront sensor
CN110207932A (en) * 2019-05-15 2019-09-06 中国科学院西安光学精密机械研究所 A kind of high-speed wind tunnel schlieren focal spot monitoring shock-dampening method and system
US11231375B2 (en) * 2019-05-20 2022-01-25 Wisconsin Alumni Research Foundation Apparatus for high-speed surface relief measurement

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4141652A (en) * 1977-11-25 1979-02-27 Adaptive Optics Associates, Inc. Sensor system for detecting wavefront distortion in a return beam of light
US4737621A (en) * 1985-12-06 1988-04-12 Adaptive Optics Assoc., Inc. Integrated adaptive optical wavefront sensing and compensating system
US5164578A (en) * 1990-12-14 1992-11-17 United Technologies Corporation Two-dimensional OCP wavefront sensor employing one-dimensional optical detection
US5233174A (en) * 1992-03-11 1993-08-03 Hughes Danbury Optical Systems, Inc. Wavefront sensor having a lenslet array as a null corrector
US5287165A (en) * 1991-09-30 1994-02-15 Kaman Aerospace Corporation High sensitivity-wide dynamic range optical tilt sensor
US5294971A (en) * 1990-02-07 1994-03-15 Leica Heerbrugg Ag Wave front sensor
US5333049A (en) * 1991-12-06 1994-07-26 Hughes Aircraft Company Apparatus and method for interferometrically measuring the thickness of thin films using full aperture irradiation
US5493391A (en) * 1994-07-11 1996-02-20 Sandia Corporation One dimensional wavefront distortion sensor comprising a lens array system
US5563709A (en) * 1994-09-13 1996-10-08 Integrated Process Equipment Corp. Apparatus for measuring, thinning and flattening silicon structures
US5629765A (en) * 1995-12-15 1997-05-13 Adaptive Optics Associates, Inc. Wavefront measuring system with integral geometric reference (IGR)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3413420C2 (en) * 1984-04-10 1986-06-26 Messerschmitt-Bölkow-Blohm GmbH, 8012 Ottobrunn Sensor for determining image defects
JPH0814484B2 (en) * 1985-04-09 1996-02-14 株式会社ニコン Pattern position measuring device
US4689491A (en) * 1985-04-19 1987-08-25 Datasonics Corp. Semiconductor wafer scanning system
JP2892075B2 (en) * 1990-01-31 1999-05-17 末三 中楯 Measuring method of refractive index distribution and transmitted wavefront and measuring device used for this method
DE4003699A1 (en) * 1990-02-07 1991-08-22 Wild Heerbrugg Ag METHOD AND ARRANGEMENT FOR TESTING OPTICAL COMPONENTS OR SYSTEMS
US5293216A (en) * 1990-12-31 1994-03-08 Texas Instruments Incorporated Sensor for semiconductor device manufacturing process control
JPH0884104A (en) * 1994-09-09 1996-03-26 Toshiba Corp Radio communication equipment
JP3405132B2 (en) * 1997-07-02 2003-05-12 三菱電機株式会社 Wavefront measurement method for coherent light source

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4141652A (en) * 1977-11-25 1979-02-27 Adaptive Optics Associates, Inc. Sensor system for detecting wavefront distortion in a return beam of light
US4737621A (en) * 1985-12-06 1988-04-12 Adaptive Optics Assoc., Inc. Integrated adaptive optical wavefront sensing and compensating system
US5294971A (en) * 1990-02-07 1994-03-15 Leica Heerbrugg Ag Wave front sensor
US5164578A (en) * 1990-12-14 1992-11-17 United Technologies Corporation Two-dimensional OCP wavefront sensor employing one-dimensional optical detection
US5287165A (en) * 1991-09-30 1994-02-15 Kaman Aerospace Corporation High sensitivity-wide dynamic range optical tilt sensor
US5333049A (en) * 1991-12-06 1994-07-26 Hughes Aircraft Company Apparatus and method for interferometrically measuring the thickness of thin films using full aperture irradiation
US5233174A (en) * 1992-03-11 1993-08-03 Hughes Danbury Optical Systems, Inc. Wavefront sensor having a lenslet array as a null corrector
US5493391A (en) * 1994-07-11 1996-02-20 Sandia Corporation One dimensional wavefront distortion sensor comprising a lens array system
US5563709A (en) * 1994-09-13 1996-10-08 Integrated Process Equipment Corp. Apparatus for measuring, thinning and flattening silicon structures
US5629765A (en) * 1995-12-15 1997-05-13 Adaptive Optics Associates, Inc. Wavefront measuring system with integral geometric reference (IGR)

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Adaptive Optics: Theory and Application; Glenn A. Tyler; Apr. 15, 1998; pp. 1 and 15.

Cited By (107)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6819414B1 (en) 1998-05-19 2004-11-16 Nikon Corporation Aberration measuring apparatus, aberration measuring method, projection exposure apparatus having the same measuring apparatus, device manufacturing method using the same measuring method, and exposure method
US6376819B1 (en) * 1999-07-09 2002-04-23 Wavefront Sciences, Inc. Sub-lens spatial resolution Shack-Hartmann wavefront sensing
US8356897B2 (en) 1999-08-11 2013-01-22 Carl Zeiss Meditec Ag Method and device for performing online aberrometry in refractive eye correction
US20100274233A1 (en) * 1999-08-11 2010-10-28 Carl Zeiss Meditec Ag Method and device for performing online aberrometry in refractive eye correction
US8029136B2 (en) 1999-08-11 2011-10-04 Carl Zeiss Meditec Ag Method and device for performing online aberrometry in refractive eye correction
US6624896B1 (en) * 1999-10-18 2003-09-23 Wavefront Sciences, Inc. System and method for metrology of surface flatness and surface nanotopology of materials
US6480267B2 (en) * 1999-12-28 2002-11-12 Kabushiki Kaisha Topcon Wavefront sensor, and lens meter and active optical reflecting telescope using the same
US6630656B2 (en) 2000-07-14 2003-10-07 Adaptive Optics Associates, Inc. Method and apparatus for wavefront measurement that resolves the 2-π ambiguity in such measurement and adaptive optics systems utilizing same
US7161128B2 (en) 2000-07-14 2007-01-09 Adaptive Optics Associates, Inc. Optical instrument employing a wavefront sensor capable of coarse and fine phase measurement capabilities during first and second modes of operation
US20050098707A1 (en) * 2000-07-14 2005-05-12 Metrologic Instruments, Inc. Method and apparatus for wavefront measurement that resolves the 2-pi ambiguity in such measurement and adaptive optics systems utilizing same
US6649895B1 (en) 2000-07-14 2003-11-18 Adaptive Optics Associates, Inc. Dispersed Hartmann sensor and method for mirror segment alignment and phasing
US6577447B1 (en) 2000-10-20 2003-06-10 Nikon Corporation Multi-lens array of a wavefront sensor for reducing optical interference and method thereof
US6548797B1 (en) * 2000-10-20 2003-04-15 Nikon Corporation Apparatus and method for measuring a wavefront using a screen with apertures adjacent to a multi-lens array
US20030137654A1 (en) * 2000-12-22 2003-07-24 Nikon Corporation Wavefront aberration measuring instrument, wavefront aberration measuring method, exposure apparautus, and method for manufacturing micro device
US6975387B2 (en) 2000-12-22 2005-12-13 Nikon Corporation Wavefront aberration measuring instrument, wavefront aberration measuring method, exposure apparatus, and method for manufacturing micro device
US6714282B2 (en) 2000-12-25 2004-03-30 Nikon Corporation Position detecting method optical characteristic measuring method and unit, exposure apparatus, and device manufacturing method
US6724464B2 (en) 2000-12-27 2004-04-20 Nikon Corporation Position detecting method and unit, optical characteristic measuring method and unit, exposure apparatus, and device manufacturing method
US20040090606A1 (en) * 2001-02-06 2004-05-13 Nikon Corporation Exposure apparatus, exposure method, and device manufacturing method
US6914665B2 (en) 2001-02-06 2005-07-05 Nikon Corporation Exposure apparatus, exposure method, and device manufacturing method
US20020159048A1 (en) * 2001-02-23 2002-10-31 Nikon Corporation Wavefront aberration measuring method and unit, exposure apparatus, device manufacturing method, and device
US6724479B2 (en) 2001-09-28 2004-04-20 Infineon Technologies Ag Method for overlay metrology of low contrast features
US20030169402A1 (en) * 2002-02-11 2003-09-11 Visx, Inc. Method and device for calibrating an optical wavefront system
WO2003068057A3 (en) * 2002-02-11 2003-12-04 Visx Inc Method and device for calibrating an optical wavefront system
US7213919B2 (en) * 2002-02-11 2007-05-08 Visx, Incorporated Method and device for calibrating an optical wavefront system
WO2003068057A2 (en) * 2002-02-11 2003-08-21 Visx, Inc. Method and device for calibrating an optical wavefront system
US8911086B2 (en) 2002-12-06 2014-12-16 Amo Manufacturing Usa, Llc Compound modulation transfer function for laser surgery and other optical applications
US7355695B2 (en) 2003-04-09 2008-04-08 Amo Manufacturing Usa, Llc Wavefront calibration analyzer and methods
US20040260275A1 (en) * 2003-04-09 2004-12-23 Visx, Incorporated Wavefront calibration analyzer and methods
US20040223214A1 (en) * 2003-05-09 2004-11-11 3M Innovative Properties Company Scanning laser microscope with wavefront sensor
US7057806B2 (en) 2003-05-09 2006-06-06 3M Innovative Properties Company Scanning laser microscope with wavefront sensor
US7338165B2 (en) 2003-06-20 2008-03-04 Visx, Incorporated Systems and methods for prediction of objective visual acuity based on wavefront measurements
US7175278B2 (en) 2003-06-20 2007-02-13 Visx, Inc. Wavefront reconstruction using fourier transformation and direct integration
US7699470B2 (en) 2003-06-20 2010-04-20 Amo Manufacturing Usa, Llc. Systems and methods for prediction of objective visual acuity based on wavefront measurements
US20100103376A1 (en) * 2003-06-20 2010-04-29 Amo Manufacturing Usa, Llc Systems and methods for prediction of objective visual acuity based on wavefront measurements
US7168807B2 (en) 2003-06-20 2007-01-30 Visx, Incorporated Iterative fourier reconstruction for laser surgery and other optical applications
US20100179793A1 (en) * 2003-06-20 2010-07-15 AMO Manufacturing USA., LLC Iterative fourier reconstruction for laser surgery and other optical applications
US7731363B2 (en) 2003-06-20 2010-06-08 Amo Manufacturing Usa, Llc. Iterative fourier reconstruction for laser surgery and other optical applications
US7997731B2 (en) 2003-06-20 2011-08-16 Amo Manufacturing Usa Llc Systems and methods for prediction of objective visual acuity based on wavefront measurements
US8596787B2 (en) 2003-06-20 2013-12-03 Amo Manufacturing Usa, Llc Systems and methods for prediction of objective visual acuity based on wavefront measurements
US20080212031A1 (en) * 2003-06-20 2008-09-04 Amo Manufacturing Usa, Llc Iterative fourier reconstruction for laser surgery and other optical applications
US8228586B2 (en) 2003-06-20 2012-07-24 Amo Manufacturing Usa, Llc. Iterative fourier reconstruction for laser surgery and other optical applications
US20040257530A1 (en) * 2003-06-20 2004-12-23 Visx, Inc. Wavefront reconstruction using fourier transformation and direct integration
US20050024585A1 (en) * 2003-06-20 2005-02-03 Visx, Incorporated Systems and methods for prediction of objective visual acuity based on wavefront measurements
US20090021694A1 (en) * 2003-06-20 2009-01-22 Amo Manufacturing Usa, Llc Systems and Methods for Prediction of Objective Visual Acuity Based on Wavefront Measurements
US20060152710A1 (en) * 2003-06-23 2006-07-13 Bernhard Braunecker Optical inclinometer
US7649621B2 (en) * 2003-06-23 2010-01-19 Leica Geosystems Ag Optical inclinometer
US20050046865A1 (en) * 2003-08-28 2005-03-03 Brock Neal J. Pixelated phase-mask interferometer
US7230717B2 (en) * 2003-08-28 2007-06-12 4D Technology Corporation Pixelated phase-mask interferometer
US7088457B1 (en) * 2003-10-01 2006-08-08 University Of Central Florida Research Foundation, Inc. Iterative least-squares wavefront estimation for general pupil shapes
US20050131398A1 (en) * 2003-11-10 2005-06-16 Visx, Inc. Methods and devices for testing torsional alignment between a diagnostic device and a laser refractive system
US7187815B1 (en) * 2004-10-01 2007-03-06 Sandia Corporation Relaying an optical wavefront
US7268937B1 (en) 2005-05-27 2007-09-11 United States Of America As Represented By The Secretary Of The Air Force Holographic wavefront sensor
US20080140329A1 (en) * 2005-09-02 2008-06-12 Visx, Incorporated Calculating Zernike Coefficients from Fourier Coefficients
US7748848B2 (en) 2005-09-02 2010-07-06 Amo Manufacturing Usa, Llc Calculating Zernike coefficients from Fourier coefficients
US7331674B2 (en) 2005-09-02 2008-02-19 Visx, Incorporated Calculating Zernike coefficients from Fourier coefficients
US20070058132A1 (en) * 2005-09-02 2007-03-15 Visx, Incorporated Calculating Zernike coefficients from Fourier coefficients
US20090152453A1 (en) * 2005-12-13 2009-06-18 Agency For Science, Technology And Research Optical wavefront sensor
US8158917B2 (en) 2005-12-13 2012-04-17 Agency For Science Technology And Research Optical wavefront sensor and optical wavefront sensing method
US8129666B2 (en) 2006-03-14 2012-03-06 Amo Manufacturing Usa, Llc. Optical surface shape determination by mapping a lenslet array spot pattern to spatial frequency space
US7652235B2 (en) 2006-03-14 2010-01-26 Amo Manufacturing Usa, Llc. Spatial frequency wavefront sensor system and method
US20100090090A1 (en) * 2006-03-14 2010-04-15 Amo Manufacturing Usa, Llc Spatial Frequency Wavefront Sensor System and Method
US20080073525A1 (en) * 2006-03-14 2008-03-27 Visx, Incorporated Spatial Frequency Wavefront Sensor System and Method
US8445825B2 (en) 2006-03-14 2013-05-21 Amo Manufacturing Usa, Llc. Optical surface shape determination by mapping a lenslet array spot pattern to a spatial frequency space
US20070222948A1 (en) * 2006-03-23 2007-09-27 Visx, Incorporated Systems and methods for wavefront reconstruction for aperture with arbitrary shape
US7931371B2 (en) 2006-03-23 2011-04-26 Amo Manufacturing Usa, Llc. Systems and methods for wavefront reconstruction for aperture with arbitrary shape
US20100238407A1 (en) * 2006-03-23 2010-09-23 Amo Manufacturing Usa, Llc Systems and methods for wavefront reconstruction for aperture with arbitrary shape
US7780294B2 (en) 2006-03-23 2010-08-24 Amo Manufacturing Usa, Llc. Systems and methods for wavefront reconstruction for aperture with arbitrary shape
US20070236702A1 (en) * 2006-04-07 2007-10-11 Neal Daniel R Geometric measurement system and method of measuring a geometric characteristic of an object
EP2008058A2 (en) * 2006-04-07 2008-12-31 Advanced Medical Optics, Inc. Geometric measurement system and method of measuring a geometric characteristic of an object
US7623251B2 (en) 2006-04-07 2009-11-24 Amo Wavefront Sciences, Llc. Geometric measurement system and method of measuring a geometric characteristic of an object
US20070236703A1 (en) * 2006-04-07 2007-10-11 Neal Daniel R Geometric measurement system and method of measuring a geometric characteristic of an object
US20100277694A1 (en) * 2006-04-07 2010-11-04 Amo Wavefront Sciences, Llc. Geometric measurement system and method of measuring a geometric characteristic of an object
EP2008058A4 (en) * 2006-04-07 2011-02-16 Abbott Medical Optics Inc Geometric measurement system and method of measuring a geometric characteristic of an object
US20070236701A1 (en) * 2006-04-07 2007-10-11 Neal Daniel R Geometric measurement system and method of measuring a geometric characteristic of an object
US7969585B2 (en) 2006-04-07 2011-06-28 AMO Wavefront Sciences LLC. Geometric measurement system and method of measuring a geometric characteristic of an object
US7583389B2 (en) 2006-04-07 2009-09-01 Amo Wavefront Sciences, Llc Geometric measurement system and method of measuring a geometric characteristic of an object
US7616330B2 (en) 2006-04-07 2009-11-10 AMO Wavefront Sciences, LLP Geometric measurement system and method of measuring a geometric characteristic of an object
US7728961B2 (en) 2006-10-31 2010-06-01 Mitutoyo Coporation Surface height and focus sensor
US20080100850A1 (en) * 2006-10-31 2008-05-01 Mitutoyo Corporation Surface height and focus sensor
US20080100829A1 (en) * 2006-10-31 2008-05-01 Mitutoyo Corporation Surface height and focus sensor
EP1972886A2 (en) 2007-03-21 2008-09-24 Mitutoyo Corporation Method and apparatus for detecting a location of a workpiece surface using a surface height focus sensor means
US20090152440A1 (en) * 2007-11-16 2009-06-18 Mitutoyo Corporation Extended range focus detection apparatus
US7723657B2 (en) 2007-11-16 2010-05-25 Mitutoyo Corporation Focus detection apparatus having extended detection range
EP2202480A2 (en) 2008-12-29 2010-06-30 Mitutoyo Corporation Extended range focus detection apparatus
US20120147377A1 (en) * 2009-06-24 2012-06-14 Koninklijke Philips Electronics N.V. Optical biosensor with focusing optics
US8670123B2 (en) * 2009-06-24 2014-03-11 Koninklijke Philips N.V. Optical biosensor with focusing optics
US9212899B2 (en) * 2010-09-15 2015-12-15 Ascentia Imaging, Inc. Imaging, fabrication and measurement systems and methods
US10132925B2 (en) 2010-09-15 2018-11-20 Ascentia Imaging, Inc. Imaging, fabrication and measurement systems and methods
US20120062708A1 (en) * 2010-09-15 2012-03-15 Ascentia Imaging, Inc. Imaging, Fabrication and Measurement Systems and Methods
CN102435420A (en) * 2011-09-20 2012-05-02 浙江师范大学 Method for detecting intermediate frequency errors of optical element
US8774549B2 (en) * 2011-09-30 2014-07-08 Stmicroelectronics, Inc. Compression error handling for temporal noise reduction
US20130083245A1 (en) * 2011-09-30 2013-04-04 Stmicroelectronics, Inc. Compression error handling for temporal noise reduction
US10024651B2 (en) 2012-01-03 2018-07-17 Ascentia Imaging, Inc. Coded localization systems, methods and apparatus
US11499816B2 (en) 2012-01-03 2022-11-15 Ascentia Imaging, Inc. Coded localization systems, methods and apparatus
US9534884B2 (en) 2012-01-03 2017-01-03 Ascentia Imaging, Inc. Coded localization systems, methods and apparatus
US11092662B2 (en) 2012-01-03 2021-08-17 Ascentia Imaging, Inc. Optical guidance systems and methods using mutually distinct signal-modifying sensors
US9739864B2 (en) 2012-01-03 2017-08-22 Ascentia Imaging, Inc. Optical guidance systems and methods using mutually distinct signal-modifying
JP2013148427A (en) * 2012-01-18 2013-08-01 Canon Inc Method and apparatus for measuring wavefront inclination distribution
DE102014220583A1 (en) 2013-10-11 2015-04-16 Mitutoyo Corporation SYSTEM AND METHOD FOR CONTROLLING A TRACKING AUTOFOK (TAF) SENSOR IN A MECHANICAL VISIBILITY INSPECTION SYSTEM
CN104198164B (en) * 2014-09-19 2017-02-15 中国科学院光电技术研究所 Focus detection method based on principle of Hartman wavefront detection
CN104198164A (en) * 2014-09-19 2014-12-10 中国科学院光电技术研究所 Focus detection method based on principle of Hartman wavefront detection
US9740190B2 (en) 2014-10-09 2017-08-22 Mitutoyo Corporation Method for programming a three-dimensional workpiece scan path for a metrology system
DE102015219495A1 (en) 2014-10-09 2016-04-14 Mitutoyo Corporation A method of programming a three-dimensional workpiece scan path for a metrology system
US10126114B2 (en) 2015-05-21 2018-11-13 Ascentia Imaging, Inc. Angular localization system, associated repositionable mechanical structure, and associated method
CN107179605A (en) * 2017-07-04 2017-09-19 成都安的光电科技有限公司 Telescope focusing system and method
CN110987377A (en) * 2019-12-18 2020-04-10 中国空间技术研究院 Optical axis angle measuring method of space optical camera
CN114543695A (en) * 2022-02-08 2022-05-27 南京中安半导体设备有限责任公司 Hartmann measuring device and measuring method thereof and wafer geometric parameter measuring device

Also Published As

Publication number Publication date
JP2003503726A (en) 2003-01-28
KR20020025098A (en) 2002-04-03
JP4647867B2 (en) 2011-03-09
WO2001002822A1 (en) 2001-01-11
DE60001280D1 (en) 2003-02-27
MY128215A (en) 2007-01-31
EP1192433B1 (en) 2003-01-22
KR100685574B1 (en) 2007-02-22
AU5910000A (en) 2001-01-22
EP1192433A1 (en) 2002-04-03
DE60001280T2 (en) 2004-01-22
ATE231609T1 (en) 2003-02-15

Similar Documents

Publication Publication Date Title
US6184974B1 (en) Apparatus and method for evaluating a target larger than a measuring aperture of a sensor
US7728961B2 (en) Surface height and focus sensor
US8922764B2 (en) Defect inspection method and defect inspection apparatus
EP0866956B1 (en) Wavefront measuring system with integral geometric reference (igr)
US6172349B1 (en) Autofocusing apparatus and method for high resolution microscope system
US7375810B2 (en) Overlay error detection
US5563709A (en) Apparatus for measuring, thinning and flattening silicon structures
CN110546487B (en) Defect inspection apparatus and defect inspection method
US6819413B2 (en) Method and system for sensing and analyzing a wavefront of an optically transmissive system
JPH0117523B2 (en)
US6552806B1 (en) Automated minimization of optical path difference and reference mirror focus in white-light interference microscope objective
JPH0578761B2 (en)
WO1996012981A1 (en) Autofocusing apparatus and method for high resolution microscope system
US4453827A (en) Optical distortion analyzer system
KR101826127B1 (en) optical apparatus for inspecting pattern image of semiconductor wafer
JP3228458B2 (en) Optical three-dimensional measuring device
US20220357285A1 (en) Defect inspection apparatus and defect inspection method
JP2001166202A (en) Focus detection method and focus detector
JP5217327B2 (en) Angle measuring method and angle measuring device
JPH04264205A (en) Interferometer
CN108663124B (en) Detection device and method of wavefront sensor
JPH0321806A (en) Interferometer
RU2078305C1 (en) Interference method of test of geometric positioning of lenses and interference device for its implementation

Legal Events

Date Code Title Description
AS Assignment

Owner name: WAVEFRONT SCIENCES INC., NEW MEXICO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NEAL, DANIEL R.;RAMMAGE, RON R.;ARMSTRONG, DARRELL J.;AND OTHERS;REEL/FRAME:010185/0768;SIGNING DATES FROM 19990628 TO 19990706

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS ADMINISTRATIVE AGENT, NO

Free format text: INTELLECTUAL PROPERTY SECURITY AGREEMENT;ASSIGNOR:WAVEFRONT SCIENCES, INC.;REEL/FRAME:019501/0287

Effective date: 20070402

AS Assignment

Owner name: AMO WAVEFRONT SCIENCES, LLC, NEW MEXICO

Free format text: CHANGE OF NAME;ASSIGNOR:WAVEFRONT SCIENCES, INC.;REEL/FRAME:020309/0383

Effective date: 20080101

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: AMO WAVEFRONT SCIENCES, LLC; FORMERLY WAVEFRONT SC

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:BANK OF AMERICA, N.A. AS ADMINISTRATIVE AGENT;REEL/FRAME:022320/0536

Effective date: 20090225

FPAY Fee payment

Year of fee payment: 12

AS Assignment

Owner name: AMO DEVELOPMENT, LLC, CALIFORNIA

Free format text: MERGER;ASSIGNOR:AMO WAVEFRONT SCIENCES, LLC;REEL/FRAME:053810/0830

Effective date: 20191217