WO2009120718A1 - Methods, systems, and media for controlling depth of field in images - Google Patents

Methods, systems, and media for controlling depth of field in images Download PDF

Info

Publication number
WO2009120718A1
WO2009120718A1 PCT/US2009/038140 US2009038140W WO2009120718A1 WO 2009120718 A1 WO2009120718 A1 WO 2009120718A1 US 2009038140 W US2009038140 W US 2009038140W WO 2009120718 A1 WO2009120718 A1 WO 2009120718A1
Authority
WO
WIPO (PCT)
Prior art keywords
translating
image detector
image
integration period
detector
Prior art date
Application number
PCT/US2009/038140
Other languages
French (fr)
Inventor
Shree K. Nayar
Hajime Nagahara
Sujit Kuthirummal
Changyin Zhou
Original Assignee
The Trustees Of Columbia University In The City Of New York
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Trustees Of Columbia University In The City Of New York filed Critical The Trustees Of Columbia University In The City Of New York
Publication of WO2009120718A1 publication Critical patent/WO2009120718A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras

Definitions

  • the disclosed subject matter relates to methods, systems, and media for controlling depth of field in images.
  • the depth of field (DOF) of an image is the range of scene depths that appear focused in the image.
  • DOF depth of field
  • the DOF of an image can be increased by making the aperture of the camera smaller. However, this reduces the amount of light received by the detector, resulting in greater image noise (lower signal-to-noise ratio (SNR)).
  • SNR signal-to-noise ratio
  • the aperture of the lens must be opened up to maintain the SNR, which causes the DOF to be reduced. This trade-off gets worse with increases in spatial resolution (decreases in pixel size).
  • DOFs in images can be realized for images taken of dominant scene planes from non- perpendicular angles (e.g., such as an image of the surface of a desk when taken from a non- perpendicular angle). While some cameras have an image detector that is tilted with respect to the lens, in many modern cameras (such as in very thin cameras), such physical tilting is impracticable or impossible.
  • an image detector is translated so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector. An image is then captured at the image detector during the integration period.
  • FlG. 1 is a diagram showing the physical translation of an image detector relative to a lens in accordance with some embodiments.
  • FIG. 2 is a diagram of a spinning refractive element being used to provide effective translation of an image detector in accordance with some embodiments.
  • FIG. 3 is a diagram of an example of the basic geometry of a camera in accordance with some embodiments.
  • FIG. 4 is a diagram of an example camera in accordance with some embodiments.
  • FIGS. 6(a) and (b) are diagrams of calculated integrated point spread functions in accordance with some embodiments.
  • FIG. 7 is a diagram of an integrated point spread function measured using a proto-type camera in accordance with some embodiments.
  • FIGS. 8(a) and 8(b) are diagrams of a camera taking a picture of a scene having front, middle, and rear portions, and varying the points of focus of the camera during integration time in order to capture an image with discontinuous depths of field in accordance with some embodiments.
  • FIG. 9 is a diagram of scene and camera in which the dominant scene plane is inclined relative to the plane of the lens of the camera in accordance with some embodiments.
  • FlG. 10 is a diagram of hardware that can be used to implement some embodiments.
  • the position and/or orientation of an image detector in a camera can be physically and/or effectively varied prior to and/or during the integration time of the capture of an image.
  • the focal plane can be swept through a volume of a scene being captured causing all points within it to come into and go out of focus, while the image detector collects photons.
  • a scene 102 can be projected through a lens onto an image detector that is consecutively moving through positions 106, 108, 1 10, 1 12, and 1 14. At point 1 10, the scene is focused on the image detector and at other times the scene is defocused.
  • This varying of the position and/or orientation of a camera's image detector can enable the DOF of the camera to be controlled in various ways.
  • an extended depth of field can be effected in an image by moving an image detector with a global shutter (all pixels are exposed simultaneously and for the same duration) at a uniform (or nearly uniform) speed during image integration.
  • each scene point can be captured by the image detector under a continuous range of focus settings, including perfect focus.
  • the captured image can then be deconvolved with a single, known blur kernel to recover an image with significantly greater DOF.
  • a discontinuous depth of field can be effected in an image by moving a camera's global -shutter image detector non-uniformly In doing so, images that are focused for certain specified scene depths, but defocused for other (e.g., in-between) scene regions, can be captured.
  • images that are focused for certain specified scene depths, but defocused for other (e.g., in-between) scene regions, can be captured.
  • a scene that includes a person in the foreground, a landscape in the background, and a dirty window in between the two
  • a tilted depth of field can be effected in an image by uniformly translating an image detector with a rolling electronic shutter in which different rows or columns of the image detector are exposed at different time intervals.
  • a tilted image detector can be emulated without the need to physically tilt the image detector with respect to the lens.
  • an image can be captured with a tilted focal plane.
  • a non-planar image detector can be emulated. As a result, an image can be captured in which focus is maintained across a curved surface in the scene.
  • the focal plane of a camera can be swept through a large range of scene depths with a very small physical translation of the image detector. For instance, with a 12.5 mm focal length lens, the focal plane can be swept from a distance of 450 mm from the lens to infinity by physically translating the detector 360 microns. Because an image detector only weighs a few milligrams, a variety of micro-actuators (e.g., solenoids, piezoelectric stacks, ultrasonic transducers, DC motors, etc.) can be used to move an image detector over the required distance within a very short integration time (e.g., less than a millisecond if required). Examples of suitable micro-actuators are already used in many consumer cameras for focus control, aperture control, and lens stabilization. (0022] While physical translation of a camera's image detector can be used to control
  • manipulating the focus setting on a camera can be used to provide the same effect as physically translating the camera's image detector in some embodiments. More particularly, when the focus setting is changed, the distance between the image detector and the focal plane of the camera is also changed. Therefore, by changing the focus setting during image integration, translation of the detector along the optical axis can be emulated. In some embodiments, this change in focus setting can be achieved by controlling the electronics already present in the most cameras and/or lenses to realize auto-focus. For example, the motors that enable the lens to change focus setting during auto-focusing can be programmed so that the lens sweeps the focal plane through the scene during the integration time of a photograph.
  • the focal plane in order to emulate translating the detector with uniform speed, the focal plane has to be swept through the scene at non-uniform speed due to non-linearity of the thin lens law shown in equation 1 below.
  • a spinning refractive element of non-uniform thickness positioned between an imaging lens and an image detector of a camera can be used to provide the same effect as physically translating the camera's image detector.
  • a refractive clement 202 can be rotated about an axis 204 parallel to the lens' optical axis 206 - making several complete rotations within the integration time of a photograph.
  • the refractive element can be synchronized with the integration timing using any suitable technique, such as by detecting markers on the refractive element using a suitable optical detector.
  • the location of q 214 depends on the thickness of the slab at a corresponding moment in time. In this way, spinning a refractive element with smoothly varying thickness can result in an image sweeping every scene point through a continuous range of distances along the optical axis. Thus, while the image detector is not physically translated, the effect is the same as physically translating the detector along the optical axis.
  • An advantage of using a spinning refractive element rather than physically translating an image detector is that the refractive element can be kept spinning in the same manner across multiple photographs (or frames of video) whereas physical movement of an image detector requires the detector to move alternately toward and away from the lens.
  • physical movement of an image detector requires the detector to move alternately toward and away from the lens.
  • effective translation of an image detector can be achieved by capturing multiple images of a scene at different focus settings relative to the image detector and then calculating a weighted average image from the multiple captured images.
  • the different focus settings of the scene relative to the image detector can be realized by physically translating a camera's image detector, manipulating the focus setting on a camera, using a spinning refractive element as described above, and/or using any other suitable technique.
  • the weights can be chosen to mimic changing the distance between the lens and the image detector at constant speed.
  • references to the translation of an image detector should be understood to include physical translation of the image detector and effective translation of the image detector (such as by manipulating the focus on the camera, using a refractive element, and/or calculating a weighted average of a set of images captured at different focus settings as described above) unless otherwise indicated.
  • characteristics of the translation such as speed of translation, should be understood to include corresponding characteristics as applicable to the use of focus control, refractive elements, and the averaging of weighted images, such as the rate of change of focus.
  • FIG. 3 an example of the basic geometry 300 of a camera in accordance with some embodiments is illustrated.
  • a scene point M 302 there is a scene point M 302, a camera aperture 304, a lens 306, a focal plane 308, and a translated image detector plane 310.
  • Focal plane 308 is at a distance v 312 from lens 306, lens 306 has a focal length/ and camera aperture 304 has a diameter a 3 14
  • Scene point M 302 is imaged in perfect focus at m 316, if its distance // 318 from lens 306 satisfies the Gaussian lens law:
  • the distribution of light energy within the blur circle is referred to as the point spread function (PSF).
  • the PSF can be denoted as P ⁇ r, u, p), where r is the distance of an Docket No. 0315120.162-WO1
  • FIG. 4 shows an example of a camera 400 in accordance with some embodiments.
  • camera 400 can include a lens 402, an image detector 404, and a micro-actuator 406.
  • Lens 402 can be any suitable lens in some embodiments.
  • Image detector 404 can be any suitable image detector in some embodiments.
  • detector 404 can be a 1/3" SONY CCD with 1024x768 pixels and having a global shutter that can be used to implement extended DOF and discontinuous DOF.
  • detector 404 can be a 1/2.5" Micron CMOS detector with 2592x1944 pixels and a rolling shutter that can be used to implement tilted and curved DOFs.
  • Micro-actuator 406 can be any suitable micro-actuator in some embodiments.
  • micro-actuator 406 can be a PHYSIK INSTRUMENTE M- 1 1 1.1 DG translation stage.
  • detector 404 can be mounted to micro-actuator Docket No. 0315120.162-WO1
  • micro-actuator 406 to enable translation of the detector in a translation direction 408 aligned with the optical axis of lens 402.
  • micro-actuator 406 can include a DC motor actuator that can translate detector 404 through a 15 mm range at a top speed of 2.7 mm/sec and can position it with an accuracy of 0.05 microns.
  • FlG. 5 shows a table 500 illustrating examples of image detector translations
  • the detector can sweep very large depth ranges when moved by very small distances.
  • micro-actuators such as that illustrated above, such translations can be achieved within typical image integration times (a few milliseconds to a few seconds).
  • FIGS. 6(a) and 6(b) show examples of IPSFs for five scene points from 450 to
  • FlG. 7 is an example of an EDOF camera's IPSF measured for a 550 mm scene depth.
  • a discontinuous depth of field can be effected in an image by moving a camera's global-shutter image detector non-uniformly.
  • micro-actuator 406 can be controlled to position image detector 404 at one position along translation path 408, stay there some portion of a camera's integration time, then move to another position along translation path 408 and stay there for the remainder of the integration time.
  • Any suitable combinations of movement can be used for example, rather than stopping at two positions, in some embodiments, the micro-actuator can stop at any suitable numbers of positions.
  • a large aperture is used and the motion of a camera's image detector is controlled such that it first focuses on the star and arrow for a part of the integration time (as represented by period 814 in FlG. 8(b)), and then moves quickly to another location during period 816 in FlG. 8(b) to focus on the backdrop for the remaining portion of the integration time (as represented by period 818 in FlG. 8(b)), an image with all of the star, arrow, and backdrop in focus, and the mesh eliminated, can be obtained. While this image may include some blurring, it can capture the high frequencies in two disconnected DOFs - the foreground and the background - but almost completely eliminates the wire mesh in between. In some embodiments, this can be achieved without any post-processing. As mentioned above, in some embodiments, this approach is not limited to two disconnected DOFs; by pausing the detector at several locations during image integration, more complex DOFs can be realized.
  • a tilted depth of field can be effected in an image by uniformly translating an image detector with a rolling electronic shutter, without the need to physically tilt the image detector with respect to the lens.
  • a tilted image detector can be emulated. If this tilted detector makes an angle 0 with Docket No. 0315120.162-WO1
  • FlG. 9 shows an example of a scene where the dominant scene plane - a table top 902 with a cup 904 and a block 906 - is inclined at an angle of 53 degrees with respect to a lens plane 908 of a camera 910.
  • a normal camera is unable to focus on the entire plane.
  • a rolling-shutter detector e.g., a 1/2.5" Micron CMOS sensor with a 70 msec exposure lag between the first and last row of pixels in the sensor
  • a detector tilt of 2.6 degrees can be emulated and a desired DOF tilt of 53 degrees can be realized based on equation 8.
  • a camera can include a series of distance measuring mechanisms (e.g., such as laser range finders) that can be used to determine the dominant scene plane or curved surface so that an appropriate translation of the detector can be emulated.
  • distance measuring mechanisms e.g., such as laser range finders
  • any suitable number of detectors, or one or more sweeping detectors, can be used.
  • FIG. 10 illustrates an example of hardware 1000 that can be used in some embodiments.
  • hardware 1000 can include an image detector/shutter 1002, an analog to digital (A/D) converter 1004, a digital signal processor (DSP) 1006, a controller 1008, camera buttons/interface 1010, a digital to analog (D/A) converter 1012, a micro- actuator 1014, an auto-focus mechanism 1016, distance detectors) 1018, and a memory /interface 1020.
  • Image detector/shutter 1002 can be any suitable image detector such as a 1/3" SONY CCD with 1024x768 pixels and having a global shutter, or a 1/2.5" Micron CMOS detector with 2592x1944 pixels and a rolling shutter.
  • A/D converter 1004 can be any suitable mechanism for interfacing image detector/shutter to DSP 1006, such as an analog to digital converter.
  • DSP 1006 can be any suitable device for processing (e.g., deconvolving) images received from image detector 1002, such as a digital signal processor, a microprocessor, a computer, a central processing unit, a programmable logic device, Docket No. 0315120.162 -WOl
  • Controller 1008 can be any suitable device for controlling the operation of the remainder of hardware 1000 (e.g., controlling the movement of micro- actuator 1014, the operation of the auto-focus mechanism 1016, the spinning of a refractive element (not shown), the capturing of images by detector/shutter 1002, etc.), such as a digital signal processor, a microprocessor, a computer, a central processing unit, a programmable logic device, dedicated circuitry, etc.
  • DSP 1006 and controller 1008 can be combined or further broken down into sub-processors/controllers.
  • Camera buttons/interface can be any suitable buttons or interface for receiving control input from users or remote devices.
  • D/A converter 1012 can be any suitable mechanism for interfacing controller 1008 to micro-actuator 1014, such as a digital to analog converter.
  • Micro-actuator 1014 can be any suitable device for physically translating image detector/shutter 1002, such as a PHYSIK INSTRUMENTE M-1 1 U DG translation stage.
  • Auto-focus mechanism 1016 can be any suitable hardware and/or software for controlling the operation of the focus in a camera for any suitable purpose, for example, to provide effective translation of the image detector.
  • Distance detectors) 1018 can be any suitable mechanism for determining the distances to multiple points on a scene so that an angle of a scene plane or curve can be determined by controller 1008.
  • detector(s) 1018 can be laser range finders.
  • Memory /interface 1020 can be any suitable mechanism for storing images after processing (if any) by DSP 1006.
  • memory/interface 1020 can be non-volatile memory, a disk drive, an interface to an external device (such as a thumb drive, memory stick, a network server, or other storage or target devices for image transfers), a display (e.g., a display on a camera, computer, telephone, etc.), etc.
  • hardware 1000 can be implemented in any suitable device for capturing images and/or video, such as a portable camera, a video camera, a computer camera, a mobile telephone, a closed-circuit television camera, a security camera, an Internet Protocol camera, etc.
  • a portable camera such as a portable camera, a video camera, a computer camera, a mobile telephone, a closed-circuit television camera, a security camera, an Internet Protocol camera, etc.

Abstract

Methods, systems, and media for controlling depth of field in images are provided. In some embodiments, an image detector is translated so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector. An image is then captured at the image detector during the integration period.

Description

Docket No. 0315120.162-WO1
METHODS, SYSTEMS, AND MEDIA FOR CONTROLLING DEPTH OF FIELD IN IMAGES
Cross Reference to Related Applications
[0001] This application claims the benefit of United States Provisional Patent
Applications Nos. 61/038,807, filed March 24, 2008, and 61/052,400, filed May 12, 2008, which are hereby incorporated by reference herein in their entireties.
Statement Regarding Federally Funded Research
|0002] This invention was made with government support under Grant No. N00014-
08-1-0329 awarded by the Office of Naval Research. The government has certain rights in the invention
Technical Field
(0003) The disclosed subject matter relates to methods, systems, and media for controlling depth of field in images.
Background
[0004| The depth of field (DOF) of an image is the range of scene depths that appear focused in the image. In virtually all applications of imaging, ranging from consumer photography to optical microscopy, it is desirable to control the DOF. Of particular interest is the ability to capture scenes with very large DOFs. The DOF of an image can be increased by making the aperture of the camera smaller. However, this reduces the amount of light received by the detector, resulting in greater image noise (lower signal-to-noise ratio (SNR)). For a dark scene, the aperture of the lens must be opened up to maintain the SNR, which causes the DOF to be reduced. This trade-off gets worse with increases in spatial resolution (decreases in pixel size). As pixels get smaller, the DOF of an image decreases because the defocus blur occupies a greater number of pixels. At the same time, each pixel receives less light and hence the SNR falls as well. This trade-off between the DOF and the SNR is one of the fundamental, long-standing limitations of imaging.
[0005] Another limitation on how DOFs in images can be realized is due to the fact that many current cameras have DOFs that correspond to a single slab that is perpendicular to the optical axis of the lenses of the cameras Having such a perpendicular slab limits how Docket No. 0315120.162-WO1
DOFs in images can be realized for images taken of dominant scene planes from non- perpendicular angles (e.g., such as an image of the surface of a desk when taken from a non- perpendicular angle). While some cameras have an image detector that is tilted with respect to the lens, in many modern cameras (such as in very thin cameras), such physical tilting is impracticable or impossible.
Summary
|0006| Methods, systems, and media for controlling depth of field in images are provided. In some embodiments, an image detector is translated so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector. An image is then captured at the image detector during the integration period.
Brief Description of the Drawings
|0007] FlG. 1 is a diagram showing the physical translation of an image detector relative to a lens in accordance with some embodiments.
[0008] FIG. 2 is a diagram of a spinning refractive element being used to provide effective translation of an image detector in accordance with some embodiments.
|0009| FlG. 3 is a diagram of an example of the basic geometry of a camera in accordance with some embodiments.
|0010| FlG. 4 is a diagram of an example camera in accordance with some embodiments.
|0011 J FlG. 5 is a table showing some lens focal lengths and corresponding scene depth ranges and required detector translations in accordance with some embodiments.
[0012J FIGS. 6(a) and (b) are diagrams of calculated integrated point spread functions in accordance with some embodiments.
|0013| FlG. 7 is a diagram of an integrated point spread function measured using a proto-type camera in accordance with some embodiments.
[0014] FIGS. 8(a) and 8(b) are diagrams of a camera taking a picture of a scene having front, middle, and rear portions, and varying the points of focus of the camera during integration time in order to capture an image with discontinuous depths of field in accordance with some embodiments. Docket No. 0315120.162-WO1
|0015| FIG. 9 is a diagram of scene and camera in which the dominant scene plane is inclined relative to the plane of the lens of the camera in accordance with some embodiments. [0016] FlG. 10 is a diagram of hardware that can be used to implement some embodiments.
Detailed Description
|0017] Methods, systems, and media for controlling depth of field in images are provided. In accordance with some embodiments, the position and/or orientation of an image detector in a camera can be physically and/or effectively varied prior to and/or during the integration time of the capture of an image. As a result, the focal plane can be swept through a volume of a scene being captured causing all points within it to come into and go out of focus, while the image detector collects photons. For example, as shown in FIG. 1, a scene 102 can be projected through a lens onto an image detector that is consecutively moving through positions 106, 108, 1 10, 1 12, and 1 14. At point 1 10, the scene is focused on the image detector and at other times the scene is defocused. This varying of the position and/or orientation of a camera's image detector can enable the DOF of the camera to be controlled in various ways.
IOO18| For example, in some embodiments, an extended depth of field (EDOF) can be effected in an image by moving an image detector with a global shutter (all pixels are exposed simultaneously and for the same duration) at a uniform (or nearly uniform) speed during image integration. In doing so, each scene point can be captured by the image detector under a continuous range of focus settings, including perfect focus. The captured image can then be deconvolved with a single, known blur kernel to recover an image with significantly greater DOF.
[0019] As another example, in some embodiments, a discontinuous depth of field can be effected in an image by moving a camera's global -shutter image detector non-uniformly In doing so, images that are focused for certain specified scene depths, but defocused for other (e.g., in-between) scene regions, can be captured. As an illustration, consider a scene that includes a person in the foreground, a landscape in the background, and a dirty window in between the two By focusing a camera's image detector on the nearby person for some duration of a camera's integration time and the far-away landscape for the rest of the integration time, an image in which both appear fairly well-focused can be produced, while the dirty window is blurred out and hence optically erased. Docket No. 0315120.162-WO1
[0020] As yet another example, in some embodiments, a tilted depth of field can be effected in an image by uniformly translating an image detector with a rolling electronic shutter in which different rows or columns of the image detector are exposed at different time intervals. In this way, a tilted image detector can be emulated without the need to physically tilt the image detector with respect to the lens. As a result, an image can be captured with a tilted focal plane. Furthermore, by translating the image detector non-uniformly (varying speed), a non-planar image detector can be emulated. As a result, an image can be captured in which focus is maintained across a curved surface in the scene. [0021] In some embodiments, the focal plane of a camera can be swept through a large range of scene depths with a very small physical translation of the image detector. For instance, with a 12.5 mm focal length lens, the focal plane can be swept from a distance of 450 mm from the lens to infinity by physically translating the detector 360 microns. Because an image detector only weighs a few milligrams, a variety of micro-actuators (e.g., solenoids, piezoelectric stacks, ultrasonic transducers, DC motors, etc.) can be used to move an image detector over the required distance within a very short integration time (e.g., less than a millisecond if required). Examples of suitable micro-actuators are already used in many consumer cameras for focus control, aperture control, and lens stabilization. (0022] While physical translation of a camera's image detector can be used to control
DOF, additionally or alternatively to physically translating a camera's image detector, manipulating the focus setting on a camera can be used to provide the same effect as physically translating the camera's image detector in some embodiments. More particularly, when the focus setting is changed, the distance between the image detector and the focal plane of the camera is also changed. Therefore, by changing the focus setting during image integration, translation of the detector along the optical axis can be emulated. In some embodiments, this change in focus setting can be achieved by controlling the electronics already present in the most cameras and/or lenses to realize auto-focus. For example, the motors that enable the lens to change focus setting during auto-focusing can be programmed so that the lens sweeps the focal plane through the scene during the integration time of a photograph.
[0023] In some embodiments, in order to emulate translating the detector with uniform speed, the focal plane has to be swept through the scene at non-uniform speed due to non-linearity of the thin lens law shown in equation 1 below. Docket No. 0315120.162- WO I
|0024| Additionally or alternatively to physically translating a camera's image detector and/or manipulating the focus setting on a camera, in some embodiments, a spinning refractive element of non-uniform thickness positioned between an imaging lens and an image detector of a camera can be used to provide the same effect as physically translating the camera's image detector. For example, as illustrated in FIG. 2, a refractive clement 202 can be rotated about an axis 204 parallel to the lens' optical axis 206 - making several complete rotations within the integration time of a photograph. By completing a large number of rotations during a single frame's exposure, the rotation of the lens does not need to be synchronized with the beginning and end of a frame's integration time, as errors arising from any asynchrony would likely be negligible. Additionally or alternatively to performing a large number of rotations, the refractive element can be synchronized with the integration timing using any suitable technique, such as by detecting markers on the refractive element using a suitable optical detector.
[0025] In order to appreciate how the refractive lens can achieve the same effect as physically translating an image detector, assume that P 208 is a scene point whose image is formed by lens 210 at point/; 212 as shown in FIG. 2. If element 202 is close to an image detector 216, rays originating from P 208 after refracting through lens 210 will be incident on a small region of element 202. The element in that small region can be approximated to be a parallel refractive slab, where the thickness of the slab varies as element 202 rotates. Refraction through the slab causes the rays to be offset and they (approximately) converge at point (/ 214, which is different from the Original image point/; 212. The location of q 214 depends on the thickness of the slab at a corresponding moment in time. In this way, spinning a refractive element with smoothly varying thickness can result in an image sweeping every scene point through a continuous range of distances along the optical axis. Thus, while the image detector is not physically translated, the effect is the same as physically translating the detector along the optical axis.
[0026] An advantage of using a spinning refractive element rather than physically translating an image detector is that the refractive element can be kept spinning in the same manner across multiple photographs (or frames of video) whereas physical movement of an image detector requires the detector to move alternately toward and away from the lens. When physically moving an image detector in such a case, some integration time is lost between photographs or frames due to the fact that the detector needs to slow down, come to a stop, and then accelerate in the opposite direction to reach the desired speed. Docket No. 0315120.162-WOl
|0027| Additionally or alternatively to physically translating a camera's image detector, manipulating the focus setting on a camera, and/or using a spinning refractive element as described above, in some embodiments, effective translation of an image detector can be achieved by capturing multiple images of a scene at different focus settings relative to the image detector and then calculating a weighted average image from the multiple captured images. The different focus settings of the scene relative to the image detector can be realized by physically translating a camera's image detector, manipulating the focus setting on a camera, using a spinning refractive element as described above, and/or using any other suitable technique. The weights can be chosen to mimic changing the distance between the lens and the image detector at constant speed.
[0028] For the sake of simplicity and clarity, throughout this disclosure references to the translation of an image detector should be understood to include physical translation of the image detector and effective translation of the image detector (such as by manipulating the focus on the camera, using a refractive element, and/or calculating a weighted average of a set of images captured at different focus settings as described above) unless otherwise indicated. Similarly, characteristics of the translation, such as speed of translation, should be understood to include corresponding characteristics as applicable to the use of focus control, refractive elements, and the averaging of weighted images, such as the rate of change of focus.
[0029] Turning to FIG. 3, an example of the basic geometry 300 of a camera in accordance with some embodiments is illustrated. As shown, in this geometry, there is a scene point M 302, a camera aperture 304, a lens 306, a focal plane 308, and a translated image detector plane 310. Focal plane 308 is at a distance v 312 from lens 306, lens 306 has a focal length/ and camera aperture 304 has a diameter a 3 14 Scene point M 302 is imaged in perfect focus at m 316, if its distance // 318 from lens 306 satisfies the Gaussian lens law:
- = - + - (equation 1 )
As shown in FIG. 3, if an image detector plane 310 is shifted to a distance/? 320 from the lens, scene point M 302 is imaged as a blurred circle (the circle of confusion) centered around m' 322. The diameter b 324 of this circle is given by b = - \(v - p) \ (equation 2)
[0030] The distribution of light energy within the blur circle is referred to as the point spread function (PSF). The PSF can be denoted as P{r, u, p), where r is the distance of an Docket No. 0315120.162-WO1
image point from the center m' 322 of the blur circle. An idealized model for characterizing the PSF is the pillbox function:
P(r. u,p) = ^ Yl (£) , (equation 3) where, ϊl(x) is the rectangle function, which has a value 1, if \x\ < 1/2 and 0 otherwise. In the presence of optical aberrations, the PSF deviates from the pillbox function and is then often modeled as a Gaussian function:
P(r'"'P) = ^e*P (" (Sf) (equation 4) where g is a constant (e.g., 1).
(0031 ] We now analyze the effect of translating the detector during an image's integration time. For simplicity, consider the case where the detector is translated along the optical axis, as in FIG. 1. Let p(l) denote the detector's distance from the lens as a function of time. Then the aggregate PSF for a scene point at a distance n from the lens, referred to as the integrated PSF (IPSF), is given by lP(r. u) = /0 T P(r, u, p(t))dt . (equation 5) where T is the total integration time. By programming the detector translating p(l) - its starting position, speed, and acceleration - we can change the properties of the resulting IPSF. This corresponds to sweeping the focal plane through the scene in different ways. The above analysis only considers the translation of the detector along the optical axis. In some embodiments, however, both position and orientation of an image detector can be varied during image integration and this analysis can be extended to more general detector translations.
[0032] FIG. 4 shows an example of a camera 400 in accordance with some embodiments. As illustrated, camera 400 can include a lens 402, an image detector 404, and a micro-actuator 406. Lens 402 can be any suitable lens in some embodiments. For example, lens 402 can be a lens having/= 12.5 mm and_/7# =1.4. Image detector 404 can be any suitable image detector in some embodiments. For example, detector 404 can be a 1/3" SONY CCD with 1024x768 pixels and having a global shutter that can be used to implement extended DOF and discontinuous DOF. As another example, detector 404 can be a 1/2.5" Micron CMOS detector with 2592x1944 pixels and a rolling shutter that can be used to implement tilted and curved DOFs. Micro-actuator 406 can be any suitable micro-actuator in some embodiments. For example, micro-actuator 406 can be a PHYSIK INSTRUMENTE M- 1 1 1.1 DG translation stage. As also shown, detector 404 can be mounted to micro-actuator Docket No. 0315120.162-WO1
406 to enable translation of the detector in a translation direction 408 aligned with the optical axis of lens 402. In some embodiments, micro-actuator 406 can include a DC motor actuator that can translate detector 404 through a 15 mm range at a top speed of 2.7 mm/sec and can position it with an accuracy of 0.05 microns.
|0033] FlG. 5 shows a table 500 illustrating examples of image detector translations
506 required to sweep a focal plane through various depth ranges 504 using lenses with two different focal lengths 502. As can be seen, the detector can sweep very large depth ranges when moved by very small distances. Using commercially available micro-actuators (such as that illustrated above), such translations can be achieved within typical image integration times (a few milliseconds to a few seconds).
[0034] Consider a detector translating along the optical axis with constant speed s,
\.t ,p(t) = p(0) + st. If it is assumed that the PSF of the lens can be modeled using the pillbox function in equation 3, the IPSF in equation 5 simplifies to:
/p<r' «) = ^T^f ("V1 - ϊw - ϊen) (equatl0n 6) where, b(t) is the blur circle diameter at time t, and λ, = 1 if b(i) >= 2r and 0 otherwise. On the other hand, if we use the Gaussian function in equation 4 for the lens PSF, we get lP{r, u) = %= (erf c (-=± — ) + erfc (-=1 — ) ). (equation 7)
[0035| FIGS. 6(a) and 6(b) show examples of IPSFs for five scene points from 450 to
2000 mm of an EDOF camera (with a lens having/= 12.5 mm and flU = \ A, p(0) = 12.5 mm, .y = 1 mm/sec, and T= 360 msec) computed using equations 6 and 7, respectively. As can be seen, the IPSFs of the EDOF camera derived using both pillbox (equation 6) and Gaussian (equation 7) PSF models look almost identical for all five scene depths (i.e., all five curves substantially over lap in the figures (and hence almost look like a single curve)), and, thus, the IPSFs are depth invariant.
[0036] FlG. 7 is an example of an EDOF camera's IPSF measured for a 550 mm scene depth. In this example, the camera has/= 12.5 mm,/# = 1.4, 7 = 360 msec, p(0) = 12.5 mm, and s = 1 mm/sec, which corresponds to sweeping the focal plane from a distance of 450 mm to 2000 mm.
[0037| Using the IPSF of FlG. 7 (or any other suitable IPSF), a captured image can be deconvolved to produce an image with an extended DOF. To make the deconvolution more robust and produce visually appealing results, sparse derivative priors, for example the one described in Olshausen, B. A., Field, D.J. : "Emergence of simple-cell receptive field Docket No. 0315120.162-WO1
properties by learning a sparse code for natural images," Nature (1996), pp. 607-609, which is hereby incorporated by reference herein in its entirety, can be used in some embodiments. [0038| As mentioned above, in some embodiments, a discontinuous depth of field can be effected in an image by moving a camera's global-shutter image detector non-uniformly. For example, turning back to FIG. 4, micro-actuator 406 can be controlled to position image detector 404 at one position along translation path 408, stay there some portion of a camera's integration time, then move to another position along translation path 408 and stay there for the remainder of the integration time. Any suitable combinations of movement can be used For example, rather than stopping at two positions, in some embodiments, the micro-actuator can stop at any suitable numbers of positions.
[0039) As an example, consider scene 802 illustrated from above in FIG. 8(a). As can be seen, camera 804 observes this scene so that a star 806 and an arrow 808 are present in front of a scenic backdrop 810 with a wire mesh 812 in between. A normal camera with a small DOF may be able to capture either the star and arrow or the backdrop in focus, while eliminating the mesh via defocusing. However, because the normal camera's DOF is a single continuous volume, it cannot capture all of the star, arrow, and backdrop in focus and at the same time eliminate the mesh. In accordance with some embodiments, if a large aperture is used and the motion of a camera's image detector is controlled such that it first focuses on the star and arrow for a part of the integration time (as represented by period 814 in FlG. 8(b)), and then moves quickly to another location during period 816 in FlG. 8(b) to focus on the backdrop for the remaining portion of the integration time (as represented by period 818 in FlG. 8(b)), an image with all of the star, arrow, and backdrop in focus, and the mesh eliminated, can be obtained. While this image may include some blurring, it can capture the high frequencies in two disconnected DOFs - the foreground and the background - but almost completely eliminates the wire mesh in between. In some embodiments, this can be achieved without any post-processing. As mentioned above, in some embodiments, this approach is not limited to two disconnected DOFs; by pausing the detector at several locations during image integration, more complex DOFs can be realized.
[0040] As also mentioned above, in some embodiments, a tilted depth of field can be effected in an image by uniformly translating an image detector with a rolling electronic shutter, without the need to physically tilt the image detector with respect to the lens. When such a detector is translated with uniform speed .v, during the frame read out time 7'of an image, a tilted image detector can be emulated. If this tilted detector makes an angle 0 with Docket No. 0315120.162-WO1
the lens plane, then the focal plane in the scene makes an angle φ with the lens plane, where 0 and φ are related by the well-known Scheimpflug condition: θ = tan- (S) and φ = tan- ( ^.J ^ωtion 8> where, H is the height of the detector. Therefore, by controlling the speed .y of the detector, the emulated tilt angle of the image detector can be controlled, and hence the tilt of the focal plane and its associated DOF.
[0041 ] FlG. 9 shows an example of a scene where the dominant scene plane - a table top 902 with a cup 904 and a block 906 - is inclined at an angle of 53 degrees with respect to a lens plane 908 of a camera 910. Typically, as a result, a normal camera is unable to focus on the entire plane. In accordance with some embodiments, by translating a rolling-shutter detector (e.g., a 1/2.5" Micron CMOS sensor with a 70 msec exposure lag between the first and last row of pixels in the sensor) at 2.7 mm/sec, a detector tilt of 2.6 degrees can be emulated and a desired DOF tilt of 53 degrees can be realized based on equation 8. This results in the camera being able to capture the table top in focus. In some embodiments, by translating an image detector with varying speed, non-planar detectors that can focus on curved scene surfaces can be additionally or alternatively emulated. [0042] In some embodiments, a camera can include a series of distance measuring mechanisms (e.g., such as laser range finders) that can be used to determine the dominant scene plane or curved surface so that an appropriate translation of the detector can be emulated. In order to measure complex curved surfaces, any suitable number of detectors, or one or more sweeping detectors, can be used.
|0043| FlG 10 illustrates an example of hardware 1000 that can be used in some embodiments. As shown, hardware 1000 can include an image detector/shutter 1002, an analog to digital (A/D) converter 1004, a digital signal processor (DSP) 1006, a controller 1008, camera buttons/interface 1010, a digital to analog (D/A) converter 1012, a micro- actuator 1014, an auto-focus mechanism 1016, distance detectors) 1018, and a memory /interface 1020. Image detector/shutter 1002 can be any suitable image detector such as a 1/3" SONY CCD with 1024x768 pixels and having a global shutter, or a 1/2.5" Micron CMOS detector with 2592x1944 pixels and a rolling shutter. A/D converter 1004 can be any suitable mechanism for interfacing image detector/shutter to DSP 1006, such as an analog to digital converter. DSP 1006 can be any suitable device for processing (e.g., deconvolving) images received from image detector 1002, such as a digital signal processor, a microprocessor, a computer, a central processing unit, a programmable logic device, Docket No. 0315120.162 -WOl
dedicated circuitry, etc. Controller 1008 can be any suitable device for controlling the operation of the remainder of hardware 1000 (e.g., controlling the movement of micro- actuator 1014, the operation of the auto-focus mechanism 1016, the spinning of a refractive element (not shown), the capturing of images by detector/shutter 1002, etc.), such as a digital signal processor, a microprocessor, a computer, a central processing unit, a programmable logic device, dedicated circuitry, etc. In some embodiments, DSP 1006 and controller 1008 can be combined or further broken down into sub-processors/controllers. Camera buttons/interface can be any suitable buttons or interface for receiving control input from users or remote devices. D/A converter 1012 can be any suitable mechanism for interfacing controller 1008 to micro-actuator 1014, such as a digital to analog converter. Micro-actuator 1014 can be any suitable device for physically translating image detector/shutter 1002, such as a PHYSIK INSTRUMENTE M-1 1 U DG translation stage. Auto-focus mechanism 1016 can be any suitable hardware and/or software for controlling the operation of the focus in a camera for any suitable purpose, for example, to provide effective translation of the image detector. Distance detectors) 1018 can be any suitable mechanism for determining the distances to multiple points on a scene so that an angle of a scene plane or curve can be determined by controller 1008. For example, detector(s) 1018 can be laser range finders. Memory /interface 1020 can be any suitable mechanism for storing images after processing (if any) by DSP 1006. For example, memory/interface 1020 can be non-volatile memory, a disk drive, an interface to an external device (such as a thumb drive, memory stick, a network server, or other storage or target devices for image transfers), a display (e.g., a display on a camera, computer, telephone, etc.), etc.
|0044| In some embodiments, hardware 1000 can be implemented in any suitable device for capturing images and/or video, such as a portable camera, a video camera, a computer camera, a mobile telephone, a closed-circuit television camera, a security camera, an Internet Protocol camera, etc.
|0045| Although the invention has been described and illustrated in the foregoing illustrative embodiments, it is understood that the present disclosure has been made only by way of example, and that numerous changes in the details of implementation of the invention can be made without departing from the spirit and scope of the invention, which is only limited by the claims which follow. Features of the disclosed embodiments can be combined and rearranged in various ways.

Claims

Docket No. 0315120.162-WO1What is claimed is:
1. A method for controlling depth of field in images, comprising: translating an image detector so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector; and capturing the image at the image detector during the integration period.
2. The method of claim 1 , wherein the translating is accomplished by physically translating the image detector.
3 The method of claim 1 , wherein the translating is synchronized with the operation of a rolling shutter.
4. The method of claim 3, wherein the translating occurs at a varying rate.
5 The method of claim 1 , further comprising deconvolving the captured image using an integrated point spread function.
6. The method of claim 5, wherein the deconvolving includes using a sparse derivative prior.
7 The method of claim 1, wherein the translating is accomplished by adjusting the focus of a lens.
8. The method of claim 1 , wherein the translating is accomplished by spinning a refractive element between a lens and the image detector.
9. The method of claim 1, wherein the at least a portion of the integration period is the entire integration period and wherein the translating occurs at a constant rate.
10. The method of claim 1 , wherein the translating stops during at least two periods of the integration period.
1 1. A system for controlling depth of field in images, comprising: Docket No. 0315120.162 -WOl
at least one processor that: causes an image detector to be translated so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector; and causes the image at the image detector to be captured during the integration period.
12. The system of claim 1 1 , wherein the translating is accomplished by physically translating the image detector.
13. The system of claim 1 1 , wherein the translating is synchronized with the operation of a rolling shutter.
14. The system of claim 13, wherein the translating occurs at a varying rate.
15. The system of claim 1 1 , wherein the process also deconvolves the captured image using an integrated point spread function.
16. The system of claim 15, wherein the deconvolving includes using a sparse derivative prior.
17. The system of claim 1 1 , wherein the translating is accomplished by adjusting the focus of a lens.
18. The system of claim 1 1 , wherein the translating is accomplished by spinning a refractive element between a lens and the image detector.
19. The system of claim 1 1 , wherein the at least a portion of the integration period is the entire integration period and wherein the translating occurs at a constant rate.
20. The system of claim 1 1 , wherein the translating stops during at least two periods of the integration period. Docket No. 0315120.162-WO1
21. A computer-readable medium containing computer-executable instructions that, when executed by a processor, cause the processor to perform a method for controlling depth of field in images, the method comprising: translating an image detector so that an image incident on the image detector changes focus during at least a portion of an integration period of the image detector; and capturing the image at the image detector during the integration period.
22. The medium of claim 21, wherein the translating is accomplished by physically translating the image detector.
23. The medium of claim 21, wherein the translating is synchronized with the operation of a rolling shutter.
24. The medium of claim 23, wherein the translating occurs at a varying rate.
25. The medium of claim 21, wherein the method further comprises deconvolving the captured image using an integrated point spread function.
26. The medium of claim 25, wherein the deconvolving includes using a sparse derivative prior.
27. The medium of claim 21 , wherein the translating is accomplished by adjusting the focus of a lens
28 The medium of claim 21, wherein the translating is accomplished by spinning a refractive element between a lens and the image detector.
29. The medium of claim 21 , wherein the at least a portion of the integration period is the entire integration period and wherein the translating occurs at a constant rate.
30. The medium of claim 21 , wherein the translating stops during at least two periods of the integration period.
PCT/US2009/038140 2008-03-24 2009-03-24 Methods, systems, and media for controlling depth of field in images WO2009120718A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US3880708P 2008-03-24 2008-03-24
US61/038,807 2008-03-24
US5240008P 2008-05-12 2008-05-12
US61/052,400 2008-05-12

Publications (1)

Publication Number Publication Date
WO2009120718A1 true WO2009120718A1 (en) 2009-10-01

Family

ID=41114312

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/038140 WO2009120718A1 (en) 2008-03-24 2009-03-24 Methods, systems, and media for controlling depth of field in images

Country Status (1)

Country Link
WO (1) WO2009120718A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010131142A1 (en) * 2009-05-12 2010-11-18 Koninklijke Philips Electronics N.V. Camera, system comprising a camera, method of operating a camera and method for deconvoluting a recorded image
US20110292275A1 (en) * 2009-12-07 2011-12-01 Takashi Kawamura Imaging apparatus and method of controlling the same
EP2390720A3 (en) * 2010-05-27 2012-03-07 Samsung Electro-Mechanics Co., Ltd Camera module
EP2511747A1 (en) * 2009-12-07 2012-10-17 Panasonic Corporation Imaging device and imaging method
CN102804751A (en) * 2011-01-31 2012-11-28 松下电器产业株式会社 Image restoration device, imaging device, and image restoration method
WO2013162747A1 (en) * 2012-04-26 2013-10-31 The Trustees Of Columbia University In The City Of New York Systems, methods, and media for providing interactive refocusing in images
US8754975B2 (en) 2010-12-14 2014-06-17 Axis Ab Method and digital video camera for improving the image quality of images in a video image stream
WO2017144503A1 (en) * 2016-02-22 2017-08-31 Koninklijke Philips N.V. Apparatus for generating a synthetic 2d image with an enhanced depth of field of an object
DE102017220101A1 (en) 2016-11-23 2018-05-24 Mitutoyo Corporation Inspection system using machine vision to obtain an image with extended depth of field
US10178321B2 (en) 2013-11-27 2019-01-08 Mitutoyo Corporation Machine vision inspection system and method for obtaining an image with an extended depth of field
US10623627B2 (en) 2016-02-22 2020-04-14 Koninklijke Philips N.V. System for generating a synthetic 2D image with an enhanced depth of field of a biological sample

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6201899B1 (en) * 1998-10-09 2001-03-13 Sarnoff Corporation Method and apparatus for extended depth of field imaging
US6445415B1 (en) * 1996-01-09 2002-09-03 Kjell Olsson Increased depth of field for photography
US6873446B2 (en) * 2000-11-29 2005-03-29 Geoffrey Donald Owen Refractive optical deflector
US20060291844A1 (en) * 2005-06-24 2006-12-28 Nokia Corporation Adaptive optical plane formation with rolling shutter
US7336430B2 (en) * 2004-09-03 2008-02-26 Micron Technology, Inc. Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6445415B1 (en) * 1996-01-09 2002-09-03 Kjell Olsson Increased depth of field for photography
US6201899B1 (en) * 1998-10-09 2001-03-13 Sarnoff Corporation Method and apparatus for extended depth of field imaging
US6873446B2 (en) * 2000-11-29 2005-03-29 Geoffrey Donald Owen Refractive optical deflector
US7336430B2 (en) * 2004-09-03 2008-02-26 Micron Technology, Inc. Extended depth of field using a multi-focal length lens with a controlled range of spherical aberration and a centrally obscured aperture
US20060291844A1 (en) * 2005-06-24 2006-12-28 Nokia Corporation Adaptive optical plane formation with rolling shutter

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAPPEN M.F. ET AL.: "Exploiting the Sparse Derivative Prior for Super-Resolution and Image Demosaicing", THIRD INTEMATIONAL WORKSHOP ON STATISTICAL AND COMPUTATIONAL THEORIES OF VISION AT ICCV 2003, 2003, Retrieved from the Internet <URL:www.stat.ucla.edu/-yuille/meetings/2003_workshop.php]> *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010131142A1 (en) * 2009-05-12 2010-11-18 Koninklijke Philips Electronics N.V. Camera, system comprising a camera, method of operating a camera and method for deconvoluting a recorded image
CN102422629B (en) * 2009-05-12 2015-04-29 皇家飞利浦电子股份有限公司 Camera, system comprising a camera, method of operating a camera and method for deconvoluting a recorded image
US8605202B2 (en) 2009-05-12 2013-12-10 Koninklijke Philips N.V. Motion of image sensor, lens and/or focal length to reduce motion blur
CN102422629A (en) * 2009-05-12 2012-04-18 皇家飞利浦电子股份有限公司 Camera, system comprising a camera, method of operating a camera and method for deconvoluting a recorded image
EP2511747A1 (en) * 2009-12-07 2012-10-17 Panasonic Corporation Imaging device and imaging method
EP2511747A4 (en) * 2009-12-07 2014-11-05 Panasonic Corp Imaging device and imaging method
US20110292275A1 (en) * 2009-12-07 2011-12-01 Takashi Kawamura Imaging apparatus and method of controlling the same
US8576326B2 (en) * 2009-12-07 2013-11-05 Panasonic Corporation Imaging apparatus and method of controlling the image depth of field
EP2390720A3 (en) * 2010-05-27 2012-03-07 Samsung Electro-Mechanics Co., Ltd Camera module
US8754975B2 (en) 2010-12-14 2014-06-17 Axis Ab Method and digital video camera for improving the image quality of images in a video image stream
CN102804751A (en) * 2011-01-31 2012-11-28 松下电器产业株式会社 Image restoration device, imaging device, and image restoration method
EP2672696A4 (en) * 2011-01-31 2015-07-08 Panasonic Corp Image restoration device, imaging device, and image restoration method
CN102804751B (en) * 2011-01-31 2016-08-03 松下电器产业株式会社 Image recovery device, camera head and image recovery method
WO2013162747A1 (en) * 2012-04-26 2013-10-31 The Trustees Of Columbia University In The City Of New York Systems, methods, and media for providing interactive refocusing in images
US10582120B2 (en) 2012-04-26 2020-03-03 The Trustees Of Columbia University In The City Of New York Systems, methods, and media for providing interactive refocusing in images
US10178321B2 (en) 2013-11-27 2019-01-08 Mitutoyo Corporation Machine vision inspection system and method for obtaining an image with an extended depth of field
WO2017144503A1 (en) * 2016-02-22 2017-08-31 Koninklijke Philips N.V. Apparatus for generating a synthetic 2d image with an enhanced depth of field of an object
US10623627B2 (en) 2016-02-22 2020-04-14 Koninklijke Philips N.V. System for generating a synthetic 2D image with an enhanced depth of field of a biological sample
DE102017220101A1 (en) 2016-11-23 2018-05-24 Mitutoyo Corporation Inspection system using machine vision to obtain an image with extended depth of field

Similar Documents

Publication Publication Date Title
WO2009120718A1 (en) Methods, systems, and media for controlling depth of field in images
Nagahara et al. Flexible depth of field photography
WO2017199556A1 (en) Stereo camera and stereo camera control method
CA2639527C (en) Security camera system and method of steering beams to alter a field of view
US7215882B2 (en) High-speed automatic focusing system
EP1466210B1 (en) Digital camera with viewfinder designed for improved depth of field photographing
TW201126453A (en) Autofocus with confidence measure
US7907205B2 (en) Optical apparatus with unit for correcting blur of captured image caused by displacement of optical apparatus in optical-axis direction
JP4874668B2 (en) Autofocus unit and camera
WO2006050430A2 (en) Optical tracking system using variable focal length lens
JP5938281B2 (en) Imaging apparatus, control method therefor, and program
EP1896891A1 (en) Adaptive optical plane formation with rolling shutter
JP2007228005A (en) Digital camera
CN109564376A (en) Time-multiplexed programmable view field imaging
US9100562B2 (en) Methods and apparatus for coordinated lens and sensor motion
JP2005321797A (en) Image-stabilization system and method
KR20100015320A (en) A device for providing stabilized images in a hand held camera
US20110158617A1 (en) Device for providing stabilized images in a hand held camera
JP6128109B2 (en) Image capturing apparatus, image capturing direction control method, and program
JP2014130131A (en) Imaging apparatus, semiconductor integrated circuit and imaging method
US20100128164A1 (en) Imaging system with a dynamic optical low-pass filter
KR20220058593A (en) Systems and methods for acquiring smart panoramic images
US8582016B2 (en) Photographing apparatus and focus detecting method using the same
JP2007228007A (en) Digital camera
JP5656507B2 (en) Shooting system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09724150

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09724150

Country of ref document: EP

Kind code of ref document: A1