USRE46239E1 - Method and system for image construction using multiple exposures - Google Patents

Method and system for image construction using multiple exposures Download PDF

Info

Publication number
USRE46239E1
USRE46239E1 US13/904,351 US201313904351A USRE46239E US RE46239 E1 USRE46239 E1 US RE46239E1 US 201313904351 A US201313904351 A US 201313904351A US RE46239 E USRE46239 E US RE46239E
Authority
US
United States
Prior art keywords
imaging system
exposure
image
movement
during
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US13/904,351
Inventor
Hannu Kakkori
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Technologies Oy
Original Assignee
Core Wiresless Licensing SARL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Core Wiresless Licensing SARL filed Critical Core Wiresless Licensing SARL
Priority to US13/904,351 priority Critical patent/USRE46239E1/en
Priority to US15/351,832 priority patent/USRE48552E1/en
Application granted granted Critical
Publication of USRE46239E1 publication Critical patent/USRE46239E1/en
Assigned to CONVERSANT WIRELESS LICENSING S.A R.L. reassignment CONVERSANT WIRELESS LICENSING S.A R.L. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: CORE WIRELESS LICENSING S.A.R.L.
Assigned to NOKIA TECHNOLOGIES OY reassignment NOKIA TECHNOLOGIES OY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CONVERSANT WIRELESS LICENSING S.A R.L.
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • H04N5/23232
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/684Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
    • H04N23/6845Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by combination of a plurality of images sequentially taken
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N2007/145Handheld terminals

Definitions

  • the present invention relates generally to image stabilization and, more particularly, to image stabilization by image processing.
  • the problem of image stabilization dates back to the beginning of photography, and the problem is related to the fact that an image sensor needs a sufficient exposure time to form a reasonably good image. Any motion of the camera during the exposure time causes a shift of the image projected on the image sensor, resulting in a degradation of the formed image.
  • the motion related degradation is called motion blur.
  • Motion blur is particularly easy to occur when the camera is set at a high zoom ratio when even a small motion could significantly degrade the quality of the acquired image.
  • One of the main difficulties in restoring motion blurred images is due to the fact that the motion blur is different from one image to another, depending on the actual camera motion that took place during the exposure time.
  • Image stabilization is usually carried out in a technique called a single-frame solution.
  • the single-frame solution is based on capturing a single image frame during a long exposure time. This is actually the classical case of image capturing, where the acquired image is typically corrupted by motion blur, caused by the motion that has taken place during the exposure time. In order to restore the image it is necessary to have very accurate knowledge about the motion that took place during the exposure time. Consequently this approach might need quite expensive motion sensors (gyroscopes), which, apart of their costs, are also large in size and hence difficult to include in small devices.
  • the exposure time is long then the position information derived from the motion sensor output exhibits a bias drift error with respect to the true value. This error accumulates in time such that at some point may affect significantly the outcome of the system.
  • Optical image stabilization generally involves laterally shifting the image projected on the image sensor in compensation for the camera motion. Shifting of the image can be achieved by one of the following four general techniques:
  • Lens shift this optical image stabilization method involves moving one or more lens elements of the optical system in a direction substantially perpendicular to the optical axis of the system;
  • Image sensor shift this optical image stabilization method involves moving the image sensor in a direction substantially perpendicular to the optical axis of the optical system;
  • Liquid prism this method involves changing a layer of liquid sealed between two parallel plates into a wedge in order to change the optical axis of the system by refraction;
  • Camera module tilt this method keeps all the components in the optical system unchanged while tilting the entire module so as to shift the optical axis in relation to a scene.
  • an actuator mechanism is required to effect the change in the optical axis or the shift of the image sensor.
  • Actuator mechanisms are generally complex, which means that they are expensive and large in size.
  • Registration step register all image frames with respect to one of the images chosen as reference, and
  • the main problems in a typical multi-frame image stabilization solution include:
  • the present invention relates to the multi-frame method based on capturing a single image frame or several image frames of the same scene in shorter intervals.
  • the number of captured frames is determined by the motion blur caused by the camera motion and the implementation of embodiments.
  • a long exposure time is divided into several short intervals in order to capture a plurality of image frames and only the image frames that are captured when the position of the camera is within a predetermined range are used to form a final image.
  • the exposure time for each frame is small in order to reduce the motion blur degradation of the individual frames. If the camera is stable and substantially stationary relative to the scene, then all or many of the shorter frames are used to form the final image. If the camera is not sufficiently stable, then one or a few shorter frames are used.
  • the duration of exposures to the image sensor is determined by the camera motion during the exposures. Multiple captured frames from multiple exposures may be used to form a final image. Alternatively, only a single frame is captured from the multiple exposures and that single frame is used to form the final image.
  • the pixel intensity values of the corresponding pixels in the frames are summed in order to obtain the final image.
  • the summing process can be done in the image sensor or in a processor.
  • the present invention uses a motion sensor to sense the camera movement during the exposure time. If the camera movement exceeds a predetermined range relative to a reference point, then the shorter frames captured during this large movement period are discarded. Alternatively, the image sensor is effectively not exposed during a large movement period.
  • the exposing light can be shut off by a mechanical shutter, by an optical valve or by an electronic circuit in the image sensor. With the frame selection or with the selective exposure method of the present invention, there is no need to optically or electronically shift the images captured in the shorter frames in the pixel fusion process.
  • the method comprises:
  • the one or more exposures attained when the movement amount is within the predetermined movement range form a single image frame during the exposure period, and the method further comprises capturing the single image frame after the exposure period for constructing the acquired image.
  • one or more exposures attained when the movement amount is within the predetermined movement range separately form one or more image frames during the exposure period, and the method further comprises capturing the image frames at least during the exposure period for constructing the acquired image.
  • the exposure period is divided into a plurality of shorter time periods and said one or more exposures attained during at least part of the exposure period form one or more image frames, each image frame for one shorter time period, said method further comprising capturing said one or more image frames at least during the exposure period; and selecting the captured image frames formed from the one or more exposures when the movement amount is within the predetermined movement range for constructing the acquired image.
  • an image sensor for attaining one or more exposures during an exposure period
  • a movement sensor for sensing movement of the imaging system during the exposure period for obtaining a movement amount relative to an initial position of the imaging system in the exposure period
  • a processor operatively connected to the image sensor, for constructing an image based on one or more exposures attained when the movement amount is within a predetermined movement range.
  • the imaging system further comprises an optical system for providing a projected image on the image sensor so as to allow the image sensor to attain the one or more exposures during the exposure period, and a shutter, positioned in relationship to the optical system, for preventing the projected image from reaching the image sensor when the movement amount is outside the predetermined movement range
  • the imaging system further comprises an electronic circuit operatively connected to the image sensor for preventing the image sensor from attaining an exposure when the movement amount is outside the predetermined movement range.
  • the electronic circuit can provide a signal to indicate whether the movement amount is within the predetermined movement range so as to allow the image sensor to attain said one or more exposures only when the movement amount is within the predetermined range.
  • the image stabilization module comprises:
  • a movement sensor for sensing movement of the imaging system during the exposure period
  • means operatively connected to the movement sensor, for determining a movement amount of the imaging system relative to an initial position of the imaging system in the exposure period, and for providing a signal indicative of wherein the movement amount is within a predetermined movement range to the processor so that the processor attains the one or more exposures only when the movement amount is within the predetermined range.
  • a light shutter is used for preventing the projected image from reaching the image sensor when the movement amount is out of the predetermined movement range.
  • FIG. 1 shows a shift in the image on the image sensor due to a linear movement of the camera.
  • FIG. 2 shows a shift in the image on the image sensor due to a rotational movement of the camera.
  • FIG. 3 shows the relationship of the distance of an image shift to the angular change of the image shift.
  • FIG. 4a illustrates a track of a projected image spot on the image plane due to the camera movement.
  • FIG. 4b illustrates the track of a projected image spot on the image plane and a different wanted exposure area.
  • FIG. 4c illustrates the track of a projected image spot on the image plane and another wanted exposure area.
  • FIG. 4d illustrates the track of a projected image spot on the image plane and yet another wanted exposure area.
  • FIG. 5 is a time-chart illustrating how the exposures are read out, according to one embodiment of the present invention.
  • FIG. 6 is a time-chart illustrating how the exposures are read out, according to another embodiment of the present invention.
  • FIG. 7 is a time-chart illustrating how the exposures are read out, according to yet another embodiment of the present invention.
  • FIG. 8 is a schematic representation showing the motion stabilization system, according to the present invention.
  • FIG. 9 is a flowchart illustrating the method of image stabilization, according to the present invention.
  • the movement of the device relative to a scene is most of the time unavoidable. If the exposure time is long, image blur occurs. Image blur is the result of the image shift in the image plane. As shown in FIG. 1 , an image point P on the image sensor is shifted to point P′ due to a linear movement of the camera relative to a point S in the scene. FIG. 2 shows the image shift due to a rotational movement of the camera relative to point S. If the image shift distance, D, between point P and point P′ is larger than three or four pixels, then the image quality may be poor. Thus, it is desirable to limit the camera movement such that the image shift is within a predetermined range, say one or two pixels.
  • the image shift distance not only varies with the camera movement, but also with the focal distance, f, between the image plane and the lens. In a camera with a zoom lens, the image shift distance is greater when the lens is zoomed out.
  • the image shift distance, D can be related to a shift angle, ⁇ , as shown in FIG. 3 .
  • the shift angle, ⁇ is approximately equal to D/f. With the same amount of camera movement, the shift angle, ⁇ , does not significantly change with the focal distance, f.
  • FIGS. 4a to 4d illustrate a track of an image point during the long exposure time. When the track crosses itself, this indicates that the camera moves back to the same aiming direction or position after moving away from it. However, the track may or may not cross the initial image point P.
  • the image stabilization method relates to the multi-frame method based on capturing a single image frame or several image frames of the same scene in shorter intervals.
  • the number of captured frames is determined by the motion blur caused by the camera motion and the implementation of embodiments.
  • a long exposure time is divided into a plurality of several short intervals in order to capture a plurality of image frames and only the image frames captured when the position of the camera is within a predetermined range are used to form a final image.
  • the exposure time for each frame is small in order to reduce the motion blur degradation of the individual frames. If the camera is stable and substantially stationary relative to the scene, then all or many of the shorter frames are used to form the final image. If the camera is not sufficiently stable, then one or a few shorter frames are used.
  • the duration of exposures to the image sensor is determined by the camera motion during the exposures. Multiple captured frames from multiple exposures may be used to form a final image. Alternatively, only a single frame is captured from the multiple exposures and that single frame is used to form the final image.
  • the track does not pass the image point P during a certain exposure time, it may pass through the pixel where the image point P is initially located.
  • the pixel is indicated by the area defined by a dotted rectangle and the track passes through the pixel at t 1 .
  • at least the initial shorter frame and the shorter frame at t 1 can be used to form the final image. Let us call the area defined by the dotted rectangle a “wanted exposure area”.
  • some or all of the shorter frames in which the track of an image point passes through the wanted exposure area are used to form the final image.
  • the sharpness of the final image depends upon how large the wanted exposure area is. In a digital camera, the smaller wanted exposure area is a pixel. However, the wanted exposure area can be larger than a pixel. When the wanted exposure area is increased, it is more likely that the track passes through the wanted exposure area. As shown in FIG. 4b , the track passes the wanted exposure area again at t 2 .
  • at least three shorter frames can be used to form the final image.
  • a wanted exposure angular range instead of the wanted exposure area, can be used for selecting shorter frames in forming the final image.
  • the wanted exposure angular range can be defined by the wanted exposure area divided by the focal distance, f, of the camera.
  • the wanted exposure angular range is bound by a dotted circle.
  • the wanted exposure angular range is bound by a dotted ellipse.
  • FIGS. 5 to 7 there are more than one way to form a final image, as shown in FIGS. 5 to 7 .
  • the camera movement is shown in FIGS. 5(d), 6(d) and 7(d) .
  • some part of the camera movement is within a predetermined range depicted as the “wanted exposure area” (or angle). Only the exposures to the image sensor when the camera movement is within the predetermined range are used.
  • the exposures start when the shutter button on the camera is activated, as shown in FIGS. 5(a), 6(a) and 7(a) .
  • FIGS. 5(b) and 6(b) the image sensor is effectively exposed only when the camera movement is within the predetermined range.
  • the exposing light is shut off by a mechanical or optical shutter, or by an electronic circuit or a plurality of electronic elements within the image sensor.
  • an image sensor such as a charge-couple device (CCD)
  • CCD charge-couple device
  • electric charges will accumulate in the pixels over an exposure period to form an image.
  • the accumulated charges in each pixel are read out as pixel intensity.
  • a frame is captured, as shown in FIG. 6(c) .
  • FIG. 6(d) the track of the camera movement moves out of the wanted exposure area three times and, therefore, there are three exposures after the shutter button is activated. Accordingly, three frames are separately and individually captured to be used in the final image.
  • the pixel intensities are summed in a processor operatively connected to the image sensor.
  • the long exposure period for taking a picture is divided into a plurality of short periods and a frame is captured for the exposure in each short period.
  • the image for each captured frame is read out while the picture is taken.
  • FIG. 7(c) only the frames captured for the exposures when the camera movement is within the predetermined range are used for summing.
  • the used frames are labeled “OK” and the discarded frames are labeled “NG”.
  • the pixel intensities of the used frames are summed in a processor operatively connected to the image sensor.
  • FIG. 7(b) shows an effective light shutter period, no shutter is needed for this embodiment.
  • the present invention uses a motion sensor, such as a gyroscope or an accelerometer, to selectively shut off the image sensor when the camera motion is out of the wanted exposure angular range or out of the wanted exposure area in regard to the initial position.
  • a motion sensor such as a gyroscope or an accelerometer
  • the imaging system 10 of the present invention comprises one or more lenses 20 to project an image on the image sensor 30 .
  • An image/signal processor 40 is configured to read out the images formed on the image sensor. When the illumination is adequate, one short frame may be sufficient to capture the image of a scene.
  • a motion sensor 50 operatively connected to the image/signal processor 40 , sends a signal to the processor 40 to effectively shut off the image sensor 30 when the camera movement is out of the wanted exposure angular range or the wanted exposure area.
  • the exposing light can be shut off by a mechanical shutter or an optical valve 55 , for example.
  • the exposure time varies with the illumination. A longer exposure time is used when the illumination is lower. For that purpose, a light sensor 60 is used.
  • the exposure time can also be dependent upon the illumination. Thus, in a low light situation, the exposure time can be increased so as to increase the chance for the track of an image point to pass through the wanted exposure area or angular range. However, it is also possible to increase the wanted exposure angular range or the wanted exposure area in a low light situation.
  • the present invention uses a single captured frame or a plurality of captured frames to form a final image. If multiple frames are used to form the final image, the pixel intensity values of the corresponding pixels in the frames are summed in order to obtain the final image.
  • the summing process can be done in the image sensor or in a processor.
  • the overall stabilization process is summarized in a flowchart as shown in FIG. 9 .
  • the exposures of a projected image to image sensor start at step 110 when the shutter button on the camera is activated.
  • the sensing of the camera movement starts immediately at step 120 in order to determine whether the camera movement is within a wanted exposure area or angle.
  • the image frames are captured at step 130 either during the exposure period or during the exposure period.
  • the image frames formed from the exposures when the movement is within the wanted exposure area or angle are used to construct the final image.
  • the projected image is prevented from reaching the image sensor when the movement exceeds the wanted exposure area or angle. It is advantages that the movement of the camera is determined using only a subset of pixels on the image sensor.
  • the present invention uses a motion sensor to sense the camera movement during the exposure time. If the camera movement exceeds a predetermined range relative to a reference point, then the shorter frames captured during this large movement period are discarded. Alternatively, the image sensor is effectively not exposed during a large movement period.
  • the exposing light can be shut off by a mechanical shutter, by an optical valve or by an electronic circuit in the image sensor. With the frame selection or with the selective exposure method of the present invention, there is no need to optically or electronically shift the images captured in the shorter frames in the pixel fusion process.

Abstract

A motion sensor is used to sense the movement of the camera during an exposure period. The camera has an image sensor to form one or more exposures. When the movement is within a certain range, the exposures are used to provide one or more frames so that an image can be constructed based on the frames. In one embodiment, the exposure period is divided into several short intervals in order to capture several image frames and only the image frames captured when the position of the camera is within a predetermined range are used to form the final image. The exposure time for each frame is small in order to reduce the motion blur degradation of the individual frames. If the camera is stable and substantially stationary relative to the scene, then all or many of the shorter frames are used to form the final image.

Description

FIELD OF THE INVENTION
The present invention relates generally to image stabilization and, more particularly, to image stabilization by image processing.
BACKGROUND OF THE INVENTION
The problem of image stabilization dates back to the beginning of photography, and the problem is related to the fact that an image sensor needs a sufficient exposure time to form a reasonably good image. Any motion of the camera during the exposure time causes a shift of the image projected on the image sensor, resulting in a degradation of the formed image. The motion related degradation is called motion blur. Using one or both hands to hold a camera while taking a picture, it is almost impossible to avoid an unwanted camera motion during a reasonably long exposure or integration time. Motion blur is particularly easy to occur when the camera is set at a high zoom ratio when even a small motion could significantly degrade the quality of the acquired image. One of the main difficulties in restoring motion blurred images is due to the fact that the motion blur is different from one image to another, depending on the actual camera motion that took place during the exposure time.
The ongoing development and miniaturization of consumer devices that have image acquisition capabilities increases the need for robust and efficient image stabilization solutions. The need is driven by two main factors:
1. Difficulty to avoid unwanted motion during the integration time when using a small hand-held device (like a camera phone).
2. The need for longer integration times due to the small pixel area resulting from the miniaturization of the image sensors in conjunction with the increase in image resolution. The smaller the pixel area the fewer photons per unit time could be captured by the pixel such that a longer integration time is needed for good results.
Image stabilization is usually carried out in a technique called a single-frame solution. The single-frame solution is based on capturing a single image frame during a long exposure time. This is actually the classical case of image capturing, where the acquired image is typically corrupted by motion blur, caused by the motion that has taken place during the exposure time. In order to restore the image it is necessary to have very accurate knowledge about the motion that took place during the exposure time. Consequently this approach might need quite expensive motion sensors (gyroscopes), which, apart of their costs, are also large in size and hence difficult to include in small devices. In addition, if the exposure time is long then the position information derived from the motion sensor output exhibits a bias drift error with respect to the true value. This error accumulates in time such that at some point may affect significantly the outcome of the system.
In the single-frame solution, a number of methods have been used to reduce or eliminate the motion blur. Optical image stabilization generally involves laterally shifting the image projected on the image sensor in compensation for the camera motion. Shifting of the image can be achieved by one of the following four general techniques:
Lens shift—this optical image stabilization method involves moving one or more lens elements of the optical system in a direction substantially perpendicular to the optical axis of the system;
Image sensor shift—this optical image stabilization method involves moving the image sensor in a direction substantially perpendicular to the optical axis of the optical system;
Liquid prism—this method involves changing a layer of liquid sealed between two parallel plates into a wedge in order to change the optical axis of the system by refraction; and
Camera module tilt—this method keeps all the components in the optical system unchanged while tilting the entire module so as to shift the optical axis in relation to a scene.
In any one of the above-mentioned image stabilization techniques, an actuator mechanism is required to effect the change in the optical axis or the shift of the image sensor. Actuator mechanisms are generally complex, which means that they are expensive and large in size.
    • Another approach to image stabilization is the multi-frame method. This method is based on dividing a long exposure time into several shorter intervals and capturing several image frames of the same scene in those shorter intervals. The exposure time for each frame is small in order to reduce the motion blur degradation of the individual frames. After capturing all these frames, the final image is calculated in two steps:
Registration step: register all image frames with respect to one of the images chosen as reference, and
    • Pixel fusion: calculate the value of each pixel in the final image based on the corresponding values in all individual frames. One simple method of pixel fusion could be to calculate the final value of each pixel as the average of its values in the individual frames.
The main problems in a typical multi-frame image stabilization solution include:
    • 1. Complex computation in image registration, and
    • 2. Moving objects in the scene: If there are objects in the scene that are moving during the time the image frames are acquired, these objects are distorted in the final image. The distortion consists in pasting together multiple instances of the objects.
It is desirable to provide a simpler method and system for image stabilization.
SUMMARY OF THE INVENTION
The present invention relates to the multi-frame method based on capturing a single image frame or several image frames of the same scene in shorter intervals. The number of captured frames is determined by the motion blur caused by the camera motion and the implementation of embodiments.
According to one embodiment of the present invention, a long exposure time is divided into several short intervals in order to capture a plurality of image frames and only the image frames that are captured when the position of the camera is within a predetermined range are used to form a final image. The exposure time for each frame is small in order to reduce the motion blur degradation of the individual frames. If the camera is stable and substantially stationary relative to the scene, then all or many of the shorter frames are used to form the final image. If the camera is not sufficiently stable, then one or a few shorter frames are used.
According to other embodiments, the duration of exposures to the image sensor is determined by the camera motion during the exposures. Multiple captured frames from multiple exposures may be used to form a final image. Alternatively, only a single frame is captured from the multiple exposures and that single frame is used to form the final image.
If multiple frames are used to form the final image, the pixel intensity values of the corresponding pixels in the frames are summed in order to obtain the final image. The summing process can be done in the image sensor or in a processor.
The present invention uses a motion sensor to sense the camera movement during the exposure time. If the camera movement exceeds a predetermined range relative to a reference point, then the shorter frames captured during this large movement period are discarded. Alternatively, the image sensor is effectively not exposed during a large movement period. The exposing light can be shut off by a mechanical shutter, by an optical valve or by an electronic circuit in the image sensor. With the frame selection or with the selective exposure method of the present invention, there is no need to optically or electronically shift the images captured in the shorter frames in the pixel fusion process.
Thus, it is a first aspect of the present invention to provide a method to stabilize an image acquired in an imaging system during an exposure period. The method comprises:
exposing a projected image on an image sensor of the imaging system at least part of the exposure period for attaining one or more exposures;
sensing movement of the imaging system during the exposure period for obtaining a movement amount relative to an initial position of the imaging system in the exposure period; and
constructing the acquired image based on one or more exposures attained when the movement amount is within a predetermined movement range in the exposure period.
According to one embodiment, the one or more exposures attained when the movement amount is within the predetermined movement range form a single image frame during the exposure period, and the method further comprises capturing the single image frame after the exposure period for constructing the acquired image.
According to another embodiment, one or more exposures attained when the movement amount is within the predetermined movement range separately form one or more image frames during the exposure period, and the method further comprises capturing the image frames at least during the exposure period for constructing the acquired image.
According to a different embodiment, the exposure period is divided into a plurality of shorter time periods and said one or more exposures attained during at least part of the exposure period form one or more image frames, each image frame for one shorter time period, said method further comprising capturing said one or more image frames at least during the exposure period; and selecting the captured image frames formed from the one or more exposures when the movement amount is within the predetermined movement range for constructing the acquired image.
It is a second aspect of the present invention to provide an imaging system which comprises:
an image sensor for attaining one or more exposures during an exposure period;
a movement sensor for sensing movement of the imaging system during the exposure period for obtaining a movement amount relative to an initial position of the imaging system in the exposure period; and
a processor, operatively connected to the image sensor, for constructing an image based on one or more exposures attained when the movement amount is within a predetermined movement range.
The imaging system further comprises an optical system for providing a projected image on the image sensor so as to allow the image sensor to attain the one or more exposures during the exposure period, and a shutter, positioned in relationship to the optical system, for preventing the projected image from reaching the image sensor when the movement amount is outside the predetermined movement range
Alternatively, the imaging system further comprises an electronic circuit operatively connected to the image sensor for preventing the image sensor from attaining an exposure when the movement amount is outside the predetermined movement range. The electronic circuit can provide a signal to indicate whether the movement amount is within the predetermined movement range so as to allow the image sensor to attain said one or more exposures only when the movement amount is within the predetermined range.
It is a third aspect of the present invention to provide an image stabilization module for use in an imaging system, wherein the imaging system comprises an image sensor, an optical module for projecting an image on the image sensor so as to allow the image sensor to attain one or more exposures during an exposure period, and a processor, operatively connected to the image sensor, for constructing an image based on one or more exposures. The image stabilization module comprises:
a movement sensor for sensing movement of the imaging system during the exposure period; and
means, operatively connected to the movement sensor, for determining a movement amount of the imaging system relative to an initial position of the imaging system in the exposure period, and for providing a signal indicative of wherein the movement amount is within a predetermined movement range to the processor so that the processor attains the one or more exposures only when the movement amount is within the predetermined range.
It is possible that a light shutter is used for preventing the projected image from reaching the image sensor when the movement amount is out of the predetermined movement range.
The present invention will become apparent upon reading the description taken in conjunction with FIGS. 1 to 9.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 shows a shift in the image on the image sensor due to a linear movement of the camera.
FIG. 2 shows a shift in the image on the image sensor due to a rotational movement of the camera.
FIG. 3 shows the relationship of the distance of an image shift to the angular change of the image shift.
FIG. 4a illustrates a track of a projected image spot on the image plane due to the camera movement.
FIG. 4b illustrates the track of a projected image spot on the image plane and a different wanted exposure area.
FIG. 4c illustrates the track of a projected image spot on the image plane and another wanted exposure area.
FIG. 4d illustrates the track of a projected image spot on the image plane and yet another wanted exposure area.
FIG. 5 is a time-chart illustrating how the exposures are read out, according to one embodiment of the present invention.
FIG. 6 is a time-chart illustrating how the exposures are read out, according to another embodiment of the present invention.
FIG. 7 is a time-chart illustrating how the exposures are read out, according to yet another embodiment of the present invention.
FIG. 8 is a schematic representation showing the motion stabilization system, according to the present invention.
FIG. 9 is a flowchart illustrating the method of image stabilization, according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Using a small hand-held device, such as camera phone to take a picture, the movement of the device relative to a scene is most of the time unavoidable. If the exposure time is long, image blur occurs. Image blur is the result of the image shift in the image plane. As shown in FIG. 1, an image point P on the image sensor is shifted to point P′ due to a linear movement of the camera relative to a point S in the scene. FIG. 2 shows the image shift due to a rotational movement of the camera relative to point S. If the image shift distance, D, between point P and point P′ is larger than three or four pixels, then the image quality may be poor. Thus, it is desirable to limit the camera movement such that the image shift is within a predetermined range, say one or two pixels. The image shift distance not only varies with the camera movement, but also with the focal distance, f, between the image plane and the lens. In a camera with a zoom lens, the image shift distance is greater when the lens is zoomed out.
The image shift distance, D, can be related to a shift angle, α, as shown in FIG. 3. The shift angle, α, is approximately equal to D/f. With the same amount of camera movement, the shift angle, α, does not significantly change with the focal distance, f.
If the camera is not stable during the long exposure time, an image point P in the image plane may move around responding to the camera movement relative to the scene. In general, the user of the camera tries to aim the camera at the scene. Thus, although the camera moves during the long exposure time, the same image does not wander very far from the image point P. FIGS. 4a to 4d illustrate a track of an image point during the long exposure time. When the track crosses itself, this indicates that the camera moves back to the same aiming direction or position after moving away from it. However, the track may or may not cross the initial image point P.
The image stabilization method, according to the present invention, relates to the multi-frame method based on capturing a single image frame or several image frames of the same scene in shorter intervals. The number of captured frames is determined by the motion blur caused by the camera motion and the implementation of embodiments.
According to one embodiment of the present invention, a long exposure time is divided into a plurality of several short intervals in order to capture a plurality of image frames and only the image frames captured when the position of the camera is within a predetermined range are used to form a final image. The exposure time for each frame is small in order to reduce the motion blur degradation of the individual frames. If the camera is stable and substantially stationary relative to the scene, then all or many of the shorter frames are used to form the final image. If the camera is not sufficiently stable, then one or a few shorter frames are used.
According to other embodiments, the duration of exposures to the image sensor is determined by the camera motion during the exposures. Multiple captured frames from multiple exposures may be used to form a final image. Alternatively, only a single frame is captured from the multiple exposures and that single frame is used to form the final image.
As shown in FIG. 4a, although the track does not pass the image point P during a certain exposure time, it may pass through the pixel where the image point P is initially located. The pixel is indicated by the area defined by a dotted rectangle and the track passes through the pixel at t1. In this case, at least the initial shorter frame and the shorter frame at t1 can be used to form the final image. Let us call the area defined by the dotted rectangle a “wanted exposure area”.
According to the present invention, some or all of the shorter frames in which the track of an image point passes through the wanted exposure area are used to form the final image. The sharpness of the final image depends upon how large the wanted exposure area is. In a digital camera, the smaller wanted exposure area is a pixel. However, the wanted exposure area can be larger than a pixel. When the wanted exposure area is increased, it is more likely that the track passes through the wanted exposure area. As shown in FIG. 4b, the track passes the wanted exposure area again at t2. Thus, at least three shorter frames can be used to form the final image.
Alternatively, a wanted exposure angular range, instead of the wanted exposure area, can be used for selecting shorter frames in forming the final image. The wanted exposure angular range can be defined by the wanted exposure area divided by the focal distance, f, of the camera. In FIG. 4c, the wanted exposure angular range is bound by a dotted circle. In FIG. 4d, the wanted exposure angular range is bound by a dotted ellipse.
It should be noted that, with the same camera movement, there are more than one way to form a final image, as shown in FIGS. 5 to 7. The camera movement is shown in FIGS. 5(d), 6(d) and 7(d). As shown, some part of the camera movement is within a predetermined range depicted as the “wanted exposure area” (or angle). Only the exposures to the image sensor when the camera movement is within the predetermined range are used. The exposures start when the shutter button on the camera is activated, as shown in FIGS. 5(a), 6(a) and 7(a). In FIGS. 5(b) and 6(b), the image sensor is effectively exposed only when the camera movement is within the predetermined range. If the camera movement is outside the predetermined range, the exposing light is shut off by a mechanical or optical shutter, or by an electronic circuit or a plurality of electronic elements within the image sensor. In an image sensor such as a charge-couple device (CCD), electric charges will accumulate in the pixels over an exposure period to form an image. In general, the accumulated charges in each pixel are read out as pixel intensity. After each exposure period, a frame is captured, as shown in FIG. 6(c). As shown in FIG. 6(d), the track of the camera movement moves out of the wanted exposure area three times and, therefore, there are three exposures after the shutter button is activated. Accordingly, three frames are separately and individually captured to be used in the final image. In this embodiment, the pixel intensities are summed in a processor operatively connected to the image sensor.
Alternatively, only a single frame is read out after the picture is taken, as shown in FIG. 5(c). This single frame effectively sums the pixel intensities in three different exposure periods.
In a different embodiment, the long exposure period for taking a picture is divided into a plurality of short periods and a frame is captured for the exposure in each short period. The image for each captured frame is read out while the picture is taken. As shown in FIG. 7(c), only the frames captured for the exposures when the camera movement is within the predetermined range are used for summing. In FIG. 7(c), the used frames are labeled “OK” and the discarded frames are labeled “NG”. In this embodiment, the pixel intensities of the used frames are summed in a processor operatively connected to the image sensor. Although FIG. 7(b) shows an effective light shutter period, no shutter is needed for this embodiment.
In order to select the shorter frames for forming a final image, the present invention uses a motion sensor, such as a gyroscope or an accelerometer, to selectively shut off the image sensor when the camera motion is out of the wanted exposure angular range or out of the wanted exposure area in regard to the initial position. As shown in FIG. 8, the imaging system 10 of the present invention comprises one or more lenses 20 to project an image on the image sensor 30. An image/signal processor 40 is configured to read out the images formed on the image sensor. When the illumination is adequate, one short frame may be sufficient to capture the image of a scene. In a low light situation, many short frames are used to capture a plurality of short exposure images so that the pixel intensities in the short frames can be summed by the image/signal processor 40 to form a final image. A motion sensor 50, operatively connected to the image/signal processor 40, sends a signal to the processor 40 to effectively shut off the image sensor 30 when the camera movement is out of the wanted exposure angular range or the wanted exposure area. The exposing light can be shut off by a mechanical shutter or an optical valve 55, for example.
In many imaging systems, the exposure time varies with the illumination. A longer exposure time is used when the illumination is lower. For that purpose, a light sensor 60 is used. In the imaging system, according to the present invention, the exposure time can also be dependent upon the illumination. Thus, in a low light situation, the exposure time can be increased so as to increase the chance for the track of an image point to pass through the wanted exposure area or angular range. However, it is also possible to increase the wanted exposure angular range or the wanted exposure area in a low light situation.
In sum, the present invention uses a single captured frame or a plurality of captured frames to form a final image. If multiple frames are used to form the final image, the pixel intensity values of the corresponding pixels in the frames are summed in order to obtain the final image. The summing process can be done in the image sensor or in a processor. The overall stabilization process is summarized in a flowchart as shown in FIG. 9. As shown in the flowchart 100 in FIG. 9, the exposures of a projected image to image sensor start at step 110 when the shutter button on the camera is activated. The sensing of the camera movement starts immediately at step 120 in order to determine whether the camera movement is within a wanted exposure area or angle. The image frames are captured at step 130 either during the exposure period or during the exposure period. At step 140, the image frames formed from the exposures when the movement is within the wanted exposure area or angle are used to construct the final image. In one embodiment of the present invention, the projected image is prevented from reaching the image sensor when the movement exceeds the wanted exposure area or angle. It is advantages that the movement of the camera is determined using only a subset of pixels on the image sensor.
The present invention uses a motion sensor to sense the camera movement during the exposure time. If the camera movement exceeds a predetermined range relative to a reference point, then the shorter frames captured during this large movement period are discarded. Alternatively, the image sensor is effectively not exposed during a large movement period. The exposing light can be shut off by a mechanical shutter, by an optical valve or by an electronic circuit in the image sensor. With the frame selection or with the selective exposure method of the present invention, there is no need to optically or electronically shift the images captured in the shorter frames in the pixel fusion process.
Thus, although the present invention has been described with respect to one or more embodiments thereof, it will be understood by those skilled in the art that the foregoing and various other changes, omissions and deviations in the form and detail thereof may be made without departing from the scope of this invention.

Claims (42)

What is claimed is:
1. A method comprising:
exposing a projected image on an image sensor of an imaging system during at least part of an exposure period for attaining at least two exposures;
sensing movement of the imaging system, during the exposure period, in order to determine an amount of movement of the imaging system relative to an initial position of the imaging system in the exposure period; and
constructing an image based on at least a first exposure and a second exposure, wherein the first exposure is attained during a first time period, within the exposure period, in which the determined amount of movement of the imaging system is within a predetermined movement range relative to the initial position of the imaging system, the second exposure is attained during a second time period, within the exposure period, in which the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, and between the first time period and the second time period there is a third time period in which the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
2. A method according to claim 1, wherein the first exposure and the second exposure, attained when the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, form a single image frame during the exposure period, said method further comprising:
capturing the single image frame after the exposure period for constructing the acquired image.
3. A method according to claim 1, wherein the first exposure and the second exposure, attained when the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, separately form at least two image frames during the exposure period, said method further comprising:
capturing the at least two image frames at least during the exposure period for constructing the acquired image.
4. A method according to claim 1, wherein the exposure period is divided into a plurality of shorter time periods and wherein the first exposure, attained during the first time period, forms a first image frame; and the second exposure, attained during the second time period, forms a second image frame, said method further comprising:
capturing said first and second image frames during the exposure period; and
selecting at least the first image frame and the second image frame for use in constructing the image.
5. A method according to claim 1, further comprising
preventing the projected image from reaching the image sensor, during the third time period, when the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
6. A method according to claim 1, wherein the image sensor comprises an array of pixels, each pixel having a pixel area, and the predetermined movement range is determined based on one or more pixel areas.
7. A method according to claim 1, wherein the imaging system comprises an optical system for providing the projected image on the image sensor, and the predetermined movement range is determined based on a focal distance of the optical system.
8. A method according to claim 1, wherein the predetermined movement range is determined based on brightness of at least part of light forming the projected image.
9. An imaging system comprising:
an image sensor configured to attain at least a first exposure and a second exposure during an exposure period;
a movement sensor configured to sense movement of the imaging system, during the exposure period, in order to enable an amount of movement of the imaging system to be determined, relative to an initial position of the imaging system in the exposure period; and
a processor configured to construct an image based on at least the first exposure and the second exposure, wherein the first exposure is attained during a first time period, within the exposure period, in which the determined amount of movement of the imaging system is within a predetermined movement range relative to the initial position of the imaging system, the second exposure is attained during a second time period, within the exposure period, in which the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, and between the first time period and the second time period there is a third time period in which the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
10. An imaging system according to claim 9, wherein the first exposure and the second exposure, attained when the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, form a single image frame during the exposure period, and wherein the image is constructed based on the single image frame captured after the exposure period.
11. An imaging system according to claim 9, wherein the first exposure and the second exposure, attained when the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, separately form at least two image frames during the exposure period, and wherein the image is constructed based on the at least two image frames captured at least during the exposure period.
12. An imaging system according to claim 9, wherein the image sensor is configured to divide the exposure period into a plurality of shorter time periods and the first exposure, attained during the first time period, forms a first image frame; and the second exposure, attained during the second time period, forms a second image frame, and wherein the processor is configured to capture the first and second and third image frames during the exposure period, configured to select the first image frame and the second image frame for use in constructing the image.
13. An imaging system according to claim 9, further comprising
an optical system for providing a projected image on the image sensor so as to allow the image sensor to attain the first exposure and the second exposure during the exposure period.
14. An imaging system according to claim 13, further comprising
a shutter, positioned in relationship to the optical system, configured to prevent the projected image from reaching the image sensor during the third time period, when the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
15. An imaging system according to claim 9, further comprising
an electronic circuit configured to prevent the image sensor from attaining the third exposure during the third time period, when the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
16. An imaging system according to claim 9, wherein the movement sensor provides a signal to indicate that the determined amount of movement of the imaging system is within the predetermined movement range so as to cause the image sensor to attain said first exposure during the first time period and so as to cause the image sensor to attain said second exposure during the second time period.
17. An imaging system, comprising:
means for sensing an image;
means for exposing a projected image on the image sensing means during an exposure period so as to allow the image sensing means to attain at least two exposures;
means for sensing movement of the imaging system, during the exposure period, in order to determine an amount of movement of the imaging system relative to an initial position of the imaging system in the exposure period; and
means for constructing an image based on at least a first exposure and a second exposure, wherein the first exposure is attained during a first time period, within the exposure period, in which the determined amount of movement of the imaging system is within a predetermined movement range relative to the initial position of the imaging system, the second exposure is attained during a second time period, within the exposure period, in which the determined amount of movement of the imaging system is within the predetermined movement range relative to the initial position of the imaging system, and between the first time period and the second time period there is a third time period in which the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
18. An imaging system according to claim 17, further comprising means for capturing at least two image frames for constructing the acquired image.
19. An imaging system according to claim 18, further comprising:
means for selecting the captured image frames for constructing the acquired image.
20. An imaging system according to claim 17, further comprising:
means for preventing the projected image from forming an exposure, during the third time period, when the determined amount of movement of the imaging system is outside the predetermined movement range relative to the initial position of the imaging system.
21. A method for constructing an image, comprising:
beginning an exposure cycle when a shutter button activation of an imaging system is detected;
sensing movement of the imaging system during the exposure cycle for obtaining an amount of movement of the imaging system relative to an initial position of the imaging system;
capturing a plurality of frames of a same scene during the exposure cycle;
for each captured frame of the plurality of captured frames:
determining whether a corresponding amount of movement of the imaging system during the capturing is within a predetermined movement range;
responsive to a determination that the corresponding amount of movement is within the predetermined movement range, labeling the captured frame of the same scene with a first identifier, the first identifier indicating that the captured frame is not to be discarded; and
responsive to a determination that the corresponding amount of movement is not within the predetermined movement range, labeling the captured frame of the same scene with a second identifier, the second identifier indicating that the captured frame is to be discarded; and
constructing an image of the same scene based on at least one captured frame of the same scene that is labeled with the first identifier after discarding at least one captured frame of the same scene that is labeled with the second identifier.
22. The method of claim 21, wherein a length of the exposure cycle varies with illumination.
23. The method of claim 21, wherein the amount of movement of the imaging system is determined using a subset of pixels on the image sensor.
24. The method of claim 21, wherein the amount of movement of the imaging system is determined by a motion sensing component.
25. A method for constructing an image, comprising:
beginning an exposure cycle when a shutter button activation of an imaging system is detected;
sensing movement of the imaging system during the exposure cycle for obtaining an amount of movement of the imaging system relative to a fixed reference point;
controlling the imaging system such that an image sensor is effectively exposed when the amount of movement of the imaging system is within a predetermined movement range relative to the fixed reference point, the image sensor is not effectively exposed when the amount of movement of the imaging system is not within the predetermined movement range relative to the fixed reference point, and the image sensor is effectively re-exposed when the amount of movement of the imaging system is back within the predetermined movement range relative to the fixed reference point; and
constructing an image based at least one exposure, each exposure obtained when the image sensor is one of exposed and re-exposed during the exposure cycle.
26. The method of claim 25, wherein a frame is captured for each exposure, and the image is constructed based on the captured frames.
27. The method of claim 25, wherein the image is constructed by accumulating pixel intensities that were accumulated during each exposure.
28. The method of claim 25, wherein the image sensor is not effectively exposed by controlling at least one of a mechanical shutter, an optical shutter, and at least one electronic element within the image sensor.
29. The method of claim 25, wherein a length of the exposure cycle varies with illumination.
30. The method of claim 25, wherein the amount of movement of the imaging system is determined using a subset of pixels on the image sensor.
31. The method of claim 25, wherein the amount of movement of the imaging system is determined by a motion sensing component.
32. An imaging system, comprising:
a movement sensor for sensing movement of the imaging system during an exposure cycle to obtain an amount of movement of the imaging system relative to an initial position of the imaging system, the initial position determined a beginning of the exposure cycle, and the exposure cycle beginning when a shutter button activation of the imaging system is detected;
an image sensor for capturing a plurality of frames of a same scene during the exposure cycle;
a processor is configured, for each captured frame of the plurality of captured frames, to:
determine whether a corresponding amount of movement of the imaging system during the capturing is within a predetermined movement range;
responsive to a determination that the corresponding amount of movement is within the predetermined movement range, label the captured frame of the same scene with a first identifier, the first identifier indicating that the captured frame is not to be discarded; and
responsive to a determination that the corresponding amount of movement is not within the predetermined movement range, label the captured frame of the same scene with a second identifier, the second identifier indicating that the captured frame is to be discarded; and
control an overall operation of the imaging system and construct an image of the same scene based on at least one captured frame of the same scene that is labeled with the first identifier after discarding at least one captured frame of the same scene that is labeled with the second identifier.
33. The imaging system of claim 32, wherein a length of the exposure cycle varies with illumination.
34. The imaging system of claim 32, wherein the amount of movement of the imaging system is determined using a subset of pixels on the image sensor.
35. The imaging system of claim 32, wherein the amount of movement of the imaging system is determined by a motion sensing component.
36. An imaging system, comprising:
a movement sensor for sensing movement of the imaging system during an exposure cycle to obtain an amount of movement of the imaging system relative to a fixed reference point, the exposure cycle beginning when a shutter button activation of the imaging system is detected;
a processor configured to:
control the imaging system such that that an image sensor is effectively exposed when the amount of movement of the imaging system relative to the fixed reference point is within a predetermined movement range, the image sensor is not effectively exposed when the amount of movement of the imaging system relative to the fixed reference point is not within the predetermined movement range, and the image sensor is effectively re-exposed when the amount of movement of the imaging system relative to the fixed reference point is back within the predetermined movement range, and
construct an image based at least one exposure, each exposure obtained when the image sensor is one of exposed and re-exposed during the exposure cycle.
37. The imaging system of claim 36, wherein a frame is captured for each exposure, and the processor constructs the image based on the captured frames.
38. The imaging system of claim 36, wherein the process constructs the image by accumulating pixel intensities that were accumulated during each exposure.
39. The imaging system of claim 36, wherein the image sensor is not effectively exposed by controlling at least one of a mechanical shutter, an optical shutter, and at least one electronic element within the image sensor.
40. The imaging system of claim 36, wherein a length of the exposure cycle varies with illumination.
41. The imaging system of claim 36, wherein the movement sensor determines the amount of movement of the imaging system based on a subset of pixels on the image sensor.
42. The imaging system of claim 36, wherein the amount of movement of the imaging system is determined by a motion sensing component.
US13/904,351 2006-06-22 2013-05-29 Method and system for image construction using multiple exposures Active 2028-11-08 USRE46239E1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/904,351 USRE46239E1 (en) 2006-06-22 2013-05-29 Method and system for image construction using multiple exposures
US15/351,832 USRE48552E1 (en) 2006-06-22 2016-11-15 Method and system for image construction using multiple exposures

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/474,047 US7952612B2 (en) 2006-06-22 2006-06-22 Method and system for image construction using multiple exposures
US13/904,351 USRE46239E1 (en) 2006-06-22 2013-05-29 Method and system for image construction using multiple exposures

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/474,047 Reissue US7952612B2 (en) 2006-06-22 2006-06-22 Method and system for image construction using multiple exposures

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/474,047 Continuation US7952612B2 (en) 2006-06-22 2006-06-22 Method and system for image construction using multiple exposures

Publications (1)

Publication Number Publication Date
USRE46239E1 true USRE46239E1 (en) 2016-12-13

Family

ID=38833114

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/474,047 Ceased US7952612B2 (en) 2006-06-22 2006-06-22 Method and system for image construction using multiple exposures
US13/904,351 Active 2028-11-08 USRE46239E1 (en) 2006-06-22 2013-05-29 Method and system for image construction using multiple exposures
US15/351,832 Active 2028-11-08 USRE48552E1 (en) 2006-06-22 2016-11-15 Method and system for image construction using multiple exposures

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/474,047 Ceased US7952612B2 (en) 2006-06-22 2006-06-22 Method and system for image construction using multiple exposures

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/351,832 Active 2028-11-08 USRE48552E1 (en) 2006-06-22 2016-11-15 Method and system for image construction using multiple exposures

Country Status (6)

Country Link
US (3) US7952612B2 (en)
EP (2) EP2035891B1 (en)
JP (6) JP5284954B2 (en)
CN (2) CN101473266B (en)
ES (1) ES2523462T3 (en)
WO (1) WO2007148169A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE48552E1 (en) * 2006-06-22 2021-05-11 Nokia Technologies Oy Method and system for image construction using multiple exposures

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7548689B2 (en) * 2007-04-13 2009-06-16 Hewlett-Packard Development Company, L.P. Image processing method
US7734161B2 (en) 2007-04-19 2010-06-08 Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. Image stabilization with adaptive shutter control
JP2009017030A (en) * 2007-07-02 2009-01-22 Sony Corp Image imaging apparatus, and imaging control method
TWI381243B (en) * 2008-09-10 2013-01-01 E Ten Information Sys Co Ltd Portable electrical apparatus and operating method thereof
CN101685236B (en) * 2008-09-22 2012-01-25 倚天资讯股份有限公司 Portable electronic device and operating method thereof
US9628711B2 (en) 2011-12-15 2017-04-18 Apple Inc. Motion sensor based virtual tripod method for video stabilization
US20130176463A1 (en) * 2012-01-09 2013-07-11 Nokia Corporation Method and Apparatus for Image Scaling in Photography
US9055222B2 (en) * 2012-02-24 2015-06-09 Htc Corporation Electronic device and method for image stabilization
US9451163B2 (en) * 2012-05-11 2016-09-20 Qualcomm Incorporated Motion sensor assisted rate control for video encoding
JP5962974B2 (en) * 2012-06-04 2016-08-03 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
JP6082274B2 (en) * 2012-06-08 2017-02-15 キヤノン株式会社 Imaging apparatus and control method thereof
JP5974913B2 (en) * 2013-01-25 2016-08-23 富士通株式会社 Imaging method, imaging apparatus, and imaging program
US9955084B1 (en) * 2013-05-23 2018-04-24 Oliver Markus Haynold HDR video camera
US9424598B1 (en) 2013-12-02 2016-08-23 A9.Com, Inc. Visual search in a controlled shopping environment
US9536161B1 (en) * 2014-06-17 2017-01-03 Amazon Technologies, Inc. Visual and audio recognition for scene change events
KR20160019215A (en) * 2014-08-11 2016-02-19 삼성전자주식회사 Photographing apparatus and photographing method thereof
CN104243819B (en) * 2014-08-29 2018-02-23 小米科技有限责任公司 Photo acquisition methods and device
US9843789B2 (en) * 2014-09-08 2017-12-12 Panasonic Intellectual Property Management Co., Ltd. Still-image extracting method and image processing device for implementing the same
KR102229152B1 (en) 2014-10-16 2021-03-19 삼성전자주식회사 Image photographing appratus
CN107211092B (en) * 2014-12-15 2020-07-21 Gvbb控股有限责任公司 Camera, camera system and method for generating image
CN108322655A (en) * 2015-06-12 2018-07-24 青岛海信电器股份有限公司 A kind of photographic method
CN105430265A (en) * 2015-11-27 2016-03-23 努比亚技术有限公司 Method and device for increasing imaging range of camera
BR112018068282B1 (en) 2016-03-11 2022-11-29 Apple Inc CAMERA, CAMERA ACTUATOR AND MOBILE MULTIFUNCTIONAL DEVICE WITH OPTICAL IMAGE STABILIZATION BY VOICE COIL MOTOR TO MOVE IMAGE SENSOR
US10437023B2 (en) 2016-03-28 2019-10-08 Apple Inc. Folded lens system with three refractive lenses
CN105898141B (en) * 2016-04-01 2017-09-19 广东欧珀移动通信有限公司 Control method, control device and electronic installation
CN105847686B (en) * 2016-04-01 2019-10-15 Oppo广东移动通信有限公司 Control method, control device and electronic device
US10306148B2 (en) * 2016-08-30 2019-05-28 Microsoft Technology Licensing, Llc Motion triggered gated imaging
US11190703B2 (en) * 2016-09-30 2021-11-30 Nikon Corporation Image-capturing apparatus, program, and electronic device that controls image sensor based on moving velocity
JP7118893B2 (en) * 2016-12-02 2022-08-16 ソニーセミコンダクタソリューションズ株式会社 Imaging device, imaging method, and electronic device
WO2018143658A1 (en) * 2017-02-06 2018-08-09 Samsung Electronics Co., Ltd. Apparatus of stabilizing shaking image and controlling method thereof
US10890734B1 (en) 2017-03-29 2021-01-12 Apple Inc. Camera actuator for lens and sensor shifting
US10863094B2 (en) 2017-07-17 2020-12-08 Apple Inc. Camera with image sensor shifting
US10462370B2 (en) 2017-10-03 2019-10-29 Google Llc Video stabilization
JP7023676B2 (en) * 2017-11-20 2022-02-22 キヤノン株式会社 Image pickup device and its control method
US10171738B1 (en) 2018-05-04 2019-01-01 Google Llc Stabilizing video to reduce camera and face movement
US11122205B1 (en) 2018-09-14 2021-09-14 Apple Inc. Camera actuator assembly with sensor shift flexure arrangement
JP7034052B2 (en) * 2018-11-02 2022-03-11 京セラ株式会社 Wireless communication head-up display systems, wireless communication devices, mobiles, and programs
JP7187269B2 (en) * 2018-11-05 2022-12-12 キヤノン株式会社 Imaging device and its control method
US10609288B1 (en) * 2019-03-04 2020-03-31 Qualcomm Incorporated Roll compensation and blur reduction in tightly synchronized optical image stabilization (OIS)
US11258951B2 (en) * 2019-06-27 2022-02-22 Motorola Mobility Llc Miniature camera device for stabilized video using shape memory alloy actuators
CN110531578B (en) * 2019-09-02 2021-04-13 深圳大学 Multi-frame framing imaging method, device and equipment
CN114631056A (en) * 2019-10-29 2022-06-14 富士胶片株式会社 Imaging support device, imaging support system, imaging support method, and program
US11190689B1 (en) 2020-07-29 2021-11-30 Google Llc Multi-camera video stabilization
KR20220037876A (en) * 2020-09-18 2022-03-25 삼성전기주식회사 Camera module
US11575835B2 (en) 2020-09-24 2023-02-07 Apple Inc. Multi-axis image sensor shifting system
CN112672059B (en) * 2020-12-28 2022-06-28 维沃移动通信有限公司 Shooting method and shooting device
CN115694089A (en) * 2021-07-27 2023-02-03 北京小米移动软件有限公司 Actuator, camera module and electronic equipment
CN115633262B (en) * 2022-12-21 2023-04-28 荣耀终端有限公司 Image processing method and electronic device

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2253067A (en) 1991-02-20 1992-08-26 Asahi Optical Co Ltd Blur preventing camera
JPH095816A (en) 1995-06-20 1997-01-10 Olympus Optical Co Ltd Camera capable of reducing camera shake
US5771404A (en) * 1992-01-14 1998-06-23 Nikon Corporation Shake preventing apparatus in camera
JPH11317904A (en) 1998-05-01 1999-11-16 Canon Inc Image pickup device and its control method
US6044228A (en) * 1997-09-09 2000-03-28 Minolta Co., Ltd. Camera capable of shake correction
JP2001166351A (en) 1999-12-08 2001-06-22 Olympus Optical Co Ltd Electronic camera apparatus
US6345152B1 (en) * 1999-04-26 2002-02-05 Olympus Optical Co., Ltd. Camera with blur reducing function
JP2002118780A (en) 2000-10-05 2002-04-19 Ricoh Co Ltd Image pickup device having camera-shake correction function
JP2002116477A (en) 2000-10-11 2002-04-19 Ricoh Co Ltd Image pickup device
JP2002311471A (en) 2001-04-13 2002-10-23 Fuji Photo Optical Co Ltd Vibration-proof device
US6487369B1 (en) * 1999-04-26 2002-11-26 Olympus Optical Co., Ltd. Camera with blur reducing function
JP2003101862A (en) 2001-09-21 2003-04-04 Ricoh Co Ltd Image pickup device and image pickup method
EP1304872A1 (en) 2001-10-19 2003-04-23 Nokia Corporation An image stabiliser for a microcamera module of a handheld device and method for stabilising a microcamera module of a hand-held device
US20030151688A1 (en) * 2002-02-08 2003-08-14 Canon Kabushiki Kaisha Image processing apparatus
JP2004007220A (en) 2002-05-31 2004-01-08 Canon Inc Blur correction camera
US6731799B1 (en) * 2000-06-01 2004-05-04 University Of Washington Object segmentation with background extraction and moving boundary techniques
JP2004201247A (en) 2002-12-20 2004-07-15 Fuji Photo Film Co Ltd Digital camera
US20040160525A1 (en) 2003-02-14 2004-08-19 Minolta Co., Ltd. Image processing apparatus and method
JP2004312663A (en) 2002-12-27 2004-11-04 Sony Corp Recording method, recording apparatus, recording medium, reproducing method, reproducing apparatus, and imaging apparatus
US20040239775A1 (en) * 2003-05-30 2004-12-02 Canon Kabushiki Kaisha Photographing device and method for obtaining photographic image having image vibration correction
EP1501288A2 (en) 2003-07-22 2005-01-26 OmniVision Technologies, Inc. CMOS image sensor using high frame rate with frame addition and movement compensation
CN1638440A (en) 2004-01-06 2005-07-13 株式会社尼康 Electronic camera
US20050248660A1 (en) * 2004-05-10 2005-11-10 Stavely Donald J Image-exposure systems and methods
US20070014554A1 (en) * 2004-12-24 2007-01-18 Casio Computer Co., Ltd. Image processor and image processing program
WO2007031808A1 (en) 2005-09-14 2007-03-22 Nokia Corporation System and method for implementing motion-driven multi-shot image stabilization
US20080052090A1 (en) * 2003-09-04 2008-02-28 Jens Heinemann Method and Device for the Individual, Location-Independent Designing of Images, Cards and Similar
US20090175496A1 (en) * 2004-01-06 2009-07-09 Tetsujiro Kondo Image processing device and method, recording medium, and program

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6429895B1 (en) * 1996-12-27 2002-08-06 Canon Kabushiki Kaisha Image sensing apparatus and method capable of merging function for obtaining high-precision image by synthesizing images and image stabilization function
JP2002077706A (en) * 2000-08-29 2002-03-15 Nikon Corp Image pickup device and image pickup method
JP2002094839A (en) * 2000-09-20 2002-03-29 Matsushita Electric Ind Co Ltd Electronic still camera
JP2002112100A (en) * 2000-09-28 2002-04-12 Nikon Corp Image pickup apparatus
AU2002327612B2 (en) * 2001-09-07 2008-01-17 Intergraph Software Technologies Company Method, device and computer program product for demultiplexing of video images
US7212230B2 (en) * 2003-01-08 2007-05-01 Hewlett-Packard Development Company, L.P. Digital camera having a motion tracking subsystem responsive to input control for tracking motion of the digital camera
US7329057B2 (en) * 2003-02-25 2008-02-12 Matsushita Electric Industrial Co., Ltd. Image pickup processing method and image pickup apparatus
JP2005277812A (en) * 2004-03-25 2005-10-06 Canon Inc Digital camera device with hand fluctuation reducing device and image reproducer
JP2006109079A (en) * 2004-10-05 2006-04-20 Olympus Corp Electronic camera
US7639888B2 (en) * 2004-11-10 2009-12-29 Fotonation Ireland Ltd. Method and apparatus for initiating subsequent exposures based on determination of motion blurring artifacts
US20060182430A1 (en) * 2005-02-15 2006-08-17 Stavely Donald J Camera exposure program coupled to image stabilization capability
CN100389601C (en) * 2005-10-09 2008-05-21 北京中星微电子有限公司 Video electronic flutter-proof device
JP2007300595A (en) * 2006-04-06 2007-11-15 Winbond Electron Corp Method of avoiding shaking during still image photographing
EP2007133A2 (en) * 2006-04-11 2008-12-24 Panasonic Corporation Image pickup device
US7952612B2 (en) * 2006-06-22 2011-05-31 Nokia Corporation Method and system for image construction using multiple exposures

Patent Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2253067A (en) 1991-02-20 1992-08-26 Asahi Optical Co Ltd Blur preventing camera
US5771404A (en) * 1992-01-14 1998-06-23 Nikon Corporation Shake preventing apparatus in camera
JPH095816A (en) 1995-06-20 1997-01-10 Olympus Optical Co Ltd Camera capable of reducing camera shake
US6044228A (en) * 1997-09-09 2000-03-28 Minolta Co., Ltd. Camera capable of shake correction
JPH11317904A (en) 1998-05-01 1999-11-16 Canon Inc Image pickup device and its control method
US6487369B1 (en) * 1999-04-26 2002-11-26 Olympus Optical Co., Ltd. Camera with blur reducing function
US6345152B1 (en) * 1999-04-26 2002-02-05 Olympus Optical Co., Ltd. Camera with blur reducing function
JP2001166351A (en) 1999-12-08 2001-06-22 Olympus Optical Co Ltd Electronic camera apparatus
US6731799B1 (en) * 2000-06-01 2004-05-04 University Of Washington Object segmentation with background extraction and moving boundary techniques
JP2002118780A (en) 2000-10-05 2002-04-19 Ricoh Co Ltd Image pickup device having camera-shake correction function
JP2002116477A (en) 2000-10-11 2002-04-19 Ricoh Co Ltd Image pickup device
JP2002311471A (en) 2001-04-13 2002-10-23 Fuji Photo Optical Co Ltd Vibration-proof device
JP2003101862A (en) 2001-09-21 2003-04-04 Ricoh Co Ltd Image pickup device and image pickup method
EP1304872A1 (en) 2001-10-19 2003-04-23 Nokia Corporation An image stabiliser for a microcamera module of a handheld device and method for stabilising a microcamera module of a hand-held device
US7307653B2 (en) 2001-10-19 2007-12-11 Nokia Corporation Image stabilizer for a microcamera module of a handheld device, and method for stabilizing a microcamera module of a handheld device
US20030151688A1 (en) * 2002-02-08 2003-08-14 Canon Kabushiki Kaisha Image processing apparatus
JP2004007220A (en) 2002-05-31 2004-01-08 Canon Inc Blur correction camera
JP2004201247A (en) 2002-12-20 2004-07-15 Fuji Photo Film Co Ltd Digital camera
JP2004312663A (en) 2002-12-27 2004-11-04 Sony Corp Recording method, recording apparatus, recording medium, reproducing method, reproducing apparatus, and imaging apparatus
US20040160525A1 (en) 2003-02-14 2004-08-19 Minolta Co., Ltd. Image processing apparatus and method
US20040239775A1 (en) * 2003-05-30 2004-12-02 Canon Kabushiki Kaisha Photographing device and method for obtaining photographic image having image vibration correction
US7209601B2 (en) 2003-07-22 2007-04-24 Omnivision Technologies, Inc. CMOS image sensor using high frame rate with frame addition and movement compensation
EP1501288A2 (en) 2003-07-22 2005-01-26 OmniVision Technologies, Inc. CMOS image sensor using high frame rate with frame addition and movement compensation
US20080052090A1 (en) * 2003-09-04 2008-02-28 Jens Heinemann Method and Device for the Individual, Location-Independent Designing of Images, Cards and Similar
JP2005197911A (en) 2004-01-06 2005-07-21 Nikon Corp Electronic camera
CN1638440A (en) 2004-01-06 2005-07-13 株式会社尼康 Electronic camera
US20090175496A1 (en) * 2004-01-06 2009-07-09 Tetsujiro Kondo Image processing device and method, recording medium, and program
US20050248660A1 (en) * 2004-05-10 2005-11-10 Stavely Donald J Image-exposure systems and methods
US20070014554A1 (en) * 2004-12-24 2007-01-18 Casio Computer Co., Ltd. Image processor and image processing program
WO2007031808A1 (en) 2005-09-14 2007-03-22 Nokia Corporation System and method for implementing motion-driven multi-shot image stabilization
US20090009612A1 (en) 2005-09-14 2009-01-08 Nokia Corporation System and method for implementation motion-driven multi-shot image stabilization

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE48552E1 (en) * 2006-06-22 2021-05-11 Nokia Technologies Oy Method and system for image construction using multiple exposures

Also Published As

Publication number Publication date
CN102769718A (en) 2012-11-07
WO2007148169A1 (en) 2007-12-27
CN101473266A (en) 2009-07-01
JP2013211898A (en) 2013-10-10
USRE48552E1 (en) 2021-05-11
JP2013176160A (en) 2013-09-05
EP2035891A1 (en) 2009-03-18
US7952612B2 (en) 2011-05-31
JP2013034247A (en) 2013-02-14
ES2523462T3 (en) 2014-11-26
EP2035891A4 (en) 2009-12-16
US20070296821A1 (en) 2007-12-27
JP2013214096A (en) 2013-10-17
EP2035891B1 (en) 2014-09-10
JP2013062849A (en) 2013-04-04
EP2428838A1 (en) 2012-03-14
CN101473266B (en) 2012-06-20
JP2009542076A (en) 2009-11-26
JP5284954B2 (en) 2013-09-11

Similar Documents

Publication Publication Date Title
USRE48552E1 (en) Method and system for image construction using multiple exposures
US7424213B2 (en) Camera system, image capturing apparatus, and a method of an image capturing apparatus
US9531944B2 (en) Focus detection apparatus and control method thereof
US20090290028A1 (en) Imaging apparatus and imaging method
US10175451B2 (en) Imaging apparatus and focus adjustment method
US20070147814A1 (en) Image capturing apparatus, method of controlling the same, and storage medium
JP2011081271A (en) Image capturing apparatus
EP0435319B1 (en) Camera apparatus having drift detecting unit
JP2005252657A (en) Electronic still camera
US10187564B2 (en) Focus adjustment apparatus, imaging apparatus, focus adjustment method, and recording medium storing a focus adjustment program thereon
US20040036781A1 (en) Digital camera
JP2004221992A (en) Imaging device and program
JP2008310072A (en) Digital camera
JP2003101862A (en) Image pickup device and image pickup method
JP4876550B2 (en) Imaging apparatus, control method, and control program
JP2002040506A (en) Image pickup unit
JP2010122356A (en) Focus detector and imaging apparatus
JP2016057402A (en) Imaging device and method for controlling the same
JP4007794B2 (en) Imaging apparatus and exposure control method thereof
JP2007025129A (en) Exposure setting method and device
JP2002372664A (en) Moving body region discriminating system, method of discriminating moving body region and focusing device
JP2003224762A (en) Digital camera
JP2016045322A (en) Imaging device and method for controlling the same
JP2003058893A (en) Processor, image pickup device and control method therefor
JPH11119337A (en) Camera

Legal Events

Date Code Title Description
AS Assignment

Owner name: CONVERSANT WIRELESS LICENSING S.A R.L., LUXEMBOURG

Free format text: CHANGE OF NAME;ASSIGNOR:CORE WIRELESS LICENSING S.A.R.L.;REEL/FRAME:044242/0401

Effective date: 20170720

AS Assignment

Owner name: NOKIA TECHNOLOGIES OY, FINLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CONVERSANT WIRELESS LICENSING S.A R.L.;REEL/FRAME:046851/0302

Effective date: 20180416

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12