CN101427372A - Apparatus for multiple camera devices and method of operating same - Google Patents

Apparatus for multiple camera devices and method of operating same Download PDF

Info

Publication number
CN101427372A
CN101427372A CNA2005800323740A CN200580032374A CN101427372A CN 101427372 A CN101427372 A CN 101427372A CN A2005800323740 A CNA2005800323740 A CN A2005800323740A CN 200580032374 A CN200580032374 A CN 200580032374A CN 101427372 A CN101427372 A CN 101427372A
Authority
CN
China
Prior art keywords
digital camera
photodetector array
array
lens
wavelength light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2005800323740A
Other languages
Chinese (zh)
Other versions
CN101427372B (en
Inventor
理查德·扬·奥尔森
达里尔·L·萨托
博登·默勒
奥利韦拉·维托米罗夫
杰弗里·A·布拉迪
费里·古纳万
雷姆济·奥滕
孙风清
詹姆斯·盖茨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Newport Imaging Corp
Original Assignee
Newport Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Newport Imaging Corp filed Critical Newport Imaging Corp
Publication of CN101427372A publication Critical patent/CN101427372A/en
Application granted granted Critical
Publication of CN101427372B publication Critical patent/CN101427372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0062Stacked lens arrays, i.e. refractive surfaces arranged in at least two planes, without structurally separate optical elements in-between
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • G02B13/0035Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface having three lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0075Arrays characterized by non-optical structures, e.g. having integrated holding or alignment means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B9/00Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or -
    • G02B9/12Optical objectives characterised both by the number of the components and their arrangements according to their sign, i.e. + or - having three components only
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14618Containers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14634Assemblies, i.e. Hybrid structures
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14683Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
    • H01L27/1469Assemblies, i.e. hybrid integration
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/02Details
    • H01L31/0232Optical elements or arrangements associated with the device
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L31/00Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
    • H01L31/02Details
    • H01L31/0232Optical elements or arrangements associated with the device
    • H01L31/02325Optical elements or arrangements associated with the device the optical elements not being integrated nor being directly associated with the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/214Image signal generators using stereoscopic image cameras using a single 2D image sensor using spectral multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • H04N23/16Optical arrangements associated therewith, e.g. for beam-splitting or for colour correction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/40Circuit details for pick-up tubes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/41Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/531Control of the integration time by controlling rolling shutters in CMOS SSIS
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0012Arrays characterised by the manufacturing method
    • G02B3/0031Replication or moulding, e.g. hot embossing, UV-casting, injection moulding
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0043Inhomogeneous or irregular arrays, e.g. varying shape, size, height
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/0001Technical content checked by a classifier
    • H01L2924/0002Not covered by any one of groups H01L24/00, H01L24/00 and H01L2224/00
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/048Picture signal generators using solid-state devices having several pick-up sensors
    • H04N2209/049Picture signal generators using solid-state devices having several pick-up sensors having three pick-up sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements

Abstract

There are many, many inventions described herein. In one aspect, what is disclosed is a digital camera including a plurality of arrays of photo detectors, including a first array of photo detectors to sample an intensity of light of a first wavelength and a second array of photo detectors to sample an intensity of light of a second wavelength. The digital camera further may also include a first lens disposed in an optical path of the first array of photo detectors, wherein the first lens includes a predetermined optical response to the light of the first wavelength, and a second lens disposed in with an optical path of the second array of photo detectors wherein the second lens includes a predetermined optical response to the light of the second wavelength. In addition, the digital camera may include signal processing circuitry, coupled to the first and second arrays of photo detectors, to generate a composite image using (i) data which is representative of the intensity of light sampled by the first array of photo detectors, and (ii) data which is representative of the intensity of light sampled by the second array of photo detectors; wherein the first array of photo detectors, the second array of photo detectors, and the signal processing circuitry are integrated on or in the same semiconductor substrate.

Description

The method that is used for equipment and this equipment of operation of a plurality of camera apparatus
Related application
The application requires the priority of following application: the title of submitting on August 25th, (1) 2004 is the U.S. Provisional Application 60/604,854 of " Solid State Camera "; The title of submitting on June 1st, (2) 2005 is the U.S. Provisional Application 60/695,946 (general designation " provisional application ") of " Method and Apparatus for use in Camera and SystemsEmploying Same ".The content of above-mentioned provisional application all is incorporated into this by reference.
Technical field
The field of the invention is a digital imagery.
Background technology
Recent technological change from film to " electronic medium " has stimulated the fast development of imaging industry, the application of imaging industry comprises that camera and video camera, cell phone, other personal communicator, surveillance equipment, automobile are used, computer based video communication and meeting, manufacturing and testing fixture, medical treatment device, toy and various other and the application that continues expansion.The lower cost and the size of digital camera (no matter it still is embedded in other device as stand-alone product) are the primary actuating forces of this development and market expansion.
Although traditional component manufacturer continues to dwindle parts in order to the advantage with electronic medium, but be difficult to reach digital camera manufacturers to smaller szie, more low-cost and more high performance more and more harsher requirement.Also have several major issues, comprising: 1) size of digital camera (for example digital camera in the cell phone) is more little, and picture quality is just poor more; 2) in order to make medium imaging in higher quality, still need complicated " lens ", shutter and photoflash lamp, negated the size benefits that electronic medium provided thus greatly; 3) the cost advantage that electronic medium provided by traditional complexity with expensive lens combination and other peripheral components is negated to a certain extent.
Great majority are used and to be sought more high-performance (picture quality), feature, smaller szie and/or all or some combination more cheaply always.These market demands usually may be conflicted mutually: higher performance usually needs bigger size, the feature of improving may need higher cost and bigger size, on the contrary, the cost of reduction and/or size may and/or be characterized as cost with any sacrifice in performance and obtain.For example, the consumer expects that its cell phone produces higher-quality image, but does not wish to accept and incite somebody to action independent digital camera quality to be integrated into size or the cost that is associated in the phone of its pocket size.
An actuating force of this challenge is the lens combination that is used for digital camera.Along with the quantity of photoelectric detector (pixel) increases (this has increased image resolution ratio), it is bigger that lens must become, with the size of the increase of the imageing sensor that covers the carrying photoelectric detector.When pixel quantity increases, keep constant imageing sensor and optics size thereby can reduce Pixel Dimensions, but pixel performance reduce (light signal reduce and pixel between the increase of crosstalking).And " zoom lens " feature of expectation has been added extra removable optics, size and cost to lens combination.Changed the focal length of optics and by the feature of high expectations as the zoom of carrying out, be called " optical zoom " by lens combination.These attributes (for example, the pixel of the increase quantity in imageing sensor and the optical zoom) although help picture quality and feature, may influence camera size and cost unfriendly.In some cases, as cell phone or wherein size and/or other very crucial device of cost, it is not best obtaining excellent picture quality (high-resolution and high sensitivity) with the method.
Digital camera supplier has the advantage that is better than conventional film supplier in the field of zoom capabilities.By electron process, digital camera can provide " electronic zoom ", should " electronic zoom " perimeter by the reduced size image original size that then middle section is amplified to electronically this image zoom capabilities is provided.Employing is similar to the mode that tradition is amplified, and can lose resolution to a certain degree when this process of execution.In addition, owing to be different from the common procedure of film, digital camera is caught discrete input and is formed picture, so the resolution of loss is more remarkable.Therefore, although " electronic zoom " is the feature of expectation, it is not the direct substitute of " optical zoom ".
Traditional digital cameras typically use single aperture and lens combination with Scenery Imaging to one or more imageing sensors.Color separated (if desired) typically realizes by three kinds of methods as red, green and blueness (RGB): the 1) color filter array on the single integrated circuit imageing sensor, 2) color separator (as prism) of a plurality of imageing sensors in optical path uses, perhaps 3) color separated and the many signals collecting ability of imager in each pixel used.These three kinds of color separated methods have limitation as described below.
Color filter array such as the Bayer pattern of frequent use changes the incident color between the neighbor on this array, and the color cross-talk of the accurate color reproduction of original image takes place to hinder.Because this array has been assembled the pixel of different colours ability, so need interpolation technique to create suitable coloured image.Color filter array also can have the optical signalling level that reduction receives and produce pixel to the uneven low and variable transmittance of the image of pixel.
A plurality of imagers provide accurate color reproduction with the use such as the color separated method of prism, but optical module is big and expensive.
Color separated method in the pixel has produced crosstalking of color and coarse color reproduction.Owing in each pixel, need collection of a plurality of color electric charge and reading device, thus Pixel Dimensions to reduce be limited.
The optics of lens design reduce with manufacturing, integrated circuit imager pixel size and the technological progress of digital reprocessing be shape with function on camera and the imaging system very different with time-honored digital camera design opened up new possibility.Use a plurality of camera passages (a plurality of optics, imageing sensor and electronic device) to allow to make with the assembly of a compactness to have improved picture quality, reduced physical thickness and increased the digital camera of imaging function, this can not realize with traditional single aperture/optical system digital camera framework.
Summary of the invention
Be to be understood that in this description and illustrate a lot of inventions.In fact, the invention is not restricted to any independent aspect and embodiment, also be not limited to any combination of these methods and/or embodiment/or displacement.In addition, each aspect of the present invention and embodiment can be used separately or be used in combination with one or more others of the present invention and/or embodiment.For the sake of simplicity, much replacing and be combined in this does not discuss separately.
In one aspect of the invention, imageing sensor comprises first and second photodetector arrays independently and will be combined to produce the signal processing circuit of composograph from the signal of described array.
Preferred embodiment comprises 3 or more photodetector arrays, and wherein signal processing circuit is handled the signal from each array, then will be combined to produce composograph from the signal of all arrays.The use like this of a plurality of arrays allows each array optimised aspect certain, optimised as the reception at particular color.Therefore, for example, described array can be optimized to the light that detects different colours or other wavelength." color " can be arrowband or broadband, as red, green or blue.These bands even can be overlapping.
Optimization can for example have the different mean pixel degree of depth to comprise, row logic unit, analog signal logical block, black level logical block, exposure control, image processing techniques and lens design and painted any desired mode are finished.
Transducer with two or more different arrays can advantageously have different lens on each different array.Preferred lens can adopt spread dyestuff in pressing mold coating (die coating), the optical medium, colour filter and/or any other colour filter technology uniformly basically, and by above-mentioned colour filter technology, light passes to array below.
Treatment circuit can comprise any suitable mechanism and/or logical block.Interested especially is that a plurality of independently images of generation then will the combined circuit with the formation single image of these a plurality of independently images from different arrays.In this process, signal processing circuit is the carries out image enhancement function advantageously, as handle saturated, acutance, intensity, tone, pseudo-shadow is removed and the correction of defectiveness pixel.
With regard to integration, it is desirable to, each array is physically located on the same chip.In addition, also it is desirable to, framework is coupled to this chip, and at least one Lens Coupling is arrived this framework.Lens can be located in manufacture process independently, use sealant or other joining technique and this frame seal then together.The integration of these elements is called " digital camera subsystem " (DCS).
Preferred imageing sensor comprises a hundreds of thousands photoelectric detector at least, and the gross thickness that comprises lens and framework is no more than 10,15 or 20mm.Can adopt wave soldering, plate top die or other technology to merge to so little DCS device in the semiconductor " encapsulation " or directly be attached to circuit board (" do not have encapsulation ").DCS and/or plate can merge to camera or have the memory of the image that user interface element, storage obtain from array and at least one is to other device of the power supply of system's power supply then.DCS of the present invention, camera or other device can be used for any suitable purpose, especially comprise static and video imaging, calculate distance and set up 3D effect.
In another aspect of this invention, compact solid state cameras (compact digital camera) comprises the first and second camera passages that are provided with close to each otherly, and wherein each camera passage comprises its oneself optics, imageing sensor and signal processing circuit.These two camera passages (identical or different) can be combined to form composograph with its output signal, and perhaps each camera passage can provide independently image.Be used for will be from the image of any combination of camera passage combined or individually or in combination the electronic device of demonstration/storage/transmission passage be included in this compact solid state cameras assembly (CSSC).
Other embodiment comprises 3 or polyphaser passage (identical or different) more, wherein signal processing circuit is handled the signal from each passage, then will be combined to produce composograph from the signal of some or all of passages, perhaps each camera passage can provide independently image alone in combination with composograph.The use of a plurality of camera passages allows each passage at optimised aspect certain (if necessary), optimised as the imaging aspect of specific incident light color.Therefore, for example, array can be optimized for the light that detects different colours or other wavelength." color " of each camera passage can be the broadband, arrowband, as red, green or blue.These bands even can be overlapping.Each camera passage can be to one or more color imagings.
The optimization of each camera passage can be finished to obtain the desired images ability in any desired mode that comprises optics, imageing sensor and signal processing electronic device, and for example optics can be optimized at specific image sensor size, wavelength (color), focal length and f number.Imageing sensor can be optimized by the peripheral circuit outside pixel quantity, Pixel Dimensions, pixel design (photoelectric detector and circuit), frame rate, integrating time and this image element circuit.Signal processing circuit can be optimized at color correction, image compression, difference pixel replacement and other imaging function.The camera passage can be identical or unique; But all camera passages closely are provided with.
If desired, colour filter (or other color separation technique) can be used as independently color-filter layer merge in optical material or the optical surface, on image sensor surface or by design construction in the pixel semiconductor.Each camera passage can have the color imaging characteristic of oneself.Imageing sensor can have single color ability or many colors ability; This many colors ability may reside in the single pixel and/or neighbor between.
Treatment circuit can comprise in order to optimize any suitable mechanism and/or the logical block of picture quality.Interested especially is that to produce independently image from the camera passage then will these a plurality of independently images combined to form the circuit of synthetic single image.In this process, signal processing circuit is the carries out image enhancement function advantageously, as correction and other imaging optimizational function of dynamic range management (automatic gain/level), image sharpening, intensity correction, tone, the removal of pseudo-shadow, defectiveness pixel.Treatment circuit can be worked under the analog or digital pattern.
With regard to mechanically integrated, it is desirable to, make each imageing sensor of camera passage be physically located in same integrated circuit (chip) upward to reduce manufacturing cost and to reduce electrical interconnection and size.In addition, it is desirable to, mechanical framework is assembled on the chip, and one or more Lens Coupling are arrived this framework.Lens can be located in manufacture process independently, then with sealant or other joining technique and this frame seal together.The integration of these elements is called " digital camera subsystem " (DCS).Other layer can form compact solid state cameras (CSSC) with the vertical integration (as camera system electronic device and even display capabilities) of DCS.
Preferred camera passage comprises a hundreds of thousands photoelectric detector (pixel) at least.Compare with the traditional camera system (image resolution ratio equates) that only adopts an optical module, the thickness of camera passage (comprising imageing sensor and optics) can be less.Can adopt wave soldering, plate top die or other technology to merge to so little DCS device in the semiconductor " encapsulation " or directly be attached to circuit board (" do not have encapsulation ").DCS and/or plate can merge to camera or have the memory of the image that user interface element, storage obtain from array and at least one is to other device of the power supply of system's power supply then.DCS of the present invention, camera and other device can be used for any suitable purpose, comprise static and video imaging.
It should be noted that, in some aspects, the camera passage that the digital camera sub-component is complete with two or more is included in the monolayer packages, and this monolayer packages is included in all parts (optics, mechanical structure and electronic device) that need in foreign peoples's (heterogeneous) assembly or the encapsulation.
In another embodiment, the digital camera sub-component has multilayer laminated form.
In another embodiment, two or more in the camera passage comprise optics, optical alignment structure (mechanical framework), encapsulation, colour filter and other optical element, imageing sensor, mixed signal interface, image and/or color treatments logical block, memory, control and sequential logic unit, Power management logic unit and parallel and/or the serial device interface of special modality.
In another embodiment, each camera passage also comprises one or more in following: single channel or multichannel image compressed logic unit and/or image output format logical block, wired or wireless communication and optics display capabilities.
In another embodiment, the output of each passage can provide the integral image that comprises coloured image or partial colour image or the image of discrete process.
In another embodiment, the camera passage is positioned at close to each otherly on the two-dimentional focal plane that comprises a CSSC component layer jointly, and this is retrained by quantity, type, position and the optical diameter of lens combination near degree and limits.
In another embodiment, each camera passage also comprises the imageing sensor that the photon sensing function is provided, and it has constituted the part of total compact solid phase camera of the testing mechanism (no film) that uses based semiconductor.Single component can be formed by two or more component layer of assembling successively on vertical dimension (with the focal plane quadrature).
In another embodiment, comprise that the component layer of vertical integration, assembly with polyphaser channel capacity provide camera system ability and the performance of utilizing the traditional camera system that uses the single camera passage to realize.
In another embodiment, the method that some or all component layer of vertically integrating are integrated with stacked assembling or wafer ratio forms, to produce the part of many camera systems simultaneously.
In another embodiment, wafer or layer can comprise optics, machinery and electric parts, electrical interconnection and other device (as display).
In another embodiment, the electrical interconnection between the component layer can with photoetching process and metallization, protuberance engage (bump bonding) or other method forms.The organic or inorganic joint method can be used for the coupling component layer.The layering packaging technology is from having " host " wafer of the electronic device that is used for whole camera and/or each camera passage.Then another wafer or each chip are alignd with host wafer and engage with host wafer.Being transferred wafer or chip can have in order to carry out the interconnected protuberance of electricity, perhaps to connect and can carry out after joint and attenuate.Support substrates from second wafer or each chip is removed, and only stays to comprise a few micro materials thickness that are attached to host wafer that are transferred electronic device.Using being engaged between wafer or tube core and host wafer or the tube core of standard integrated circuit technology to be electrically connected (if desired) then.Can repeatedly repeat this technology.The layer that shifts in this manner can comprise electricity, machinery or optical signature/parts.This technology allows a plurality of layers of foreign peoples's assembly that formation has desired electricity, machinery and optical power in the compact solid phase camera.
In another embodiment, the camera passage comprises the linearity or the area array imager of virtually any size, form, pixel quantity, pixel design or pel spacing.
In another embodiment, the camera passage is providing panchromatic, monochromatic, polychrome or monochrome (black and white) ability in any wave-length coverage of ultraviolet (UV) to infrared (IR).If desired, colour filter can be on the imageing sensor and/or in optical component layer.The camera passage can also utilize the semiconductor absorption characteristic in the pixel that the color ability is provided.For example, pixel can provide one or more color abilities by the optical absorption depth characteristic.The pixel color stalling characteristic can also be combined with the colour filter in the optical path.
In another embodiment, the high spatial image resolution ratio can realize from the identical visual field of slightly different view by utilizing a plurality of camera passages.
In another embodiment, two or more camera passages are observed identical visual fields, although because the spatial deviation between such camera passage and from different view.In some such embodiment, can be combined from two or more such camera channel image provides the image of high spatial resolution with generation.Adopting the parallax correction algorithm may be favourable to reduce and/or to eliminate the parallax effect.Interchangeable, can be combined so that the three-dimensional feature imaging to be provided from two or more camera channel image (having identical visual field but different visual angles).Thus, it may be favourable for example increasing and/or strengthen the parallax effect by " inversely " application parallax correction algorithm.The three-dimensional feature imaging can for example be used for fingerprint and/or retina characteristic imaging and/or analysis.Any parallax correction algorithm, no matter be at present known or later exploitation, use can combine with any embodiment here.Use that among the embodiment of front any can increase with parallax and/or parallax reduces to combine.
In another embodiment, can add optical signature to the optical stack of one or more camera passages, so that additional imaging capability to be provided, as single, two or tunable colour filter, at the wavefront modification of the depth of focus that increases and the automatic focus degree of depth and the polarizing filter that reduces dazzle.It should be noted that any optical signature,, can merge in one or more camera passages so that additional imaging capability to be provided no matter be at present known or later exploitation.
In another embodiment, optics part can comprise one or more filters, for example colour filter, one or more wavelength or one or more wavelength band are offered the sensor array of one or more associations.Such filter can for example be single, two or tunable filter.In one embodiment, user, operator and/or manufacturer can adopt tunable filter to control or definite these one or more wavelength or one or more wavelength band.
In another embodiment, one or more filters use that combines with, some or all of camera passage.Such filter can be same to each other or different to each other.For example, filter can provide or identical wavelength or wavelength band can be provided.In addition, some filters can be fixed, and other filter can be tunable.
In another embodiment, optics partly comprises for example wavefront modification element, to increase the depth of focus and/or to be used to implement automatic focus.In addition, in another embodiment, the optics part can comprise the polarizing filter of one or more minimizing dazzles, so that light polarization and minimizing " dazzle " thus.Such filter can use separately or with the embodiment disclosed herein in any combined use.
In the embodiments of the invention any can comprise one or more lighting units, to improve and/or to strengthen image acquisition by one or more camera passages (being one or more sensor arraies in particular), help to object range detection, to the SHAPE DETECTION of object and conversion imaging (be human eye can not observed imaging).
Lighting unit can provide passive illumination (for example not having illumination), initiatively illumination (for example constant illumination), active constant and/or gate to throw light on (for example predetermined, pre-if the pulsed illumination and/or the programmable pulsed illumination of user/operator of processor control).These one or more lighting units can be arranged on the substrate of sensor array and/or the support frame or be integrated in the substrate and/or support frame of sensor array.In fact, these one or more lighting units can be arranged on any element of one or more camera passages or the parts or be integrated in any element or parts of one or more camera passages.
In certain embodiments, lighting unit is exclusively used in one or more camera passages.Thus, the work of lighting unit and one or more designated lanes " is enabled " synergistically.In another embodiment, lighting unit is by all camera channels share.So in this embodiment, the work of lighting unit and camera passage is enabled synergistically.In fact, in certain embodiments, one or more lighting units are exclusively used in one or more camera passages and one or more lighting unit can be shared by one or more camera passages (comprising those passages that are associated with one or more dedicated illumination unit).In this embodiment, the work of dedicated illumination unit and one or more designated lanes " is enabled " synergistically, and the work of shared lighting unit and all camera passages is enabled synergistically.
As mentioned above, one or more camera passages can be optimized, revise and/or dispose according to predetermined, the spectral response that self adaptation is determined, expection and/or expectation of these one or more camera passages.For example, yardstick, characteristic, work, response and/or the parameter of sensor array (and/or its pixel) and image processing circuit can dispose, design and/or customize according to predetermined, the spectral response that self adaptation is determined, expection and/or expectation of these one or more camera passages.In this way, can dispose, design and/or customize one or more aspects of digital camera of the present invention, so that expectation, suitable, predetermined and/or specific response to be provided under the environment that will adopt this camera.
In certain embodiments, each camera passage can be disposed uniquely, be designed and/or be customized.For example, one or more camera passages can be configured to comprise the visual field that is different from these one or more camera passages.Like this, one or more camera passage has first visual field and one or more other camera passage has second visual field.In this way, digital camera can utilize different visual fields to catch image simultaneously.
The visual field can be (for example in situ) that fix or programmable.The visual field can utilize some technology or configuration to regulate, and these some technology or configuration comprise the effective dimensions of regulating or revising optics focal length and/or adjusting or revise array.In fact, no matter the technology or the configuration of any adjusting visual field are at present known or later exploitation, all are intended to fall into scope of the present invention.
In addition, digital camera of the present invention can comprise programmable (in situ or other position) or the fixing integrating time of one or more (or all) camera passage.Thus, the integrating time of one or more camera passages can dispose, designs and/or be customized to and help to catch for example large scene dynamic range.So in this embodiment, single color belt camera passage can be used for producing combined colors image ability and (comprises for example UV and IR, if desired), dispose and/or design the integrating time of each camera passage to obtain the signals collecting that provides required in the band at its wavelength.
And, can implement two or more integrating times to obtain in this image illumination level from low to high simultaneously.The combination dynamic range of a plurality of camera passages provides recently from the bigger dynamic range of the dynamic range of single camera passage (all passages are all had an integrating time).Like this, each camera channel image transducer or array can dispose and/or be designed to use specific (predetermined, default or programmable) integrating time scope and illumination level to come work.
Yardstick, characteristic, work, response and/or parameter (for example visual field, integrating time, sensor array (and/or its pixel) and/or image processing circuit) that it should be noted that the camera passage can dispose, design and/or customize (in situ or other position) according to predetermined, response that self adaptation is determined, expection and/or expectation according to one or more camera passages.For example, the camera passage can dispose, designs and/or be customized to and comprise different visual fields, and each visual field all has identical or different frame rate and/or integrating time.Like this, in one embodiment, digital camera of the present invention can comprise and is used to the second narrow visual field camera passage that obtains the first/wide visual field camera passage of object and be used for recognition object.And the resolution of the first/wide visual field camera passage and the second narrow visual field camera passage can also be different, with image that enhancing for example is provided or obtain.
In addition, sensor array and/or Pixel Dimensions (spacing) can dispose, design and/or customize according to the response predetermined, expection and/or expectation of one or more camera passages.For example, can dispose Pixel Dimensions, to optimize, to strengthen and/or to obtain specific response.In one embodiment, can select the Pixel Dimensions of sensor associated array, to provide, to strengthen and/or to optimize the specific response of digital camera.Thus, if sensor array comprises a plurality of camera passages (for example UV, B, R, G and IR), then in one or more sensor arraies, (for example implement different Pixel Dimensions, to IR (maximum), pel spacing increases gradually from UV (minimum)) can provide, strengthen and/or optimize the specific response of digital camera.
Pixel Dimensions can be based on some considerations, comprise provide predetermined, the resolution of that self adaptation is determined, expection or expectation and/or obtain predetermined, that strengthen and/or the suitable feature of obtaining for certain wavelengths (or wavelength band), for example, the size (reducing the size of spacing) that reduces pixel can strengthen obtaining of short-wavelength light.This may be favourable when the corresponding optical dimming size of coupling reduces.Can select and/or definite pixel design and process sequence (subclass of total wafer technique), to optimize and/or to strengthen the photoresponse of certain camera passage color.And, can regulate, select and/or the quantity of the pixel that definite sensor array lists so that identical visual field to be provided, although the pixel in a plurality of array is of different sizes.
In addition, can dispose, design and/or customized image treatment circuit (for example image processing and color treatments logical block), with the response of predetermined, that self adaptation is determined, expection and/or expectation that one or more camera passages are provided.For example, image processing and color treatments logical block can be configured to when optics, transducer and image processing are applied to each passage respectively independently to optimize, to quicken and/or to reduce complexity by making optics, transducer and image processing " coupling ".Any final ordering of full color or partial colour image can be simplified and improves the quality greatly again by eliminating Bayer pattern interpolation.
Should be noted that any digital camera passage (for example have RGB ability or the combination of other colour filter) can be combined with one or more panchromatic, double-colored, monochromes or B/W camera passage.The combination of camera passage can be used for providing the wave-length coverage ability of increase, visual field different time the, integrating time, active and imaging and passive imaging ability different the time, utilize a plurality of camera passages and parallax correction higher resolution, use the 3D imaging (feature extraction) of the parallax of a plurality of camera passages and increase, the colour band ability that increases.
In certain embodiments, different colours camera channels share parts, for example data processor.Thus, in one embodiment, a camera passage can adopt the sensor array of the data of obtaining expression first color image (for example blue) and second color image (green).Other camera passage can adopt the sensor array that is exclusively used in specific/predetermined wavelength or wavelength band (for example redness or green), and perhaps such camera passage can adopt the sensor array of the data of obtaining two or more predetermined wavelengths of expression or wavelength band (for example (i) redness and green or (ii) cyan and green).These camera passages are combined can to provide panchromatic ability.
For example, in one embodiment, the first sensor array can obtain the data of expression first and second predetermined wavelengths or wavelength band (for example with the red and blue wavelength that is associated), and second sensor array can obtain the data of expression the 3rd predetermined wavelength or wavelength band (for example wavelength that is associated with green).In this embodiment, these camera passages are combined can only provide full-colour image with two sensor arraies.
It should be noted that in above-mentioned example embodiment it may be favourable adopting the 3rd sensor array to obtain IR.In this way, the YcrCb output camera of " truly " be can provide, necessary cost complexity of conversion and/or energy consideration in the combine digital image area minimized and/or eliminate simultaneously.
If sensor array obtains two or more predetermined wavelengths or wavelength band, then the pixel of this sensor array can be designed as two or more degree of depth or the location collection photon in the pixel of the semiconductor array that is associated with these two or more predetermined wavelengths or wavelength band.Thus, can be at the color " selection " of such sensor array based on separating in order to pixel design and/or the colour band that comes separate colors by the optical absorption degree of depth.
And two look abilities in one or more camera passages can utilize and be arranged on the color filter array before (for example in the optical module) sensor array to finish or provide.It should be noted that if desired, can in the optical module layer, provide extra colour band to separate.
Maybe advantageously, combine with the sensor array that obtains two or more wavelength predetermined or that self adaptation is determined or wavelength band, adopt programmable (in situ or other position) or fixing integration technology at one or more (or all) camera passage.Thus, can dispose, design and/or customize the integrating time of one or more camera passages, helping to catch for example a plurality of predetermined wavelengths or wavelength band, thus strengthen, optimize and/or provide enhancing, through self adaptation design, expectation determine and/or the predetermined technology of obtaining.It should be noted that at this any embodiment that discusses about the integrating time of camera passage and can merge with the sensor array that obtains two or more predetermined wavelengths or wavelength band.For for purpose of brevity, do not repeat this discussion at this.
The present invention can use 3 sensor arraies to implement (each sensor array obtains one or more predetermined wavelengths or wavelength band, for example with red, the blue and green wavelength that is associated).In this embodiment, these 3 sensor arraies can be arranged as triangular arrangement (for example symmetry, asymmetrical, isosceles, the obtuse angle, acute angle and/or right-angled triangle) so that panchromatic (RGB) to be provided ability.This triangular arrangement will provide the symmetry of parallax, simplify thus in order to handle the algorithm computation of parallax.Triangular arrangement also will provide three image sensor array system/install for more compact assembly and layout enhancing and/or that optimize of the component layer that is associated.
In triangular arrangement/layout embodiment, it may be favourable that one or more (or all) camera passage is adopted programmable (in situ or other position) or fixing integration technology.Thus, can dispose, design and/or customize the integrating time of one or more camera passages, to help to catch for example a plurality of predetermined wavelengths or wavelength band, to strengthen, to optimize and/or to provide self adaptation enhancing, expectation, determine and/or the predetermined technology of obtaining through designing.It should be noted, can merge with triangular arrangement/layout at this any embodiment that discusses about the integrating time of camera passage.For for purpose of brevity, do not repeat this discussion at this.
As mentioned above, can comprise two or more camera passages according to digital camera of the present invention.In one embodiment, digital camera comprises a plurality of sensor arraies (for example more than 5 sensor arraies), and each sensor array obtains the wavelength or the wavelength band (for example with 4 to 10 wavelength that colour band is associated) of narrow predetermined quantity.In this way, digital camera can provide multispectral (for example 4-10 colour band) or ultraphotic spectrum (for example 10-100 colour band) imaging capability simultaneously.
In another embodiment, digital camera can adopt black and white (B/W) sensor array that obtains a plurality of broadbands black and white image.The combination of B/W camera passage can be used for providing the wave-length coverage ability of increase, visual field different time the, integrating time, active and imaging and passive imaging ability different the time, use a plurality of camera passages and parallax correction higher resolution, use the 3D imaging (feature extraction) of the parallax of a plurality of camera passages and increase.In fact, a plurality of B/W camera passage can be combined to realize full color or partial colour ability with other camera passage.It should be noted that the gray-scale sensor array can combine with B/W sensor array described herein uses or replaces B/W sensor array described herein to use.
In another embodiment, digital camera subsystem comprises display.This display can be arranged in the display layer and/or be integrated on the sensor array substrate or in the sensor array substrate.
In yet another embodiment, digital camera subsystem is provided for one or more interfaces of communicating by letter with digital camera subsystem.
In another embodiment, digital camera subsystem comprises the ability of carrying out wired, wireless and/or optical communication.In certain embodiments, digital camera subsystem comprises one or more circuit or its part that is used for such communication.Described circuit can be arranged in the layer that is exclusively used in such communication and/or can merge in one of other layer (for example being integrated in the sensor array substrate or on the sensor array substrate).
In one aspect of the invention, " scene " is imaged onto a plurality of sensor arrays and lists.Sensor array can near and can be on single integrated circuit processed or made independently and closely fitted together.Each sensor array is arranged under optical module or the optical module.This optical module can obtain by the sensor subsystem wafer process, be applied on the image wafer, shift individually with laying method or adhere to die-level (die level) by picking up by independent wafer transfer.
If the employing colour filter, then colour filter can be building up in the optical material, as layer or be coated with and be deposited upon on the sensor associated array, be applied in the optical module as lens coating or as colour filter independently.If necessary, can also on each imaging region, provide color separated mechanism by means of colour filter or by color separated mechanism in the pixel.Can add other optical signature to the optical system of each sensor array, so that additional imaging capability to be provided.
In certain embodiments, optimize the design and the electricity work of each sensor array, so that sensing incides the light wavelength of this sensor array.A plurality of optical modules have produced the compact camera that high-resolution, high sensitivity and excellent color reproduction can be arranged with the use of the sensor array of independent optimization.
In one aspect, the present invention is the digital camera that comprises a plurality of photodetector arrays, and photodetector array comprises: first photodetector array, in order to luminous intensity sampling to the light of for example first wavelength (it can be associated with first color); And second photodetector array, in order to luminous intensity sampling to the light of for example second wavelength (it can be associated with second color).This digital camera can comprise the signal processing circuit with the coupling of first and second photodetector arrays, to utilize (i) expression by the data of the luminous intensity of first photodetector array sampling with (ii) represent to produce composograph by the data of the luminous intensity of second photodetector array sampling.Aspect this, first photodetector array, second photodetector array and signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate of the present invention.
This digital camera can also comprise in order to the 3rd photodetector array to the luminous intensity sampling of the light of three-wavelength (can be associated with the 3rd color).In this embodiment, signal processing circuit and the coupling of the 3rd photodetector array, and utilization (i) expression is by the data of the luminous intensity of first photodetector array sampling, (ii) represent by the data of the luminous intensity of second photodetector array sampling and (ii) represent to produce composograph by the data of the luminous intensity of the 3rd photodetector array sampling.First, second can be arranged in triangular arrangement (for example isosceles, obtuse angle, acute angle or right-angled triangle configuration) relatively with the 3rd photodetector array.
In certain embodiments, first photodetector array can reach first integrating time to the intensity sampling of first wavelength light; Second photodetector array can reach second integrating time to the intensity sampling of second wavelength light.If digital camera comprises the 3rd photodetector array, then sampling reaches first integrating time, second integrating time or the 3rd integrating time to the 3rd photodetector array to the three-wavelength light intensity.
This digital camera can comprise first array, and wherein each photoelectric detector of first array comprises semiconductor portions, in this semiconductor portions luminous intensity is sampled.In addition, each photoelectric detector of second array comprises semiconductor portions, in this semiconductor portions luminous intensity is sampled.In certain embodiments, the semiconductor portions of each photoelectric detector of the semiconductor portions of each photoelectric detector of first array and second array is positioned at the different depth place with respect to the surface of each photoelectric detector.
This digital camera can also comprise in the optical path that is arranged on first photodetector array and first lens that are associated with this optical path and the optical path that is arranged on second photodetector array in and second lens that are associated with this optical path.Basically the colour filter sheet can be arranged in the optical path of first detector array uniformly.In addition, first tinted lenses is arranged in the optical path of first detector array and with this optical path and is associated.
It should be noted that, this digital camera can also comprise in the optical path that is arranged on first photodetector array and first lens that are associated with this optical path (first wavelength light is passed through and filtering second wavelength light), wherein first photodetector array is to the intensity sampling of first wavelength light, and second photodetector array is to the intensity sampling of second wavelength light.
This digital camera can comprise to first photodetector array of the intensity sampling of the first wavelength light intensity and second wavelength light and to second photodetector array of three-wavelength light intensity sampling, wherein first wavelength is associated with first color, second wavelength is associated with second color, and three-wavelength is associated with the 3rd color.Each photoelectric detector of first array can comprise first semiconductor portions, in first semiconductor portions first wavelength light intensity sampled, and second semiconductor portions, in second semiconductor portions second wavelength light intensity is sampled; And each photoelectric detector of second array can comprise semiconductor portions, in this semiconductor portions the three-wavelength light intensity is sampled; And wherein the semiconductor portions of each photoelectric detector of first and second semiconductor portions of each photoelectric detector of first array and second array is positioned at relative to each other and with respect to the different depth place on the surface of each photoelectric detector.
In this embodiment, this digital camera can also comprise in the optical path that is arranged on first photodetector array and first lens that are associated with this optical path and the optical path that is arranged on second photodetector array in and second lens that are associated with this optical path, wherein first lens make first and second wavelength light by and filtering three-wavelength light.In fact, this digital camera can comprise in the optical path that is arranged on first photodetector array and the filter that is associated with this optical path, and wherein this filter passes through and filtering three-wavelength light first and second wavelength light.And first photodetector array can reach first integrating time to the intensity sampling of first wavelength light and the intensity sampling of second wavelength light is reached second integrating time; And sampling reaches the 3rd integrating time to second photodetector array to the three-wavelength light intensity.
The signal processing circuit of this digital camera can utilize expression to be produced first image and utilized expression to produce second image by the data of the luminous intensity of second photodetector array sampling by the data of the luminous intensity of first photodetector array sampling.After this, signal processing circuit can utilize first image and second image to produce composograph.
This digital camera can also comprise memory, in order to storage (i) expression by the data of the luminous intensity of first photodetector array sampling with (ii) represent data by the luminous intensity of second photodetector array sampling.This memory, first photodetector array, second photodetector array and signal processing circuit can be integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
In addition, can comprise sequential and control logic unit, to provide sequential and control information to signal processing circuit, first photodetector array and/or second photodetector array.In addition, comprise telecommunication circuit (wired, wireless and/or optical communication circuit), in order to the data of output expression composograph.This telecommunication circuit, memory, first photodetector array, second photodetector array and signal processing circuit can be integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
In in the above-described embodiments any, first photodetector array can comprise the first surface zone, and second photodetector array comprises the second surface zone, and wherein the first surface zone is different from the second surface zone.And the photoelectric detector of first array can comprise the first active surface zone, and the photoelectric detector of second array can comprise the second active surface zone, and wherein the first active surface zone is different from the second active surface zone.
In addition, in any in the foregoing description, first photodetector array can comprise the first surface zone, and second photodetector array comprises the second surface zone, and wherein the first surface zone is substantially the same with the second surface zone.The photoelectric detector of first array can comprise the first active surface zone, and the photoelectric detector of second array can comprise the second active surface zone, and wherein the first active surface zone is different from the second active surface zone.
A kind of digital camera that comprises a plurality of photodetector arrays, described photodetector array comprises: first photodetector array, in order to luminous intensity sampling to the light of first wavelength (it can be associated with first color); And second photodetector array, in order to luminous intensity sampling to the light of second wavelength (it can be associated with second color).This digital camera can also comprise: (it can be delivered to first wavelength light on the plane of delineation of photoelectric detector of first array to be arranged on first lens in the optical path of first photodetector array, and can filtering/weaken second wavelength light), wherein first lens comprise the predetermined optical response to first wavelength light; And (it can be delivered to second wavelength light on the plane of delineation of photoelectric detector of second array to be arranged on second lens in the optical path of second photodetector array, and can filtering/weaken first wavelength light), wherein second lens comprise the predetermined optical response to second wavelength light.In addition, this digital camera can comprise the signal processing circuit with the coupling of first and second photodetector arrays, to utilize (i) expression by the data of the luminous intensity of first photodetector array sampling with (ii) represent to produce composograph by the data of the luminous intensity of second photodetector array sampling; Wherein first photodetector array, second photodetector array and signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
This digital camera can also comprise that in order to the 3rd photodetector array of the luminous intensity sampling of the light of three-wavelength (it can be associated with the 3rd color) and be arranged on the 3rd lens in the optical path of the 3rd photodetector array, wherein the 3rd lens comprise the predetermined optical response to three-wavelength light.Like this, signal processing circuit and the coupling of the 3rd photodetector array, and utilization (i) expression is by the data of the luminous intensity of first photodetector array sampling, (ii) represent by the data of the luminous intensity of second photodetector array sampling and (ii) represent to produce composograph by the data of the luminous intensity of the 3rd photodetector array sampling.First, second can be arranged in triangular arrangement (for example isosceles, obtuse angle, acute angle or right-angled triangle configuration) relatively with the 3rd photodetector array.
In one embodiment, first lens filtering second and the three-wavelength light, second lens filtering first and the three-wavelength light, the 3rd lens filtering first and second wavelength light.
In one embodiment, first photodetector array reaches first integrating time to the intensity sampling of first wavelength light; Second photodetector array reaches second integrating time to the intensity sampling of second wavelength light.If this digital camera comprises the 3rd photodetector array, then sampling reaches the 3rd integrating time to the 3rd photodetector array to the three-wavelength light intensity.
This digital camera can also comprise shell, wherein first and second lens, first and second photodetector arrays and signal processing circuit are attached to this shell, and wherein first and second lens can be located independently with respect to the photodetector array that is associated.
In certain embodiments, first photodetector array is to the luminous intensity of the light of first wavelength (it is associated with first color) and the light intensity sampling of three-wavelength (it is associated with the 3rd color), and second photodetector array is to the light intensity sampling of second wavelength (it is associated with second color).Here, each photoelectric detector of first array can comprise: first semiconductor portions, in this first semiconductor portions the first wavelength light intensity is sampled; And second semiconductor portions, in this second semiconductor portions the three-wavelength light intensity is sampled.In addition, each photoelectric detector of second array can comprise semiconductor portions, in this semiconductor portions the second wavelength light intensity is sampled.In this embodiment, the semiconductor portions of each photoelectric detector of first and second semiconductor portions of each photoelectric detector of first array and second array is positioned at relative to each other and with respect to the different depth place on the surface of each photoelectric detector.
In addition, among in these embodiments one or more, first lens can make first and three-wavelength light by and filtering second wavelength light.In addition or replace it, comprise in the optical path that is arranged on first photodetector array and the filter that is associated with this optical path, wherein this filter make first and three-wavelength light by and filtering second wavelength light.
And first photodetector array can reach first integrating time to the intensity sampling of first wavelength light, and sampling reaches the 3rd integrating time to the three-wavelength light intensity.Sampling reaches second integrating time to second photodetector array to the three-wavelength light intensity.
The signal processing circuit of this digital camera can utilize expression to be produced first image and utilized expression to produce second image by the data of the luminous intensity of second photodetector array sampling by the data of the luminous intensity of first photodetector array sampling.After this, signal processing circuit can utilize first image and second image to produce composograph.
This digital camera can also comprise memory, in order to storage (i) expression by the data of the luminous intensity of first photodetector array sampling with (ii) represent data by the luminous intensity of second photodetector array sampling.This memory, first photodetector array, second photodetector array and signal processing circuit can be integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
In addition, can comprise sequential and control logic unit, to provide sequential and control information to signal processing circuit, first photodetector array and/or second photodetector array.In addition, comprise telecommunication circuit (wired, wireless and/or optical communication circuit), with the data of output expression composograph.This telecommunication circuit, memory, first photodetector array, second photodetector array and signal processing circuit can be integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
Signal processing circuit can comprise first signal processing circuit and secondary signal treatment circuit, wherein first signal processing circuit is coupled and is associated with first photodetector array, and the secondary signal treatment circuit is coupled and is associated with second photodetector array.In addition, this signal processing circuit comprises the first analog signal logical block and the second analog signal logical block, wherein the first analog signal logical block is coupled and is associated with first photodetector array, and the second analog signal logical block is coupled and is associated with second photodetector array.And, this signal processing circuit can comprise the first black level logical block and the second black level logical block, wherein the first black level logical block is coupled and is associated with first photodetector array, and the second black level logical block is coupled and is associated with second photodetector array.It should be noted that, this signal processing circuit can comprise first exposure control circuit and second exposure control circuit, wherein first exposure control circuit is coupled and is associated with first photodetector array, and second exposure control circuit is coupled and is associated with second photodetector array.
This digital camera can comprise framework, and wherein first and second photodetector arrays, signal processing circuit and first and second lens are fixed on this framework.
In in the above-described embodiments any, first photodetector array can comprise the first surface zone, and second photodetector array comprises the second surface zone, and wherein the first surface zone is different from the second surface zone.And the photoelectric detector of first array can comprise the first active surface zone, and the photoelectric detector of second array can comprise the second active surface zone, and wherein the first active surface zone is different from the second active surface zone.
In addition, in the above-described embodiments any, first photodetector array can comprise the first surface zone, and second photodetector array comprises the second surface zone, and wherein the first surface zone is substantially the same with the second surface zone.The photoelectric detector of first array can comprise the first active surface zone, and the photoelectric detector of second array can comprise the second active surface zone, and wherein the first active surface zone is different from the second active surface zone.
Again, in this description with illustrate a lot of inventions.The foregoing invention content does not have limit scope of the present invention.In addition, this summary of the invention is not to limit the present invention, and should not explain like this.Therefore, though in this summary of the invention, describe and/or summarized some embodiment, be to be understood that the present invention connect up these embodiment, description and/or summary.In fact, much other and the different and/or similar embodiment of the embodiment in this summary of the invention will become obvious from following description, diagram and/or claim.
In addition, although in summary of the invention, describe various features, attribute and advantage and/or obvious because of this description, be to be understood that no matter be and in one of the present invention, a plurality of or all embodiment, all do not require these features, attribute and advantage, and in fact, unless special statement does not need to be present among any aspect of the present invention and/or the embodiment.
Represent the accompanying drawing of similar parts from following description and similar Reference numeral, various purposes, characteristics and/or the advantage of one or more aspects of the present invention and/or embodiment will be conspicuous.But should be appreciated that any of these purpose, characteristics and/or advantage are optional, unless and in fact special statement, do not need to be present among any aspect of the present invention and/or the embodiment.
Should be appreciated that the each side of the present invention and the embodiment that do not appear in the following claim are retained, so that in one or more divisions/later patents application, state.
Description of drawings
Following detailed is carried out with reference to the accompanying drawings.These accompanying drawings illustrate different aspect of the present invention and embodiment, and the Reference numeral suitable, that analog structure, parts, material and/or element are described among the wherein different figure all similarly marks.The various combinations that are to be understood that these structures, parts, material and/or element except that illustrating have especially here also reckoned with and have fallen into scope of the present invention.
Figure 1A illustrates prior art digital camera and critical piece thereof;
Figure 1B-1D is the schematic diagram of prior art picture catching element of the prior art digital camera of Figure 1A;
Fig. 1 E illustrates the work of lens subassembly under the indentation pattern of the prior art camera of Figure 1A;
Fig. 1 F illustrates the work of lens subassembly under the optical zoom pattern of the prior art camera of Figure 1A;
Fig. 2 illustrates digital camera and the critical piece thereof according to an embodiment of aspect of the present invention, comprises digital camera subsystem (DCS);
Fig. 3 A-3B is the schematic diagram of digital camera subsystem (DCS);
Fig. 4 illustrates the digital camera subsystem with tri-array/lens configuration;
Fig. 5 A-5C is to use the schematic diagram of picture catching of the digital camera subsystem (DCS) of Fig. 2-3;
Fig. 6 A is the replaceable digital camera subsystem (DCS) with 4 arrays;
Fig. 6 B is the flow chart of the replaceable digital camera subsystem (DCS) of Fig. 6 A;
Fig. 7 A-7C is the schematic diagram of four lens combinations used in the DCS of Fig. 3;
Fig. 8 is the schematic diagram according to the digital camera devices of another embodiment of aspect of the present invention;
Fig. 9 A is the optics cross-sectional view taken that can use in digital camera devices according to one embodiment of the present of invention;
Fig. 9 B-9D is the optics cross-sectional view taken that can use in digital camera devices according to other embodiments of the invention;
Figure 10 A-10H is the schematic diagram according to the optics that can use in the digital camera devices part of other embodiments of the invention;
Figure 11 A-11B for example is being used for the schematic diagram and the end view that are suitable for making the lens that the optics part of ruddiness or red band transmission uses of red camera passage according to another embodiment of the present invention;
Figure 12 A-12B for example is being used for the schematic diagram and the end view that are suitable for making the lens that the optics part of green glow or the transmission of green glow band uses of green camera passage according to another embodiment of the present invention;
Figure 13 A-13B for example is being used for the schematic diagram and the end view that are suitable for making the lens that the optics part of blue light or the transmission of blue light band uses of blue camera passage according to another embodiment of the present invention;
Figure 14 for example is being used for the schematic diagram that is suitable for making the lens that the optics part of ruddiness or red band transmission uses of red camera passage according to another embodiment of the present invention;
Figure 15 A-15F is the schematic diagram according to the lens that can use in digital camera devices of other embodiments of the invention;
Figure 16 A is according to the sensor array that can use in digital camera devices of other embodiments of the invention and the schematic diagram of connected circuit;
Figure 16 B is the schematic diagram of pixel of the sensor array of Figure 16 A;
Figure 16 C is the schematic diagram according to the circuit that can use in the pixel of Figure 16 B of one embodiment of the present of invention;
Figure 16 D-16E illustrates the parameter that can be used for sensor array according to other embodiments of the invention;
Figure 17 A is the schematic diagram according to the part of the sensor array of another embodiment of the present invention;
Figure 17 B-17K is the cross-sectional view according to each embodiment of one or more pixels of other embodiments of the invention; Such pixel embodiment can here describe and/or illustrated any embodiment in implement;
Figure 17 F is the schematic diagram according to the sensor array of other embodiments of the invention;
Figure 18 A-18B illustrates the image of catching according to the part by sensor array of one embodiment of the present of invention;
Figure 19 A-19B illustrates the image of catching according to the part by sensor array of another embodiment of the present invention;
Figure 20 A-20B is the schematic diagram according to the relative positioning that provides for optics part and respective sensor array of other embodiments of the invention;
Figure 21 is the schematic diagram according to the relative positioning that can provide for 4 optics part and 4 sensor arraies of one embodiment of the present of invention;
Figure 22 A-22B is respectively according to the plane graph of the image device of one embodiment of the present of invention and cross-sectional view;
Figure 23 A-23B is respectively according to the plane graph of the image device of another embodiment of the present invention and cross-sectional view;
Figure 24 A-24B is respectively according to the plane graph of the image device of another embodiment of the present invention and cross-sectional view;
Figure 25 A-25B is respectively according to the plane graph of the image device of another embodiment of the present invention and cross-sectional view;
Figure 26 A-26B is respectively according to the plane graph of the image device of another embodiment of the present invention and cross-sectional view;
Figure 27 A-27B is respectively according to the plane graph of the image device of another embodiment of the present invention and cross-sectional view;
Figure 28 A is the strutting piece and the perspective view that can be installed in optics part wherein according to one embodiment of the present of invention;
Figure 28 B is the plane graph of amplification of the strutting piece of Figure 28 A;
Figure 28 C is the cross-sectional view of amplification of the strutting piece of Figure 28 A of looking along the A-A direction of Figure 28 B;
Figure 28 D is the exploded cross section views of amplification of a part of the strutting piece of Figure 28 A of looking along the A-A direction of Figure 28 B; And can be installed in wherein lens;
Figure 29 A is the strutting piece and the cross-sectional view that can be installed in optics part wherein according to an alternative embodiment of the invention;
Figure 29 B is the strutting piece and the cross-sectional view that can be installed in optics part wherein according to an alternative embodiment of the invention;
Figure 30 A is the strutting piece and the cross-sectional view that can be installed in optics part wherein according to an alternative embodiment of the invention;
Figure 30 B is the plane graph of the strutting piece of Figure 30 A;
Figure 30 C is the cross-sectional view of the strutting piece of Figure 30 A of looking along the A-A direction of Figure 30 B;
Figure 30 D is the cross-sectional view of the strutting piece of Figure 30 A of looking along the A-A direction of Figure 30 B; And can be installed in wherein lens;
Figure 31 A is the strutting piece and the perspective view that can be installed in optics part wherein according to an alternative embodiment of the invention;
Figure 31 B is the plane graph of the strutting piece of Figure 31 A;
Figure 31 C is the cross-sectional view of the strutting piece of Figure 31 A of looking along the A-A direction of Figure 31 B;
Figure 31 D is the cross-sectional view of the strutting piece of Figure 31 A of looking along the A-A direction of Figure 31 B; And can be installed in wherein lens;
Figure 32 is according to the digital camera devices of one embodiment of the present of invention and the cross-sectional view of printed circuit board (PCB), and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 33 A-33F illustrates an embodiment of the digital camera devices that is used to assemble and install Figure 32;
Figure 33 G is the perspective view according to the digital camera devices of an alternative embodiment of the invention;
Figure 33 H-33K be according to other embodiments of the invention can related use with digital camera devices installation and the front view of electric connector configuration;
Figure 34 is the cross-sectional view of strutting piece of optics part that can be used to support at least in part Figure 11 A-11B, 13A-13B according to an alternative embodiment of the invention;
Figure 35 A-35C illustrates an embodiment who is used for 3 lenslets of optics part are assembled in this strutting piece.
Figure 36 is that this digital camera devices can be installed on this printed circuit board (PCB) according to the digital camera devices of the optics part of the strutting piece that comprises Figure 34 of one embodiment of the present of invention and Figure 11 A-11B, 13A-13B and the cross-sectional view of printed circuit board (PCB);
Figure 37 is the cross-sectional view of another strutting piece of optics part that can be used to support at least in part Figure 11 A-11B, 13A-13B according to an alternative embodiment of the invention;
Figure 38 is the cross-sectional view of another strutting piece of optics part that can be used to support at least in part Figure 11 A-11B, 13A-13B according to an alternative embodiment of the invention;
Figure 39 is that this digital camera devices can be installed on this printed circuit board (PCB) according to the digital camera devices of the optics part of the strutting piece that comprises Figure 37 of one embodiment of the present of invention and Figure 11 A-11B, 13A-13B and the cross-sectional view of printed circuit board (PCB);
Figure 40 is that this digital camera devices can be installed on this printed circuit board (PCB) according to the digital camera devices of the optics part of the strutting piece that comprises Figure 38 of one embodiment of the present of invention and Figure 11 A-11B, 13A-13B and the cross-sectional view of printed circuit board (PCB);
Figure 41 A-41D is the cross-sectional view of installation configuration of lens that can be used for supporting at least in part respectively Figure 15 A-15D in digital camera devices according to other embodiments of the invention;
Figure 42-the 44th is according to the installation configuration of adopting Figure 41 B-41D respectively of other embodiments of the invention and can be used for supporting at least in part respectively the cross-sectional view of the strutting piece of the lens shown in Figure 15 B-15D;
Figure 45 is according to the digital camera devices of the strutting piece that comprises Figure 42 of one embodiment of the present of invention and the cross-sectional view of printed circuit board (PCB), and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 46 is according to the digital camera devices of the strutting piece that comprises Figure 43 of one embodiment of the present of invention and the cross-sectional view of printed circuit board (PCB), and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 47 is according to the digital camera devices of the strutting piece that comprises Figure 44 of one embodiment of the present of invention and the cross-sectional view of printed circuit board (PCB), and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 48 is the schematic diagram according to the digital camera devices of an alternative embodiment of the invention;
Figure 49 is according to the printed circuit board (PCB) of the digital camera of an alternative embodiment of the invention and the cross-sectional view of digital camera devices, and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 50 A-50F illustrates an embodiment of the digital camera devices that is used to assemble and install Figure 49;
Figure 51 is the schematic diagram according to the digital camera devices that comprises dividing plate of an alternative embodiment of the invention;
Figure 52 is the schematic diagram according to the digital camera devices that comprises dividing plate of an alternative embodiment of the invention;
Figure 53 is according to the printed circuit board (PCB) of the digital camera of an alternative embodiment of the invention and the cross-sectional view of digital camera devices, and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 54 A-54F illustrates such embodiment of the digital camera devices that is used to assemble and install Figure 53;
Figure 55 is the schematic diagram according to the digital camera devices that comprises second device and dividing plate of an alternative embodiment of the invention;
Figure 56 is according to the printed circuit board (PCB) of the digital camera of an alternative embodiment of the invention and the cross-sectional view of digital camera devices, and this digital camera devices can be installed on this printed circuit board (PCB);
Figure 57 A-57F illustrates such embodiment of the digital camera devices that is used to assemble and install Figure 56;
Figure 58-the 62nd, according to the printed circuit board (PCB) of the digital camera of other embodiments of the invention and the cross-sectional view of digital camera devices, this digital camera devices can be installed on this printed circuit board (PCB);
Figure 63-the 67th, according to the printed circuit board (PCB) of the digital camera of other embodiments of the invention and the cross-sectional view of digital camera devices, this digital camera devices can be installed on this printed circuit board (PCB);
Figure 68-the 72nd, according to the printed circuit board (PCB) of the digital camera of other embodiments of the invention and the cross-sectional view of digital camera devices, this digital camera devices can be installed on this printed circuit board (PCB);
Figure 73 A-73B is respectively according to the front view of the strutting piece of an alternative embodiment of the invention and cross-sectional view;
Figure 74 is the cross-sectional view according to the strutting piece of an alternative embodiment of the invention;
Figure 75 is the plane graph according to the strutting piece of an alternative embodiment of the invention;
Figure 76 A is the schematic diagram according to the digital camera devices that comprises one or more output devices of an alternative embodiment of the invention;
Figure 76 B-76C is respectively front view and the rearview according to the display unit that can use in the digital camera devices of Figure 76 A of one embodiment of the present of invention;
Figure 76 D-76F is the schematic diagram according to the digital camera devices that comprises one or more output devices of other embodiments of the invention;
Figure 77 A is the schematic diagram according to the digital camera devices that comprises one or more output devices of an alternative embodiment of the invention;
Figure 77 B-77C is respectively front perspective view and the rear view according to the amplification of the input unit that can use in the digital camera devices of Figure 77 A of one embodiment of the present of invention;
Figure 77 D-77L is the schematic diagram according to the digital camera devices that comprises one or more output devices of other embodiments of the invention;
Figure 77 M-77N is respectively according to the plane graph of the strutting piece of an alternative embodiment of the invention and cross-sectional view;
Figure 77 O-77P is that this digital camera devices can be installed on this printed circuit board (PCB) according to the cross-sectional view of the printed circuit board (PCB) of the digital camera devices of the strutting piece of employing Figure 77 M-77N of other embodiments of the invention and digital camera;
Figure 78 A is the schematic diagram according to the digital camera devices that comprises one or more lighting devices of an alternative embodiment of the invention;
Figure 78 B-78C is respectively front perspective view and the rear view according to the amplification of the lighting device that can use in the digital camera devices of Figure 78 A of one embodiment of the present of invention;
Figure 78 D-78L is the perspective view according to the digital camera devices that comprises one or more lighting devices of other embodiments of the invention;
Figure 78 M-78N is the schematic diagram according to the digital camera devices that comprises one or more lighting devices of other embodiments of the invention;
Figure 79 A-79C is the perspective view of digital camera devices according to comprising of other embodiments of the invention one or more input units and one or more output devices;
Figure 80 A-80F is the perspective view of digital camera devices according to comprising of other embodiments of the invention one or more input units, one or more display unit and one or more lighting devices;
Figure 81 A protects the schematic diagram of the digital camera devices of dress according to the molded plastics that comprises of one embodiment of the present of invention;
Figure 81 B-81C is the decomposition diagram of the digital camera devices of Figure 81 A;
Figure 82 is the front perspective view according to the amplification of the digital camera devices of an alternative embodiment of the invention;
Figure 83 A-83C is the front perspective view according to sensor array of other embodiments of the invention and processor configuration;
Figure 83 A-83C is the front view according to the sensor array configuration of other embodiments of the invention;
Figure 84 A-84E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 85 A-85E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 86 A-86E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 87 A-8B is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 88 A-88E is the schematic diagram according to the digital camera devices of an alternative embodiment of the invention;
Figure 88 A-88E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 89 A-89E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 90 A, 91A-91B, 92A-92B, 93A-93B, 94A-94B, 95A-95B and 96A-96B are respectively plane graph and the cross-sectional views of some embodiment of image device;
Figure 90 A is the plane graph according to the image device of an alternative embodiment of the invention;
Figure 90 A-90B is respectively according to the plane graph of the image device of one embodiment of the present of invention and cross-sectional view;
Figure 91 A-91B is respectively according to the plane graph of the image device of an alternative embodiment of the invention and cross-sectional view;
Figure 92 A-92B is respectively according to the plane graph of the image device of an alternative embodiment of the invention and cross-sectional view;
Figure 93 A-93B is respectively according to the plane graph of the image device of an alternative embodiment of the invention and cross-sectional view;
Figure 94 A-94B is respectively according to the plane graph of the image device of an alternative embodiment of the invention and cross-sectional view;
Figure 95 A-95B is respectively according to the plane graph of the image device of an alternative embodiment of the invention and cross-sectional view;
Figure 96 A-96B is respectively according to the plane graph of the image device of an alternative embodiment of the invention and cross-sectional view;
Figure 97 A is the strutting piece and the plane graph that can be installed in optics part wherein according to one embodiment of the present of invention;
Figure 97 B is the cross-sectional view of amplification of the strutting piece of Figure 97 A of looking along the A-A direction of Figure 97 B;
Figure 97 C is the part of strutting piece of Figure 97 A and the exploded cross section views that can be installed in lens wherein;
Figure 99 A-99D is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 100 A-100D is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 101 A is the front perspective view according to the image device of an alternative embodiment of the invention;
Figure 101 B is according to the sensor array that can use in the image device of Figure 101 A of one embodiment of the present of invention and the schematic diagram of connected circuit;
Figure 101 C is the schematic diagram of pixel of the sensor array of Figure 101 B;
Figure 101 D is according to the sensor array that can use in the image device of Figure 101 A of one embodiment of the present of invention and the schematic diagram of connected circuit;
Figure 101 E is the schematic diagram of pixel of the sensor array of Figure 101 D;
Figure 101 F is according to the sensor array that can use in the image device of Figure 101 A of one embodiment of the present of invention and the schematic diagram of connected circuit;
Figure 101 G is the schematic diagram of pixel of the sensor array of Figure 101 F;
Figure 102 A is the front perspective view according to the image device of an alternative embodiment of the invention;
Figure 102 B is according to the sensor array that can use in the image device of Figure 102 A of one embodiment of the present of invention and the schematic diagram of connected circuit;
Figure 102 C is the schematic diagram of pixel of the sensor array of Figure 102 B;
Figure 102 D is according to the sensor array that can use in the image device of Figure 102 A of one embodiment of the present of invention and the schematic diagram of connected circuit;
Figure 102 E is the schematic diagram of pixel of the sensor array of Figure 102 D;
Figure 102 F is according to the sensor array that can use in the image device of Figure 102 A of one embodiment of the present of invention and the schematic diagram of connected circuit;
Figure 102 G is the schematic diagram of pixel of the sensor array of Figure 102 F;
Figure 103 A-103E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 104 A-104E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 105 A-105E is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 106 A-106C is the perspective view according to the system with a plurality of digital camera devices of an alternative embodiment of the invention;
Figure 107 A is the perspective view according to the system that comprises a plurality of digital camera devices of an alternative embodiment of the invention;
Figure 107 B is the front view of the image device that can use in the system of Figure 107 A;
Figure 108 A-108B is the schematic diagram according to the digital camera devices of other embodiments of the invention;
Figure 109 A-109E is the block diagram that illustrates according to the configuration of the digital camera devices of the embodiment of the invention;
Figure 110 A is the block diagram according to the processor of one embodiment of the present of invention;
Figure 110 B is the block diagram according to the channel processor that can use in the processor of Figure 110 A of one embodiment of the present of invention;
Figure 110 C is the block diagram according to the image pipeline that can use in the processor of Figure 110 A of one embodiment of the present of invention;
Figure 110 D is the block diagram according to the preprocessor that can use in the processor of Figure 110 A of one embodiment of the present of invention;
Figure 110 E is according to the system's control of the digital camera devices of one embodiment of the present of invention and the block diagram of other parts;
Figure 110 F is the expression according to the command format of one embodiment of the present of invention;
Figure 111 A is the block diagram according to the channel processor of an alternative embodiment of the invention;
Figure 111 B is the diagrammatic representation of adjacent pixel values;
Figure 111 C is illustrated in the flow chart of the operation of using among the embodiment of two samplers;
Figure 111 D is illustrated in the flow chart of the operation of using among the embodiment of defectiveness pixel identifier;
Figure 111 E is the block diagram according to the image pipeline of an alternative embodiment of the invention;
Figure 111 F is the block diagram according to the plane of delineation integrator of one embodiment of the present of invention;
Figure 111 G is the explanatory expression of the multi-phase clock that can use in the plane of delineation integrator of Figure 111 G;
Figure 111 H-111J is the key-drawing that illustrates according to the expression of the image that is produced by 3 camera passages of one embodiment of the present of invention;
Figure 111 K-111Q is the key-drawing that illustrates according to the expression of the process of the image of Figure 111 H-111J being carried out by the automated graphics aligned portions of one embodiment of the present of invention;
Figure 111 R is the block diagram according to the automatic exposure control of one embodiment of the present of invention;
Figure 111 S is the block diagram according to the zoom controller of one embodiment of the present of invention;
Figure 111 T-111V is the key-drawing according to the process of being carried out by the zoom controller of Figure 111 S of one embodiment of the present of invention;
Figure 111 W is the diagrammatic representation that illustrates according to the example of the operation of the gamma correction of one embodiment of the present of invention part;
Figure 111 X is the gamma correction block diagram partly that adopts according to one embodiment of the present of invention;
Figure 111 Y is the block diagram according to the color correcting section of one embodiment of the present of invention;
Figure 111 Z is the block diagram according to the edge booster/sharpener of one embodiment of the present of invention;
Figure 111 AA is the block diagram that reduces part according to the chrominance noise of one embodiment of the present of invention;
Figure 111 AB is the key-drawing that illustrates according to the expression of the process of partly being carried out by white balance of one embodiment of the present of invention;
Figure 111 AC is the block diagram that strengthens part according to the color of one embodiment of the present of invention;
Figure 111 AD is the block diagram according to the convergent-divergent part of one embodiment of the present of invention;
Figure 111 AE is the key-drawing that illustrates according to the expression of the last convergent-divergent of one embodiment of the present of invention;
Figure 111 AF is the flow chart according to the operation that can use in aligned portions of an alternative embodiment of the invention;
Figure 112 is the block diagram according to the channel processor of an alternative embodiment of the invention;
Figure 113 is the block diagram according to the channel processor of an alternative embodiment of the invention;
Figure 114 A is the block diagram according to the image pipeline of an alternative embodiment of the invention;
Figure 114 B is the block diagram according to the image pipeline of an alternative embodiment of the invention;
Figure 114 C is the block diagram that reduces part according to the chrominance noise of an alternative embodiment of the invention;
Figure 115 A-115L is the key-drawing that the example of parallax is shown;
Figure 115 M illustrates according to the image of watching by the first camera passage under the situation of eliminating parallax of one embodiment of the present of invention and the overlapping key-drawing of the image of watching by the second camera passage.
Figure 115 N-115R is the key-drawing that the example that reduces parallax is shown;
Figure 115 S-115X is the key-drawing that the example that increases parallax is shown;
Figure 116 illustrates the flow chart according to the operation that can adopt of an alternative embodiment of the invention in the estimation of the distance between generation and object or its part.
Figure 117 is the block diagram according to the part of the distance measuring equipment of one embodiment of the present of invention;
Figure 118 is the block diagram according to the locator part of the distance measuring equipment of one embodiment of the present of invention;
Figure 119 A-119C is the key-drawing that the example of 3D imaging is shown;
Figure 120 is the key-drawing of the 3D imaging of another type;
Figure 121-122 illustrates the flow chart according to the operation that can use of an alternative embodiment of the invention in the 3D imaging;
Figure 123 is the block diagram according to the 3D effect generator of one embodiment of the present of invention;
Figure 124 is the block diagram according to the 3D effect generator of one embodiment of the present of invention;
Figure 125 illustrates the flow chart according to the operation that can use of an alternative embodiment of the invention in image identifying;
Figure 126 A-126B illustrates the flow chart according to the operation that can use of an alternative embodiment of the invention in image identifying;
Figure 127 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 128 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 129 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 130 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 131 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 132 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 133 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 134 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 135 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 136 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 137 is the block diagram according to one or more parts of the digital camera devices of an alternative embodiment of the invention;
Figure 138 implements one or more aspect/technology/embodiment that spectrum is optimized according to the one or more parts that are used for the logarithmic code camera apparatus of other embodiments of the invention; One or more aspect/technology/embodiment can here describe and/or illustrated any embodiment in implement.
Embodiment
In Figure 1A, the digital camera 1 of prior art generally comprises master image capturing element-imageing sensor 150, colour filter sheet 160 and a series of lens 170 (becoming lens subassembly).Extra electronic unit typically comprise circuit board 110, peripheral user interface electronic device 120 (be expressed as shutter release button at this, but can also comprise display, device is set, controller or the like), power supply 130 and electronic image storage medium 140.
Digital camera 1 also comprises shell (comprising housing parts 173,174,175,176,177,178) and shutter assembly (not shown), and this shutter assembly control aperture 180 and light enter the passage of digital camera 1.Mechanical framework 181 is used for each parts of lens subassembly are combined.Lens subassembly comprises lens 170 and one or more electro-mechanical device 182 along axle 183 mobile lens 170.Mechanical framework 181 and one or more electro-mechanical device 182 can be made up of a lot of parts and/or complicated assemblies.
Colour filter sheet 160 has the color filter array that is arranged in the Bayer pattern.The Bayer pattern uses redness, green, blueness in the past and typically is the colour filter of second green (2 * 2 color matrixes for example, replace red and green wherein arranged in delegation, and alternately green and blue arranged in another row, although also can use other color), although can change pattern according to user's needs.The Bayer pattern repeats in whole color filter array 112, shown in Figure 1A-1D.This pattern repeats in whole array as shown in figure.
Imageing sensor 150 comprises a plurality of identical photoelectric detector (being sometimes referred to as " picture element " or " pixel ") that is arranged in matrix.The quantity of photoelectric detector is usually in hundreds of thousands arrives scope up to a million.Lens subassembly covers the diagonal of this array.
Color filter array 160 is placed on imageing sensor 150 tops, make each colour filter in the colour filter sheet 160 be set on the corresponding photoelectric detector in the imageing sensor 150, each photoelectric detector in the imageing sensor receives the visible light (for example red, green or blue) of specific band thus.
The photon capture-process that the digital camera 1 that Figure 1B-1D shows prior art uses when producing coloured image.The complete spectrum of visible light 184 is mapped on the lens subassembly, and it makes complete spectrum pass through basically.This complete spectrum is mapped on the colour filter of colour filter sheet 160 then, and each band that it is specific in each colour filter of colour filter sheet 160 passes to its specific pixel.This process repeats each pixel.Each pixel provides the signal of the indication color intensity that it received.The signal processing circuit (not shown) receives color signal alternately from photoelectric detector, be integrated into single panchromatic pixels by group (red/green/green or its mutation) and handle the described color signal that replaces, export coloured image at last in the mode of unanimity with each 4 pixel.
Fig. 1 E illustrates the work of the lens subassembly under the indentation pattern (being sometimes referred to as general mode or nearly burnt the setting).This lens subassembly is shown to be focused on the distant objects 186 (being expressed as lightning).For ease of reference, the expression of imageing sensor 150 is included.The visual field of camera 1 is limited between the reference line 188,190.The width of visual field for example can be 50 millimeters (mm).In order to realize this visual field 188,190, one or more electro-mechanical devices 182 are located lens 170 to such an extent that draw close relatively.Lens subassembly makes visual field scioptics 170 and passes to imageing sensor 150, shown in reference line 192,194.The image of object (illustrating 196) is presented on the imageing sensor 150 according to the ratio identical with the ratio of practical field of view 188,190 with the width of real image 186.
Fig. 1 F illustrates the work of the lens subassembly 110 under the optical zoom pattern (being sometimes referred to as burnt setting the far away).Under this pattern, one or more electro-mechanical devices 182 of lens subassembly are reorientated lens 170 so that reduce visual field 188,190 on the identical image zone, thereby make object 186 manifest closelyer (promptly bigger).An advantage of this lens subassembly is that the resolution of this lens subassembly under zoom mode typically equals the resolution of this lens subassembly under the indentation pattern.But a shortcoming is that this lens subassembly may cost height and complicated.In addition, provide lens to cause luminous sensitivity to reduce, and therefore increase the F-diaphragm (F-stop) of lens, the efficient of these lens under low light condition is reduced with zoom capabilities.
Some other shortcomings that are associated with traditional digital cameras 1 are as described below.
Traditional digital cameras adopts a big array on imageing sensor, also adopt the lens that must cover whole array.This has produced two problems relevant with physical size: the lens that 1) cover big array (for example 3,000,000 pixels) all will be physically greater than the lens that cover less array (for example 1,000,000 pixels) aspect diameter and thickness; 2) big lens/array combination may have long focal length, and this will increase the height of lens.
And because the conventional lenses complete spectrum that must decompose visible wavelength, so they are complicated, have 3-8 independently element usually.This also is added on Optical stack height, complexity and the cost.
In addition, because conventional lenses must make the color of all bandwidth pass through, so it must be water white transparency lens (not having colour filter).Previously described required colour filter is finished by the small colour filter of deposition a slice below lens and on the imageing sensor top.For example, the imageing sensor with 1,000,000 pixels needs 1,000,000 independent colour filters of a slice.This color filter array technology is expensive (off-gauge integrated circuit processing), played restriction (color cross-talk between the pixel in the imageing sensor) to dwindling Pixel Dimensions, and the color filter array material has weakened by the interior photon stream (promptly having reduced luminous sensitivity) of the band of this array, because transmission is lower than 100% in the band of this color filter array material.
In addition, because lens must move forward and backward with respect to imageing sensor, therefore need extra time and energy.This is the non-expectation aspect of digital camera, catches the delay of the length of response time because it has produced, and has reduced battery capacity.
The one or more above-mentioned shortcoming that is associated with traditional digital cameras can solve by one or more embodiment of one or more aspects of the present invention, although this not necessarily.
Fig. 2 illustrates digital camera 2 and the examples of members thereof according to an embodiment of some aspect of the present invention.This digital camera comprises digital camera subsystem 200, circuit board 110, peripheral user interface electronic device (be expressed as shutter release button at this, but can also comprise display and/or one or more other output device, controller and/or one or more additional input units etc. are set) 120, power supply 130 and electronic image storage medium 140.
The digital camera of Fig. 2 can also comprise shell and shutter assembly (not shown), and this shutter assembly control aperture 180 and light enter the passage of digital camera 2.
Fig. 3 A-3B is the part decomposing schematic representation of an embodiment of digital camera subsystem 200.In this embodiment, digital camera subsystem comprises imageing sensor 210, framework 220 (Fig. 7 A-7C) and lens 230A-230D.Imageing sensor 210 generally comprises semiconductor integrated circuit or " chip " with some upper level features, and this semiconductor integrated circuit comprises a plurality of array 210A-210D and signal processing circuit 212,214.Among the array 210A-210D each is caught photon and is exported the signal of telecommunication.In certain embodiments, signal processing circuit 212 is each processing signals in each array 210.Signal processing circuit 214 can be combined as the output of handling 212 from signal the dateout form of the full-colour image of recombination (usually with).Each array and relevant signal processing circuit can preferably be customized to handle specific visible light bands of a spectrum.
Among the lens 230A-230D each can advantageously customize at the respective wavelength of respective array.The size of the general array with below of lens is roughly the same, and therefore depends on the yardstick of array below and size and dimension difference each other.Certainly, do not require to fix-focus lens and cover all or below array only.In alternative embodiment, lens can only cover the part of array, and can extend beyond this array.Lens can comprise any one or more suitable materials, comprise for example glass and plastics.Lens can be to be impregnated in colour filter, polarization or other characteristic such as any suitable manner that gives (impart).Lens can be rigidity or flexible.
Framework 220 (Fig. 7 A-7C) is used for lens 230A-230D is installed to imageing sensor 210.
In this exemplary embodiment, each lens, array and signal processing circuit constitute the image generation subsystem of visible light bands of a spectrum (for example redness, blueness, green etc.).Then with the additional signals treatment circuit combined complete image of image that these are independent in the semiconductor chip to be formed for exporting.
Although one skilled in the art will understand that with four arrays/lens configuration and describe digital camera subsystem 210, digital camera subsystem also can be used with the configuration of array/lens with any a plurality of number and shape.
Fig. 4 has described to have the digital camera subsystem of tri-array/lens configuration.
In Fig. 5 A-5C, digital camera subsystem adopts independently array, for example array 210A-210 to replace art methods on an imageing sensor, and (it adopts Bayer pattern (or its mutation), stride this array and work (next pixel), and will be integrated into single panchromatic pixels) from the group (for example, red/green/green or its mutation) of each 4 pixel of this array.In such array each is at specific visible light bands of a spectrum.Therefore, can be array tuned to each, make its seizure and to handle the image of this particular color more efficient.Each lens (230A-D) can bring at the spectrum of array and be customized.Each lens only needs this color (184A-184D) is passed to imageing sensor.Traditional colour filter sheet is removed.Each array is to the signal processing circuit output signal.Each the signal processing circuit that is used for these arrays also is customized in the visible light bands of a spectrum each.In fact, each image in these arrays each and produce.After this process, that each image is combined to form a panchromatic or black/white image.Signal processing circuit by customizing each array and being associated can produce the image higher than the picture quality that conventional image sensor produced of similar pixel quantity.
Fig. 6 A-6B illustrates some in the many processing operations that can advantageously be used.As mentioned above, each array is to signal processing circuit 212 output signals.In this signal processing circuit, to each array individual processing so that this processing is suitable for corresponding band.Some functions take place:
Row logic unit (212.1A-D) is the signal processing circuit part that reads signal from pixel.For example, read signal in the pixel of row logic unit 212.1A from array 210A.Read signal in the pixel of row logic unit 212.1B from array 210B.Read signal in the pixel of row logic unit 212.1C from array 210C.Read signal in the pixel of row logic unit 212.1D from array 210D.
Because this array is at a specific wavelength, a plurality of wavelength, wavelength band or multi-wavelength band, so the row logic unit can have different integrating times for each array and strengthens dynamic range and/or color selectivity.Because logical block needn't be switched between extreme color shifts, and therefore can reduce the signal processing circuit complexity of each array considerably.
The analog signal logical block (ASL) of each array can be a particular color (212.1A-D).Therefore, ASL handles single color, and therefore can wait at gain, noise, dynamic range, linearity and optimize.Do not need logical block and the remarkable transfer of stabilization time (settlingtime) because color signal separates because amplifier and logical block not as traditional B ayer patterning designs individual element change (color is to color).
Noise level in black level logical block (212.3A-D) assessing signal and with its filtering.Each array is at the visible light bands of a spectrum narrower than conventional image sensor, and the black level logical block can be by tuning to eliminate noise more subtly.
Whole light quantities of being caught by this array are measured in exposure control (212.4A-D), and adjust pull-in time at picture quality.Traditional camera must be globally (for all colours) carries out this and determines.The invention enables exposure control can at each array and at wavelength band and differently carry out.
Give second group of signal processing circuit 214 with treated image transfer then.At first, image processing logical block 214.1 is integrated into single coloured image with a plurality of planes of color.This image is adjusted in correction at saturation, acutance, intensity, tone, the removal of pseudo-shadow and defectiveness pixel.IPL also provides arithmetic automatic focus, zoom, window (windowing), combination of pixels (pixel binning) and camera function.
Latter two operation is encoded into standard agreement 214.2 as MPEG, JPEG etc. before as USB with signal passing to standard output interface 214.3.
Although the specific region that signal processing circuit 212,214 is positioned at imageing sensor is shown, signal processing circuit 212,214 can also be placed on the chip Anywhere, and segmentation by any way.Actual signal processing circuit may be placed on a plurality of positions.
As previously mentioned, imageing sensor 210 (Fig. 3 A-3B) generally comprises the semiconductor chip with some upper level features, this semiconductor chip comprises a plurality of arrays (210A-210D) and signal processing circuit 212, wherein preferably customizes each array and relevant signal processing circuit to handle specific visible light bands of a spectrum.As mentioned above, image sensor array can use the array of any a plurality of number and shape to dispose.
Imageing sensor 210 can use any suitable technique, construct particularly including silicon and germanium technologies.Pixel can form in any suitable manner, can determine size and yardstick as required, and can be with any required pattern distribution.Even can use the pixel that does not distribute with any regular pattern.
Any limit of visible spectrum can be depending on user's special interests and is applied to each array.In addition, infrared array also can be as one of array/combination of lenses that low light level ability is provided to transducer.
As mentioned above, array 210A-D can have virtually any size or shape.Fig. 3 is depicted as array independent, the discrete part of imageing sensor.These arrays can also contact.Can also have a big array, it is configured so that this array is subdivided into several portions, and each part is at a band thus, thus produce with same chip on the identical effect of independent array.
Although stride the photoelectric detector of each independent array (representing) with 210A-D well depth (for example catch, gather, respond, detection and/or the sensing for example zone or the part of the photoelectric detector of incident light intensity illumination; In certain embodiments, this well depth is the distance from the photoelectric detector surface to doped region, for example referring to Figure 17 B-E) can be identical, but the well depth of any given array can be different from the one or more or whole well depth in other array in the sensor subsystem.The selection of suitable well depth can be depending on several factors, most probable comprise at the visible light bands of a spectrum.Because each whole array may be at visible light bands of a spectrum (for example red), so well depth can be designed as and catches this wavelength and ignore other (as blue, green).
In the particular color array, mix with semi-conducting material and can be further used for strengthening selectivity the photonic absorption of particular color wavelength.
In Fig. 7 A-7C, framework 220 is thin plates, and it is holed each lens (representing with 230A, 230C) are carried on each array top.Lens can be secured on the framework according to a variety of modes: bonding, interference fit (press fit), electronic engagement etc.).Installing hole can have little " seat " degree of depth with the control lens position on base.This degree of depth can be all different to each lens, and are the results at the particular focal length of the certain lenses of each array customization.
Framework shown in Fig. 7 A-7C is the solid unit who provides many kinds to select for manufacturing, material, installation, size and dimension.Certainly, can easily design other suitable frame, they all fall into scope of the present invention.
Although accompanying drawing illustrates each lens of each array and is assembled in the framework, lens also can be fabricated to and make the lens of each imageing sensor all form a mould or parts.In addition, this integral structure can also be as the framework that is installed to imageing sensor.
Lens and frame concept can be applicable to traditional imageing sensor (and need not traditional colour filter sheet), to obtain physical size, cost and feature performance benefit.
Shown in Fig. 7 A-7C, digital camera subsystem can have a plurality of independently arrays on the single image transducer, and each array has its lens (representing with 230A, 230C).The simple geometric structures of less a plurality of arrays allows less lens (diameter, thickness and focal length), and this allows to reduce the stacks as high of digital camera.
Each array is can be advantageously visible and/or can detect band at one.Especially can be at the path of this specific wavelength band and tuning to each lens.Because therefore each lens does not need to make whole spectrum to pass through, so the quantity of element can for example be reduced to one or two.
In addition and since each lens the institute at bandwidth, each lens can be colored (for example will dye at the array of red visible light bands of a spectrum and be redness) at its corresponding bandwidth in manufacture process.Replacedly, can on each lens, apply single colour filter.This process has been saved traditional colour filter (sheet of each pixel filters), has reduced cost thus, improved signal strength signal intensity and has eliminated the obstacle that pixel reduces.
The colour filter sheet save the physical size that allows to reduce pixel so that further reduce the size of whole DSC assembly.
Although Fig. 2,3A-3B and 5A-5C illustrate four arrays/lens arrangement, Fig. 4 has described tri-array/lens configuration, and the array/lens of any a plurality of numbers and various combination thereof all are fine.
Said apparatus can comprise the combination of any suitable number, from few to two arrays/lens to wideer array.Example comprises:
Two arrays/lens: red/green and blueness
Two arrays/lens: redness and blue/green
Tri-array/lens: red, green, blue
Four arrays/lens: red, blue, green, emerald green (being used for color strengthens)
Four arrays/lens: red, blue, green, infrared (being used for low light condition)
Eight arrays/lens: above-mentioned configuration is doubled pixel quantity that is used to add and picture quality.
Although Fig. 2 reflection is digital camera, this camera is intended to comprise the representative of the common unit of digital camera subsystem.Therefore Fig. 2 should be interpreted as camera and video camera, cell phone, other personal communication devices, surveillance equipment, automobile application, computer, manufacturing and checkout facility, toy and various other and continue the representative of the application of expansion.Certainly these replaceable explanations of this figure can comprise or can not comprise specific features as shown in Figure 2.For example, this circuit board does not have only camera function just to have, and as in cell phone, digital camera subsystem can be the annex of available circuit plate.
Therefore, be to be understood that, any or all method disclosed herein and/or equipment can use in the equipment of any kind or process, this equipment or process include but not limited to camera and video camera, cell phone, other personal communication devices, surveillance equipment, automobile application, computer, manufacturing and checkout facility, toy and various other and continue the application of expansion.
Unless context needs different explanations, as used herein such, following term is explained as described below like that.
" array " meaning is one group of photoelectric detector, is also referred to as pixel, and they cooperate with each other to produce an image.Array is caught photon and data is converted to the signal of telecommunication.Array outputs to signal processing circuit with this initial data, and signal processing circuit produces the output of imageing sensor image.
" digital camera " meaning is that to receive photon, go up photon conversion at semiconductor device (" imageing sensor ") be the signal of telecommunication, and with the single component of these signal processing for the output that produces photograph image.Digital camera comprises lens, imageing sensor, shutter, photoflash lamp, signal processing circuit, storage device, user interface features, power supply and any mechanical structure (for example circuit board, shell etc.) of holding these parts of any necessity.Digital camera can be a product independently, also can be embedded into other device, as cell phone, computer or countless other imaging platform available at present or that can produce in future, as those imaging platform that become feasible owing to the present invention.
" digital camera subsystem " (DSC) looks like is that to receive photon, go up photon conversion at semiconductor device (" imageing sensor ") be the signal of telecommunication and with the single component of these signal processing for the output that produces photograph image.Digital camera subsystem comprises lens, imageing sensor, signal processing circuit, shutter, photoflash lamp of any necessity and the framework that holds these parts that may need at least.Power supply, storage device and any mechanical structure is unnecessary be included in.
" electronic medium " meaning be constitute with the use of film catch electronically contrastively, processing and memory image.
" framework " or " thin plate " meaning is the parts that are used to hold lens and are installed to the DCS on the imageing sensor.
" imageing sensor " meaning is the semiconductor device that comprises photoelectric detector (" pixel "), treatment circuit and output channel.Input is a photon, and output is view data.
" lens " meaning is that light is formed in single lens or a series of lens that pile up (row of lens on another lens) on the independent array.When the many heaps of employing lens on different arrays, they are called " a plurality of lens (lenses) ".
" encapsulation " meaning is on it or the box or the framework of imageing sensor (or any semiconductor chip) wherein is installed that its protection imager also provides sealing." do not have encapsulation " and be meant the semiconductor chip that can directly be mounted to circuit board and not need to encapsulate.
" photoelectric detector " and " pixel " meaning is sensing and catches photon and be converted into the electronic device of the signal of telecommunication.The little device of these extremes uses in a large number with matrix form (hundreds of thousands is to up to a million), catches image like that with similar film.
" semiconductor chip " meaning is the discrete electronic device of making on silicon or similar substrate, and it generally uses in nearly all electronic equipment.
" signal processing circuit " meaning is the photon input information to be converted to the signal of telecommunication and finally to be converted to hardware and software in the imageing sensor of picture output signal.
Theme of the present invention can provide lot of advantages in application-specific.For example, the temperature range of conventional color filters is limited, and this has limited the terminal use and has made flexibility.Wave soldering technology, production welding procedure low-costly and in high volume can't be used owing to the temperature limitation of colour filter.At least some embodiment of theme of the present invention do not have such restriction.In fact, describe and illustrated one, some or all of embodiment do not need to adopt wave soldering technology or other production welding procedure low-costly and in high volume at this.
In addition, in case imageing sensor, framework and entirety of lens package are in the same place, this assembly just can be the device of sealing.This device does not need " encapsulation ", and like this, then can directly be mounted to circuit board if desired, thereby saves part and manufacturing cost.
Owing to create a plurality of images from the position (though the distance between the array on the same imageing sensor is little) that separates, so produced parallax, this parallax can be eliminated in signal processing circuit or be utilized/strengthen for a lot of purposes, comprises the distance between for example measurement and the object and 3-D is provided effect.
Although each array preferably is customized to relevant signal processing circuit and handles specific visible light bands of a spectrum, and for make this certain wavelengths band by and can carry out tuning to each lens, but should be understood that and do not require that each such array and relevant signal processing circuit are customized to the specific visible light bands of a spectrum of processing.Do not require yet for make the certain wavelengths band by and each lens is carried out tuning, perhaps each array is positioned at on the semiconductor device.In fact, in this description and illustrated embodiment, comprise that its specific features does not need to use the feature of specific wavelength.For example, do not need array and/signal processing circuit is customized to and handles certain wavelengths or wavelength band.
It should be noted that in certain embodiments its specific features can be customized to handles certain wavelengths or wavelength band, and other parts of this embodiment are not customized to processing certain wavelengths or wavelength band.For example, lens and/or array can be customized to handles certain wavelengths or wavelength band, and the signal processing circuit that is associated is not customized to processing certain wavelengths or wavelength band.And in other embodiments, (identical or different optical channel) one or more lens can be customized to handles certain wavelengths or wavelength band, and array that is associated and signal processing circuit are not customized to processing certain wavelengths or wavelength band.All these displacements and combination all are intended to fall within the scope of the present invention.For the purpose of concise and to the point, do not go through all these displacements and combination at this.
In addition, although digital camera subsystem comprises lens, imageing sensor, signal processing circuit, shutter, the photoflash lamp of any necessity and any framework that may need to be used to hold parts, some digital camera subsystem may not need one or more.For example, some digital camera systems may not need shutter, photoflash lamp and/or be used to hold the framework of parts.In addition, some digital camera subsystem may not need to comprise each the imageing sensor in detector, treatment circuit and the output channel.For example, in certain embodiments, one or more parts of one or more detectors (or its part), treatment circuit and/or one or more parts of output channel can be included in the device separately and/or be arranged on position separately.All these displacements and combination all are intended to fall into scope of the present invention.For the purpose of concise and to the point, do not go through all these displacements and combination at this.
Fig. 8 is the decomposition diagram according to the digital camera devices 300 of another embodiment of the present invention.Digital camera devices 300 comprises for example for example 4 optics part 330A-330D and processor 340 of 4 sensor array 310A-310D, one or more optics part of one or more sensor arraies.One or more optics part for example each among the optics part 330A-330D can comprise such as but not limited to lens, and can with one or more sensor arraies for example among the sensor array 310A-310D corresponding one be associated.In certain embodiments, provide support part 320 (for example referring to Figure 28 A-28D) (such as but not limited to framework) and support for example optics part 330A-330D of one or more optics parts at least in part.Each sensor array and corresponding optical devices part can limit a camera passage.For example, camera passage 350A can be limited by optics part 330A and sensor array 310A.Camera passage 350B can be limited by optics part 330B and sensor array 310B.Camera passage 350C can be limited by optics part 330C and sensor array 310C.Camera passage 350D can be limited by optics part 330D and sensor array 310D.The optics part of one or more camera passages is referred to as the optics subsystem at this.The sensor array of one or more camera passages is listed in this and is referred to as sensor subsystem.Two or more sensor arraies can be integrated in the common substrate that hereinafter is called image device or be arranged on this common substrate, (for example are provided with on the substrate that separates or by any combination of this dual mode, when system comprises 3 or more during multisensor array, two or more sensor arraies can be integrated in first substrate, and one or more other sensor array can be integrated in second substrate or be arranged on second substrate).
Thus, continue with reference to Fig. 8, one or more sensor arraies for example sensor array 310A-310D can or can not be arranged on the common substrate each other.For example, in certain embodiments, two or more sensor arraies are arranged on the common substrate.But in certain embodiments, one or more sensor arraies are not arranged on the same substrate with one or more other sensor arraies.
One or more camera passages can or can be not mutually the same.For example, in certain embodiments, the camera passage is mutually the same.In other embodiments, one or more camera passages are different from one or more other camera passages aspect one or more.In the described embodiment in some back, each camera passage can be used for detecting and color (or colour band) that is detected by other camera passage and/or different color (colour band) and/or the light belts of light belt.
In certain embodiments, one of camera passage for example camera passage 350A detects ruddiness, and one of camera passage for example camera passage 350B detects green glow, and one of camera passage for example camera passage 350C detects blue light.In some this embodiment, one of camera passage for example camera passage 350D detects infrared light, cyan light or emerald green light.In some other embodiment, one of camera passage for example camera passage 350A detects cyan light, one of camera passage for example camera passage 350B detects sodium yellow, one of camera passage for example camera passage 350C detects pinkish red coloured light, and one of camera passage for example camera passage 350D detects Mingguang City's (black and white).Can also use any other wavelength or wavelength band (no matter as seen still invisible) combination.
Processor 340 by one or more communication links for example communication link 370A-370D respectively with one or more sensor arraies for example sensor array 310A-310D be connected.Communication link can be the communication link of any kind of, include but not limited to for example wired (for example lead, optical cable) or wireless (for example acoustic link, electromagnetic links or their any combination, include but not limited to microwave link, satellite link, infrared link) and their combination, its each can be (as network) public or privately owned, special-purpose and/or that share.Communication link can use for example circuit switching or packet switch or their combination.Other example of communication link comprises special-purpose Point-to-Point system, cable network and cell phone system.Communication link can use any agreement or combination of protocols, includes but not limited to Internet protocol.
Communication link can transmit the information of any kind.This information can have any form, comprises such as but not limited to (binary value sequence, i.e. Bit String) simulation and/or numeral.This information can or can be without separating into piece.If be divided into piece, then the amount of information in piece can be scheduled to or dynamically be determined, and/or can be (for example consistent) fixed or variable.
As will be further described below, processor can comprise one or more channel processors, its each all with corresponding (or a plurality of) camera passage coupling and produce image based on the signal that receives from corresponding camera passage at least in part, although this does not need.In certain embodiments, one or more channel processors are applicable to its corresponding camera passage, and are for example described here such.For example, when one of camera passage was exclusively used in certain wavelengths or color (or wavelength band or colour band), corresponding channel processor can be adapted to or be suitable for this wavelength or color (or wavelength band or colour band).For example, the gain of processor, noise reduce, the combination of dynamic range, the linearity and/or any other characteristic or such characteristic can be suitable for improving and/or optimized processor at this wavelength or color (or wavelength band or colour band).Passage is handled be suitable for corresponding camera passage can help to produce to be higher than the image of the quality of the picture quality that the conventional image sensor by similar pixel quantity produces.In addition, provide the designated lane processor can help to reduce or simplify the amount of the logical block in the channel processor to each camera passage, because channel processor may not need to adapt to the extreme translation of color or wavelength, for example move to and be in another extreme color (or colour band) or wavelength (or wavelength band) from being in an extreme color (or colour band) or wavelength (or wavelength band).
At work, the optics of camera passage part receives light in the visual field, and the one or more parts that make this light for example with the form transmission of image at the plane of delineation.Sensor array receives the one or more parts by the light of this optics part transmission, and one or more output signals of representing it are provided.One or more output signals from sensor array offer processor.In certain embodiments, processor produces one or more output signals based on the one or more signals from sensor array at least in part.For example, in certain embodiments, each camera passage is exclusively used in the color (or colour band) that is different from other camera passage or the color (or colour band) or the wavelength (or wavelength band) of wavelength (or wavelength band), and processor produces independently image for each such camera passage.In some other embodiment, processor can be at least in part based on producing composograph from two or more this camera channel image.For example, in certain embodiments, each camera passage is exclusively used in the color (or colour band) that is different from other camera passage or the color (or colour band) or the wavelength (or wavelength band) of wavelength (or wavelength band), and processor will be combined so that partial colour or full-color image to be provided from two or more camera channel image.
Although illustrate processor 340 and one or more sensor arraies for example sensor array 310A-310D be separated, processor 340 or its part can have any configuration and can be arranged on one or more positions.In certain embodiments, one of processor 340, some or all of part and one or more sensor arraies for example one or more among the sensor array 310A-310D are integrated in identical one or more substrates or are arranged on identical one or more substrates.In certain embodiments, one of this processor, some or all of part is arranged on its on can be provided with one or more sensor arraies for example the one or more one or more substrates among the sensor array 310A-310D separate on one or more substrates of (and may away from).For example, some work of this processor can distribute to these one or more sensor arraies in one or morely be integrated in identical one or more substrates or be arranged on the circuit on identical one or more substrates or carry out by this circuit, and some work allocation of this processor give be integrated in the different one or more substrates of substrate integrated with this one or more sensor arraies or that be arranged at (it is interior no matter whether so one or more different substrates are physically located in camera) or be arranged on these one or more substrates circuit or by this circuit execution.
Digital camera devices 300 can or can not comprise shutter, photoflash lamp and/or parts are contained in together framework.
Fig. 9 A is for example decomposing schematic representation of the embodiment of optics part 330A of optics part.In this embodiment, optics part 330A comprises for example color coating 382, one or more mask automatic focus mask 384 and one or more IR coating IR coating 386 for example for example of for example complicated non-spherical lens module 380 of one or more lens, one or more color coating.
Lens can comprise any one or more suitable materials, comprise for example glass and plastics.Lens can be to be impregnated in colour filter, polarization or other characteristic such as any suitable manner that gives.Lens can be rigidity or flexible.Thus, some embodiment adopt have the lens (or a plurality of lens) of dye coatings, the dyestuff of diffusion in optical medium (for example one or more lens), uniform colour filter and/or with so that light passes to any other optical filtering technique of array below basically.
Color coating 382 helps one or more wavelength of optics part filtering (promptly weakening considerably) or wavelength band.Automatic focus mask 384 can limit one or more interference figures, and these patterns help digital camera devices to carry out one or more auto-focus functions.IR coating 386 helps wavelength or the wavelength band in the IR part of optics part 370A filtering spectrum.
One or more color coatings for example color coating 382, one or more mask for example mask 384, one or more IR coating for example IR coating 386 can have virtually any size, shape and/or configuration.In certain embodiments, one or more color coatings one or more tops (referring to for example Fig. 9 B) that are arranged on the optics part in the color coating 382 for example.Some embodiment of optics part (and/or its parts) can or can not comprise these one or more color coatings, one or more mask and one or more IR coating, and can or can not comprise except the feature them or replace their feature.For example, in certain embodiments, one or more color coatings for example one or more in the color coating 382 replace by being arranged on the one or more filters 388 (referring to for example Fig. 9 C) that in the optics part, for example are arranged on below the lens.In other embodiments, one or more color coatings replace (referring to for example Fig. 9 D) by one or more dyestuffs that spread in lens.
For example optics part 330A-330D can or can be not mutually the same for one or more optics parts.For example, in certain embodiments, this optics part is mutually the same.In some other embodiment, one or more optics part is different from one or more other optics parts aspect one or more.For example, in certain embodiments, the result that one or more characteristics (such as but not limited to its component type, size, response and/or performance) of one or more optics parts are suitable for corresponding sensor array and/or help to realize to expect.For example, if the certain camera passage is exclusively used in particular color (or colour band) or wavelength (or wavelength band), the optics part that then is used for this camera passage can be suitable for only this particular color (or colour band) or wavelength (or wavelength band) being transmitted to the sensor array of this certain camera passage and/or being suitable for one or more other colors of filtering or wavelength.In some such embodiment, the design of opticator brings optimization at respective wavelength or the wavelength that the respective camera passage is exclusively used in.But should be appreciated that and to adopt any other configuration.In these one or more optics parts each can have any configuration.
In certain embodiments, optics part for example each among the optics part 330A-330D comprises single lens element or a pile lens element (or lenslet), although the present invention is not limited to this as mentioned above.For example, in certain embodiments, adopt to have or do not have the single lens element of one or more filters, prism and/or mask, a plurality of lens element and/or compound lens.
Opticator can also comprise digital camera functionality and/or needed other optical signature of performance.But this can be the filter, polarizer such as electric tuning, wavefront coded, spatial light filter (mask) and other also unexpected feature.Some new features (except lens) can electric work by (as tunable filter) or utilize MEMs mechanism to come machinery to move.
With reference to Figure 10 A-10F, optics part for example optics part 330A can comprise for example any amount of lens element, optical coating wavelength filter, optical polarizator and/or their combination.Other optical element can be included in the optical stack to produce the expectation optical signature.Figure 10 A is the schematic diagram of the embodiment of optics part 330A, and wherein optics part 330A comprises single lens element 390.Figure 10 B is the schematic diagram of another embodiment of optics part 330A, and wherein optics part 330A comprises two or more lens elements, for example lens element 392A, 392B.The part of optics part can be separated from each other, integrate each other and/or according to any combination of this dual mode.Therefore, for example, two lens element 392A, 392B that represent among Figure 10 B can be separated from each other or integrate each other.
Figure 10 C-10F illustrates the schematically illustrating of exemplary embodiment of optics part 330A, and wherein optics part 330A has for example lens element 394A, 394B and one or more filter filter 394C for example of one or more lens elements.These one or more lens elements and expectation optical signature and/or optical element can be separated from each other, integrate each other and/or according to any combination of this dual mode.And one or more lens element features and/or element can be provided with according to any configuration and/or the order of for example lens-filter order (referring to for example Figure 10 C), lens-coded sequence (referring to for example Figure 10 D), lens-polarizer order (referring to for example Figure 10 E), lens-filter-coding-polarizer order (referring to for example Figure 10 F) and combination and/or variation.
In certain embodiments, the filter 394C shown in Figure 10 C makes in lens, is deposited in the optical system or is deposited on colour filter on the lens surface as the independent stratum on the supporting construction.This filter can be single bandpass optical filter or many bandpass optical filters.Coding 396C (Figure 10 D) can be applied to or be formed on the lens, and/or provides as optical element independently.In certain embodiments, coding 396C is used to revise optical wavefront, with the imaging capability of the improvement that allows to have extra back image processing.Optical polarizator 400E (Figure 10 E) can have and is used to improve any kind that picture quality such as dazzle reduce.Polarizer 400E can apply or be formed on one or more optical surfaces and/or provide as the optical element of special use.
Figure 10 G-10H is schematically illustrating according to the optics part of other embodiment of the present invention.
As mentioned above, the part of optics part can be separated from each other, integrate each other and/or according to any combination of this dual mode.If these parts are separated, then these parts can separate each other, contact with each other or according to any combination of this dual mode.For example, two or more lens that separate can separate each other, contact with each other or according to any combination of this dual mode.Therefore, some embodiment of optics part shown in Figure 10 G can implement with the lens element 402A-402C that separates each other, as schematically showing among Figure 10 I, perhaps implement, as schematically showing among Figure 10 I with two or more lens element 402A-402C that contact with each other.In addition, for example the filter of 402D can for example be embodied as independent element 402D, as schematically showing among Figure 10 G, perhaps for example is embodied as the coating 402D that is arranged on the lens surface, as schematically showing among Figure 10 J.This coating can have any suitable thickness, and can be for example thin than lens thickness, as schematically showing among Figure 10 K.Similarly, some embodiment of optics part shown in Figure 10 H can implement with the lens element 404A-404D that separates each other, as schematically showing among Figure 10 H, perhaps implement, as schematically showing among Figure 10 L with two or more lens element 404A-404D that contact with each other.For example the filter of 404E can for example be embodied as independent element 404E, as schematically showing among Figure 10 H, perhaps for example is embodied as the coating 404E that is arranged on the lens surface, as schematically showing among Figure 10 M.This coating can have any suitable thickness, and can be for example thin than lens thickness, as schematically showing among Figure 10 N.
Should be appreciated that these technology can be used in combination with any embodiment disclosed herein, but for for the purpose of briefly, these embodiment can or can not illustrate separately and/or discuss at this.
In addition, as each embodiment disclosed herein, be to be understood that any embodiment of Figure 10 A-10N can be used in combination with any other embodiment disclosed herein or its part.Therefore, the embodiment of the part of the optics shown in Figure 10 G-10N can also comprise coding and/or polarizer.
One or more camera passages for example 350A-350D can adopt the opticator of narrower wavelength band (comparing with the broadband) transmission that makes for example R, G or B, and this helps to simplify optical design in certain embodiments.For example, in certain embodiments, use optics partly than easier realization image sharpness of traditional digital cameras and focusing with single optical module of use and Bayer color filter array with strip ribbon.In certain embodiments, use a plurality of camera passages to detect the quantity that different colour bands allows to reduce optical element in each camera passage.Additional optical means such as diffraction and non-spherical surface may cause further optical element to reduce.In addition, in certain embodiments, the colour filter that the opticator of narrower wavelength band transmission allow to be used can be applied directly in the optical material or apply as coating.In certain embodiments, the transmittance that the transmittance during each is with provides greater than the colour filter that is used by color filter array on sheet traditionally.In addition, the light of institute's transmission is not presented at the variation of observed pixel to pixel in the color filter array method.In addition, in certain embodiments, use a plurality of optics and corresponding sensor array to help to simplify the quantity of optical design and element because with narrower wavelength band that the broadband optics is compared in, aberration is much smaller.
In certain embodiments, each opticator makes single color or colour band, a plurality of color or colour band or broadband transmission.In certain embodiments, one or more polarizers of light polarization that make can strengthen picture quality.
In certain embodiments, if for example opticator makes a plurality of colour bands or broadband transmission, then color filter array (color filter array that for example has the Bayer pattern) can be arranged between lens and the sensor array, and/or the camera passage can use the sensor array that can separate these colors or colour band.
In certain embodiments, the ability that the color separated that is provided by color filter array (for example Bayer pattern or its mutation) for example is provided that provides can be provided opticator itself.
In certain embodiments, can many kinds be arranged for the optics material that opticator is selected, such as but not limited to molded glass and plastics.
In certain embodiments, in one or more opticators, adopt one or more photochromic materials.These one or more materials can merge in the optical lens components, or the another feature in the optical path of for example one or more sensor arraies of conduct top.In certain embodiments, photochromic material can merge in the cover glass that the camera inlet (public aperture) of (shared by all camera passages) all optics locates, perhaps insert in the lens of one or more camera passages, perhaps insert in one or more other optical signatures in the optics optical path partly that is included in any sensor array top.
Some embodiment adopt has the optical device designs of single lens element.Some other embodiment adopt the lens with a plurality of lens elements (for example two or more elements).Lens with a plurality of lens elements can for example be used for helping to provide to be better than the wide wavelength band optical property of (as list the conventional digital imager with color filter array at sensor array).For example, some multicomponent lens subassemblies adopt the combination of discrete component to help minimize total deviation.Other have minus deviation because some lens elements have overgauge, therefore can reduce total deviation.This lens element can be made by different materials, can have different shapes and/or can limit different surface curvatures.The response that can obtain to be scheduled in this way.Determine that the process of suitable and/or best lens configuration carried out by suitable computer software by lens design person usually.
Some embodiment adopt has the optics part of three lens elements or lenslet.These three lenslets can be arranged to the heap of any configuration and separate each other, and wherein each lenslet limits two surface profiles, makes this heap limit 6 surface curvatures and two spaces (between lenslet) altogether.In certain embodiments, lens with 3 lenslets provide enough degrees of freedom to allow the designer to proofread and correct all the 3rd rank aberrations and two aberration, and provide effective focal length to lens, although this is not the requirement of each embodiment, neither have the requirement of the embodiment that arranges 3 lenslets in heaps.
Thus, Figure 11 A-11B for example is being used for the schematic diagram and the end view that are suitable for making the lens 410 that the optics part of ruddiness or red band transmission uses of red camera passage according to another embodiment of the present invention.In this embodiment, lens 410 comprise 3 lenslets of layout in heaps 418, i.e. first lenslet 412, second lenslet 414 and the 3rd lenslet 416.Lens 410 receive light in the visual field, and with at least a portion transmission and/or the shaping of this light, in the image-region of the plane of delineation 419, to produce image.Specifically, first lenslet 412 receives light in the visual field, and with at least a portion transmission and/or the shaping of this light.Second lenslet 414 receives at least a portion by the light of the first lenslet transmission and/or shaping, and with a part of transmission and/or the shaping of this light.The 3rd lenslet 416 receives at least a portion by the light of the second lenslet transmission and/or shaping, and with a part of transmission of this light and/or be shaped in the image-region of the plane of delineation 419, to produce image.
Figure 12 A-12B for example is being used for the schematic diagram and the end view that are suitable for making the lens 420 that the optics part of green glow or the transmission of green glow band uses of green camera passage according to another embodiment of the present invention.In this embodiment, lens 420 comprise 3 lenslets of layout in heaps 428, i.e. first lenslet 422, second lenslet 424 and the 3rd lenslet 426.Heap 428 receives light in the visual field, and with at least a portion transmission and/or the shaping of this light, in the image-region of the plane of delineation 429, to produce image.Specifically, first lenslet 422 receives light in the visual field, and with at least a portion transmission and/or the shaping of this light.Second lenslet 424 receives at least a portion by the light of the first lenslet transmission and/or shaping, and with a part of transmission and/or the shaping of this light.The 3rd lenslet 426 receives at least a portion by the light of the second lenslet transmission and/or shaping, and with a part of transmission of this light and/or be shaped in the image-region of the plane of delineation 429, to produce image.
Figure 13 A-13B for example is being used for the schematic diagram and the end view that are suitable for making the lens 430 that the optics part of blue light or the transmission of blue light band uses of blue camera passage according to another embodiment of the present invention.In this embodiment, lens 430 comprise 3 lenslets of layout in heaps 438, i.e. first lenslet 432, second lenslet 434 and the 3rd lenslet 436.Lens 430 receive light in the visual field, and with at least a portion transmission and/or the shaping of this light, in the image-region of the plane of delineation 439, to produce image.Specifically, first lenslet 432 receives light in the visual field, and with at least a portion transmission and/or the shaping of this light.Second lenslet 434 receives at least a portion by the light of the first lenslet transmission and/or shaping, and with a part of transmission and/or the shaping of this light.The 3rd lenslet 436 receives at least a portion by the light of the second lenslet transmission and/or shaping, and with a part of transmission of this light and/or be shaped in the image-region of the plane of delineation 439, to produce image.
Figure 14 for example is being used for the schematic diagram that is suitable for making the lens 440 that the optics part of ruddiness or red band transmission uses of red camera passage according to another embodiment of the present invention.Lens 440 among this embodiment can be characterized as being the full visual field of 60 degree.In this embodiment, lens 440 comprise 3 lenslets of in heaps 448 are set, i.e. first lenslet 442, second lenslet 444 and the 3rd lenslet 446.Lens 440 receive light in the visual field, and with at least a portion transmission and/or the shaping of this light, in the image-region of the plane of delineation 449, to produce image.Specifically, first lenslet 442 receives light in the visual field, and with at least a portion transmission and/or the shaping of this light.Second lenslet 444 receives at least a portion by the light of the first lenslet transmission and/or shaping, and with a part of transmission and/or the shaping of this light.The 3rd lenslet 446 receives at least a portion by the light of the second lenslet transmission and/or shaping, and with a part of transmission of this light and/or be shaped in the image-region of the plane of delineation 449, to produce image.
Figure 15 A-15F is schematically illustrating of some other type lens that can adopt.Specifically, Figure 15 A-15E is the schematically illustrating of other lens 450-458 that comprises the heap with 3 lenslet 450A-450C, 452A-452C, 454A-454C, 456A-456C, 458A-458C.Figure 15 is the schematically illustrating of lens 460 that only has a lens element.Yet should be appreciated that the optics part can have any amount of parts and configuration.
Figure 16 A-16C is for example sensor array 310A and for example expression of the embodiment of 470-476 of connected circuit of sensor array.The sensor array for example purpose of sensor array 310A is the signal (for example signal of telecommunication) of catching light and being converted into one or more these light of expression, and these signals are provided for one or more circuit for example as described below that are connected with this sensor array.With reference to Figure 16 A, sensor array comprises a plurality of sensor elements, for example a plurality of identical photoelectric detectors (being sometimes referred to as " picture element " or " pixel "), and for example pixel 480 1,1-480 N, mPhotoelectric detector is photoelectric detector 480 for example 1,1-480 N, mBe arranged to array, for example the matrix-type array.In this array the scope of the quantity of pixel can be for example from hundreds of thousands to up to a million.These pixels can for example be arranged to for example to have the two-dimensional array configuration of multirow and multiple row, and for example 640 * 480,1280 * 1024 etc.But, the size and the yardstick of pixel can be set as required, and these pixels can distribute according to the pattern of any needs.Even can use the pixel that distributes not according to any regular pattern.With reference to Figure 16 B, for example pixel 480 1,1Pixel can be considered as having for example yardstick of x and y yardstick, although will be appreciated that, the photon of pixel is caught part can or can not occupy the whole area of this pixel, and can or can not have regular shape.In certain embodiments, sensor element is arranged on the plane that is called sensor plane here.This transducer can have the sensor reference axle of quadrature, comprises for example x axle, y axle and z axle, and can be configured to the optics part that makes this sensor plane be parallel to xy plane X Y and point to the camera passage.Each camera passage all has corresponding to can be by the visual field of the observed broad area of this sensor array.Each sensor element can for example be associated with the appropriate section of this visual field.
Sensor array can adopt the technology of any kind, combination (mixing) such as but not limited to MOS pixel technique (meaning is that one or more parts of transducer are implemented according to " metal-oxide semiconductor (MOS) " technology), charge-coupled device (CCD) pixel technique or two kinds of technology, and can comprise any one or more suitable materials, comprise for example silicon, germanium and/or their combination.Sensor element or pixel can form according to any suitable mode.
At work, sensor array for example sensor array 310A (be similar to scanner) for example line by line or fully (be similar to traditional film camera exposure) be exposed to light.Reach special time period (time for exposure) afterwards being exposed to light, can for example read for example pixel 480 line by line 1,1-480 N, mPixel.
In certain embodiments, use be sometimes referred to as the row logic unit for example the circuit of row logic unit 470 read from pixel pixel 480 for example 1,1-480 N, mSignal.With reference to Figure 16 C, be schematically illustrating of image element circuit, in some such embodiment, can pass for example word line one in the word line 482 for example of sensor array 310A of sensor array by the statement level, come for example pixel 480 of each delegation ground accessor sensor element 1,1Data can by vertically pass sensor array for example the bit line of sensor array 310A import into and/or spread out of for example pixel 480 of sensor element 1,1
Will be appreciated that pixel is not limited to the configuration shown in Figure 16 A-16C.As mentioned above, each in one or more sensor arraies can have any configuration (for example size, shape, pixel design).
Sensor array for example sensor array 310A-310D can be same to each other or different to each other.For example in certain embodiments, sensor array is mutually the same.And in some other embodiment, one or more sensor arrays are listed in one or more aspects and are different from one or more other sensor arraies.For example, in certain embodiments, one or more characteristics of one or more sensor arraies (such as but not limited to its component type, size (for example surface area) and/or performance) result of being suitable for corresponding optical devices part and/or helping to realize to expect.For example, if the certain camera passage is exclusively used in particular color (or colour band) or wavelength (or wavelength band), the sensor array that then is used for this camera passage can be suitable for this particular color (or colour band) or wavelength (or wavelength band) are compared the sensitivity of other color or wavelength and want high, and/or only this particular color of sensing (or colour band) or wavelength (or wavelength band).In some such embodiment, that respective wavelength that the Pixel Dimensions (for example effective area on pixel surface) of the primitive shape of sensor array (for example shape of the effective coverage of pixel (the photosensitive surf zone of this pixel)), design, operation, array sizes (for example surface area of the live part of array) and/or sensor array is exclusively used at the camera passage or wavelength bring is definite, select, customization and/or optimize.Yet should be appreciated that and to use any other configuration.In one or more sensor arraies each can have any configuration (for example size and dimension).
As described herein such, each sensor array can for example be exclusively used in specific light belt (visible and/or invisible), for example a kind of color or colour band.If like this, then can be tuning so that catch and/or handle one or more images in its specific light belt more efficiently to each sensor array.
In this embodiment, the well depth of striding the photoelectric detector of each independent array is identical, although in some other embodiment, this well depth can be different.For example, the well depth of any given array can easily manufacture the well depth of other array that is different from sensor subsystem.Selection to suitable well depth can be depending on several factors, most probable comprise at the visible light bands of a spectrum.Because each whole array may be at visible light bands of a spectrum (for example red), so well depth can be designed as and catches this wavelength and ignore other (for example blue, green).
In the particular color array, mix with semi-conducting material and can strengthen selectivity the photonic absorption of particular color wavelength.
In certain embodiments, pixel can be in response to a kind of specific color or colour band (being wavelength or wavelength band).For example in some such embodiment, the optics part can comprise only to be made this particular color or colour band transmission and/or weakens the wavelength that is associated with other color or colour band or the lens and/or the filter of wavelength band.At some among other such embodiment, colour filter and/or color filter array be arranged on one or more sensor arraies one or more parts the top and/or on.In some other embodiment, do not have colour filter or color filter array to be arranged on any sensor array and list.In certain embodiments, sensor array separate colors or colour band.In some such embodiment, sensor array can be equipped with has for example pixel of many bands sensing function of two or three color.For example, each pixel can comprise two or three photodiodes, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band, and the 3rd photodiode is suitable for detecting the third color or the 3rd colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.In certain embodiments,, also use this sensor array, for example make this sensor array be suitable for particular color or colour band even pixel may only be seen a kind of particular color or colour band.In fact, the material layer that weakens specific wavelength and other wavelength is passed through can be arranged on the photodiode surface or be integrated in the photodiode surface.In this way, each pixel plays the effect of a plurality of photodiodes that are suitable for a plurality of wavelength of sensing.
Figure 17 A is according to the part of the sensor array of one embodiment of the present of invention plane graph of the part of sensor array 310A for example.This part of this array comprises 6 unit cells, and for example the unit 490 I, j-490 I+2, j+1Each unit cell has pixel region, and for example unit cell 490 I+2, j+1Has pixel region 492 I+2, j+1This pixel region can for example be but be not limited to the p injection region.Can by statement for example level pass for example word line one in the word line 494 for example of sensor array 310A of sensor array, come for example pixel 492 of each delegation ground accessor sensor element I, j-492 I+2, j+1Can for example provide energy on the power line 496 at the power line, this power line can for example pass sensor array vertically.Data can by can for example vertically pass sensor array for example sensor array 310A bit line for example bit line 498 import into and/or spread out of for example pixel 492 of sensor element I, j-492 I+2, j+1
In certain embodiments, each sensor array has 1.3M pixel.In these embodiments, 3 camera passages can provide about 4M the effective resolution of a pixel.4 camera passages can provide about 5.2M the effective resolution of a pixel.
In some other embodiment, each sensor array has 2M pixel.In these embodiments, 3 camera passages can provide about 6M the effective resolution of a pixel.4 camera passages can provide about 8M the effective resolution of a pixel.
Will be appreciated that sensor array is not limited to the design shown in Figure 17 A.As mentioned above, each in one or more sensor arraies can have any configuration (as size, shape, pixel design).
Figure 17 B is the cross-sectional view that has in order to the injection part of the pixel of the single trap of catching all wavelengths.
For example, Figure 17 C is the cross-sectional view of injection part with pixel of trap, and this trap is formed on " depths " in the semiconductor (for example silicon), make inject the degree of depth be suitable for or be applicable to improve wavelength with the red scope that is associated for example in the seizure or the collection of light.Like this, the embodiment shown in Figure 17 C comprises that the dark injection of knot forms to produce high efficiency red detector, and wherein the depths in semiconductor is gathered, detects or caught to photon.In this embodiment, maybe advantageously, before light incides on the pixel, adopt colour filter or light carried out optical filtering, with weaken to a large extent the light that wavelength is associated with the color of non-redness (wavelength with scope that redness is associated in photon).
The well depth of pixel or photoelectric detector can be scheduled to, select and/or be designed to response is tuned as and be suitable for photoelectric detector.Thus, with reference to Figure 17 D, illustrate " being tuned as " catch, gather or in response to wavelength with scope that blueness is associated in the pixel of photon.The cross-sectional view of the injection of pixel part comprises trap, and this trap is formed on " near surface " in the semiconductor (for example silicon), make implantation depth be suitable for or be applicable to improve wavelength with scope that blueness is associated in the seizure or the collection of light.Therefore, with reference to Figure 17 C, form shallow junction in semiconductor, it is optimised at gather, detect or catch wavelength (with reference to Figure 17 C) in the scope that is associated with blueness near detector surface.Like this, in this embodiment, owing to optionally this district is flow into certain depth and can omit filter.That is to say that the filter material can be unnecessary because green and red photon pass main collection, detection or seizure blue signal ((wavelength with scope that blueness is associated in photon) acquisition zone.
With reference to Figure 17 E, pixel or photoelectric detector can " tuning " for catch, gather or in response to wavelength mainly with scope that redness is associated in photon.At this, well region forms and/or is limited to the degree of depth place that mainly is associated with red wavelength.
With reference to Figure 17 F, pixel or photoelectric detector can " tuning " for catch, gather or in response to wavelength mainly with scope that green is associated in photon.At this, well region forms and/or is limited to the degree of depth place that mainly is associated with green wavelength.
It should be noted that pixel or photoelectric detector can " tuning " for catch, gather or in response to wavelength mainly with scope that any color is associated in photon.Thus, the well region of pixel or photoelectric detector forms and/or is limited to mainly and the degree of depth place that waits to catch or the wavelength of color to be collected is associated.In these embodiments, the given zone that is used to gather can form by knot is imbedded in the semiconductor-based material.In this case, junction depth and shape by change is imbedded just can realize wavelength selectivity.With optical path together, further selectivity and wavelength response degree can allow single or multiple bands to lead to detector.
Pixel or photoelectric detector can " tuning " for catch, gather or in response to wavelength mainly with scope that multiple color is associated in photon.For example with reference to Figure 17 G, first pixel (being positioned at the left side) comprises the well region that forms and/or be limited to the degree of depth place that mainly is associated with the wavelength of red (deeply) and blue (more shallow).Like this, this pixel or photoelectric detector " tuning " for catch or gather wavelength mainly with scope that two kinds of colors are associated in incident photon.The pixel on the right comprise form and/or be limited to main and a kind of color, at the well region at this degree of depth place that is associated for the wavelength of green.This sensor array can comprise one, some or all of pixel (being positioned at the left side or the right).In addition, this sensor array can comprise the pattern of two types of pixels.
It should be noted that pixel or photoelectric detector can " tuning " for catch, gather or in response to wavelength mainly with scope that any two or more colors are associated in photon (as long as such color sufficiently separates to allow suitable sensing).(blue and green referring to the pixel sensing of for example Figure 17 H-by being positioned at the left side, the pixel sensing by being positioned at the right green and red).
There are a lot of embodiment to relate to the district of tuning well depth and/or pixel or photoelectric detector, for example,
λ 3/ λ, 2/ λ 1 (for example R/G/B) color filter array on-each pixel
λ 3/ λ, 2/ λ 1 (for example R/G/B) photodiode in-each pixel
λ 2 (for example G) in λ 3/ λ 1 (for example R/B) photodiode in-one pixel, pixel
λ 3/ λ, 2/ λ 1 (for example R/G/B) photodiode in-one pixel
λ 3/ λ 1 (for example G2/B) in λ 4/ λ 2 (for example R/G1) photodiode in-one pixel, pixel
λ 4/ λ 3/ λ, 2/ λ 1 (for example R/G2/G1/B) color filter array on-each pixel
λ 4/ λ 3/ λ, 2/ λ 1 (for example R/G2/G1/B) photodiode in-one pixel
λ 4/ λ 3/ λ, 2/ λ 1 (for example R/G2/G1/B) photodiode in-each pixel
Attention: represent the wavelength that increases progressively from the wavelength band of λ 1 to λ 4, and scope can be from UV to IR (for example, 200-1100nm is used for silicon photoelectric diode)
All embodiment that relate to the district of tuning well depth and/or pixel or photoelectric detector are intended to fall into scope of the present invention, and therefore can here describe with illustrated any embodiment in implement.
Generally speaking, because each photodetector array is separated from one another, and, can realize various injections and knot configuration by the present invention unlike the traditional array that can only handle in a similar manner owing to the close of adjacent photo detectors.Utilize the combination of the detector of above-mentioned one or more technology and/or embodiment or filter and specific wavelength, can realize various photoelectric detector topologys.
The configuration of sensor array (for example quantity of sensor element, shape, size type and layout) may influence the characteristic of the image of institute's sensing.For example, Figure 18 A-18B illustrates by for example explanatory expression of the image of the part seizure of 310A of sensor array.Specifically, Figure 18 A is the key-drawing of image of the object (lightning) of the part of impact microphone array.In this embodiment, the photon of sensor element seizure part (or effective coverage) for example photon seizure part 502 is represented by circle usually, although in fact pixel can have Any shape, comprises for example irregularly shaped.Concerning this example, the photon (for example impinging upon the photon in the circle XX) that the photon of bump pixel or photoelectric detector is caught part or effective coverage comes sensing and/or seizure by this photon seizure part or effective coverage.Figure 18 B illustrates for example part 504 of the photon part of being caught by the transducer among this embodiment.Not the photon of impact microphone element (for example impinging upon the photon outside the circle XX) not sensed/catch.
Figure 19 A-19B illustrates by the part of transducer schematically illustrating of the sensor array 310A image of catching for example, and the part of this transducer provides the nearlyer interval of more sensor element and these elements than the transducer among Figure 18 A.Specifically, Figure 19 A illustrates the image of the object (lightning) of this transducer of bump.Concerning this example, the bump photon catch part for example the photon photon of catching part 506 by this photon capture unit assign to sensing and/or seizure.Figure 19 B illustrates the photon part of being caught by the transducer in this example, and for example part 508.Transducer that it should be noted that Figure 19 A is more than the photon of the transducer seizure of Figure 18 A.
Optics part optics part 330A and the corresponding sensor array schematic diagram of the relative positioning that provides of sensor array 310A for example for example is provided Figure 20 A-20B in certain embodiments.Thus, although should be appreciated that Figure 20 A-20B illustrate have axle for example optics part and the sensor array of axle 510A axle for example an axle 512A align, some embodiment can not adopt such alignment.In addition, in certain embodiments, optics part and/or sensor array can not have axle.
4 optics part optics part 330A-330D and 4 sensor arraies schematic diagram of the relative positioning that provides of sensor array 310A-310D for example for example is provided Figure 21 in certain embodiments.Although Figure 21 illustrate have axle for example axle 510B optics part for example each among the optics part 330B and corresponding sensor array for example sensor array 310B axle for example a 512B align, be to be understood that some embodiment can not adopt such alignment.In addition, in certain embodiments, these one or more optics parts and/or one or more sensor array can not have axle.
In certain embodiments, optics part is roughly the same with corresponding sensor array general size, and therefore depends on the yardstick of array below and can differ from one another on size and dimension.But do not require that given optics partly covers all or only cover following array.In some alternative embodiments, the optics part can only cover the part of array and/or can extend beyond this array.
Figure 22 A-22B is respectively according to the image device 520 of one embodiment of the present of invention and corresponding optical devices part for example plane graph and the cross-sectional view of an embodiment of the image-region of optics part 330A-330D, one or more sensor arraies for example sensor array 310A-310D can be provided with and/or be integrated in the image device 520 or on.In this embodiment, image device 520 outer perimeter that has first and second first type surfaces 522,524 and limit by edge 526,528,530,532.Image device 520 is respectively one or more sensor arraies and for example limits one or more districts and for example distinguish 534A-534D in the effective coverage of sensor array 310A-310D.This image device is respectively also that for example the buffer that is associated of sensor array 310A-310D and/or logical block limit one or more districts and for example distinguish 536A-536D and 538A-538D with one or more sensor arraies.
This image device can also limit near the circumference that is arranged on this image device (for example along and adjacent to one, two, three of this image device or four edges extending) and/or the one or more additional district that is used between the district of sensor array for example distinguishes 540,542,544,546.In certain embodiments, the pad of one or more conductions for example fills up 550,552,554,556, one or more parts of processor, one or more parts of additional memory and/or the circuit or the feature of any other type can be arranged in one or more these districts or its part.One or more such pads can be used for providing one or more signals of telecommunication and/or the one or more circuit from this image device to offer one or more other circuit that are positioned on this image device or leave this image device.
In certain embodiments, main outer surface is defined for for example one or more stayed surfaces of one or more parts of strutting piece 320 of supported part.This stayed surface can be arranged on any district and for example distinguish in 540-546 or its part, but in certain embodiments, advantageously, stayed surface is positioned at outside the effective coverage of sensor array, thereby does not disturb pixel in these zones to the seizure of photon.
One or more optics parts for example optics part 330A-330D produce for example image-region 560A-560D of image-region respectively on the plane of delineation.
Image device, sensor array and image-region can have virtually any size and shape respectively.In certain embodiments, image-region is roughly the same with corresponding sensor array general size, and therefore, image-region depends on the yardstick of sensor array below and can differ from one another on size and dimension.Certainly, do not require that the image-region covering all or only covers following array.In alternative embodiment, image-region can only cover the part of array, and can extend beyond this array.
In this embodiment, image-region for example image-region 560A-560D extend beyond for example outer perimeter of sensor array 310A-310D of sensor array respectively.This image device has and is generally foursquare shape, and wherein first yardstick 562 approximates 10mm, and second yardstick 564 approximates 10mm, and each quadrant has first yardstick 566 that equals 5mm and second yardstick 568 that equals 5mm.Each image-region has width or the diameter 570 that is generally circular shape and approximates 5 millimeters (mm).Each effective coverage has the shape that is generally rectangle, and wherein first yardstick 572 approximates 4mm, and second yardstick 574 approximates 3mm.The effective coverage can limit for example matrix of 1200 * 900 pixels (promptly 1200 being listed as 900 row).
Figure 23 A-23B is respectively according to the plane graph of the image device of another embodiment and image-region and cross-sectional view.In this embodiment, image device 520 has for example 550-556 of one or more pads, and they are provided with according to the configuration that is different from the one or more pads among the embodiment that illustrates above.Image device 520, sensor array and image-region 560A-560D for example can have the identical shaped and yardstick that the embodiment at the image device shown in Figure 22 A-22B sets forth in the above.
Figure 24 A-24B is respectively according to the image device 520 of another embodiment and the plane graph and the cross-sectional view of image-region.In this embodiment, image device 520 has the vertical extension area that is arranged between the sensor array, and the vertical extension area that is arranged between the sensor array among the embodiment of this district than the image device shown in Figure 22 A-22B is narrower.Wideer along the horizontal-extending district 542,546 that circumference is provided with than the horizontal-extending district 542,546 that the circumference along the image device 520 shown in Figure 22 A-22B is provided with.Image device 520 for example can have the identical shaped and yardstick that the embodiment at the image device shown in Figure 22 A-22B sets forth in the above.
Figure 25 A-25B is respectively according to the image device 520 of another embodiment and image-region for example plane graph and the cross-sectional view of image-region 560A-560D.In this embodiment, image-region for example image-region 560A-560D all do not extend beyond for example outer perimeter of sensor array 310A-310D of sensor array respectively.Image device 520 and sensor array for example can have the identical shaped and yardstick that the embodiment at the image device 520 shown in Figure 22 A-22B sets forth in the above.
Figure 26 A-26B is respectively according to the plane graph of the image device of another embodiment and image-region and cross-sectional view.In this embodiment, the district 540-546 that is arranged between the edge of sensor array and image device is wideer than the district 540-546 between the edge of the image device among the embodiment that is arranged on sensor array and Figure 22 A-22B.Such district can be used for one or more parts of for example one or more pads, processor, as installing zone and/or base and/or their any combination of strutting piece.
In addition, in this embodiment, the horizontal-extending district 564 that is arranged between the sensor array is wideer than the horizontal-extending district 546 between the sensor array among the embodiment that is arranged on Figure 22 A-22B.Such district 546 can be used for one or more parts of for example one or more pads, processor, as installing zone and/or base and/or their any combination of strutting piece.This image device and sensor array can have the identical shaped and yardstick of for example setting forth above.
For each embodiment disclosed herein, this embodiment can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
For this reason, for example, Figure 27 A-27B is respectively according to the image device 540 of another embodiment and plane graph and the cross-sectional view of image-region 560A-560D.This embodiment of image device 520 and image-region 560A-560D is similar to the embodiment of image device shown in Figure 26 A-26B and image-region, just image-region for example image-region 560A-560D do not extend beyond for example outer perimeter of sensor array 310A-310D of sensor array respectively.
Figure 28 A is the perspective view according to the strutting piece 320 of another embodiment of the present invention.Strutting piece 320 can have any configuration, and can comprise such as but not limited to framework.Figure 28 B-28D is the cross-sectional view of the amplification of strutting piece 320.With reference to Figure 28 A-28D, the optics of one or more camera passages part for example optics part 330A-330D by one or more strutting pieces for example strutting piece 320 support, this strutting piece is located each optics part at least in part with corresponding sensor array registration ground.In this embodiment, for example locate on optics part 330A and sensor array 310A registration ground.Optics part 330B and location, sensor array 310B registration ground.Optics part 330C and location, sensor array 310C registration ground.Optics part 330B and location, sensor array 310B registration ground.Optics part 330D and location, sensor array 310D registration ground.
In certain embodiments, strutting piece 320 can also help to limit, minimize and/or eliminate light between the camera passage and " crosstalks " and/or help restriction, minimize and/or eliminate light and " enter " outside digital camera devices.
In certain embodiments, strutting piece 320 limits one or more support sections, 4 support section 600A-600D for example, its each all support and/or help to locate in one or more optics parts corresponding one.For example, in this embodiment, support section 600A is with optics part 330A and sensor array 310A registration ground supports and the location.Support section 600B is with optics part 330B and sensor array 310B registration ground supports and the location.Support section 600C is with optics part 330C and sensor array 310C registration ground supports and the location.Support section 600D is with optics part 330D and sensor array 310D registration ground supports and the location.
In this embodiment, each support section for example support section 600A-600D limit aperture 616 and base 618.Aperture 616 is the path that corresponding camera passage limits optical transmission.Base 618 is suitable for receiving corresponding optics part (or its part) and is suitable for supporting at least in part and/or locating the corresponding optical devices part.Thus, base 618 can comprise the one or more surfaces (for example surface 620,622) that are suitable in abutting connection with one or more surfaces of this optics part, to support and/or to locate this optics part at least in part with respect to one or more among this support section and/or the sensor array 310A-310D.In this embodiment, surface 620 is arranged on around the circumference of this optics part, to locate this optics part in x direction and y direction upper support with help.Surface 622 (being sometimes referred to as " stop " surface) are partly located this optics or are helped to be positioned on the z direction.
The position of stop surfaces 622 and/or orientation may be modified as optics partly is positioned on specific range (or distance range) with respect to corresponding sensor array locates and/or be orientated.Thus, the degree of depth of base 618 control lens position (for example installing) in strutting piece 320.This degree of depth may be different to each lens, and at least in part based on the focal length of lens.For example, if the camera passage is exclusively used in specific color (or colour band), the one or more lens that then are used for this camera passage can have the focal length that is particularly suitable for the color (or colour band) that the camera passage is exclusively used in.If each camera passage is exclusively used in the color (or colour band) of the color (or colour band) that is different from other camera passage, then each lens can have different focal lengths, for example making lens be suitable for corresponding sensor array, and each base has the different degree of depth.
Each optics part can be fixed in the corresponding base 618 in any suitable manner, and this mode is such as but not limited to (for example interference fit, the physics stop) of machinery, (for example adhering to), (for example electricity engages) and/or their combination of electricity of chemistry.Base 618 can comprise and is suitable for the yardstick that corresponding optical devices partly provides interference fit.
Aperture (or its part) can have any configuration (for example shape and/or size), comprises for example columnar, conical, rectangle, irregular and/or their any combination.This configuration can be based on the desired configuration of for example optical path, the configuration of respective optical device portions, the configuration of respective sensor array and/or their any combination.
Should be appreciated that strutting piece 320 can have or not have for example 600A-600D of lucky 4 support sections.For example, in certain embodiments, this strutting piece comprises the support section (for example 1,2 or 3 support sections) that is less than 4.In further embodiments, this strutting piece comprises the support section above 4.Although it is mutually the same that support section 630A-630D is shown as, this is optional.In addition, in certain embodiments, one or more support sections can be isolated with one or more other support sections at least in part.For example, strutting piece 320 can further limit one or more inner support parts gap or space partly isolated with one or more other support sections.
Strutting piece 320 can comprise the material of any kind and can have any configuration and/or structure.For example, in certain embodiments, strutting piece 320 comprises silicon, semiconductor, glass, pottery, plastics or metal material and/or their combination.If strutting piece 320 has the part more than, then these parts can be made, integrate each other the ground manufacturing apart from each other and/or make by any combination of this dual mode.If this strutting piece limits the support section more than, then each such support section for example support section 600A-600D can be as shown with other support section in one, some or all couplings, perhaps isolated fully with other support section.This strutting piece can be still also can adopt the device of other form for making and material provides the solid unit of extensive selection.For example, in certain embodiments, strutting piece 320 comprises the plate (for example thin plate) that limits one or more support sections, and wherein aperture and base form by machining (for example boring) or any other suitable method.In further embodiments, strutting piece 320 (for example utilizing the mould of the projection with the aperture that limits one or more support sections and base) is manufactured to the foundry goods that wherein is limited with the aperture.
In certain embodiments, lens and strutting piece are manufactured to single moulding part.In certain embodiments, lens can manufacturedly have the small pieces (tab) that can be used for forming strutting piece.
In certain embodiments, strutting piece 320 directly or indirectly is coupled and/or is attached on the image device.For example, strutting piece 320 can (for example use adhesive) and directly is coupled and be attached on the image device or by intermediate support assembly (not shown) indirect coupling and/or be attached on the image device.
The x of strutting piece 320 and y yardstick can be for example roughly the same with image device (on one or more yardsticks), roughly the same and/or roughly the same with the layout (on one or more yardsticks) of sensor array 310A-310D with the layout (on one or more yardsticks) of optics part 330A-330D.The advantage that yardstick is set like this is to help to keep the x and the y yardstick of digital camera devices as much as possible little.
In certain embodiments, maybe advantageously, to base 618 provide with will be in abutting connection with the identical height A of height of the optics of stop surfaces 620 part.Maybe advantageously, stop surfaces 622 is arranged on equals to allow base 618 to provide the height B (for example distance between the pedestal of stop surfaces 622 and support section) of the needed height of firm retention at least for superincumbent optics part (for example lens) will be installed.The width of the part in the aperture 616 that is higher than the height of stop surfaces 622 and is provided with or diameter C can be for example based on the width or the diameter that will be installed in optics part (for example lens) wherein and the methods that is used for optics is partly adhered to and/or remains on base 618.The width of stop surfaces 622 is preferred must to be enough to help for optics part (for example lens) provides firm retention greatly, but little be enough to stop by the light of optics part transmission unnecessary reduce to minimum.May it is desirable to, will be lower than the height of stop surfaces 622 and the width of the part in the aperture 616 that is provided with or diameter D big must be enough to help with to stop by the light of optics part transmission unnecessary reduce to minimum.In view of above-mentioned consideration, may it is desirable to, provide the height that equals needed smallest dimension E to this strutting piece, thereby produce firm one or more strutting pieces that will be installed in optics part wherein that are enough to support, and maybe advantageously, make one of one or more support section 600A-600D or aperture 616A-616D as far as possible little at interval but big must be enough to make this strutting piece firm the spacing F of the optics part that is enough to support will be installed in wherein.This strutting piece can have length J and width K.
In certain embodiments, it is desirable to, the height A that equals 2.2mm is provided to base 618, provide stop surfaces 622 at 0.25mm to the height B place in the 3mm scope, the width or the diameter C of part that makes this aperture be higher than the height B of stop surfaces 622 approximates 3mm greatly, the width of the bottom in this aperture or diameter D approximate 2.8mm greatly, provide 2.45mm height E in the 5.2mm scope to support section, and make the spaced apart 1mm at least in aperture apart from F.In some such embodiment, may it is desirable to, provide length J to equal 10mm and width K equals the strutting piece of 10mm.In further embodiments, may it is desirable to, provide length J to equal 10mm and width K equals the strutting piece of 8.85mm.
In certain embodiments, one or more optics part for example optics part 330A comprises the lens of cylindrical shape type, and the NT45-090 lens made of Edmunds Optics for example are although this not necessarily.It is the cylindrical shape part of 2.19mm up to 3 millimeters (mm) and height H that this lens have diameter G.In these embodiments, may it is desirable to, adopt to have the yardstick of setting forth in the preceding paragraph and the strutting piece of scope.
In certain embodiments, the length J of strutting piece equals 10mm and width K equals 10mm.In further embodiments, may it is desirable to, provide length J to equal 10mm and width K equals the strutting piece of 8.85mm.
Figure 29 A is according to the strutting piece 320 of another embodiment and is installed in wherein the optics part cross-sectional view of optics part 330A-330D for example.In this embodiment, optics partly have with Fig. 7 A-7C embodiment in the opposite orientation of optics part.
Figure 29 B is according to the strutting piece of another embodiment and is installed in wherein the optics part cross-sectional view of optics part 330A-330D for example.In this embodiment, each optics part comprises the single lens element with handle (shank) part 702A-702D respectively.Strutting piece 320 have with Fig. 6 A-6C embodiment in the opposite orientation of strutting piece, make optics part be installed in respectively on stop surfaces 622A-622D away from the direction of sensor array (not shown).
The feature that is to be understood that various embodiment described here can be used separately and/or be used in combination.
Figure 30 A-30D illustrates the strutting piece 320 with 4 support section 600A-600D, each support section all is that corresponding optical devices partly limits for example aperture 616A-616D of aperture, wherein by one or more support sections for example the base that limits of support section 600A for example base 618A be arranged on degree of depth 710A place, the base that this degree of depth is not equal to one or more other support sections is the degree of depth degree of depth 710C for example of base 618C for example, is suitable for the focusing of respective optical device portions for example to make these one or more support sections.As mentioned above, the position of stop surfaces 622 and/or orientation can be suitable for optics partly is positioned with respect to the specific range (or distance range) of respective sensor array and/or orientation.Thus, the degree of depth of base 618 control lens position (for example installing) in strutting piece 320.In certain embodiments, one of optics part is applicable to blue light or blue light band, and in the optics part another is applicable to ruddiness or red band, yet also can use other configuration.
Figure 31 A-31D illustrates the strutting piece 320 with 4 support section 600A-600D, each support section all is respectively corresponding optical devices and partly limits aperture 616A-616D and base 618A-618D, wherein one or more support sections for example support section 600A the aperture for example the diameter 714A of aperture 616A less than one or more other support sections diameter 714C in the aperture 616 of support section 600C for example.
For each embodiment disclosed herein, this embodiment can use separately or be used in combination with one or more other embodiment disclosed herein or its part.Thus, in certain embodiments, as in the embodiment of the strutting piece shown in Figure 30 A-30D, the base that is limited by one or more support sections and the base position of other support section are at different degree of depth places, so that make so one or more support sections be suitable for the focal length of respective optical device portions.
In certain embodiments, one of optics part is applicable to blue light or blue light band, and in the optics part another is applicable to ruddiness or red band, yet also can use other configuration.
Figure 32 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of one embodiment of the invention and digital camera.In this embodiment, one or more optics part for example optics part 330A-330D be installed in the strutting piece 320 and/or be attached to strutting piece 320.Overlay on the strutting piece 320 on first knitting layer 722, and overlay on wherein on first knitting layer 722 or be provided with on it and/or integrated one or more Sensor sections image device of Sensor section 310A-310D for example on the image device 520 for example.Overlay on the image device 520 on second knitting layer 724, and overlay on the printed circuit board (PCB) 110 on second knitting layer 724.
This printed circuit board (PCB) comprises the main outer surface 730 of qualification installation diagram as the Unit Installation district.Main outer surface 730 can also limit the one or more additional installing zone (not shown) that the one or more attachment devices that use can be installed on it in digital camera.Providing one or more pads 732 to be connected to one or more devices that are installed on this main outer surface on the main outer surface 730 of printed circuit board (PCB).
Image device 520 comprises one or more sensor arraies for example sensor array 310A-310D and one or more conductive layer.In certain embodiments, image device also comprises, some or all of part of the processor of digital camera devices.Image device 520 also comprises and limits the main outer surface 740 that the installing zone of strutting piece 320 is installed on it.
One or more conductive layers can be patterned as limit one or more pads 742 and will these one or more pads with one or more sensor arraies in one or more one or more track (not shown) that are connected.Pad 742 for example is arranged on along a side of image device, two sides, three sides or four sides near the circumference of image device 520 for example.One or more conductive layers can comprise for example copper, Copper Foil and/or any other suitable electric conducting material.
A plurality of conductors 750 can be connected the one or more pads 742 on the image device 520 with one or more pads 732 on the circuit board 720.Conductor 750 for example can be used for the one or more circuit on the image device are connected with one or more circuit on the printed circuit board (PCB).
First and second knitting layers 722,724 can comprise any suitable material, such as but not limited to adhesive, and can comprise any suitable configuration.First and second knitting layers 722,724 can comprise identical materials, although this not necessarily.As used herein, knitting layer can be continuous or discrete.For example, conductive layer can be etched layer printed circuit board.In addition, knitting layer is can yes or no smooth, or even is smooth basically.For example, conformal (conformal) knitting layer on the non-planar surface is a non-flat forms.
A plurality of optics parts for example optics part 330A-330D are installed in the strutting piece and/or are attached to strutting piece.
In certain embodiments, digital camera devices 300 has the yardstick of about 2.5mm * 6mm * 6mm.For example, thickness can equal about 2.5mm, and length can equal about 6mm, and width can equal about 6mm.In some such embodiment, Zong digital camera devices has one or more sensor arraies of a total 1.3M pixel, although also can adopt other configurations (for example different thickness, width, length and pixel quantity).
In certain embodiments, the one or more circuit on the image device 520 can be communicated by letter with one or more devices by one or more wireless communication links.In some such embodiment, image device 520 can be limited to one or more installing zones of one or more circuit that use in this wireless communication link and/or the one or more discrete devices that use in this wireless communication link.
Digital camera devices 300 can be assembled and install according to any way.Figure 33 A-33F illustrates assembling and an embodiment of this digital camera devices is installed.With reference to Figure 33 A, initial, image device 520 is provided.With reference to Figure 33 B, first knitting layer 722 is provided in one or more districts on one or more surfaces of image device 520.These districts define one or more installing zones for strutting piece.With reference to Figure 33 C, after this strutting piece 520 is positioned on the knitting layer 722.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between image device and strutting piece.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between image device 520 and strutting piece 320, form activation to be provided and/or to solidify knitting layer.With reference to Figure 33 D, after this one or more optics parts for example optics part 330A-330D can be installed in the strutting piece 320 and/or be attached to strutting piece 320.With reference to Figure 33 E, knitting layer 724 is provided in one or more districts on one or more surfaces of printed circuit board (PCB) 720.Such district is that digital camera devices 300 limits one or more installing zones.With reference to Figure 33 F, after this digital camera devices 300 is positioned on the knitting layer 724.One or more conductors 750 can be installed, so that the one or more pads 742 on the image device are connected with one or more pads on the circuit board 732.
In certain embodiments, electric interconnected can the formation between the component layer by photoetching process and metallization, protuberance joint or other method.The organic or inorganic joint method can be used for the coupling component layer.The layering packaging technology is from having " host " wafer of the electronic device that is used for whole camera and/or each camera passage.Then another wafer or each chip are alignd with host wafer and engage with host wafer.Being transferred wafer or chip can have in order to carry out the interconnected protuberance of electricity, perhaps to connect and can carry out after joint and attenuate.Support substrates from second wafer or each chip is removed, and only stays to comprise a few micro materials thickness that are attached to host wafer that are transferred electronic device.Using being engaged between wafer or tube core and host wafer or the tube core of standard integrated circuit technology to be electrically connected (if desired) then.Can repeatedly repeat this technology.
Figure 33 G-33K is the schematic diagram according to digital camera devices, mechanical mounting and the electrical connection of other embodiment use of the present invention.Specifically, Figure 33 G is the perspective view of digital camera devices 300.Figure 33 H is the front view of digital camera 300 that is installed to the main lower surface of printed circuit board (PCB) 720.Use one or more conductors 750 that the one or more pads on the main outer surface of the one or more pads 732 on the printed circuit 720 and image device 520 are connected.
Figure 33 H is the front view of digital camera 300 that is installed to the main lower surface of printed circuit board (PCB) 720.Strutting piece 320 is arranged in the through hole that is limited by this printed circuit board (PCB).One or more conductors 750 are connected the pad on the main outer surface of the pad on the printed circuit 720 732 and image device 520.
Figure 33 I is the front view of digital camera 300 that is installed to the main lower surface of printed circuit board (PCB) 720.Strutting piece 320 is arranged in the through hole that is limited by this printed circuit board (PCB).Protuberance engages 752 the pad 732 on the main lower surface of the one or more pads 742 on the surface 740 of image device 520 and printed circuit board (PCB) 720 is connected.
Figure 33 J is the front view of digital camera 300 that is installed to the Your Majesty surface of printed circuit board (PCB) 720.One or more conductors 750 are connected the pad 742 on the main outer surface 740 of the pad on the printed circuit 720 732 and image device 520.
Figure 33 I is the front view of digital camera 300 that is installed to the main lower surface of printed circuit board (PCB) 720.Strutting piece 320 is arranged in the through hole that is limited by this printed circuit board (PCB).Protuberance engages 752 the one or more pads on the main lower surface of image device 520 is connected with the lip-deep pad of the Your Majesty of printed circuit board (PCB) 720.
In certain embodiments, being manufactured on the single wafer of imageing sensor and optical stack carried out, independently assembled on the wafer (may reach two wafers: is used for IC, and is used for optics), and is bonded together with wafer scale.Can also adopt pick up with laying method and equipment optical module is attached to wafer IC, perhaps assembly drawing image-position sensor tube core or other assembly separately.
In the embodiment that adopts MEMS, the manufacturing of optical stack, MEMs and imageing sensor can carry out on single wafer, (may reach 3 wafers: one is used for IC at wafer independently, one is used for MEMs, and one is used for optical stack) go up assembling, and be bonded together with wafer scale.Can also adopt pick up with laying method and equipment so that optical module and MEMs are attached to wafer IC, perhaps assembly drawing image-position sensor tube core or other assembly (MEMs and optical stack) separately.
Figure 34 be according to another embodiment of the present invention can be used for support one or more lens with 3 lens elements for example lens 410,430 (Figure 11 A-11B, 13A-13B) and can be used for and the respective sensor array to the cross-sectional view of the strutting piece of these lens of location, small part registration ground.In this embodiment, strutting piece 320 limits one or more support sections, and 4 support section 600A-600D for example, each support section support and/or help to locate in one or more optics parts corresponding one.
In certain embodiments, this strutting piece can also help to limit, minimize and/or eliminate light between the camera passage and " crosstalks " and/or can also help restriction, minimize and/or eliminate light and " enter " outside digital camera devices.
Among the support section 600A-600D each all limits aperture 616 and a plurality of base 618-1 to 618-3.Specifically, support section 600A limits aperture 616A and base 618-1A to 618-3C.Support section 600B limits aperture 616B and base 618-1B to 618-3B.Support section 600C limits aperture 616C and base 618-1C to 618-3C.Support section 600D limits aperture 616D and base 618-1D to 618-3D.Reference example such as support section 600A, aperture 616A are the path that corresponding camera passage limits optical transmission.Among a plurality of base 618-1A to 618-3A each is suitable for receiving in the lenslet of corresponding optical devices part (or its part) corresponding one and support and/or locate this corresponding lenslet at least in part.Thus, among the base 618-1A to 618-3A each can comprise the one or more surfaces (for example surperficial 620-1A to 620-3A and surperficial 622-1A to 622-3A) that are suitable in abutting connection with one or more surfaces of this corresponding lenslet, to support and/or to locate this lenslet at least in part with respect to one or more among this support section and/or the sensor array 310A-310D.In this embodiment, each among the surperficial 620-1A to 620-3A is arranged on around the circumference of this corresponding lenslet, to locate this lenslet in x direction and y direction upper support with help.In surface 622-1A to 622-3A (being sometimes referred to as " stop " surface) each should corresponding lenslet location or help be positioned on the z direction.
The position of stop surfaces 622-1A to 622-3A and/or orientation can be suitable for this corresponding lenslet be positioned at locate and/or be orientated with respect to the specific range (or distance range) of respective sensor array on.Thus, base 618-1A to 618-3A controls each lenslet location (for example installing) degree of depth in strutting piece.This degree of depth may be different to each lenslet, and to the focal length of small part based on lens.For example, if the camera passage is exclusively used in specific color (or colour band), the one or more lens that then are used for this camera passage can have the focal length that is particularly suitable for the color (or colour band) that this camera passage is exclusively used in.If each camera passage is exclusively used in the color (or colour band) of the color (or colour band) that is different from other camera passage, then each lens can have different focal lengths, for example making lens be suitable for corresponding sensor array, and each base has the different degree of depth.
In this embodiment, each support section comprises and is suitable for helping corresponding optical devices partly is positioned at prolongation from respective sensor array desired distance place.In this embodiment, this prolongation extends and confining wall 760 in the axial direction, this wall limits the bottom in aperture again respectively, and it helps to limit, minimize and/or eliminate light between the camera passage and " crosstalks " and/or can also help restriction, minimize and/or eliminate light and " enter " outside digital camera devices.
In certain embodiments, the dividing plate that separates assembling with support section is provided, and this dividing plate is suitable for being arranged between support section and the one or more sensor array to help that one or more optics partly are positioned at desired distance place from these one or more sensor arraies.In some such embodiment, the common passage that limits one or more optical transmission of this dividing plate and strutting piece helps restriction, minimizes and/or eliminate light between the camera passage and " crosstalk " and/or helps to limit, minimize and/or eliminate light and " enter " outside digital camera devices.
Strutting piece 320 can comprise the material of any kind and can have any configuration and/or structure.For example, in certain embodiments, strutting piece 320 comprises silicon, semiconductor, glass, pottery, plastics or metal material and/or their combination.If strutting piece 320 has the part more than, then these parts can be made, integrate each other the ground manufacturing apart from each other and/or make by any combination of this dual mode.If this strutting piece limits the support section more than, then each such support section for example support section 600A-600D can be as shown with other support section in one, some or all couplings, perhaps isolated fully with other support section.
Strutting piece 320 can be still also can adopt the device of other form for making and material provides the solid unit of extensive selection.For example, in certain embodiments, strutting piece 320 comprises the plate (for example thin plate) that limits one or more support sections, and wherein aperture and base form by machining (for example boring) or any other suitable method.In further embodiments, strutting piece 320 (for example utilizing the mould of the projection with the aperture that limits one or more support sections and base) is manufactured to the foundry goods that wherein is limited with the aperture.
Each optics part for example optics part 330A-330D can be fixed in the corresponding base in any suitable manner, and this mode is such as but not limited to (for example interference fit, the physics stop) of machinery, (for example adhering to), (for example electricity engages) and/or their combination of electricity of chemistry.In certain embodiments, each among the base 618-1A to 618C-3A has and is suitable for the yardstick that corresponding lenslet provides interference fit.
Lenslet that it should be noted that the optics part can be assembled in the strutting piece in any suitable manner.
Figure 35 A-35C illustrates lenslet with optics part and is assembled in a embodiment in the strutting piece.With reference to Figure 35 A, in this embodiment, strutting piece 320 is inverted, and respectively bottom lenslet 410C, the 430C of each lens 410,430 inserted respective aperture the bottom, be installed among the corresponding base 618-3 and with its adhesion, if necessary.With reference to Figure 35 B, after this strutting piece 320 right sides up and are respectively inserted middle part lenslet 410B, the 430B of each lens 410,430 respective aperture the top, be installed among the corresponding base 618-2 and with its adhesion, if necessary.With reference to Figure 35 C, after this respectively top lenslet 410A, the 430A of each lens 410,430 are inserted respective aperture the top, be installed among the corresponding base 618-1 and with its adhesion, if necessary.In certain embodiments, top lenslet and middle part lenslet are configured to an assembly, and insert together.
In this particular example, the bottom lenslet is passed the bottom, aperture and inserts may be favourable, because the stop surfaces of bottom lenslet faces the bottom in aperture.Similarly, top lenslet and middle part lenslet are passed the top, aperture and insert may be favourable, because the stop surfaces of the stop surfaces of top lenslet and middle part lenslet all faces the top in aperture.
Yet should be appreciated that and to adopt any suitable configuration.For example in certain embodiments, the stop surfaces of middle part lenslet can face the bottom in aperture, and feasible middle part lenslet can for example pass the bottom in aperture and be inserted in the support section before the bottom lenslet is inserted strutting piece.In further embodiments, each stop surfaces can face a direction, makes all lenslets all pass the same section in aperture and is inserted into.
In certain embodiments, lens and strutting piece are manufactured to single moulding part.In certain embodiments, lens can manufacturedly have the small pieces that can be used for forming strutting piece.
In certain embodiments, strutting piece 320 directly or indirectly is coupled and/or is attached on the image device.For example, strutting piece 320 can (for example use adhesive) and directly is coupled and be attached on the image device or by intermediate support assembly (not shown) indirect coupling and/or be attached on the image device.
The x of strutting piece 320 and y yardstick can be for example roughly the same with image device (on one or more yardsticks), roughly the same and/or roughly the same with the layout (on one or more yardsticks) of sensor array 310A-310D with the layout (on one or more yardsticks) of optics part 330A-330D.The advantage that yardstick is set like this is to help to keep the x and the y yardstick of digital camera devices as much as possible little.
In certain embodiments, strutting piece can have the similar yardstick of one or more yardsticks with the embodiment of the strutting piece shown in Figure 28 A-28D.
Figure 36 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices shown in Figure 32 and printed circuit board (PCB), and just present embodiment adopts strutting piece 320 and the lens element 410,430 shown in Figure 35 A-35C.
In certain embodiments, digital camera devices 300 has the yardstick of about 2.5mm * 6mm * 6mm.For example, thickness can equal about 2.5mm, and length can equal about 6mm, and width can equal about 6mm.In some such embodiment, Zong digital camera devices has one or more sensor arraies of a total 1.3M pixel, although can adopt other configurations (for example different thickness, width, length and pixel quantity).
Digital camera devices 300 can and be installed on the printed circuit board (PCB) according to any way assembling.In certain embodiments, this digital camera devices is assembled and is installed on the printed circuit board (PCB) 720 according to the similar mode of setting forth in the above with embodiment at digital camera devices 300 shown in Figure 33 A-33F and printed circuit board (PCB) 720 of mode, just bottom lenslet 410C, 430C can before strutting piece being positioned on second knitting layer, be installed in the strutting piece and with its adhesion, if necessary.The middle part of lens and top lenslet can after strutting piece is positioned on second knitting layer 724, be installed in respectively in the strutting piece and with its adhesion, if necessary.
Figure 37 be according to another embodiment of the present invention can be used for support the lens 410,430 of Figure 11 A-11B, 13A-13B and can be used for and the respective sensor array to the cross-sectional view of the replaceable strutting piece 320 of these lens of small part registration ground location.Strutting piece 320 in the present embodiment is similar with the embodiment of strutting piece 320 shown in Figure 34, and just the strutting piece in the present embodiment 320 limits the wideer outer wall 760A-760D of outer wall 760A-760D that limits than the embodiment by strutting piece shown in Figure 34.
Each optics part for example optics part 330A-330S can be assembled in any suitable manner and be fixed in the corresponding base, the mode that this mode is set forth in the above such as but not limited to the embodiment at strutting piece shown in Figure 35 A-35C and optics part.
Figure 38 be according to another embodiment of the present invention can be used for support the lens 410,430 of Figure 11 A-11B, 13A-13B and can be used for and the respective sensor array to the cross-sectional view of the replaceable strutting piece 320 of these lens of small part registration ground location.Strutting piece 320 in the present embodiment is similar with the embodiment of strutting piece 320 shown in Figure 34, and just the strutting piece in the present embodiment 320 limits outer wall and wideer outer wall and the inwall 760A-760D of inwall 760A-760D that limits than the embodiment by strutting piece shown in Figure 34 320.
Each optics part can be assembled in any suitable manner and be fixed in the corresponding base, the mode that this mode is set forth in the above such as but not limited to the embodiment at strutting piece shown in Figure 35 A-35C and optics part.
Figure 39 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar with the embodiment of printed circuit board (PCB) 720 to the digital camera devices 300 shown in Figure 36, and just present embodiment adopts strutting piece 320 and the lens element 410,430 shown in Figure 37.
This digital camera devices can and be installed on the printed circuit board (PCB) according to any way assembling.For example in certain embodiments, this digital camera devices is assembled and is installed on the printed circuit board (PCB) according to the similar mode of setting forth in the above with embodiment at digital camera devices shown in Figure 36 and printed circuit board (PCB) of mode.
Figure 40 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices 300 shown in Figure 36 and printed circuit board (PCB), and just present embodiment adopts strutting piece 320 and the lens element 410,430 shown in Figure 38.
Digital camera devices 300 can and be installed on the printed circuit board (PCB) 720 according to any way assembling.For example in certain embodiments, digital camera devices 300 is assembled and is installed on the printed circuit board (PCB) 720 according to the similar mode of setting forth in the above with embodiment at digital camera devices shown in Figure 36 300 and printed circuit board (PCB) 720 of mode.
Figure 41 A-41D can be used for supporting cross-sectional view with the base configuration 770-776 of the lens of network for location 15A-15B, 15D-15E according to other embodiment.
In the base shown in Figure 41 A configuration, top lenslet 450A, middle part lenslet 450B and bottom lenslet 450C are one at a time and/or as passing the bottom (perhaps passing the top in aperture) in aperture respectively and be inserted in assembly ground.
In the base shown in Figure 41 B configuration, top lenslet 452A, middle part lenslet 452B and bottom lenslet 452C are one at a time and/or as passing the top (perhaps passing the bottom in aperture) in aperture respectively and be inserted in assembly ground.
In the base configuration of Figure 41 C, top lenslet 456A can for example pass the top in aperture and be inserted into.Middle part lenslet 456B and bottom lenslet 456C can pass the bottom in aperture one at a time and be inserted into, and perhaps replacedly, this middle part lenslet and bottom lenslet can be constructed as an assembly and inserted together.
In the base configuration of Figure 41 D, middle part lenslet 458B and top lenslet 458A pass the top in aperture one at a time and are inserted into, and perhaps replacedly, middle part lenslet 458B and top lenslet 458A can be constructed as an assembly and inserted together.Bottom lenslet 458C passes the bottom in aperture and is inserted into.
For each embodiment disclosed herein, this embodiment can use separately or be used in combination with and illustrated one or more other embodiment (or its part) open at this.
Thus, Figure 42-the 44th, according to the base that adopts Figure 41 B-41D respectively of other embodiment dispose the lens 452A-452C, the 456A-456C that support respectively shown in Figure 15 B-15D, 458A-456C and with the cross-sectional view of respective sensor array to the strutting piece 32 of these lens of location, small part registration ground.
Thus, Figure 42-the 44th, according to the base that adopts Figure 41 B-41D respectively of other embodiment dispose the lens 452A-452C that supports respectively shown in Figure 15 B-15D and with the cross-sectional view of respective sensor array to the strutting piece of these lens of location, small part registration ground.
Figure 45 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices 300 shown in Figure 36 and printed circuit board (PCB), and just present embodiment adopts strutting piece 320 and the lens element shown in Figure 42.
Digital camera devices 300 can and be installed on the printed circuit board (PCB) according to any way assembling.For example in certain embodiments, digital camera devices 300 is according to assembling and be installed on the printed circuit board (PCB) 720 with the similar mode of setting forth at the embodiment of digital camera devices shown in Figure 36 and printed circuit board (PCB) of mode, may be favourable in the strutting piece although adopt with being configured at the base shown in Figure 41 B that the similar mode of the mode of setting forth above is assembled into lenslet.
Figure 46 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices shown in Figure 36 and printed circuit board (PCB), and just present embodiment adopts strutting piece and the lens element shown in Figure 43.
Digital camera devices 300 can and be installed on the printed circuit board (PCB) 720 according to any way assembling.For example in certain embodiments, digital camera devices 300 is according to assembling and be installed on the printed circuit board (PCB) 720 with the similar mode of setting forth at the embodiment of digital camera devices shown in Figure 36 and printed circuit board (PCB) of mode, may be favourable in the strutting piece although adopt with being configured at the base shown in Figure 41 C that the similar mode of the mode of setting forth above is assembled into lenslet.
Figure 47 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar with the embodiment of printed circuit board (PCB) 720 to the digital camera devices 300 shown in Figure 36, and just present embodiment adopts strutting piece and the lens element shown in Figure 44.
Digital camera devices 300 can and be installed on the printed circuit board (PCB) 720 according to any way assembling.For example in certain embodiments, digital camera devices 300 is according to assembling and be installed on the printed circuit board (PCB) 720 with the similar mode of setting forth at the embodiment of digital camera devices shown in Figure 36 and printed circuit board (PCB) of mode, may be favourable in the strutting piece although adopt with being configured at the base shown in Figure 41 D that the similar mode of the mode of setting forth above is assembled into lenslet.
In certain embodiments, digital camera devices 300 comprises one or more additional structures and/or device, such as but not limited to one or more additional integrated circuits, one or more output device and/or one or more input unit.These one or more output devices can comprise any or polytype output device, such as but not limited to one or more display unit, one or more loud speaker and/or their any combination.These one or more input units can comprise any or polytype input unit, such as but not limited to one or more microphones.Structure that should be additional and/or device can be arranged on any suitable position, such as but not limited to adjacent to image device.
Structure and/or device that should be additional can comprise the material of any kind and can have any configuration and/or structure.For example in certain embodiments, this additional structure and/or device comprises silicon, semiconductor, glass, pottery, plastics or metal material and/or their combination.These one or more additional structures and/or device can be made, integrate each other ground apart from each other and make and/or make by any combination of this dual mode.These one or more additional structures and/or device can be made dividually with the camera passage, make and/or make by any combination of this dual mode with integrating with the camera passage.These one or more additional structures and/or device can or can physically not be connected to processor, one or more camera passage or its any part.These one or more additional structures and/or device can or can not be electrically connected to processor and/or one or more camera passage or its part.
Figure 48 is schematically illustrating according to the digital camera devices 300 that comprises second device 780 of another embodiment of the present invention.Second device 780 can comprise such as but not limited to integrated circuit, this integrated circuit comprises one or more circuit of any kind, such as but not limited to one or more parts, one or more parts (for example one or more parts of preprocessor) of processor and/or the circuit of any other type of one or more parts of processor, memory or additional memory.
For example in certain embodiments, digital camera devices 300 comprises the memory block, and this memory block is provided with and/or stores one, some or all of image and/or produced or the out of Memory that uses and/or from any source and wishes to continue storage any out of Memory of any time by digital camera devices.The memory block can provide one or more such images and/or such out of Memory to one or more parts of one or more other devices and/or processor, for example in order to further processing and/or offer one or more other devices.This memory block can (for example as discrete parts) be integrated into the identical or different substrate of, some or all of sensor array in or be arranged on this substrate.This memory block can for example be the part of processor or be integrated in processor (its can (for example as discrete parts) be integrated into the identical or different substrate of, some or all of sensor array in or be arranged on this substrate) in and/or one or more parts couplings by one or more communication links and processor.In certain embodiments, this memory block is also by one or more communication links and one or more other device coupling.In such embodiments, any other parts of processor can directly (promptly not passed) by one or more one or more one or more images of storing and/or the out of Memory of providing in one or more other devices in one or more communication links in this memory block.
Second device 780 can be arranged on any suitable position.But in certain embodiments, second device 780 usually adjacent to or near the image device that is associated for example image device 520 or the processor that is associated and be provided with.
Second device, one or more circuit of 780 can be for example be connected to one or more parts of processor 340, one or more, one or more other devices in the camera passage and/or their any combination by one or more communication links.In certain embodiments, one or more communication links comprise one or more pads on second device 780 and the image device and the one or more electric connectors with one or more conductive components, and this conductive component is connected the one or more pads on the image device with one or more pads on second installs.In certain embodiments, one or more communication links comprise that one or more protuberances that one or more circuit on the image device and one or more circuit on second device are electrically connected engage.
Second device 780 can have virtually any size and shape, and can or can not have the configuration identical with image device.In certain embodiments, second device 780 the length and the width length and the width that are less than or equal to optical module, transducer sub-component and/or image device respectively.In further embodiments, second device 780 the length and width are respectively greater than the length and the width of optical module, transducer sub-component and/or image device.
Separate with the image device and second device although processor is shown, be to be understood that processor can have any configuration and processor or its part and can be arranged on any position.In certain embodiments, one of processor, some or all of part is arranged on the image device or be integrated in the image device.In certain embodiments, one of processor, some or all of part are arranged on second device and go up or be integrated in second device.In some such embodiment, one or more parts of processor are arranged on the image device, and one or more parts of processor are arranged on second device and go up or be integrated in second device.For example, the particular job of processor can distribute to be integrated in one or more sensor arraies in one or more identical one or more substrates in or be arranged on the circuit on this substrate or carry out by this circuit, and the particular job of processor be assigned to be integrated in one or more substrates (no matter whether so one or more different substrates are physically located in the camera) different with one or more sensor arraies institute substrate integrated or that be arranged at or be arranged on these one or more substrates circuit or by this circuit execution.
In certain embodiments, digital camera devices can also comprise one or more additional integrated circuit (IC)-components, for example the 3rd integrated circuit (IC)-components (not shown).These one or more additional integrated circuit (IC)-components can have virtually any size and shape, and can or can not have and other integrated circuit (IC)-components, image device or the identical configuration of second device.In certain embodiments, the length of the 3rd integrated circuit (IC)-components and width equal the length and the width of optical module, transducer sub-component and/or image device respectively.In further embodiments, the length of the 3rd integrated circuit (IC)-components or width are greater than or less than the length or the width of optical module, transducer sub-component and/or image device respectively.
Figure 49 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices shown in Figure 36 and printed circuit board (PCB), and just present embodiment comprises second device 780 for example shown in Figure 48.Overlay on the 3rd knitting layer 782 on second device 780, overlay on this printed circuit board (PCB) on the 3rd knitting layer 782.
The 3rd knitting layer 782 can comprise any suitable material, such as but not limited to adhesive, and can comprise any suitable configuration.The 3rd knitting layer 782 can comprise and first and/or second knitting layer, 722,724 identical materials, although this not necessarily.
In certain embodiments, digital camera devices 300 has the yardstick of about 2.5mm * 6mm * 6mm.For example, thickness can equal about 2.5mm, and length can equal about 6mm, and width can equal about 6mm.In some such embodiment, Zong digital camera devices has one or more sensor arraies of a total 1.3M pixel, although also can adopt other configurations (for example different thickness, width, length and pixel quantity).
Digital camera devices 300 can and/or be installed according to any way assembling.Figure 50 A-50F illustrates an embodiment who is used to assemble and install this digital camera devices.With reference to Figure 50 A, initial, second device is provided.With reference to Figure 50 B, knitting layer 724 is provided in one or more districts on second one or more surfaces of installing.These districts define one or more installing zones for image device.With reference to Figure 50 C, after this image device 520 is positioned on the knitting layer 724.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between second device 780 and image device 520.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between second device and image device, form activation to be provided and/or to solidify knitting layer.With reference to Figure 50 D, knitting layer 722 is provided in one or more districts on one or more surfaces of image device 520.These districts define one or more installing zones for strutting piece 320.With reference to Figure 50 E, after this strutting piece 320 is positioned on the knitting layer 722.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between strutting piece 320 and image device 520.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between strutting piece and image device, form activation to be provided and/or to solidify knitting layer.With reference to Figure 50 F, after this can with one or more optics part for example optics part 330A-330D be installed in the strutting piece 320 and/or be attached to strutting piece 320.With reference to Figure 50 G, knitting layer 782 is provided in one or more districts on one or more surfaces of printed circuit board (PCB) 720.Such district is that digital camera devices 300 limits one or more installing zones.With reference to Figure 50 H, after this digital camera devices is positioned on the knitting layer 782.One or more conductors 750 can be installed, so that the one or more pads 742 on the image device 520 are connected with one or more pads on the circuit board 732.One or more conductors 790 can be installed, so that the one or more pads on the image device 792 are connected with the second one or more pads that install on 794.
Figure 51 is that digital camera devices 300 comprises the dividing plate 800 that is arranged between strutting piece 320 and the image device 520 according to the schematically illustrating of the exemplary digital camera devices of another embodiment of the present invention.In certain embodiments, dividing plate 800 help with optics part for example optics part 330A-330D be positioned at respectively from the respective sensor array desired distance place of 310A-310D for example.In the present embodiment, dividing plate 800 extends and confining wall 802 in the axial direction, this wall is limited with the light that helps to limit, minimize and/or eliminate between the camera passage and " crosstalks " and help restriction, minimize and/or eliminate the aperture that is used to transmit light that light " enters " outside digital camera devices, the aperture 804-804D that for example (for example is respectively applied for camera passage 350A-350D's).
Dividing plate 800 can comprise the material of any kind and can have any configuration and/or structure.For example in certain embodiments, dividing plate 800 comprises silicon, semiconductor, glass, pottery, plastics or metal material and/or their combination.Surpass one part if this dividing plate has, then these parts can be made, integrate each other the ground manufacturing apart from each other and/or make by any combination of this dual mode.
Dividing plate 800 can be made with strutting piece 320 or support section 600A-600D with integrating dividually and/or.
Dividing plate 800 can be still also can adopt the device of other form for making and material provides the solid unit of extensive selection.For example, in certain embodiments, this dividing plate comprises the wall that limits this dividing plate and the plate (for example thin plate) in aperture.Aperture for example aperture 804A-804D can form by machining (for example boring) or any other suitable method.In certain embodiments, this dividing plate (for example utilizing the mould of the projection of aperture with the one or more support sections that limit this dividing plate and base) is manufactured to the foundry goods that wherein is limited with the aperture.
Separate with image device although processor is shown, be to be understood that processor can have any configuration and processor or its part and can be arranged on any one or a plurality of position.In certain embodiments, one of processor, some or all of part are arranged on the image device.
For each embodiment disclosed herein, this embodiment can use separately or be used in combination with and illustrated one or more other embodiment open at this or its part.
Figure 52 is that digital camera devices 300 comprises the dividing plate 800 that is arranged between strutting piece 320 and the image device 520 according to the schematically illustrating of the digital camera devices 300 of another embodiment of the present invention.The embodiment of this dividing plate 800 is similar to the embodiment of the dividing plate 500 shown in Figure 51, just 800 on the dividing plate in the present embodiment limits an aperture 804 that is used to transmit light, and may not help to limit, minimize and/or eliminate the camera passage for example the light between the camera passage 350A-350D " crosstalk ".
Figure 53 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices shown in Figure 36 and printed circuit board (PCB), and just present embodiment comprises for example dividing plate shown in Figure 51 800.Overlay on the dividing plate 800 on the knitting layer 782, overlay on the knitting layer 782 on the image device 520.
This knitting layer 782 can comprise any suitable material, such as but not limited to adhesive, and can comprise any suitable configuration.This knitting layer can comprise and other knitting layer identical materials, although this not necessarily.
In certain embodiments, this digital camera devices has the yardstick of about 2.5mm * 6mm * 6mm.For example, thickness can equal about 2.5mm, and length can equal about 6mm, and width can equal about 6mm.In some such embodiment, Zong digital camera devices has one or more sensor arraies of a total 1.3M pixel, although also can adopt other configurations (for example different thickness, width, length and pixel quantity).
This digital camera devices can and/or be installed according to any way assembling.Figure 54 A-54F illustrates an embodiment who is used to assemble and install digital camera devices 300.With reference to Figure 54 A, initial, image device 520 is provided.With reference to Figure 54 B, knitting layer 782 is provided in one or more districts on one or more surfaces of this image device.These districts define one or more installing zones for dividing plate 800.With reference to Figure 54 C, after this dividing plate 800 is positioned on the knitting layer 782.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between dividing plate 800 and image device 520.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between this dividing plate and this image device, form activation to be provided and/or to solidify knitting layer.With reference to Figure 54 D-54E, knitting layer 722 is provided in one or more districts on one or more surfaces of dividing plate 800.These districts define one or more installing zones for one or more support sections of strutting piece 320, after this strutting piece 320 are positioned on the knitting layer 722.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between one or more support sections of dividing plate 800 and strutting piece 320.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between one or more support sections of this dividing plate and this strutting piece, form activation to be provided and/or to solidify knitting layer.With reference to Figure 54 F, after this can with one or more optics part for example optics part 330A-330D be installed in the strutting piece 320 and/or be attached to strutting piece 320.With reference to Figure 54 G, knitting layer 724 is provided in one or more districts on one or more surfaces of printed circuit board (PCB) 720.Such district is that digital camera devices 300 limits one or more installing zones.With reference to Figure 54 H, after this digital camera devices is positioned on the knitting layer 724.One or more conductors 750 can be installed, so that the one or more pads 742 on this image device are connected with one or more pads 732 on this circuit board.
For each embodiment disclosed herein, this embodiment can use separately or be used in combination with and graphic one or more other embodiment open at this or its part.
For example, Figure 55 comprises schematically illustrating of second device and the digital camera devices 300 of dividing plate 800 according to another embodiment of the present invention.
Although processor and image device are shown and second device separates, be to be understood that processor 340 can have any configuration and this processor or its part and can be arranged on any one or a plurality of position.In certain embodiments, one of this processor, some or all of part are arranged on the image device.In certain embodiments, one of this processor, some or all of part are arranged on second device.In some such embodiment, one or more parts of this processor are arranged on the image device, and one or more parts of this processor are arranged on second device.
Figure 56 is that digital camera devices 300 can be installed on the printed circuit board (PCB) 720 according to the cross-sectional view of the printed circuit board (PCB) 720 of the digital camera devices 300 of another embodiment of the present invention and digital camera.This embodiment is similar to the embodiment of digital camera devices shown in Figure 53 and printed circuit board (PCB), and just present embodiment comprises second device 780.Overlay on the knitting layer 808 on second device 780, overlay on the knitting layer 808 on the printed circuit board (PCB) 720.
Knitting layer 808 can comprise any suitable material, such as but not limited to adhesive, and can comprise any suitable configuration.This knitting layer can comprise and other knitting layer identical materials, although this not necessarily.
In certain embodiments, this digital camera devices has the yardstick of about 2.5mm * 6mm * 6mm.For example, thickness can equal about 2.5mm, and length can equal about 6mm, and width can equal about 6mm.In some such embodiment, Zong digital camera devices has one or more sensor arraies of a total 1.3M pixel, although also can adopt other configurations (for example different thickness, width, length and pixel quantity).
Digital camera devices 300 can and/or be installed according to any way assembling.Figure 57 A-57F illustrates an embodiment who is used to assemble and install this digital camera devices.With reference to Figure 57 A, initial, second device 780 is provided.With reference to Figure 57 B, knitting layer 724 is provided in one or more districts on second device, one or more surfaces of 780.These districts define one or more installing zones for image device 520.With reference to Figure 57 C, after this image device is positioned on the knitting layer 724.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between second device 780 and image device.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between second device 780 and image device, form activation to be provided and/or to solidify knitting layer.With reference to Figure 57 D, knitting layer 782 is provided in one or more districts on one or more surfaces of image device.These districts define one or more installing zones for dividing plate 800.With reference to Figure 57 E, after this dividing plate 800 is positioned on the knitting layer 782.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between dividing plate 800 and image device.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between dividing plate 800 and image device, form activation to be provided and/or to solidify knitting layer.With reference to Figure 54 E-54G, knitting layer 722 is provided in one or more districts on one or more surfaces of dividing plate 800.Such district is that one or more support sections of strutting piece 320 limit one or more installing zones, after this strutting piece 320 is positioned on the knitting layer 722.In certain embodiments, can apply power helps any air that is absorbed in is flushed out between one or more support sections of dividing plate 800 and strutting piece 320.In certain embodiments, can apply heat and/or apply the condition of power, engage thereby between one or more support sections of dividing plate 800 and strutting piece 320, form activation to be provided and/or to solidify knitting layer.After this can with one or more optics part for example optics part 330A-330D be installed in this strutting piece and/or be attached to this strutting piece.With reference to Figure 57 G, knitting layer 808 is provided in one or more districts on one or more surfaces of printed circuit board (PCB) 720.Such district is that digital camera devices 300 limits one or more installing zones.With reference to Figure 57 H, after this this digital camera devices is positioned on the knitting layer 782.One or more conductors 750 can be installed, so that the one or more pads 742 on the image device are connected with one or more pads on the circuit board 732.One or more conductors 790 can be installed, so that the one or more pads 742 on the image device are connected with the second one or more pads that install on 780.
As mentioned above, each embodiment disclosed herein can use separately or be used in combination with and graphic one or more other embodiment open at this or its part.
For example in certain embodiments, the one or more strutting pieces shown in Figure 37-38 and the 42-44 use in one or more embodiment of the digital camera devices shown in Figure 48-57.
For example, Figure 58-the 62nd, according to the cross-sectional view of the printed circuit board (PCB) of the digital camera devices of other embodiment of the present invention and digital camera, this digital camera devices can be installed on this printed circuit board (PCB).These embodiment are similar to the embodiment of digital camera devices shown in Figure 49 and printed circuit board (PCB), and just strutting piece and optics part has and Figure 37-38 configuration that the strutting piece shown in the 42-44 is similar with the optics part respectively.
Figure 63-the 67th, according to the cross-sectional view of the printed circuit board (PCB) of the digital camera devices of other embodiment of the present invention and digital camera, this digital camera devices can be installed on this printed circuit board (PCB).These embodiment are similar to the embodiment of digital camera devices shown in Figure 53 and printed circuit board (PCB), and just strutting piece and optics part has and Figure 37-38 configuration that the strutting piece shown in the 42-44 is similar with the optics part respectively.
Among some embodiment here, mechanical device one or more electricity or electronics is arranged in the strutting piece and/or is arranged on the dividing plate.In some such embodiment, one or more conductors can be connected one or more such devices with one or more circuit on image device and/or another device, go to and/or from power, control signal and/or the data-signal of one or more such devices for example to provide.One or more such conductors can be taked the form of electric connector, but this not necessarily.Described conductor can run through one or more parts of digital camera devices, for example one or more parts of strutting piece, dividing plate, image device (if present) or their combination and/or can be arranged on its one or more outer surfaces.For example in certain embodiments, one or more conductors for example conductor 810,812 (Figure 63-72) are provided and spread all over one or more surfaces of strutting piece or run through one or more parts (for example spread all over or run through for example 600A-600D of one or more support sections) of strutting piece and/or spread all over one or more surfaces of dividing plate or run through one or more parts (for example spread all over or run through for example wall 602 of one or more walls) of dividing plate, so as to be connected in image device or another device or on one or more circuit.
Figure 68-the 72nd, according to the cross-sectional view of the printed circuit board (PCB) of the digital camera devices of other embodiment of the present invention and digital camera, this digital camera devices can be installed on this printed circuit board (PCB).These embodiment are similar to the embodiment of digital camera devices shown in Figure 56 and printed circuit board (PCB), and just strutting piece and optics part has and Figure 37-38 configuration that the strutting piece shown in the 42-44 is similar with the optics part respectively.
Figure 73 A-73B is respectively according to the front view of the strutting piece of another embodiment of the present invention and cross-sectional view.In this embodiment, one or more support sections for example for example separate each other and/or are isolated from each other in gap 816 by one or more gaps or space.
Figure 74 is the cross-sectional view according to the strutting piece of another embodiment of the present invention.In this embodiment, strutting piece comprises and overlays on the one or more support sections that (for example are arranged on these one or more other support sections or the top) on one or more other support sections.In some such embodiment, support section can for example for example separate each other and/or be isolated from each other on the z direction in gap 816 by gap or space.
As mentioned above, should be appreciated that each the foregoing description can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
For example, in certain embodiments, strutting piece is suitable for holding one or more optics parts of first size and shape and the one or more optics parts that are different from second size and dimension of the first size and/or first shape.In certain embodiments, can also hold for example optics part of the 3rd size and dimension, the 4th size and dimension, the 5th size and dimension etc. of other size and dimension.
With reference to Figure 75, in certain embodiments, one or more strutting pieces disclosed herein are provided with for example sweep 818A-818D of one or more sweeps.Such aspect for example reduces and/or minimizes among some embodiment of yardstick of digital camera devices in hope may be favourable.
Figure 76 A-76C is the schematic diagram of digital camera devices according to comprising of another embodiment of the present invention of one or more output devices 820.Figure 76 A is the perspective view of an embodiment that comprises the digital camera devices of one or more output devices.Figure 76 B-76C is respectively front view and the rearview according to the output device 820 of one embodiment of the present of invention.
In certain embodiments, one or more output devices 820 are taked the form of one or more display unit, but also can adopt the output device of other type.In certain embodiments, one or more display unit are taked the form of one or more micro-displays.
These one or more display unit can be arranged on any suitable one or more positions.In certain embodiments, maybe advantageously, (being one or more camera passages) gathers light on a side of digital camera assembly, and for example one or more in the Output Display Unit 820 of these one or more Output Display Units are provided on the another side of digital camera assembly.In illustrated embodiment, digital camera devices has first side respect to one another and second side usually.These one or more camera channel location become first side by digital camera devices to receive light.One or more second side-emitted light (for example one or more display image) that are positioned to from digital camera devices in the display unit.In certain embodiments, such configuration may make the digital camera devices of extremely thin (for example extremely thin on the z direction) can be provided.Also can adopt other configuration.In certain embodiments, one or more in the display unit locate adjacent to image device usually, although this not necessarily.
These one or more display unit can be connected to processor, one or more camera passage or their any combination by one or more communication links.In certain embodiments, one or more communication links comprise one or more pads on image device and these one or more display unit and the one or more electric connectors with one or more conductive components, and this conductive component is connected the one or more pads on the image device with one or more pads on these one or more display unit.In certain embodiments, one or more communication links comprise that one or more protuberances that the one or more circuit on the image device are connected with one or more circuit on these one or more display unit engage.
These one or more display unit can have virtually any size and shape, and can or can not have mutually the same configuration (for example type, size, shape, resolution).In certain embodiments, one or more length in these one or more display unit and the width length and the width that are less than or equal to optical module, transducer sub-component and/or image device respectively.In certain embodiments, one or more length in these one or more display unit and width are respectively greater than the length and the width of optical module, transducer sub-component and/or image device.In certain embodiments, the display unit of each camera passage and its oneself is connected.In further embodiments, two or more camera passages for example camera passage 350A-350B are connected with first display unit, and one or more other camera passages for example camera passage 350C-350D be connected with second display unit.In certain embodiments, in these one or more display unit one is connected with processor, so that small part ground is based on showing composograph from each camera channel image.
For each embodiment disclosed herein, the foregoing description can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
Therefore, in certain embodiments, digital camera devices 300 also comprise dividing plate 800 (referring to for example Figure 76 D) and/or be used for one or more output devices for example one or more circuit of output device 820 be connected to subsystem 300 one or more other parts one or more circuit one or more conductors 822, can be arranged on one or more circuit of the processor 340 on the image device 520 (referring to for example Figure 76 E).
In addition, in certain embodiments, one or more actuators that digital camera devices can also comprise one or more lighting devices and/or have an one or more optics part that is used for the mobile camera passage (for example, MEMS actuator, for example comb type MEMS actuator) strutting piece (such strutting piece can comprise the framework that for example has one or more actuators).In some embodiment after, digital camera devices comprises one or more lighting devices and/or has the strutting piece (such strutting piece can comprise the framework that for example has one or more actuators) of one or more actuators (for example MEMS actuator).
Figure 77 A-77C is the schematic diagram of digital camera devices according to comprising of another embodiment of the present invention of one or more input units 830.Specifically, Figure 77 A is the perspective view of an embodiment that comprises the digital camera devices of one or more input units.Figure 77 B-77C is respectively according to the front view of the amplification of the input unit of one embodiment of the present of invention and rearview.In this embodiment, these one or more input units are taked for example form of one or more microphones of one or more voice input devices, but also can adopt the input unit of other type.In certain embodiments, these one or more microphones are taked the form of one or more silicon microphones.
These one or more voice input devices can be arranged on any suitable one or more positions.In certain embodiments, maybe advantageously, (being one or more camera passages) gathers light on a side of digital camera assembly, and gathers sound from the same side of digital camera sub-component.In illustrated embodiment, digital camera devices has first and second sides respect to one another usually.These one or more camera channel location become first side by digital camera devices to receive light.One or more can being positioned in the voice input device receives audio frequency input (for example sound) from first side of digital camera devices.In certain embodiments, such configuration may make the digital camera devices of extremely thin (for example extremely thin on the z direction) can be provided.Also can adopt other configuration.In certain embodiments, one or more in the voice input device are arranged on one or more parts of strutting piece and/or integrate ground with one or more parts of strutting piece and are provided with, although this not necessarily.
These one or more voice input devices can be connected to processor by one or more communication links.In certain embodiments, these one or more communication links comprise one or more pads on image device and these one or more voice input devices and the one or more electric connectors with one or more conductive components, and this conductive component is connected the one or more pads on the image device with one or more pads on the voice input device.In certain embodiments, these one or more communication links comprise that one or more protuberances that the one or more circuit on the one or more circuit on the image device and these one or more voice input devices are electrically connected engage.
These one or more voice input devices can have virtually any size and shape, and can or can not have mutually the same configuration (for example type, size, shape, resolution).In certain embodiments, one or more length in these one or more voice input devices and the width length and the width that are less than or equal to optical module, transducer sub-component and/or image device respectively.In certain embodiments, one or more length in these one or more voice input devices or width are respectively greater than the length or the width of optical module, transducer sub-component and/or image device.
Figure 77 G-77L is the perspective view according to the digital camera devices of other embodiment.The input unit of these embodiment has with the configuration of input unit shown in Figure 77 A and/or arranges different configurations and/or layout.Also can use other configuration and/or layout.
For each embodiment disclosed herein, the foregoing description can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
Therefore, in certain embodiments, digital camera devices 300 also comprises dividing plate 800, is used for one or more input units one or more conductors 822, one or more circuit that can be arranged on the processor 340 on the image device 520 and/or for example one or more output devices 820 of one or more attachment device of being connected with one or more circuit of one or more other parts of subsystem 300 of one or more circuit of input unit 830 for example.
For example, Figure 77 D is the perspective view of an embodiment that comprises the digital camera devices of input unit and dividing plate 800.Figure 77 E is the perspective view of an embodiment that comprises the digital camera devices 300 of dividing plate 800 and for example one or more output devices 820 of one or more attachment device.Image device illustrate have be arranged on image device on or in one or more pads of being connected of one or more circuit.Figure 77 F is the perspective view of an embodiment that comprises the digital camera devices of input unit, dividing plate and attachment device (for example display and/or second integrated circuit (IC) apparatus adjacent with image device).Image device illustrate have be arranged on image device on or in one or more pads of being connected of one or more circuit.
In addition, in certain embodiments, digital camera devices can also comprise have the one or more optics part that is used for the mobile camera passage one or more actuators (for example, MEMS actuator, for example comb type MEMS actuator) strutting piece (such strutting piece can comprise the framework that for example has one or more actuators), one or more display unit and/or one or more lighting device (for example one or more light-emitting diode (LED)) with high output intensity.In some embodiment after, one or more actuators that digital camera devices comprises one or more voice input devices, have one or more optics parts that are used for the mobile camera passage (for example, MEMS actuator, for example comb type MEMS actuator) strutting piece (such strutting piece can comprise the framework that for example has one or more actuators), one or more display unit and/or one or more lighting device.
Figure is the schematically illustrating of digital camera devices that comprises one or more voice input devices, one or more display unit and one or more lighting devices.
This digital camera devices can be according to such as but not limited to the assembling of the similar any way of the mode that adopts in one or more embodiment disclosed herein and/or install.
Any embodiment of the present invention can comprise one or more lighting units, in order to improve and/or to strengthen (specifically by one or more camera passages, one or more sensor arraies) image acquisition of carrying out helps the range detection to object, the SHAPE DETECTION and the conversion imaging (being the not observable imaging of human eye) of object.
Figure 78 A-78B is according to the one or more lighting units of having of other embodiment of the present invention block diagram of the digital camera devices of lighting unit 840 for example.This lighting unit can provide passive illumination (for example not having illumination), initiatively illumination (for example constant illumination), active constant and/or gate to throw light on (for example predetermined, pre-if the pulsed illumination and/or the programmable pulsed illumination of user/operator of processor control).These one or more lighting units can be arranged on the substrate of sensor array and/or the support frame or be integrated in the substrate and/or support frame of sensor array.In fact, these one or more lighting units can be arranged on any element of one or more camera passages or the parts or be integrated in any element or parts of one or more camera passages.
Figure 78 C-78P is the schematic diagram according to the digital camera devices that comprises one or more output devices of another embodiment of the present invention.Specifically, Figure 78 C is the perspective view of an embodiment that comprises the digital camera devices of one or more output devices.In certain embodiments, these one or more output devices are taked the form of for example one or more lighting devices 850 of one or more lighting devices, but also can adopt the output device of other type.Figure 78 C-78D is respectively front view and the rearview according to the amplification of the lighting device 850 of one embodiment of the present of invention.In certain embodiments, these one or more lighting devices are taked the form of one or more LED (for example one or more high-capacity LED).
These one or more lighting devices can be arranged on any suitable one or more positions.In certain embodiments, maybe advantageously, (being one or more camera passages) gathers light on a side of digital camera assembly, and provides illumination from the same side of digital camera assembly.In illustrated embodiment, digital camera devices has first and second sides respect to one another usually.These one or more camera channel location become first side by digital camera devices to receive light.One or more in the lighting device can be positioned to throw light on from the same side of digital camera devices (light for example is provided).In certain embodiments, such configuration may make the digital camera devices of extremely thin (for example extremely thin on the z direction) can be provided.Also can adopt other configuration.In certain embodiments, one or more lighting devices are arranged on one or more parts of strutting piece and/or integrate ground with one or more parts of strutting piece and are provided with, although this not necessarily.
These one or more lighting devices can be connected to processor by one or more communication links.In certain embodiments, these one or more communication links comprise one or more pads on image device and these one or more lighting devices and the one or more electric connectors with one or more conductive components, and this conductive component is connected the one or more pads on the image device with one or more pads on this lighting device.In certain embodiments, these one or more communication links comprise that one or more protuberances that the one or more circuit on the one or more circuit on the image device and these one or more lighting devices are electrically connected engage.
These one or more lighting devices can have virtually any size and shape, and can or can not have mutually the same configuration (for example type, size, shape, resolution).In certain embodiments, one or more length in one or more lighting devices and width are less than or equal to the length and the width of optical module, transducer sub-component and/or image device respectively.In certain embodiments, one or more length in one or more lighting devices or width are respectively greater than the length or the width of optical module, transducer sub-component and/or image device.
Figure 78 H-78M is the perspective view according to the digital camera devices of other embodiment.One or more lighting devices of these embodiment have with the configuration of lighting device shown in Figure 78 C and/or arrange different configurations and/or layout.Also can use other configuration and/or layout.
For each embodiment disclosed herein, the foregoing description can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
For example, Figure 78 F is the perspective view of an embodiment that comprises the digital camera devices of output device and dividing plate.Figure 78 G is the perspective view of an embodiment that comprises the digital camera devices of output device and dividing plate.This image device illustrate have be arranged on image device on or in one or more pads of being connected of one or more circuit.Figure 77 H is the perspective view of an embodiment that comprises the digital camera devices of output device, dividing plate and attachment device (for example display and/or adjacent with image device second device 780).This image device illustrate have be arranged on image device on or in one or more pads of being connected of one or more circuit.
In addition, in certain embodiments, digital camera devices can also comprise have the one or more optics part that is used for the mobile camera passage one or more actuators (for example, MEMS actuator, for example comb type MEMS actuator) strutting piece (such strutting piece can comprise the framework that for example has one or more actuators), one or more display unit and/or one or more voice input device.In some embodiment after, one or more actuators that digital camera devices comprises one or more voice input devices, have one or more optics parts that are used for the mobile camera passage (for example, MEMS actuator, for example comb type MEMS actuator) strutting piece (such strutting piece can comprise the framework that for example has one or more actuators), one or more display unit and/or one or more lighting device.
This digital camera devices can be according to such as but not limited to the assembling of the similar any way of the mode that adopts in one or more embodiment disclosed herein and/or install.
Figure 78 R is the plane graph according to the downside of the strutting piece 320 of one embodiment of the present of invention (for example facing the main outer surface of one or more sensor arraies).In this embodiment, one or more devices 850 be arranged on the strutting piece 320 or in and receive/provide power, control signal and/or data-signal by being arranged on strutting piece 320 lip-deep pads 852.A plurality of conductors (referring to for example Figure 63-72) can with the one or more pads on the strutting piece 320 be arranged on digital camera devices 300 in one or more circuit in other places be connected.
In certain embodiments, integrated circuit 854 can be arranged on the strutting piece 320 to provide such as but not limited to helping and any one or more circuit that are arranged on device butt joint on the strutting piece 320 (for example with other any way control or communicate by letter).A plurality of conductive traces 856 (showing wherein some) can be connected the output of integrated circuit 854 with one or more devices on being installed in strutting piece 320.Although illustrate from the teeth outwards, should be appreciated that one, some or all of such track can be arranged in the strutting piece 320, thereby not on its outer surface.
Figure 79 A-79C is the perspective view according to the digital camera devices of for example one or more voice input devices of comprising of other embodiment of the present invention of one or more input units 830 (for example silicon microphone) and for example one or more display unit of one or more output device 820 (for example little display unit).
Figure 80 A-80F is the perspective view according to the digital camera devices of comprising of other embodiment of the present invention of for example one or more voice input devices of one or more input units 830 (as silicon microphone) and for example one or more display unit of one or more output device 820 (for example micro-display), wherein the one or more one or more lighting devices (for example high illuminance LED) that comprise in this input unit.
This digital camera devices can be according to such as but not limited to the assembling of the similar any way of the mode that adopts in one or more embodiment disclosed herein and/or install.
As mentioned above, this digital camera devices can have any amount of camera passage, and each camera passage can have any configuration.With reference to Figure 81 A-81C, in certain embodiments, digital camera devices comprises shell, and this shell is such as but not limited to packing.One or more parts of shell can be limited by one or more structures described here, and described structure example is as one or more, the one or more parts of framework in: the optics part, one or more parts of image device and/or their combination.
In certain embodiments, one or more parts of shell are limited by plastic material, ceramic material and/or their any combination.
Figure 81 A is the perspective view according to the digital camera devices that comprises shell 300 of one embodiment of the present of invention.Figure 81 B-81C is the decomposition diagram of digital camera devices 300.Shell can comprise the molded plastics packing, although this not necessarily.In illustrated embodiment, this digital camera devices comprises first housing parts (for example molded plastic base or bottom) that supports imageing sensor.Image device can comprise one or more Sensor sections, can also comprise one or more parts of processor.Second housing parts (for example molded plastics top or lid) limits has the framework that is used to hold and locate one or more frame parts of one or more optics parts.Can provide one or more terminals for example one or more terminals 860, terminal 860 can for example be arranged on one or more outer surfaces of molded plastics packing.One or more conductive components for example closing line can with the one or more circuit on one or more terminals and the image device for example one or more circuit of one or more parts of processor be electrically connected.In certain embodiments, first housing parts, second housing parts and one or more optics partly limit the essential part of shell, such as but not limited to packing.In certain embodiments, the upper surface of one or more optics part is concordant with one or more parts of the main outer surface of second housing parts (for example molded plastics top) usually.
This digital camera devices can be assembled by any way.In certain embodiments, image device, terminal and conductive component are supported on the main outer surface of first housing parts (for example molded plastic base).Second housing parts (for example molded plastics top) can provide subsequently.Before the assembling process, among and/or can adopt heat, pressure and/or grafting material afterwards.This grafting material can be any or polytype grafting material, such as but not limited to one or more sealed engagement materials.
Molded plastics packing can be repaired and/or upgrade for example helping, although this not necessarily so that this digital camera sub-component can more easily be dismantled and/or install.Molded plastics packing may also be favourable to the digital camera devices that adopts in the wear-resisting transducer for example, and this transducer for example is not comprise display but the meter (badge) from data to the base station or the drill (broach) that send.The molded plastics packing can be used in combination with one or more embodiment disclosed herein.
Can also use other configuration.For example in certain embodiments, first housing parts and/or second housing parts are formed by the encapsulant of any kind, and the sealing material is such as but not limited to ceramic material.The use of ceramic package may be favourable in adverse circumstances and/or in the in-problem application of exhaust from plastics (for example vacuum system), although this not necessarily.Ceramic package can be used in combination with one or more embodiment disclosed herein.
With reference to Figure 81 D, the decomposition diagram according to the digital camera devices that comprises the molded plastics packing of another embodiment of the present invention is shown, in certain embodiments, two digital camera devices are arranged in the single shell.For example, in certain embodiments, first housing parts (for example pedestal) limits the framework with one or more frame parts, these one or more frame parts be used for holding with location and the one or more optics that are installed in second housing parts partly towards second group of relative one or more optics part.Second group of one or more sensor array can join with second group of one or more optics part correlation, and can for example be arranged on the image device or second image device that also can be arranged in this shell.
In certain embodiments, one of camera passage for example camera passage 350A be exclusively used in the color of two or more separation or the colour band of two or more separation (for example blueness or blue ribbon and redness or red zone).In some such embodiment, optics part self can have the ability (referring to for example Figure 82) of the color separated that the ability that for example is similar to the color separated that color filter array (for example Bayer pattern or its mutation) provided is provided.
Figure 82 is that the ability of the color separated of providing partly is provided these one or more optics according to the perspective view of the digital camera devices of the one or more optics parts of having of one embodiment of the present of invention.In some such embodiment, one or more optics part for example optics part 330C comprises color filter array such as but not limited to the Bayer pattern.In some such embodiment, one or more optics part for example optics part 330C has the ability of the color separated that the ability that is similar to the color separated that color filter array provides is provided.
In certain embodiments, the lens of camera passage and/or filter can make this two colors or colour band transmission, and the other places in the camera passage, and the camera passage can comprise that one or more mechanism separate two colors or two colour bands.For example, color filter array can be arranged between lens and the sensor array and/or the camera passage can adopt can separate colors or the transducer of colour band.In some embodiment after, sensor array can be equipped with the have many bands ability pixel of (for example two or three colors).For example, each pixel can comprise two or three photodiodes, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band, and the 3rd photodiode is suitable for detecting the 3rd color or the 3rd colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.In certain embodiments,, also use this sensor array, for example make this sensor array be suitable for particular color or colour band even pixel may only be seen a kind of particular color or colour band.
4 camera passages
In certain embodiments, digital camera devices comprises 4 or polyphaser passage camera passage 350A-350D for example more.In some such embodiment, the first camera passage for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), the second camera passage for example camera passage 350B is exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, third phase machine passage for example camera passage 350C is exclusively used in the 3rd color that is different from first and second colors or colour band or the 3rd colour band (for example green or green band), and the 4th camera passage for example camera passage 350C is exclusively used in and is different from first, the 4th color of the second and the 3rd color or colour band or the 4th colour band (for example green or green band).In certain embodiments, one or more camera passages adopt: with the fuzzy Pixel Dimensions that is complementary of color optics of respective camera passage, integrating time and/or other electrical characteristics and/or be suitable for of sensor array that are suitable for improving or optimize the performance of respective camera passage improve or maximize the image element circuit of sensitivity of respective camera passage and the design/layout of photodiode.In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
The sensor array of one or more camera passages can or can not have mutually the same visual field.In certain embodiments, each in the sensor array has mutually the same visual field.In certain embodiments, one or more sensor arraies have the visual field different with the visual field of one or more other camera passages.
In certain embodiments, one of camera passage for example camera passage 350A be exclusively used in the color of two or more separation or the colour band of two or more separation (for example blueness or blue ribbon and redness or red zone).In some such embodiment, optics part self can have the ability (referring to for example Figure 82) of the color separated that the ability that for example is similar to the color separated that color filter array (for example Bayer pattern or its mutation) provided is provided.In certain embodiments, the lens of camera passage and/or filter can make this two colors or colour band transmission, and the other places in the camera passage, and the camera passage can comprise that one or more mechanism separate two colors or two colour bands.For example, color filter array can be arranged between lens and the sensor array and/or the camera passage can adopt can separate colors or the transducer of colour band.In some embodiment after, sensor array can be equipped with the have many bands ability pixel of (for example two or three colors).For example, each pixel can comprise two or three photodiodes, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band, and the 3rd photodiode is suitable for detecting the 3rd color or the 3rd colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.In certain embodiments,, also use this sensor array, for example make this sensor array be suitable for particular color or colour band even pixel may only be seen a kind of particular color or colour band.
In some embodiment after, the second camera passage for example camera passage 350B also is exclusively used in the color of two or more separation or the colour band of two or more separation.For example, the first camera passage can be exclusively used in redness or red zone and green or green band (for example G1).The second camera passage can be exclusively used in blueness or blue ribbon and green or green band (for example G2).In other embodiment after, the second camera passage for example camera passage 350B is exclusively used in color or different single color or the single colour bands (for example green or green band) of colour band that is exclusively used in the first camera passage, and third phase machine passage for example camera passage 350C is exclusively used in color or different single color or the single colour bands of colour band that is exclusively used in the first and second camera passages.
The camera passage can or can not have mutually the same configuration (for example size, shape, resolution or sensitivity or the range of sensitivity).For example, in certain embodiments, each camera passage has size, shape, resolution and/or sensitivity or the range of sensitivity identical with other camera passage.In further embodiments, one or more camera passages have size, shape, resolution and/or sensitivity or the range of sensitivity different with one or more other camera passages.Thus, in certain embodiments, each camera passage for example camera passage 350A-350D has mutually the same resolution.In further embodiments, one or more camera passages have the resolution less than the resolution of one or more other camera passages.For example, for analogous visual field part, the one or more camera passages for example sensor array of camera passage 350A have than one or more other camera passages pixel of lacking of the pixel of the sensor array of camera passage 350B for example.For example, in one embodiment, for analogous visual field part, the pixel quantity in one of camera passage Duos 44% than the pixel quantity in another camera passage.For example, in another embodiment, for analogous visual field part, the pixel quantity in one of camera passage Duos 36% than the pixel quantity in other camera passage.
In certain embodiments, the sensor array of one or more camera passages can have the different size of size with the sensor array of one or more other camera passages.In some such embodiment, the optics of this one or more camera passages part can have f/# and/or different f/# and/or the focal lengths of focal length with one or more other camera passages.
In certain embodiments, one or more camera passages are exclusively used in a wavelength or wavelength band, and the sensor array of so one or more camera passages and/or optics part is optimized at respective wavelength that the respective camera passage was exclusively used in or wavelength band.In certain embodiments, the design of each sensor array, operation, array sizes and/or Pixel Dimensions are at respective wavelength that the camera passage was exclusively used in or wavelength band and optimize.In certain embodiments, the design of each opticator is at respective wavelength that the respective camera passage was exclusively used in or wavelength band and optimize.
Yet should be appreciated that and to adopt any other configuration.
These 4 or more the polyphaser passage can arrange according to any way.In certain embodiments, these 4 or more the polyphaser channel arrangement become 2 * 2 matrixes, to help to provide the compactedness and the symmetry of light collection.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 83 A-83C).In certain embodiments, this processor can have the one or more parts that are not integrated on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 83 D-83E) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, the camera passage for example camera passage 350A-350D be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.This display can or can not have mutually the same characteristic.In further embodiments, 4 camera passages for example camera passage 350A-350D be connected with same display respectively.
4 camera passages of Y configuration
Figure 84 A-84E is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 4 or polyphaser passage camera passage 350A-350D for example more, wherein 4 camera passages for example camera passage 350A-350D be arranged to " Y " configuration.
In certain embodiments, one of camera passage for example camera passage 350C be broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, in the camera passage first for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), in the camera passage second for example camera passage 350B be exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, in the camera passage the 3rd for example camera passage 350D be exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, another camera passage for example camera passage 350D is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) the camera passage that for example uses the color filter array with Bayer pattern.
In certain embodiments, one of camera passage for example camera passage 350C be exclusively used in two or more colors or two or more colour bands.In some such embodiment, optics part self can have the ability (referring to for example Figure 84 B) of the color separated that the ability that for example is similar to the color separated that color filter array (for example Bayer pattern or its mutation) provided is provided.In certain embodiments, the lens of camera passage and/or filter can make this two colors or colour band transmission, and the other places in the camera passage, and the camera passage can comprise that one or more mechanism separate two colors or two colour bands.For example, color filter array can be arranged between lens and the sensor array and/or the camera passage can adopt can separate colors or the transducer of colour band.In some embodiment after, sensor array can be equipped with the have many bands ability pixel of (for example two or three colors).For example, each pixel can comprise two or three photodiodes, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band, and the 3rd photodiode is suitable for detecting the 3rd color or the 3rd colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.In certain embodiments,, also use this sensor array, for example make this sensor array be suitable for particular color or colour band even pixel may only be seen a kind of particular color or colour band.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 84 C-84E).In certain embodiments, one of this processor, some or all of part are not arranged on (referring to for example Figure 84 A) on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
4 passages, two passages are littler than other two passages
Figure 85 A-85E is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 4 or polyphaser passage camera passage 350A-350D for example more.Two camera passages for example the size of camera passage 350A, 350C respectively less than other two camera passages size of camera passage 350B, 350D for example.
In certain embodiments, less camera passage for example the resolution of camera passage 350A, 350C all less than the big camera passage resolution of camera passage 350B, 350D for example, although in these embodiments, less camera passage can or can not have mutually the same resolution, and big camera passage can or can not have mutually the same resolution.For example, for analogous visual field part, the sensor array of each less camera passage can have than the pixel pixel still less that is provided in each sensor array than big camera passage.For example, in one embodiment, for analogous visual field part, the pixel quantity in one or more big camera passages Duos 44% than the pixel quantity in one or more less camera passages.For example, in another embodiment, for analogous visual field part, the pixel quantity in one or more big camera passages Duos 36% than the pixel quantity in one or more less camera passages.Yet should be appreciated that and to use any other size and/or framework.
In further embodiments, the resolution of one or more less camera passages equals the resolution of one or more big camera passages.For example, for analogous visual field part, the one or more less camera passages for example sensor array of camera passage 350A, 350C can have the pixel of equal number with the sensor array that big camera passage is for example provided among camera passage 350B, the 350D.For example, in one embodiment, the Pixel Dimensions big by 44% (for example big by 20% on the x direction, big by 20% on the y direction) in the smaller camera passage of size of the pixel in the big camera passage.For example, in another embodiment, the Pixel Dimensions big by 36% (for example big by 17% on the x direction, big by 17% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.Yet should be appreciated that and to use any other size and/or framework.
In certain embodiments, the size of the sensor array of one or more camera passages can be different from the size of the sensor array of one or more other camera passages.In some such embodiment, the optics of so one or more camera passages part can have f/# and/or different f/# and/or the focal lengths of focal length with one or more other camera passages.
In certain embodiments, one of less camera passage for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), one of big camera passage for example camera passage 350B is exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, and another camera passage for example camera passage 350D is exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, less camera passage for example camera passage 350A has and equals for example resolution of the resolution of camera passage 350B, 350D of two big camera passages.
In certain embodiments, one of less camera passage for example camera passage 350C be broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 85 C-85E).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 85 B) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, these 4 or more the polyphaser passage for example camera passage 350A-350D be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.This display can or can not have mutually the same characteristic.In further embodiments, two or more camera passages for example camera passage 350A-350B, 350D are connected with first display, and one or more other camera passages for example camera passage 350C are connected with second display.First and second displays can or can not have identical characteristic.In some such embodiment, first display has the resolution of the resolution that equals connected one or more camera passages.Second display can have the resolution of the resolution that equals connected one or more camera passages.For example, in certain embodiments, one or more camera passages have the resolution of the resolution that is lower than one or more other camera passages.In such embodiments, can have the resolution that is lower than with the channel attached one or more exploration on display resolution ratio of one or more other cameras with the channel attached one or more displays of these one or more low resolution cameras.In certain embodiments, first display has the resolution of the resolution that equals connected one or more camera passages.Second display has the resolution of the resolution that equals connected one or more camera passages.But also can adopt other resolution.
4 passages, 3 passages are littler than other a passage
Figure 86 A-86E is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 4 or polyphaser passage camera passage 350A-350D for example more.3 camera passages for example the size of camera passage 350A-350C all less than other the camera passage size of camera passage 350D for example.
In certain embodiments, less camera passage for example the resolution of camera passage 350A-350C all less than the big camera passage resolution of camera passage 350D for example, although in these embodiments, less camera passage can or can not have mutually the same resolution.For example, for analogous visual field part, the sensor array of each less camera passage can have than the pixel pixel still less that is provided in each sensor array than big camera passage.For example, in one embodiment, for analogous visual field part, the pixel quantity in the big camera passage Duos 44% than the pixel quantity in one or more less camera passages.For example, in another embodiment, for analogous visual field part, the pixel quantity in the big camera passage Duos 36% than the pixel quantity in one or more less camera passages.Yet should be appreciated that and to adopt any other size and/or framework.
In further embodiments, the resolution of one or more less camera passages equals the resolution of big camera passage.For example, for analogous visual field part, the one or more less camera passages for example sensor array of camera passage 350A-350C can have the pixel of equal number with the sensor array that big camera passage is for example provided among the camera passage 350D.For example, in one embodiment, the Pixel Dimensions big by 44% (for example big by 20% on the x direction, big by 20% on the y direction) in the smaller camera passage of size of the pixel in the big camera passage.For example, in another embodiment, the Pixel Dimensions big by 36% (for example big by 17% on the x direction, big by 17% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.Yet should be appreciated that and to adopt any other size and/or framework.
In certain embodiments, one of camera passage for example camera passage 350D be broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, in the camera passage first for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), in the camera passage second for example camera passage 350B be exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, in the camera passage the 3rd for example camera passage 350D be exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, other a camera passage for example camera passage 350D is the broadband camera passage that for example uses the color filter array with Bayer pattern.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 86 C-86E).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 86 B) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, these 4 or more the polyphaser passage for example camera passage 350A-350D be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.This display can or can not have mutually the same characteristic.In further embodiments, 3 or more the polyphaser passage for example camera passage 350A-350C be connected with first display, other a camera passage for example camera passage 350D is connected with second display.First and second displays can or can not have mutually the same characteristic.For example, in certain embodiments, one or more camera passages have the resolution of the resolution that is lower than one or more other camera passages.In such embodiments, can have the resolution that is lower than with the channel attached one or more exploration on display resolution ratio of one or more other cameras with the channel attached one or more displays of these one or more low resolution cameras.In certain embodiments, first display has the resolution of the resolution that equals connected one or more camera passages.Second display can have the resolution of the resolution that equals connected one or more camera passages.But also can adopt other resolution.
4 oval-shaped passageway
Figure 87 A-87B is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 4 or polyphaser passage camera passage 350A-350D for example more, and the opticator that they are one or more to have ellipse or other non-circular shape respectively is opticator 330A-330D for example.
In certain embodiments, in the camera passage first for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), in the camera passage second for example camera passage 350B be exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, in the camera passage the 3rd for example camera passage 350D be exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, other a camera passage for example camera passage 350D is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) the camera passage that for example uses the color filter array with Bayer pattern.
In further embodiments, in the camera passage first for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), in the camera passage second for example camera passage 350B be exclusively used in second color or second colour band (for example green or green band) that is different from first color or first colour band, in the camera passage the 3rd for example camera passage 350C be exclusively used in the 3rd color or the 3rd colour band (for example blueness or blue ribbon) that is different from first and second colors or colour band, in the camera passage the 4th for example camera passage 350D be exclusively used in color or the colour band (for example green or green band) that is different from the first and the 3rd color or colour band.
In further embodiments, in the camera passage first for example camera passage 350A is exclusively used in redness or red zone, in the camera passage second for example camera passage 350B be exclusively used in blueness or blue ribbon, for example camera passage 350C is exclusively used in green 1 in the camera passage the 3rd or green is with 1, and for example camera passage 350D is exclusively used in green 2 to the 4th in the camera passage or green is with 2).
In certain embodiments, one of camera passage for example camera passage 350C be broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, comprises for example identical with layout described here or similar layout.In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 87 A-87B) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
3 camera passages
Figure 88 A-88E and 89A-89E are schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 3 or polyphaser passage camera passage 350A-350C for example more.
In certain embodiments, the first camera passage for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), the second camera passage for example camera passage 350B is exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, and third phase machine passage for example camera passage 350C is exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In certain embodiments, one or more camera passages adopt: with the fuzzy Pixel Dimensions that is complementary of color optics of respective camera passage, integrating time and/or other electrical characteristics and/or be suitable for of sensor array that are suitable for improving or optimize the performance of respective camera passage improve or maximize the image element circuit of sensitivity of respective camera passage and the design/layout of photodiode.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, one of camera passage for example camera passage 350A be exclusively used in the color of two or more separation or the colour band of two or more separation (for example blueness or blue ribbon and redness or red zone).In some such embodiment, optics part self can have the ability of the color separated that the ability that for example is similar to the color separated that color filter array (for example Bayer pattern or its mutation) provided is provided.In certain embodiments, the lens of camera passage and/or filter can make this two colors or colour band transmission, and the other places in the camera passage, and the camera passage can comprise that one or more mechanism separate two colors or two colour bands.For example, color filter array can be arranged between lens and the sensor array and/or the camera passage can adopt can separate colors or the transducer of colour band.In some embodiment after, sensor array can be equipped with the have many bands ability pixel of (for example two or three colors).For example, each pixel can comprise two or three photodiodes, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band, and the 3rd photodiode is suitable for detecting the 3rd color or the 3rd colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.In certain embodiments,, also use this sensor array, for example make this sensor array be suitable for particular color or colour band even pixel may only be seen a kind of particular color or colour band.
In some embodiment after, the second camera passage for example camera passage 350B also can be exclusively used in the color of two or more separation or the colour band of two or more separation.For example, the first camera passage can be exclusively used in redness or red zone and green or green band (for example G1).The second camera passage can be exclusively used in blueness or blue ribbon and green or green band (for example G2).In other embodiment after, the second camera passage for example camera passage 350B can be exclusively used in color or different single color or the single colour bands (for example green or green band) of colour band that is exclusively used in the first camera passage, and third phase machine passage for example camera passage 350C is exclusively used in color or different single color or the single colour bands of colour band that is exclusively used in the first and second camera passages.
These 3 or more the polyphaser passage can or can not have mutually the same configuration (for example size, shape, resolution or sensitivity or the range of sensitivity).In certain embodiments, each camera passage has size, shape, resolution and/or sensitivity or the range of sensitivity identical with other camera passage.For example, in further embodiments, one or more camera passages have size, shape, resolution and/or sensitivity or the range of sensitivity different with one or more other camera passages.For example, for analogous visual field part, the sensor array of one or more camera passages can have the pixel of lacking than the pixel of the sensor array of one or more other camera passages.
In certain embodiments, the sensor array of one or more camera passages can have the different size of size with the sensor array of one or more other camera passages.In some such embodiment, the optics of this one or more camera passages part can have f/# and/or different f/# and/or the focal lengths of focal length with one or more other camera passages.
In certain embodiments, one or more camera passages are exclusively used in a wavelength or wavelength band, and the sensor array of so one or more camera passages and/or optics part is optimized at respective wavelength that the respective camera passage was exclusively used in or wavelength band.In certain embodiments, the design of each sensor array, operation, array sizes and/or Pixel Dimensions are at respective wavelength that the camera passage was exclusively used in or wavelength band and optimize.In certain embodiments, the design of each opticator is at respective wavelength that the respective camera passage was exclusively used in or wavelength band and optimize.
Yet should be appreciated that and to adopt any other configuration.
These 3 or more the polyphaser passage can arrange according to any way.In certain embodiments, these 3 or the triangle shown in the polyphaser channel arrangement Cheng Suotu more are to help to provide the compactedness and the symmetry of light collection.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of layout described here (referring to for example Figure 98 A-98B).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, the camera passage for example camera passage 350A-350C be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.This display can or can not have mutually the same characteristic.In further embodiments, 3 camera passages for example camera passage 350A-350C be connected with same display respectively.
Figure 90 A, 91A-91B, 92A-92B, 93A-93,94A-94B, 95A-95B, 96A-96B be respectively can with have 3 or the be associated plan view and the cross-sectional view of some embodiment of the image device 520 that adopts of the digital camera devices of polyphaser passage more.In this embodiment, this image device has first and second first type surfaces and by edge limited outer perimeter.This image device is that the effective coverage of one or more sensor arraies limits one or more districts.This image device also limits one or more districts for buffer and/or the logical block that is associated with one or more sensor arraies.
Image device, sensor array and image-region can have virtually any size and shape.In certain embodiments, image-region size common and the respective sensor array is roughly the same, and therefore, depends on the yardstick of sensor array below, and image-region is the size and dimension difference each other.Certainly, do not require that image-region covers all or below array only.In alternative embodiment, image-region can only cover the part of array, and can extend beyond this array.
The shape of image device 520 is generally rectangle, and the yardstick that the yardstick on its first limit equals the about 10mm and second limit equals about 8.85mm.It is circular that the shape of each image-region is generally, and its width or diameter equal about 5mm.The shape of each effective coverage is generally rectangle, and its first yardstick equals about 4.14mm and second yardstick equals about 3.27mm.This effective coverage can limit for example matrix of 1200 * 900 pixels (i.e. 1200 row, 900 row).
In certain embodiments, the shape of image device 520 is generally square, and the yardstick on its every limit equals about 10mm, and wherein at each quadrant, the yardstick on every limit is 5mm.It is circular that the shape of each image-region is generally, and its width or diameter equal about 5 millimeters (mm).The shape of each effective coverage is generally rectangle, and its first yardstick equals about 4mm and second yardstick equals about 3mm.This effective coverage can limit for example matrix of 1200 * 900 pixels (i.e. 1200 row, 900 row).
With reference to Figure 97 A-97D, 3 or more the polyphaser passage optics part by one or more strutting pieces for example strutting piece 320 support, this strutting piece is located with the respective sensor array each optics part to small part registration ground.For example, in this embodiment, optics part 330A and location, sensor array 310A registration ground.Optics part 330B and location, sensor array 310B registration ground.Optics part 330C and location, sensor array 310C registration ground.Optics part 330B and location, sensor array 310B registration ground.In certain embodiments, this strutting piece light of also helping to limit, minimize and/or eliminate between the camera passage " is crosstalked ".
In this embodiment, strutting piece 320 comprises and limits for example strutting piece of 4 support section 600A-600C of one or more support sections, and each support section all supports and/or helps to locate in one or more optics parts corresponding one.For example, in this embodiment, support section 600A and location, sensor array 310A registration ground and support of optical device portions 330A.Support section 600B and location, sensor array 310B registration ground and support of optical device portions 330B.Support section 600C and location, sensor array 310C registration ground and support of optical device portions 330C.Support section 600D and location, sensor array 310D registration ground and support of optical device portions 330D.In this embodiment, this strutting piece light of also helping to limit, minimize and/or eliminate between the camera passage " is crosstalked ".
Each support section 600A-600C limits aperture 616 and base 618.Aperture 616 is the path that corresponding camera passage limits optical transmission.Base 618 is suitable for holding corresponding optics part (or its part) and is suitable for supporting at least in part and/or locatees this corresponding optical devices part.Thus, base 618 can comprise the one or more surfaces (for example surface 620,622) that are suitable in abutting connection with one or more surfaces of this optics part, to support and/or to locate this optics part at least in part with respect to one or more among this support section and/or the sensor array 310A-310C.In this embodiment, surface 620 is arranged on around the circumference of this optics part, to locate this optics part in x direction and y direction upper support with help.Surface 622 (being sometimes referred to as " stop " surface) are partly located this optics or are helped to be positioned on the z direction.
The position of stop surfaces 622 and/or orientation can be suitable for optics partly is positioned on specific range (or distance range) with respect to corresponding sensor array locates and/or be orientated.Thus, the degree of depth of base 618 control lens position (for example installing) in strutting piece 320.This degree of depth may be different to each lens, and at least in part based on the focal length of lens.For example, if the camera passage is exclusively used in specific color (or colour band), the one or more lens that then are used for this camera passage can have the focal length that is particularly suitable for the color (or colour band) that the camera passage is exclusively used in.If each camera passage is exclusively used in the color (or colour band) of the color (or colour band) that is different from other camera passage, then each lens can have different focal lengths, for example making lens be suitable for corresponding sensor array, and each base has the different degree of depth.
Each optics part can be fixed in the corresponding base 618 in any suitable manner, and this mode is such as but not limited to (for example interference fit, the physics stop) of machinery, (for example adhering to), (for example electricity engages) and/or their combination of electricity of chemistry.In certain embodiments, base 618 has the yardstick that the corresponding optical devices of being suitable for partly provides interference fit.
Aperture (or its part) can have any configuration (for example shape and/or size), comprises for example columnar, conical, rectangle, irregular and/or their any combination.This configuration can be based on the desired configuration of for example optical path, the configuration of respective optical device portions, the configuration of respective sensor array and/or their any combination.
Strutting piece 320 can comprise the material of any kind and can have any configuration and/or structure.For example, in certain embodiments, strutting piece 320 comprises silicon, semiconductor, glass, pottery, plastics or metal material and/or their combination.If strutting piece 320 has the part more than, then these parts can be made, integrate each other the ground manufacturing apart from each other and/or make by any combination of this dual mode.If this strutting piece limits the support section more than, then each such support section for example support section 600A-600D can be as shown with other support section in one, some or all couplings, perhaps isolated fully with other support section.If strutting piece 320 is single integral component, each in then one or more support sections limits one or more parts of such integral component.In addition, this locator can be still also can adopt the device of other form for making and material provides the solid unit of extensive selection.For example, in certain embodiments, strutting piece 320 comprises the plate (for example thin plate) that limits this strutting piece and one or more support sections, and wherein aperture and base form by machining (for example boring) or any other suitable method.In further embodiments, strutting piece 320 (for example utilizing the mould of the projection with the aperture that limits one or more support sections and base) is manufactured to the foundry goods that wherein is limited with the aperture.
In certain embodiments, lens and strutting piece are manufactured to single moulding part.In certain embodiments, lens can manufacturedly have the small pieces that can be used for forming strutting piece.
In certain embodiments, strutting piece 320 directly or indirectly is coupled and/or is attached on the image device.For example, strutting piece 320 can (for example use adhesive) and directly is coupled and be attached on the image device or by intermediate support assembly (not shown) indirect coupling and/or be attached on the image device.
The x of strutting piece 320 and y yardstick can be for example roughly the same with image device (on one or more yardsticks), roughly the same and/or roughly the same with the layout (on one or more yardsticks) of sensor array 310A-310D with the layout (on one or more yardsticks) of optics part 330A-330D.The advantage that yardstick is set like this is to help to keep the x and the y yardstick of digital camera devices as much as possible little.
In certain embodiments, maybe advantageously, to base 618 provide with will be in abutting connection with the identical height A of height of the optics of stop surfaces 620 part.Maybe advantageously, stop surfaces 622 is arranged on equals to allow base 618 to provide the height B (for example distance between the pedestal of stop surfaces 622 and support section) of the needed height of firm retention at least for superincumbent optics part (for example lens) will be installed.The width of the part in the aperture 616 that is higher than the height of stop surfaces 622 and is provided with or diameter C can be for example based on the width or the diameter that will be installed in optics part (for example lens) wherein and the methods that is used for optics is partly adhered to and/or remains on base 618.The width of stop surfaces 622 is preferred must to be enough to help for optics part (for example lens) provides firm retention greatly, but little be enough to stop by the light of optics part transmission unnecessary reduce to minimum.May it is desirable to, will be lower than the height of stop surfaces 622 and the width of the part in the aperture 616 that is provided with or diameter D big must be enough to help with to stop by the light of optics part transmission unnecessary reduce to minimum.In view of above-mentioned consideration, may it is desirable to, provide the height that equals needed smallest dimension E to this strutting piece, thereby produce firm one or more strutting pieces that will be installed in optics part wherein that are enough to support, and maybe advantageously, make one of one or more support section 600A-600D or aperture 616A-616D as far as possible little at interval but big must be enough to make this strutting piece firm the spacing F of the optics part that is enough to support will be installed in wherein.This strutting piece can have length J and width K.
In certain embodiments, it is desirable to, the height A that equals 2.2mm is provided to base 618, provide stop surfaces 622 at 0.25mm to the height B place in the 3mm scope, the width or the diameter C of part that makes this aperture be higher than the height B of stop surfaces 622 approximates 3mm greatly, the width of the bottom in this aperture or diameter D approximate 2.8mm greatly, provide 2.45mm height E in the 5.2mm scope to support section, and make the spaced apart 1mm at least in aperture apart from F.In some such embodiment, may it is desirable to, provide length J to equal 10mm and width K equals the strutting piece of 10mm.In further embodiments, may it is desirable to, provide length J to equal 10mm and width K equals the strutting piece of 8.85mm.
In certain embodiments, one or more optics partly comprise the lens of cylindrical shape type, for example the NT45-090 lens of Edmunds Optics manufacturing.It is the cylindrical shape part of 2.19mm up to 3 millimeters (mm) and height H that this lens have diameter G.In these embodiments, may it is desirable to, adopt to have the yardstick of setting forth in the preceding paragraph and the strutting piece of scope.
In certain embodiments, the length J of strutting piece equals 10mm and width K equals 10mm.In further embodiments, may it is desirable to, provide length J to equal 10mm and width K equals the strutting piece of 8.85mm.
3 passages, two passages are less than other a passage
Figure 99 A-99D is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 3 or polyphaser passage camera passage 350A-350C for example more.Two camera passages for example the size of camera passage 350A-350B all less than the third phase machine passage size of camera passage 350C for example.Less camera passage can or can not have mutually the same size.
In certain embodiments, the less camera passage for example resolution of camera passage 350A-350B all is lower than for example resolution of camera passage 350C of big camera passage, although in these embodiments, less camera passage can or can not have mutually the same resolution.For example, for analogous visual field part, each the less camera passage for example sensor array of camera passage 350A-350B can have the pixel pixel still less that bigger camera passage is for example provided in the sensor array of camera passage 350B.For example, in one embodiment, for analogous visual field part, the pixel quantity in the big camera passage Duos 44% than the pixel quantity in one or more less camera passages.For example, in another embodiment, for analogous visual field part, the pixel quantity in the big camera passage Duos 36% than the pixel quantity in one or more less camera passages.Yet should be appreciated that and to adopt any other size and/or framework.
In further embodiments, the resolution of one or more less camera passages equals the resolution of big camera passage.For example, for analogous visual field part, the one or more less camera passages for example sensor array of camera passage 350A-350B can have the pixel of equal number with the sensor array that big camera passage is for example provided among the camera passage 350C.For example, in one embodiment, the Pixel Dimensions big by 44% (for example big by 20% on the x direction, big by 20% on the y direction) in the smaller camera passage of size of the pixel in the big camera passage.For example, in another embodiment, the Pixel Dimensions big by 36% (for example big by 17% on the x direction, big by 17% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.Yet should be appreciated that and to adopt any other size and/or framework.
In certain embodiments, the first camera passage for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), the second camera passage for example camera passage 350B is exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, and third phase machine passage for example camera passage 350C is exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, two the less camera passages for example resolution of camera passage 350A-350B all are lower than for example resolution of camera passage 350C of third phase machine passage.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of layout described here (referring to for example Figure 99 B-99D).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 89 A) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, the camera passage for example camera passage 350A-350C be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.This display can or can not have mutually the same characteristic.In further embodiments, less camera passage for example camera passage 350A-350B is connected with first display, and big camera passage for example camera passage 350C be connected with second display.First and second displays can or can not have identical characteristic.In certain embodiments, first display has the resolution of the resolution that equals connected one or more camera passages.Second display can have the resolution of the resolution that equals connected one or more camera passages.But also can adopt other resolution.
3 mutually different passages of size
Figure 100 A-100D is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises 3 or polyphaser passage camera passage 350A-350C for example more.For example the size of camera passage 350A is less than second camera passage camera passage 350B for example for the first camera passage, and the size of this second camera passage is less than the third phase machine passage size of camera passage 350C for example.
With reference to Figure 101 A-101G, in certain embodiments, the minimum camera passage for example resolution of camera passage 350A is lower than for example resolution of camera passage 350B of the second camera passage, and the second camera passage for example the resolution of camera passage 350B be lower than for example resolution of camera passage 350C of maximum camera passage.For example, for analogous visual field part, the pixel pixel still less that is for example provided in the sensor array of camera passage 350B than the second camera passage can be provided the minimum camera passage for example sensor array of camera passage 350A, and for analogous visual field part, the pixel pixel still less that is for example provided in the sensor array of camera passage 350C than maximum camera passage can be provided the second camera passage for example sensor array of camera passage 350B.For example, in one embodiment, for analogous visual field part, the second camera passage for example the pixel quantity among the camera passage 350B than minimum camera passage for example the pixel quantity among the camera passage 350A Duo 44%, and for analogous visual field part, maximum camera passage for example the pixel quantity among the camera passage 350C than the second camera passage for example the pixel quantity among the camera passage 350B Duo 36%.Yet should be appreciated that and to adopt any other size and/or framework.
With reference to Figure 102 A-102G, in further embodiments, one or more less camera passages for example camera passage 350A-350B have and equal for example resolution of the resolution of camera passage 350C of big camera passage.For example, for analogous visual field part, one or more less camera passages for example camera passage 350A-350B sensor array can with big camera passage for example the sensor array of camera passage 350C have the pixel of equal number.
For example, in one embodiment, the second camera passage for example the Pixel Dimensions among the camera passage 350B than the minimum camera passage Pixel Dimensions big by 44% (for example big by 20% on the x direction, big by 20% on the y direction) among the camera passage 350A for example.For example, in another embodiment, maximum camera passage for example the Pixel Dimensions among the camera passage 350C than the second camera passage Pixel Dimensions big by 36% (for example big by 17% on the x direction, big by 17% on the y direction) among the camera passage 350B for example.Yet should be appreciated that and to use any other size and/or framework.
In certain embodiments, the first camera passage for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), the second camera passage for example camera passage 350B is exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, and third phase machine passage for example camera passage 350C is exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, two less camera passages for example camera passage 350A-350B all have and are lower than for example resolution of the resolution of camera passage 350C of third phase machine passage.In further embodiments, each camera passage for example camera passage 350A-350C have same resolution.
In certain embodiments, pixel quantity in the sensor array of camera passage and/or the pixel design incident light wavelength or the wavelength band that are suitable for being exclusively used in this camera passage is complementary.
In certain embodiments, the sensor array size of one or more camera passages and/or optical device designs (for example f/# and focal length) are suitable for this camera passage provides required visual field and/or sensitivity.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, one or more camera passages are exclusively used in a wavelength or wavelength band, and the sensor array of so one or more camera passages and/or optics part is optimized at respective wavelength that the respective camera passage was exclusively used in or wavelength band.In certain embodiments, the design of each sensor array, operation, array sizes and/or Pixel Dimensions are at respective wavelength that the camera passage was exclusively used in or wavelength band and optimize.In certain embodiments, the design of each opticator is at respective wavelength that the respective camera passage was exclusively used in or wavelength band and optimize.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 100 B-100D).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 100 A) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, the camera passage for example camera passage 350A-350C be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.This display can or can not have identical characteristic.For example, in certain embodiments, the resolution of one or more camera passages is lower than the resolution of one or more other camera passages, has the resolution that is lower than with the channel attached one or more exploration on display resolution ratio of one or more other cameras with the channel attached one or more displays of these one or more low resolution cameras.
3 mutually different oval-shaped passageway of size
Figure 103 A-103E is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises for example camera passage 350A-350C of one or more camera passages, and the opticator that they are one or more to have ellipse or other non-circular shape respectively is opticator 330A-330C for example.
In certain embodiments, one or more camera passages for example the size of camera passage 350A-350B less than the third phase machine passage size of camera passage 350C for example.In some such embodiment, one or more less camera passages for example the resolution of camera passage 350A-350B can be all less than the big camera passage resolution of camera passage 350C for example, although in these embodiments, less camera passage can or can not have mutually the same resolution.For example, for analogous visual field part, each the less camera passage for example sensor array of camera passage 350A-350B can have the pixel pixel still less that bigger camera passage is for example provided in the sensor array of camera passage 350B.For example, in one embodiment, for analogous visual field part, the pixel quantity in the big camera passage Duos 44% than the pixel quantity in one or more less camera passages.For example, in another embodiment, for analogous visual field part, the pixel quantity in the big camera passage Duos 36% than the pixel quantity in one or more less camera passages.Yet should be appreciated that and to adopt any other size and/or framework.
If for example the size of camera passage 350A-350B is less than the third phase machine passage size of camera passage 350C for example for one or more camera passages, then these one or more less camera passages can have the resolution of the resolution that equals big camera passage.For example, for analogous visual field part, one or more less camera passages for example camera passage 350A-350B sensor array can with big camera passage for example the sensor array of camera passage 350C have the pixel of equal number.For example, in one embodiment, the Pixel Dimensions big by 44% (for example big by 20% on the x direction, big by 20% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.For example, in another embodiment, the Pixel Dimensions big by 36% (for example big by 17% on the x direction, big by 17% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.Yet should be appreciated that and to adopt any other size and/or framework.
In certain embodiments, the first camera passage for example camera passage 350A is exclusively used in first color or first colour band (for example redness or red zone), the second camera passage for example camera passage 350B is exclusively used in second color or second colour band (for example blueness or blue ribbon) that is different from first color or first colour band, and third phase machine passage for example camera passage 350C is exclusively used in the 3rd color or the 3rd colour band (for example green or green band) that is different from first and second colors or colour band.In some such embodiment, two less camera passages for example camera passage 350A-350B all have and are lower than for example resolution of the resolution of camera passage 350C of third phase machine passage.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 103 B-103E).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 103 A) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, one or more camera passages for example camera passage 350A-350C be connected with one or more displays by one or more communication links.In some such embodiment, the display of each camera passage and its oneself is connected.Display can or can not have mutually the same characteristic.For example, if the camera passage has mutually different resolution, then display also can have mutually different resolution.For example, in certain embodiments, the smaller channels for example resolution of camera passage 350A-350B is lower than than the major path resolution of camera passage 350B for example, and the exploration on display resolution ratio that is connected with smaller channels is lower than and the big channel attached exploration on display resolution ratio of camera.In further embodiments, two less camera passages for example camera passage 350A-350B are connected with first display, and big camera passage for example camera passage 350C be connected with second display.First and second displays can or can not have mutually the same characteristic.For example, in certain embodiments, the less camera passage for example resolution of camera passage 350A-350B is lower than for example resolution of camera passage 350C of big camera passage, has the resolution that is lower than with the big channel attached exploration on display resolution ratio of camera with the channel attached display of less camera.
Two camera passages
Figure 104 A-104E is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises for example camera passage 350A-350B of two or more camera passages.
In certain embodiments, the first camera passage for example camera passage 350A is exclusively used in single color or single colour band (for example redness or red zone), and the second camera passage for example camera passage 350B be exclusively used in and be different from the color that the first camera passage is exclusively used in or the single color or the single colour band (for example green or green band) of colour band.In certain embodiments, one or more camera passages adopt: with the fuzzy Pixel Dimensions that is complementary of color optics of respective camera passage, integrating time and/or other electrical characteristics and/or be suitable for of sensor array that are suitable for improving or optimize the performance of respective camera passage improve or maximize the image element circuit of sensitivity of respective camera passage and the design/layout of photodiode.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In further embodiments, the first camera passage for example camera passage 350A be exclusively used in the color of two or more separation or the colour band of two or more separation (for example blueness or blue ribbon and redness or red zone).In some such embodiment, optics part self can have the ability of the color separated that the ability that for example is similar to the color separated that color filter array (for example Bayer pattern or its mutation) provided is provided.In certain embodiments, the lens of camera passage and/or filter can make this two colors or colour band transmission, and the other places in the camera passage, and the camera passage can comprise that one or more mechanism separate two colors or two colour bands.For example, color filter array can be arranged between lens and the sensor array and/or the camera passage can adopt can separate colors or the transducer of colour band.In some embodiment after, sensor array can be equipped with the have many bands ability pixel of (for example two or three colors).For example, each pixel can comprise two or three photodiodes, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band, and the 3rd photodiode is suitable for detecting the 3rd color or the 3rd colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.In certain embodiments,, also use this sensor array, for example make this sensor array be suitable for particular color or colour band even pixel may only be seen a kind of particular color or colour band.
In some embodiment after, the second camera passage for example camera passage 350B is exclusively used in and is different from the color that the first camera passage is exclusively used in or the single color or the single colour band (for example green or green band) of colour band.In some embodiment after, the second camera passage for example camera passage 350B also is exclusively used in the color of two or more separation or the colour band of two or more separation.For example, the first camera passage can be exclusively used in redness or red zone and green or green band (for example G1).The second camera passage can be exclusively used in blueness or blue ribbon and green or green band (for example G2).
These 2 or more the polyphaser passage can or can not have mutually the same configuration (for example size, shape, resolution or sensitivity or the range of sensitivity).For example, in certain embodiments, each camera passage has mutually the same size, shape, resolution and/or sensitivity or the range of sensitivity.In further embodiments, one or more camera passages have size, shape, resolution and/or sensitivity or the range of sensitivity different with one or more other camera passages.For example, for analogous visual field part, the sensor array of one or more camera passages can have the pixel of lacking than the pixel of the sensor array of one or more other camera passages.
In certain embodiments, the sensor array of one of camera passage can have the size of size of the size of the sensor array that is different from other camera passage.In some such embodiment, the optics of this one or more camera passages part can have the f/# that is different from one or more other camera passages and/or the f/# and/or the focal length of focal length.
In certain embodiments, one or more camera passages are exclusively used in a wavelength or wavelength band, and the sensor array of so one or more camera passages and/or optics part is optimized at respective wavelength that the respective camera passage was exclusively used in or wavelength band.In certain embodiments, the design of each sensor array, operation, array sizes and/or Pixel Dimensions are at respective wavelength that the camera passage was exclusively used in or wavelength band and optimize.In certain embodiments, the design of each opticator is at respective wavelength that the respective camera passage was exclusively used in or wavelength band and optimize.
Yet should be appreciated that and to adopt any other configuration.
These two or more camera passages can be arranged according to any way.In certain embodiments, these two or more camera passages can be arranged to linear array as shown in the figure.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of one or more layouts described here (referring to for example Figure 104 C-104E).In certain embodiments, this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (referring to for example Figure 104 B) that is arranged on the integrated circuit identical with sensor array.
As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
In certain embodiments, one or more camera passages for example camera passage 350A-350B be connected with one or more displays by one or more communication links.In some such embodiment, each camera passage and different display connect, and promptly less camera passage for example camera passage 350A is connected with first display, and than camera passage greatly for example camera passage 350B be connected with second display.First and second displays can or can not have identical characteristic.In certain embodiments, first exploration on display resolution ratio equals the resolution of connected camera passage.Second exploration on display resolution ratio can equal the resolution of connected camera passage, but also can adopt other resolution.
Two passages, a passage is less than another passage
Figure 105 A-105E is schematically illustrating according to the digital camera devices 300 of other embodiment of the present invention.In each such embodiment, this digital camera devices comprises for example camera passage 350A-350B of two or more camera passages.The first camera passage for example the size of camera passage 350A less than the second camera passage size of camera passage 350B for example.
In certain embodiments, less camera passage for example the resolution of camera passage 350A be lower than for example resolution of camera passage 350B of big camera passage.For example, for analogous visual field part, the less camera passage for example sensor array of camera passage 350A can have for example pixel pixel still less of the sensor array of camera passage 350B of bigger camera passage.For example, in one embodiment, for analogous visual field part, the pixel quantity many 44% in the smaller camera passage of pixel quantity in the big camera passage.For example, in another embodiment, for analogous visual field part, the pixel quantity many 36% in the smaller camera passage of pixel quantity in the big camera passage.Yet should be appreciated that and to adopt any other size and/or framework.
In further embodiments, less camera passage for example camera passage 350A have and equal for example resolution of the resolution of camera passage 350B of big camera passage.For example, for analogous visual field part, less camera passage for example camera passage 350A sensor array can with big camera passage for example the sensor array of camera passage 350B have the identical but less pixel of quantity.For example, in one embodiment, the Pixel Dimensions big by 44% (for example big by 20% on the x direction, big by 20% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.For example, in another embodiment, the Pixel Dimensions big by 36% (for example big by 17% on the x direction, big by 17% on the y direction) in the smaller camera passage of Pixel Dimensions in the big camera passage.Yet should be appreciated that and to adopt any other size and/or framework.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
In certain embodiments, the first camera passage is that less camera passage is exclusively used in single color or single colour band (for example redness or red zone), and the promptly big camera passage of the second camera passage is exclusively used in and is different from the color that the first camera passage is exclusively used in or the single color or the single colour band (for example green or green band) of colour band.
In further embodiments, for example less camera passage of one of camera passage is exclusively used in the color of two or more separation or the colour band of two or more separation (for example blueness or blue ribbon and redness or red zone).In some such embodiment, the lens of camera passage and/or filter can make this two colors or colour band transmission, and the other places in the camera passage, and the camera passage can comprise that one or more mechanism separate two colors or two colour bands.For example, color filter array can be arranged between lens and the sensor array and/or the camera passage can adopt can separate colors or the transducer of colour band.In some embodiment after, transducer can be equipped with sensor element or pixel, each all comprises two photodiodes this sensor element or pixel, wherein first photodiode is suitable for detecting first color or first colour band, and second photodiode is suitable for detecting second color or second colour band.A kind of method of finishing this is to provide to photodiode to make them have optionally different structure/characteristic, thereby make first photodiode to first color or first colour band is compared second color or second colour band is sensitiveer, and second photodiode is to second color or second colour band is compared first color or first colour band is sensitiveer.Another kind method is that photodiode is arranged on different depth place in the pixel, and this has utilized the characteristics different with absorption characteristic that penetrate of different colours or colour band.For example, blue and blue ribbon penetrates to such an extent that lack (and therefore being absorbed at less degree of depth place) than green and green band, and green and green band penetrates to such an extent that lack (also so be absorbed at less degree of depth place) than redness and red zone.
In certain embodiments, digital camera devices adopts the processor that is arranged on the integrated circuit identical with sensor array.This processor can have any layout, for example comprises and the identical or similar layout of layout described here (referring to for example Figure 105 C-105E).In certain embodiments, this processor of this processor can have the one or more parts that are not arranged on the integrated circuit identical with sensor array, and/or can not have any part (for example referring to Figure 105 B) that is arranged on the integrated circuit identical with sensor array.As mentioned above, above-mentioned each embodiment can use separately or with open at this or well known to a person skilled in the art that any other embodiment or its part are used in combination.
For this reason, in certain embodiments, less camera passage is exclusively used in the color (or colour band of two or more separation) of two or more separation, and its resolution is lower than the resolution of big camera passage.
In certain embodiments, the camera passage for example camera passage 350A-350B be connected with one or more displays by one or more communication links.In some such embodiment, each camera passage and different display connections, be less camera passage for example camera passage 350A be connected with first display, and big camera passage for example camera passage 350B be connected with second display that is different from first display.First and second displays can or can not have identical characteristic.In certain embodiments, first exploration on display resolution ratio equals the resolution of connected camera passage.Second exploration on display resolution ratio can equal the resolution of connected camera passage, but also can adopt other resolution.
The image sensor chip group
Figure 106 A-106C is according to a plurality of digital camera devices of having of another embodiment of the present invention perspective view of the system of two digital camera devices for example.These a plurality of digital camera devices can be arranged according to any desired mode.In certain embodiments, may it is desirable to, from opposite direction images acquired.In certain embodiments, the back-to-back as shown in figure installation of digital camera devices, some such embodiment can allow imaging simultaneously in the opposite direction.
In certain embodiments, the one or more optics of first camera subsystem part towards with one or more optics parts of second digital camera devices towards the side in the opposite direction.For example, in illustrated embodiment, this system has first and second sides respect to one another usually.In this digital camera devices first can be positioned to receive light by first side of digital camera devices.Second in this digital camera devices can be positioned to receive light by second side of this system.Can also adopt other configuration.
In certain embodiments, each subsystem all has the group of its oneself sensor array, filter and optics, and can or can not have mutually the same application and/or configuration, for example in certain embodiments, a subsystem can be a color system, and another can be monochromatic system, and subsystem can have first visual field and another can have different visual fields, and subsystem can provide video imaging and another can provide static imaging.
A plurality of digital camera sub-components can have virtually any size and shape and can or can not have mutually the same configuration (for example type, size, shape, resolution).In illustrated embodiment, the length of a subsystem and width equal the length and the width of another subsystem respectively, although this is optional.
In certain embodiments, one or more Sensor sections of second digital camera devices are arranged on the device (as image device) identical with one or more Sensor sections of first digital camera devices.In certain embodiments, one or more Sensor sections of second digital camera devices are arranged on second device (for example second image device), and second device can be for example be provided with adjacent to the image device of the top one or more Sensor sections that are provided with first digital camera devices.
In certain embodiments, two or more digital camera devices shared processing devices or its part.In further embodiments, each digital camera devices has its own processor special-purpose, that separate with the processor of other digital camera devices.
In certain embodiments, this system specialization packs, although this not necessarily.
For each embodiment disclosed herein, this embodiment of the present invention can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
This digital camera devices can be according to assembling and/or install such as but not limited to any way that is similar to the mode that adopts among one or more embodiment disclosed herein.
Figure 107 A-107B is schematically illustrating of another embodiment.This embodiment comprises a plurality of image devices.In this embodiment, each image device all has the one or more sensor arraies that are used for one or more camera passages.In certain embodiments, image device can or can be not similar each other.
Although at this a lot of accompanying drawings the digital camera sub-component of laminated assembly form is shown, is to be understood that the digital camera sub-component can or can not have such configuration.In fact, one or more camera passages of digital camera sub-component can have any configuration.Therefore, some embodiment can have the form of laminated assembly.Other embodiment can not have the form of laminated assembly.
For example, Figure 108 A-108B is schematically illustrating according to the digital camera sub-component of other embodiment of the present invention.Each all adopts one or more embodiments described here or its part the digital camera sub-component.But the digital camera sub-component can or can not have the form of laminated assembly.
In this embodiment, the digital camera assembly comprises one or more camera passages.This camera passage can have any configuration and can or can not have mutually the same configuration.
In certain embodiments, each camera passage all comprises the 2M pixel arrowband camera of for example red camera passage, blue camera passage and green camera passage.
In certain embodiments, each camera passage all comprises the 1.3M pixel arrowband camera of for example red camera passage, blue camera passage and green camera passage.
In certain embodiments, one of camera passage is broadband camera passage, infrared (IR) camera passage or ultraviolet (UV) camera passage.
Yet, as mentioned above, can also adopt any other configuration.
For each embodiment disclosed herein, this embodiment of the present invention can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
The movable support
In certain embodiments, each optics part is fixed with respect to the position of respective sensor part.But in some alternative embodiments, can provide one or more actuators so that the motion of one or more optics parts or its part and/or one or more sensor array or its part to be provided.In certain embodiments, one or more such actuators are provided in the strutting piece (such strutting piece can comprise the framework that for example is provided with one or more actuators).
For example, may it is desirable to, relatively moving between optics part (or its one or more parts) and the sensor array (or its one or more parts) is provided, comprise such as but not limited to x and/or y direction, relatively moving on the z direction, tilt, the rotation (for example less than, greater than and/or equal the rotations of 360 degree) and/or their combination, this relatively moves and can be used to provide each feature disclosed herein and/or each application, comprises such as but not limited to increasing resolution (for example increasing details), zoom, 3D strengthens, image stabilization, image alignment, the lens alignment, shelter, image identifying, automatic focus, mechanical shutter, machinery iris diaphgram (mechanical iris), the hypersphere imaging, snapshot mode, range finding and/or their combination.
It should be noted that, the title of submitting on July 1st, 2005 be description and the illustrated invention use that can combine with the present invention in the U.S. Provisional Application 60/695,946 of " Method andApparatus for use in Camera and Systems Employing Same ".For the sake of brevity, will not repeat those discussion.Should be specifically noted that the full content of above-mentioned U.S. Provisional Application comprises that for example feature, attribute, replacement, material, technology and/or the advantage of its all invention/embodiment are all incorporated herein by reference.
Can for example use for example MEMS actuator of actuator, thereby and by applying suitable control signal to one or more actuators so that these one or more actuators move, expand and/or shrink moves related optics such motion partly is provided.May be advantageously, make amount of exercise equal little for example 2 microns of distances (2 μ m), this may be enough for a lot of application.For example, in certain embodiments, this amount of exercise may diminish to 1/2 (for example 1/2 of the width of a pixel) of the width of a sensor element on one of sensor array.For example, in certain embodiments, this amount of movement can equal the twice of amount of the width of the amount of width of a sensor element or a sensor element.
In certain embodiments, the form that relatively moves is the 1/3 pixel * 1/3 pixel pitch displacement of 3 * 3 forms.In other embodiments, the form that relatively moves is shake.In some dithering systems, may it is desirable to, adopt the optical filling factor that reduces.In certain embodiments, adopt the snapshot integration.The ability of read output signal when some embodiment are provided at integration.
In certain embodiments, digital camera devices adopts relatively moving that it oneself carries out, this relatively moves and replaces one or more embodiment disclosed herein or provide each feature disclosed herein and/or each application with these one or more embodiment are combined, and resolution (for example increasing details), zoom, 3D effect, image stabilization, image alignment, lens are alignd, sheltered such as but not limited to increasing, image identifying, automatic focus, automatic exposure, mechanical shutter, mechanical iris diaphgram, hypersphere imaging, snapshot mode, range finding and/or their combination.
In addition, be to be understood that such relatively moving can be used to provide any further feature and/or each application of known at present or later exploitation, if desired can also be with any method and/or equipment known or later exploitation use at present.
Figure 109 A-109D is the block diagram that the configuration of adopting in some embodiments of the invention is shown.
Although some accompanying drawings at this illustrate the processor that separates with sensor array, processor or its part can have any configuration and can be arranged on any one or a plurality of position.In certain embodiments, one of processor, some or all of part and one or more sensor arraies are one or more being arranged on identical one or more substrates among the sensor array 310A-310D for example.But in certain embodiments, one of processor, some or all of part are arranged on one or more substrates, this one or more substrates and can be provided with for example one or more one or more substrate separation among the sensor array 310A-310D (and may away from) of one or more sensor arraies on it.
In certain embodiments, one or more parts of digital camera devices include and help to proceed to subsystem and/or from the circuit of subsystem and/or wired, the wireless and/or optical communication in subsystem.Such circuit can have any form.In certain embodiments, one or more parts of sort circuit can be the parts of processor 340, and can be arranged on the same integrated circuit with one or more other parts of processor 340 and/or can separate with processor 340 or its other parts with discrete form.
Figure 110 A is the block diagram according to the processor 340 of one embodiment of the present of invention.In this embodiment, processor 340 comprises one or more channel processors, one or more image pipeline and/or one or more post processing of image device.Each channel processor all also produces image based on the signal that receives from this respective camera passage at least in part with corresponding camera passage coupling.In certain embodiments, processor 340 is at least in part based on producing composograph from two or more camera channel image.In certain embodiments, one or more channel processors are suitable for its corresponding camera passage, and are for example described here such.For example, if one of camera passage is exclusively used in certain wavelengths or color (or wavelength band or colour band), then corresponding channel processor also can be suitable for this wavelength or color (or wavelength band or colour band).Can also adopt any other embodiment described here or its combination.
For example, the gain of processor, noise reduce, the combination of dynamic range, the linearity and/or any other characteristic or such characteristic can be suitable for improving and/or optimized processor at this wavelength or color (or wavelength band or colour band).Passage is handled be suitable for corresponding camera passage can help to produce to be higher than the image of the quality of the picture quality that the conventional image sensor by similar pixel quantity produces.In such embodiments, provide the designated lane processor can help to reduce or simplify the amount of the logical block in the channel processor to each camera passage, because channel processor may not need to adapt to the extreme translation of color or wavelength, for example move to and be in another extreme color (or colour band) or wavelength (or wavelength band) from being in an extreme color (or colour band) or wavelength (or wavelength band).
The image that is produced by channel processor (and/or represent its data) is provided for the image pipeline, and the image pipeline can be with the image combination to form panchromatic or the black/white image.The output of image pipeline is provided for preprocessor, and preprocessor produces dateout according to one or more output formats.
Figure 110 B illustrates an embodiment of channel processor.In this embodiment, channel processor comprises row logic unit, analog signal logical block, black level control and exposure control.Row logic unit and sensors coupled also read signal from pixel.If channel processor and the camera passage coupling that is exclusively used in specific wavelength (or wavelength band), it may be favourable then making the row logic unit be suitable for this wavelength (or wavelength band).For example, the row logic unit can adopt the one or more integrating times that are suitable for the wavelength (or wavelength band) that is exclusively used in response to this Color Channel and specific dynamic range is provided.Therefore, it may be favourable making row logic unit in one of channel processor adopt the one or more integrating times different with the one or more integrating times that adopted of row logic unit in one or more other channel processors.
The analog signal logical block receives the output from the row logic unit.If channel processor and the camera passage coupling that is exclusively used in specific wavelength or color (perhaps wavelength band or colour band) then make the analog signal logical block be particularly suitable for this wavelength or color (perhaps wavelength band or colour band) may be favourable.Like this, if desired, can wait at gain, noise, dynamic range and/or the linearity and optimize the analog signal logical block.For example, if channel processor is exclusively used in specific wavelength or color (perhaps wavelength band or colour band), the then remarkable translation that may not need logical block and stabilization time is because each sensor element in the camera passage is exclusively used in identical wavelength or color (perhaps wavelength band or colour band).On the contrary, if the camera passage must handle all wavelengths and color and adopt wherein the adjacent sensors element be exclusively used in different colours for example the Bayer of redness-blueness, redness-green or blueness-green arrange, then may not carry out such optimization.
The output of analog signal logical block is provided for the black level logical block, and the black level logical block is determined the level and the some or all of such noises of filtering of the noise in this signal.If with the narrower visible light bands of a spectrum of sensor tip contrast conventional image sensor of channel processor coupling, then can be tuning more subtly to eliminate noise to the black level logical block.If channel processor and the camera passage coupling that is exclusively used in specific wavelength or color (perhaps wavelength band or colour band) then make the analog signal logical block be particularly suitable for this wavelength or color (perhaps wavelength band or colour band) may be favourable.
The output of black level logical block is provided for exposure control, and the exposure control survey is adjusted pull-in time by the total amount of the light of array seizure and at picture quality.Traditional camera (to all colours) is globally made this and is determined.If with the sensor-specific of channel processor coupling in particular color (or colour band), then exposure control can be particularly suitable for this transducer at wavelength (or wavelength band).Each channel processor therefore can provide be particularly suitable for this transducer and/or this transducer at particular color (or colour band) and with the different pull-in time of the pull-in time that one or more other channel processor provided by one or more other camera passages.
Figure 110 C illustrates an embodiment of image pipeline.In this embodiment, the image pipeline comprises two parts.First comprises planes of color integrator and image adjuster.The planes of color integrator receives and is integrated into single coloured image from the output of each channel processor and with a plurality of planes of color.The output of representing the planes of color integrator of single coloured image is provided for image adjuster, and image adjuster is at this single coloured image of saturation, acutance, intensity and tone adjustment.This adjuster also adjust image with remove pseudo-shadow and with the relevant any not desired effects of bad pixel in one or more Color Channels.The output of image adjuster is provided for the second portion of pipeline, and this second portion provides automatic focus, zoom, windows, combination of pixels and camera function.
Figure 110 D illustrates an embodiment of post processing of image device.In this embodiment, the post processing of image device comprises encoder and output interface.Encoder receives the output signal from the image pipeline, and provides coding to supply with output signal according to one or more standard agreements (for example MPEG and/or JPEG).
The output of encoder is provided for output interface, output interface according to the standard output interface for example USB (USB) interface provide coding to supply with output signal.
Figure 110 E illustrates an embodiment of system's control.In this embodiment, the systems control division branch comprises serial line interface, configuration register, power management, voltage-regulation and control, sequential and control, camera control interface and serial line interface.In certain embodiments, this phase connection interface comprises the interface of the signal of the form of handling high level language (HLL) instruction.Below each section such embodiment of this system's control is described.But be to be understood that this camera interface to be not limited to such embodiment and can have any configuration.In certain embodiments, this camera interface comprises the interface of control signal of other form of the form of handling low level language (LLL) instruction and/or any known at present or later exploitation.Some embodiment can handle the HLL instruction and LLL instructs the two.
In the work of this embodiment, communication takes place by the serial line interface that is connected with serial port.For example, the signal of presentation directives's (for example HLL camera control command), required setting, operation and/or data offers serial line interface and control section by serial port.If this signal is not represented HLL camera control command (promptly relevant with camera HLL instruction), represent that then the signal of required setting, operation and/or data is provided for configuration register to be stored in wherein.If this signal indication HLL camera control command, then this HLL instruction is provided for HLL camera control interface.HLL camera control interface is to this instruction decoding, and to produce the signal of required (user or other device are required) setting, operation and/or data of expression, this signal is provided for configuration register to be stored in wherein.
The signal of representing required setting, operation and/or data is provided for power management, transducer sequential and control section, channel processor, image pipeline and post processing of image device as required.Power management the signal that is provided partly is provided and in response to this signal control signal is offered voltage at least adjusts power and control section, and this part is connected to the circuit in the digital camera devices again.Transducer sequential and control section the signal that is provided are provided and in response to this signal control signal are offered sensor array to control its work at least.Channel processor (passing through line) signal that is provided is provided and further receives one or more signals from one or more sensor arraies, and carries out one or more channel processor operations at least in response to this signal.
Image pipeline received signal, and further receive one or more signals from one or more channel processors, and carry out one or more image pipeline operation in response to this signal at least.
Post processing of image device received signal, and further receive one or more signals from the image pipeline, and carry out one or more post processing of image device operations at least in response to this signal.
Figure 110 F illustrates the example according to the high level language camera control command of one embodiment of the present of invention.This command format has the op sign indicating number, and COMBINE for example, this op sign indicating number are one type camera control command in the case with this command identification and ask digital camera devices to produce composograph.This command format also has one or more operand fields, for example passage id1, passage id2, and this field identifies at least in part and is ready to use in the camera passage that produces composograph.
As used herein, term " composograph " means at least in part the image based on the information of being caught by two or more camera passages.Composograph can produce according to any way.Demonstration camera passage includes but not limited to camera passage 350A-350D.
The example of HLL camera control command that uses the command format of Figure 110 E is that " COMBINE1,2 "-this instruction request is at least in part based on by the camera passage that is denoted as " camera passage 1 " camera passage 350A and the camera passage that is denoted as " the camera passage 2 " output image of the camera passage 350B information of catching for example for example.
Another example is COMBINE1,2,3,4-this instruction request is at least in part based on by the camera passage that is denoted as " camera passage 1 " for example camera passage 350A, the camera passage for example camera passage 350B, the camera passage for example camera passage 350C, the camera passage output image of the camera passage 350D information of catching for example that is denoted as " camera passage 4 " that are denoted as " camera passage 3 " that are denoted as " camera passage 2 ".
The availability of COMBINE and other HLL instruction logarithmic code camera apparatus provides formal more near the instruction of human language rather than machine language and/or assembler language, writes, reads and/or maintenance program thereby help to be embodied as digital camera devices.
Will be appreciated that, the invention is not restricted to COMBINE instruction and the HLL command format shown in Figure 110 E, also can use other command format, comprise for example other HLL command format.
For example, in certain embodiments, the camera passage is not specified in instruction, but has for example provided prompting according to the op sign indicating number.In such embodiments, digital camera devices can for example be configured to produce composograph when the COMBINE instruction is provided up to small part ground automatically based on one group of predetermined camera passage.Replacedly, for example, can support a plurality of different COMBINE or other HLL instruction, each all has different op sign indicating numbers.Different op sign indicating numbers can identify interested certain camera passage clearly.For example, instruction " COMBINE12 " can be asked at least in part based on by the camera passage that is denoted as " camera passage 1 " camera passage 350A and the camera passage that is denoted as " the camera passage 2 " output image of the camera passage 350B information of catching for example for example.
Instruction " COMBINE1234 " can be asked at least in part based on by the camera passage that is denoted as " camera passage 1 " for example camera passage 350A, the camera passage for example camera passage 350B, the camera passage for example camera passage 350C, the camera passage output image of the camera passage 350D information of catching for example that is denoted as " camera passage 4 " that are denoted as " camera passage 3 " that are denoted as " camera passage 2 ".
In certain embodiments, the feasible composograph that produces more than of single COMBINE instruction.The interested camera passage (as mentioned above) that can be used for additional composograph based on op sign indicating number hint.Replacedly, for example, can be used for the interested camera passage of additional composograph based on the operand hint that is provided.
Figure 110 G illustrates the high level sound instruction according to other embodiment of the present invention.
In certain embodiments, one or more instructions can be so that camera interface starts the operation by this instruction suggestion.For example, if receive instruction " white balance is artificial ", then camera interface can be indicated white balance to be controlled to work under the artificial mode and/or be started the signal that makes that finally camera is worked under this pattern.
These instructions can be used for for example controlling the camera setting and/or have for example mode of operation of one or more aspects of the camera of " ON/OFF " and/or " artificial/automatic " of two or more states.
Some embodiment comprise in the instruction of Figure 110 G one, some or all.Other embodiment can not adopt any instruction of listing among Figure 110 G.
Figure 110 H illustrates the high level sound instruction according to other embodiment of the present invention.
In certain embodiments, one or more instructions can be so that camera interface starts the operation by this instruction suggestion.For example, if receive instruction " single frames seizure ", then camera interface can start the seizure of single frames.
Some embodiment comprise in the instruction of Figure 110 H one, some or all.Other embodiment can not adopt any instruction of listing among Figure 110 H.Some embodiment can be individually and/or are comprised one or more instructions of Figure 110 G and one or more instructions of Figure 110 H with the signal of any other form combinedly.
In certain embodiments, camera interface can be configured to provide the limited visit to the low level order, with the function that provides the specific user to limit.
The signal form that is used for camera interface can be that be scheduled to, that self adaptation is determined and/or the user determines.For example in certain embodiments, the user can be this interface definition instruction set and/or form.
As mentioned above, camera interface is not limited to adopt the embodiment of HLL camera control interface.This camera interface can have any configuration.In certain embodiments, this camera interface comprises the interface of control signal of other form of the form of handling low level language (LLL) instruction and/or any known at present or later exploitation.Some embodiment can handle the HLL instruction and LLL instructs the two.
Should be appreciated that processor 340 is not limited to above-mentioned part and/or operation.For example, processor 340 can comprise part or its combination of any kind and/or can carry out any operation.
Should be appreciated that processor 340 can implement according to any way.For example, processor 340 can be programmable or non-programmable, general or special purpose, special-purpose or non-special use, distributed or non-distributed, that share or unshared and/or their any combination.If processor 340 has two or more distributed parts, then these two or more parts can be communicated by letter by one or more communication links.Processor can comprise such as but not limited to hardware, software, firmware, hardware circuitry and/or their any combination.In certain embodiments, one or more parts of processor 340 can be implemented according to the form of one or more ASIC.Processor 340 can be carried out or not carry out has one or more one or more computer programs that can comprise the subroutine or the module of a plurality of instructions respectively, and can carry out or not carry out the task except that task described here.Surpass one module if computer program comprises, then module can be the part of a computer program, perhaps can be the part of the computer program of separation.As used herein, term " module " is not limited to subroutine, and can comprise for example hardware, software, firmware, hardware circuitry and/or their any combination.
In certain embodiments, processor 340 includes and helps to proceed to digital camera devices and/or from the circuit of wired, the wireless and/or optical communication of digital camera devices.Such circuit can have any form.In certain embodiments, one or more parts of sort circuit are arranged in the same integrated circuit with the other parts of processor 340.In certain embodiments, one or more parts of sort circuit are separated with the integrated circuit of discrete form with the other parts of processor 340 or its part.
In certain embodiments, processor 340 comprises that at least one passes through the processing unit that interlocking frame (for example data/address bus) is connected with accumulator system.Accumulator system can comprise computer-readable and recording medium that can write.This medium can be or can not be non-volatile.The example of non-volatile media includes but not limited to disk, tape, non-volatile optical media and non-volatile integrated circuit (for example read-only memory and flash memory).Dish can be movably, and for example floppy disk maybe can be permanent, for example hard disk.The example of volatile memory includes but not limited to random access memory for example dynamic random access memory (DRAM) or static RAM (SRAM), and they can be or can not be to use one or more integrated circuits to come the type of stored information.
If processor 340 is carried out one or more computer programs, then these one or more computer programs may be embodied as the clear and definite embodied computer program product of being carried out by computer in machinable medium or device.In addition, if processor 340 is computers, then such computer is not limited to specific calculation machine platform, par-ticular processor or programming language.Computer programming language can include but not limited to programming language, object oriented programming languages and their combination of process.
The program that is called operating system can be carried out or do not carried out to computer, and this program can be controlled or not control the execution of other computer program and scheduling, debugging, I/O control, calculating, compiling, memory allocation, data management, Control on Communication and/or related service are provided.Computer can for example use a computer language such as C, C++, Java or other Languages such as script or even assembler language programme.Computer system can also be special programming, special purpose hardware or application-specific IC (ASIC).
The example of output device include but not limited to display (for example cathode ray tube (CRT) device, LCD (LCD), plasma scope and other video output device), printer, communicator for example modulator-demodulator, storage device for example coil or with and audio frequency output and the device that on light transmissive films or similar substrate, produces output.Output device can include the one or more interfaces of communicating by letter that help with output device.This interface can be the interface of any kind, for example proprietary or non-(for example USB (USB) or little USB) proprietary, standard or certainly customization or their any combination.
The example of input unit includes but not limited to button, knob, switch, keyboard, keypad, tracking ball, mouse, pen and clipboard, light pen, touch-screen and data input device such as Voice ﹠ Video trap setting.Output device can include and help one or more interfaces of communicating by letter with output device.This interface can be the interface of any kind, such as but not limited to proprietary or non-(for example USB (USB) or little USB) proprietary, standard or certainly customization or their any combination.
In addition, as mentioned above, be to be understood that feature disclosed herein can be used in combination with any.
Input signal to processor 340 can have any form, and can provide from any source, and this source is such as but not limited to the one or more sources in the digital camera devices (for example user's peripheral interface on the digital camera) and/or one or more other device.For example, in certain embodiments, peripheral user interface comprises one or more input units, the user can be by the one or more preferences of this input unit indication about one or more expectation mode of operations (for example resolution, artificial exposure control), and peripheral user interface produces one or more signals of the so one or more preferences of expression.In certain embodiments, one or more parts of processor 340 produce one or more signals of the one or more expectation mode of operations of expression.In certain embodiments, one or more partial responses of processor 340 are in from one or more inputs of peripheral user interface and produce one or more such signals.
In certain embodiments, one or more parts of digital camera devices include and help to proceed to subsystem and/or from the circuit of subsystem and/or wired, the wireless and/or optical communication in subsystem.Such circuit can have any form.In certain embodiments, one or more parts of sort circuit can be the parts of processor 340, and can be arranged on the same integrated circuit with one or more other parts of processor 340 and/or can separate with processor 340 or its other parts with discrete form.
In certain embodiments, digital camera devices comprises the memory block, and this memory block is provided with and/or stores one, some or all of image and/or produced or the out of Memory that uses and/or from any source and wishes to continue storage any out of Memory of any time by digital camera devices.The memory block can provide one or more such images and/or such out of Memory to one or more parts of one or more other devices and/or processor, for example in order to further processing and/or offer one or more other devices.This memory block can for example be the part of processor 340 and/or the one or more parts coupling by one or more communication links and processor 340.In certain embodiments, this memory block is also by one or more communication links and one or more other device coupling.In such embodiments, any other parts of processor 340 can directly (promptly not passed) by one or more one or more one or more images of storing and/or the out of Memory of providing in one or more other devices in one or more communication links in this memory block, although this not necessarily.
Figure 111 A illustrates another embodiment of channel processor.In this embodiment, channel processor comprises two sampler, analog to digital converter, black level clamps and departs from pixel correction.
Image can be expressed as a plurality of picture elements (pixel) value.The picture intensity at the place, picture position that each amount of pixels value representation is associated (darkness or relative brightness relatively).The picture intensity that low relatively amount of pixels value representation is low relatively (promptly dark relatively position).On the contrary, the high relatively high relatively picture intensity (promptly bright relatively position) of amount of pixels value representation.The pixel value is to select from the scope that depends on sensor resolution.
Figure 111 B is the diagrammatic representation of adjacent pixel values.Figure 111 B also illustrates the direction in space of a plurality of regulations, i.e. the first regulation direction in space (for example horizontal direction), the second regulation direction in space (for example vertical direction), the 3rd regulation direction in space (for example first pair of angular direction) and the 4th regulation direction in space (for example second pair of angular direction).Pixel P22 is adjacent with pixel P12, P21, P32 and P23.Pixel P22 is offset with pixel P32 in the horizontal direction.Pixel P22 in the vertical direction and pixel P23 skew.Pixel P22 is offset with pixel P11 on first pair of angular direction.Pixel P22 is offset with pixel P31 on second pair of angular direction.
Two samplers are determined the amount that the value of each pixel changes during the exposure period, thereby the estimation of the light quantity that each pixel was received during the exposure period is provided effectively.For example, pixel can have the first value Vstart before the exposure period.This first value Vstart can equal or be not equal to 0.Same pixel can have the second value Vend after the exposure period.Difference between first value and second value is that Vstart-Vend represents the light quantity that this pixel-by-pixel basis is received.
Figure 111 C illustrates the flow chart of the operation of adopting among this pair sampling embodiment.
The value of a plurality of pixels in the sensor array resets to and begins to expose period initial condition before.The value of each pixel was sampled before the period that begins to expose.The value of each pixel was sampled after the exposure period, and represented that the signal of this value is provided for two samplers.Two samplers are that each pixel produces signal, poor between the initial value of this this pixel of signal indication and the end value.
As mentioned above, the value of each difference signal is illustrated in the light quantity that corresponding sensor array column position receives.Difference signal with low relatively value is illustrated in corresponding sensor array column position and receives low relatively light quantity.Difference signal with relative high magnitude is illustrated in corresponding sensor array column position and receives high relatively light quantity.
Referring again to Figure 111 A, the difference signal that is produced by two samplers is provided for analog to digital converter, and this analog to digital converter produces the multistation digital signal sequence to each such signal sampling and in response to it, and each multistation digital signal is represented a corresponding difference signal.
Multistation digital signal is provided for black level clamp, the drift of the sensor array of black level clamp compensation camera passage.
Difference signal should have and equals 0 value, unless pixel is exposed to light.But because imperfect (for example leakage current) of transducer, the value of pixel even also may change (for example increasing) not being exposed to the light time.For example, pixel can have the first value Vstart before the exposure period.Same pixel can have the second value Vend after the exposure period.If there is drift, then second value may be not equal to first value, even this pixel is not exposed to light.The drift that the black level clamp compensation is such.
In order to finish this, in certain embodiments, on one or more parts (for example one or more row and/or one or more row) of sensor array, apply permanent lid, arrive these parts to prevent light.This lid for example applies during the manufacturing of sensor array.The difference signal of the pixel in the part that is capped can be used for the drift value (and direction) in the estimated sensor array.
In this embodiment, black level clamp produces reference value (it represents the estimation of the drift in this sensor array), the mean value of the difference signal of the pixel during the value of this reference value equals to be capped partly.After this, black level clamp is by compensating estimated drift for being capped the difference signal that each pixel produces through compensation in the part, and the value that each value through the difference signal of compensation equals corresponding uncompensated difference signal deducts the value (reference value is represented the estimation of drift as mentioned above) of reference value.
The output of black level clamp is provided for and departs from the pixel identifier, departs from the pixel identifier and is used to discern the defectiveness pixel and helps to reduce its influence.
In this embodiment, the defectiveness pixel definition is that its one or more values, difference signal and/or the difference signal through compensating do not satisfy the pixel of one or more criterions, so take one or more measures to help reduce the influence of this pixel in this case.For example, in this embodiment, if the value of difference signal through compensation of pixel exceeds reference range (promptly be lower than first reference value or greater than second reference value), then this pixel is defective.Reference range can be that be scheduled to, that self adaptation is determined and/or their any combination.
If the value of the difference signal through compensating exceeds this scope, then be set equal at least in part the value through the difference signal of compensation based on the one or more pixels adjacent with the defectiveness pixel through the value of difference signal of compensation, this value for example is the pixel-shift on x direction just and the mean value of the pixel-shift on the negative x direction.
Figure 111 D is illustrated in the flow chart of the operation of adopting among this defectiveness pixel identifier embodiment.Each is compared with reference range through the value of the difference signal of compensation.If the value of the difference signal through compensating exceeds reference range, then this pixel is defective, and the value of difference signal is set to value according to the method described above.
Figure 111 E illustrates another embodiment of image pipeline.
In this embodiment, the image pipeline comprise the alignment of plane of delineation integrator, the plane of delineation and sew up, exposure control, focus control, zoom control, Gamma correction, color correction, edge strengthen, chrominance noise reduces, white balance, color enhancing, image zoom and color space conversion.
The output of channel processor is the data set of the compensation version of the image that captured by the camera passage of expression.This data set can be used as data flow output.For example, represent the compensation version of the image that captures by camera passage A from the output of the channel processor of camera passage A, and can be with data flow P A1, P A2... P AnForm.Represent the compensation version of the image that captures by the camera channel B from the output of the channel processor of camera channel B, and can be with data flow P B1, P B2... P BnForm.Represent the compensation version of the image that captures by the camera channel C from the output of the channel processor of camera channel C, and can be with data flow P C1, P C2... P CnForm.Represent the compensation version of the image that captures by camera passage D from the output of the channel processor of camera passage D, and can be with data flow P D1, P D2... P DnForm.
Plane of delineation integrator each from two or more channel processors receives data, and with this data set synthetic individual data collection, for example P A1, P B1, P C1, P D1, P A2, P B2, P C2, P D2, P A3, P B3, P C3, P D3, P An, P Bn, P Cn, P DnFigure 111 F illustrates an embodiment of plane of delineation integrator.
In this embodiment, plane of delineation integrator comprises multiplexer and heterogeneous phase clock.
Multiplexer has a plurality of input in0, in1, in2, in3, its each all be suitable for receiving multistation digital signal stream (or sequence).The data flow P of multibit signal A1, P A2... P AnOffer input in0 from the channel processor of camera passage A.Data flow P B1, P B2... P BnOffer input in1 from the channel processor of camera channel B.Data flow P C1, P C2... P CnOffer input in2 from the channel processor of camera channel C.Data flow P D1, P D2... P DnOffer input in3 from the channel processor of camera passage D.Multiplexer has provides the output of multidigit output signal out.Notice that in certain embodiments multiplexer comprises a plurality of 4 input multiplexers, its each all is a bit wide.
The input that multi-phase clock has received signal enables.Multi-phase clock has output c0, the c1 of the input s0, the s1 that offer multiplexer.In this embodiment, multi-phase clock has 4 phase places, shown in Figure 111 G.
The work of plane of delineation integrator is as follows.This integrator has two states.A state is a wait state.Another state is multiplexed state.The selection of operating state is controlled by the logic state of the enable signal that offers multi-phase clock.Multiplexed state has 4 phase places, and they are corresponding to 4 phase places of multi-phase clock.In phase place 0, clock signal c1, c0 are not declared, and make multiplexer export for example P of one of multibit signal from camera passage A A1In phase place 1, clock signal c0 is declared, and makes multiplexer export for example P of one of multibit signal from the camera channel B B1In phase place 2, clock signal c1 is declared, and makes multiplexer export for example P of one of multibit signal from the camera channel C C1In phase place 0, the two is declared clock signal c1, c0, makes multiplexer export for example P of one of multibit signal from camera passage D D1
After this, clock turns back to phase place 0, makes multiplexer output from another multibit signal of camera passage A P for example A2After this, in phase place 1, multiplexer output is from another multibit signal of camera channel B P for example B2In phase place 2, multiplexer output is from another multibit signal of camera channel C P for example C2In phase place 3, multiplexer output is from another multibit signal of camera passage D P for example D2
Repeat this operation up to multiplexer output from last multibit signal of each camera passage P for example An, P Bn, P Cn, P DnTill.
The output of plane of delineation integrator is provided for plane of delineation alignment and sutured portion.The purpose of this part is to determine that thereby registration image how makes the target of being caught by the different cameral passage be aligned in same position in the respective image, for example to guarantee appearing at same position in each camera channel image by the target that the different cameral passage is caught.
For example, human eye is the good example of two channel image planar systems.Hold pencil when about 1 foot place before eyes, when closing left eye and seeing pencil, can see that pencil is in the ad-hoc location that is different from when closing right eye and seeing pencil with left eye with right eye.This is because our brain receives only an image a time, and can not this image is related with another the image in the different time reception.(hold pencil in the position identical with previous experiments) when opening eyes and attempt to see pencil once more, brain receives two pencil drawing pictures simultaneously.In this case, brain is attempted two image alignments with same pencil automatically, and we will feel that the place ahead is single pencil drawing picture, and just this image becomes stereo-picture.
Under the situation of digital camera devices, automated graphics planar registration and sutured portion determine how 2,3,4,5 or more image channels should align.
Figure 111 H-111J is respectively by 3 camera passages schematic diagram of the image that produces of camera passage 350A, 350B, 350C for example, and these three camera passages are arranged to the triangle constellation according to one embodiment of the present of invention and adopt in an embodiment of automated graphics planar registration and sutured portion.
Each image has a plurality of pixels that are arranged to a plurality of row.Specifically, the first camera passage for example the image of camera passage 350A have capable 1-n.The second camera passage for example image of camera passage 350B has capable 1-n.The third phase machine passage for example image of camera passage 350C has capable 1-n.Reference line identifies the horizontal reference point (for example mid point) in the first camera channel image.Reference line identifies the horizontal reference point (for example mid point) in the second camera channel image.Horizontal reference point (for example mid point) in reference line sign third phase machine channel image.
An object appears at respectively in these 3 images.In this embodiment, this object for example appears at the diverse location in each image because of the spatial deviation between the camera passage.For example, this object has two edges that intersect at the summit.At the first camera passage for example in the image of camera passage 350A, this summit occurs being expert in 2 and is consistent with horizontal reference point.At the second camera passage for example in the image of camera passage 350B, this summit occurs being expert in 3 and on the left side of horizontal reference point.At third phase machine passage for example in the image of camera passage 350C, this summit occurs being expert in 3 and on the right of horizontal reference point.
Figure 111 K-111Q is the schematic diagram according to the process of being carried out by the automated graphics aligned portions of the system with 3 camera passages of one embodiment of the present of invention.In this embodiment, the automated graphics alignment is carried out vertically and horizontal alignment.
Thus, can at first carry out vertical alignment, although can adopt any order.This part uses one of image (for example the first camera passage for example the image of camera passage 350A) to compare with other image as the reference image.The automated graphics aligned portions can compare the row 1 of reference picture and the row 1 of other image when initial, and determines whether such row of such image limits similar edge feature.In this example, image does not have edge feature in first row, does not therefore have similar edge feature in each such row.Therefore this part is partly exported data corresponding to such row (being each the row 1 in these 3 images) to image zoom.In next compare operation, the automated graphics aligned portions compares the row 1 of first image and the row 2 of other image.In this example, such provisional capital does not have edge feature, does not therefore have similar edge feature in each such row.Therefore this part is partly exported data corresponding to such row (i.e. the row 1 of first image and the row 2 of each other image) to image zoom.In next compare operation, the automated graphics aligned portions compares the row 1 of first image and the row 3 of other image.Although second and the row 3 of the image of third channel each all have edge feature, the row 1 of the image of first passage does not have the edge.
Can select the maximum number of the compare operation of use (reference picture) particular row based on the physical separation between the camera passage.For example in this embodiment, the particular row of reference picture is used for 3 compare operations at most.Therefore in follow-up compare operation, the row 2 of the first camera channel image is used in the automated graphics alignment, rather than the row 1 of the image of first passage.In next compare operation, the automated graphics aligned portions compares the row 2 of first image and the row 2 of other image.Although the row 2 of the first camera channel image has the edge, the row 2 of other camera passage is without any the edge.In next compare operation, the automated graphics aligned portions compares the row 2 of first image and the row 3 of other image.In this example, each such row has similar edge feature.The automated graphics aligned portions uses its indication as superimposed images (or its part).
Executive level alignment then.This part is determined for the edge feature in the image of the image of second channel and third channel is alignd with edge feature in the first camera channel image should be with the value and the direction of the image translation of the image of second channel and third channel, and the width of definite doubling of the image (for example, the overlapping in the horizontal direction degree of image).
In next compare operation, this part compares the next line (for example row 3) of reference picture and the next line (for example row 4) of other image, and repeats aforesaid operations to determine the minimum widith of the doubling of the image.
Can come trimmed image according to vertical overlapping and minimum level is overlapping.The output of automated graphics aligned portions is the image of the alignment of pruning, and the image of the alignment of this pruning is provided for the image zoom part.In certain embodiments, image zoom partly amplifies the image of the alignment of (for example up-sampling) this pruning, to produce the image identical with original image size.
Some embodiment adopt other alignment schemes individually or with any method described here combinedly.For example, in certain embodiments, under camera situation far away relatively, use said method, and when camera is near relatively, use additive method at object at object.
Figure 111 AF illustrates the flow chart according to the operation that can adopt of another embodiment of the present invention in aligned portions.This alignment embodiment can for example be used to comprise the image of one or more nearly objects.
In this embodiment, extract the edge on one of plane.Be each edge pixel regulation neighbor (nuclear).After this, for example by towards direction translation that other planes of color was positioned at relatively nuclear, each edge pixel endorse with other planes of color in pixel be complementary.Can make about the nuclear of each edge pixel and the one or more of matching degree of the pixel in another planes of color and determining.Thus, can adopt the coupling cost function to quantize the nuclear of each edge pixel and the matching degree of the pixel in another planes of color.In determining next plane, in the process of the optimum position at each edge, can check that the relative position at each edge still keeps identical structure to confirm them after according to the optimum Match translation.
After the final position of having set the edge, can for example use Linear Mapping and/or translation to shine upon interval between the edge.Can carry out reprocessing to translational movement, to confirm not having outlier (not having unexpected translation) with respect to surrounding pixel.
The initial coupling of initial two planes of color can be used as the reference about the amount of each the pixel place expection translation in other planes of color.
Aforesaid operations can for example apply between initial color plane and all other planes of color.
Be to be understood that the automated graphics aligned portions is not limited to the foregoing description.For example in certain embodiments, alignment is less than 3 or more than 3 camera passage.In addition, can adopt any other technology two or more images that align.
The alignment of being carried out by the automated graphics aligned portions can be that be scheduled to, processor control and/or user's control.In certain embodiments, automatically aligned portions has the ability that alignment is less than all camera passages (for example any two or more).In such embodiments, one or more signals can be provided for the automated graphics aligned portions with the indication camera passage that will align, and this automated graphics aligned portions can align in response to so one or more signals the camera passage at least in part indicated.These one or more signals can be that be scheduled to or that self adaptation is determined, processor control and/or user's control.
Be to be understood that and in each embodiment, can not require the automated graphics alignment.
Plane of delineation alignment and the output of sewing up are provided for exposure control, and the purpose of exposure control is to assist in ensuring that the image of being caught can overexposure or under-exposure.The image of overexposure is too bright.Under exposed image is too dark.
Figure 111 R illustrates an embodiment of automatic exposure control.In certain embodiments, this automatic exposure control produces the brightness value that expression offers the brightness of the image that this exposure controls.Brightness value and the one or more reference value that is produced compared in automatic exposure control, and reference value is two values for example, and wherein first value representation is minimum expects brightness and the second value representation greatest hope brightness.Minimum and/or high-high brightness can be that be scheduled to, processor control and/or user's control.For example in certain embodiments, minimum expectation brightness and greatest hope brightness value are provided by the user, make that the image that is provided by digital camera devices can be too not bright or too dark In the view of this user.
If brightness value (is promptly expected brightness and is less than or equal to greatest hope brightness) that then automatic exposure control does not change the time for exposure more than or equal to minimum between minimum expectation brightness and greatest hope brightness.If this brightness value is expected brightness value less than minimum, then automatic exposure control provides and makes the time for exposure increase the control signal till brightness is more than or equal to minimum expectation brightness.If brightness value is greater than maximum brightness value, then automatic exposure control provides and makes the time for exposure reduce the control signal till brightness is less than or equal to maximum brightness value.(promptly more than or equal to minimum luminance value and be less than or equal to maximum brightness value) afterwards between minimum and maximum brightness value at brightness value, automatic exposure control provides the signal that enables trap mode, wherein the user can press capture button with the seizure of startup to image, and the setting of time for exposure makes the time for exposure cause being in (image of the catching) luminance level in the user preference scope.In certain embodiments, the ability that digital camera devices provides the direct labor to adjust the time for exposure to the user, it is similar to the iris diaphgram of adjusting on the traditional film camera.
In certain embodiments, digital camera devices adopts relatively moving between optics part (or its one or more parts) and the sensor array (or its one or more parts), is provided for the mechanical iris diaphgram of automatic exposure control and/or artificial exposure control.As mentioned above, for example MEMS actuator of actuator can be for example used in such motion, thereby and by applying suitable control signal to one or more actuators so that these one or more actuators move, expand and/or shrink moves related optics and partly provide.
For each embodiment disclosed herein, the foregoing description can adopt separately or be used in combination with one or more other embodiment disclosed herein or its part.
In addition, should be appreciated that the embodiment disclosed herein can also be with one or more other methods and/or device known or later exploitation be used in combination at present.
As mentioned above, the title of submitting on July 1st, 2005 be description and the illustrated invention use that can combine with the present invention in the U.S. Provisional Application 60/695,946 of " Method and Apparatusfor use in Camera and Systems Employing Same ".For the sake of brevity, will not repeat those discussion.Should be specifically noted that the full content of above-mentioned U.S. Provisional Application comprises that for example feature, attribute, replacement, material, technology and/or the advantage of its all invention/embodiment are all incorporated herein by reference.
The output of exposure control is provided for automatically/artificial focus control part, automatically/and manually focus control partly helps to make the object (for example target of image) that is positioned at the visual field to appear on the focus.Usually, if image overfocus or owe focus, it is fuzzy that the object in the image all can seem.When lens were positioned at focus, image can have the peak value acutance.In certain embodiments, the auto focus control part is the fuzzy quantity of detected image when digital camera devices is in preview mode for example, and control signal is provided, this control signal makes lens subassembly correspondingly move forward and backward, till auto focus control determines that partly lens are positioned at focus.Available a lot of digital cameras have all utilized the mechanism of this type at present.
In certain embodiments, automatically focusing block is suitable for helping to increase the depth of focus of digital camera devices/manually.Depth of focus can be considered as being positioned at focus in the visual field object this object become " out of focus " can move what tolerance forward or backward before.Depth of focus is at least in part based on the lens that adopt in the opticator.Some embodiment and one or more algorithm adopt one or more filters to come increasing depth of focus in combination.These one or more filters can be the conventional filters that is used for increasing depth of focus, and can more than overlay on lens top (going up or the top), although this is not necessarily.Can adopt the filter and the location of any kind.Similarly, these one or more algorithms can be traditional wavefront coded algorithms, although this not necessarily.Can adopt one or more algorithms of any kind.In certain embodiments, automatic focus mechanism increases 10 times (depth of focus that for example is provided with automatic focus mechanism is 10 times of depth of focus of independent lens (not having automatic focus mechanism)) with depth of focus, thereby makes this system more insensitive or insensitive to the position of object in the visual field.In certain embodiments, this automatic focus mechanism increases 20 times or more times (depth of focus that for example is provided with automatic focus mechanism is 20 times of depth of focus of independent lens (not having automatic focus mechanism)) with depth of focus, thereby has further reduced the susceptibility of the position of object in the visual field and/or make this system insensitive to the position of object in the visual field.
In certain embodiments, digital camera devices can provide the ability of artificial adjustment focal length to the user.
In certain embodiments, digital camera devices adopts relatively moving between optics part (or its one or more parts) and the sensor array (or its one or more parts), helps provide automatic focus and/or artificial focusing.As mentioned above, for example MEMS actuator of actuator can be for example used in such motion, thereby and by applying suitable control signal to one or more actuators so that these one or more actuators move, expand and/or shrink moves related optics and partly provide.(referring to the title of for example submitting on July 1st, 2005 is the U.S. Provisional Application 60/695,946 of " Method andApparatus for use in Camera and Systems Employing Same ", by reference this application is merged once more).
Automatically/manually focus on and be not limited to the foregoing description.In fact, can adopt the focusing automatically/manually of any other type of known at present or later exploitation.
In addition, for each embodiment disclosed herein, the foregoing description can use separately or be used in combination with one or more other embodiment disclosed herein or its part.
Should be appreciated that the embodiment disclosed herein can also be with one or more other methods and/or device known or later exploitation be used in combination at present.
Should be appreciated that automatic focus and artificial focusing not necessarily.In addition, focusing block can provide automatic focus and not consider whether to provide the ability of artificial focusing.Similarly, focusing block can provide artificial focusing and not consider whether provide self-focusing ability.
The output of auto focus control is provided for zoom controller.
Figure 111 S is the schematic block diagram of an embodiment of zoom controller, and this zoom controller for example can help to provide " optical zoom " and/or " digital zoom " ability.Optical zoom can be the optical zoom of any kind of known at present or later exploitation.The example (one or more lens elements are moved forward and backward) of traditional optical zoom has been described above.Similarly, digital zoom can be the digital zoom of any kind of known at present or later exploitation.Notice expecting that the definite of zooming window can be that be scheduled to, processor control and/or user's control.
A shortcoming of digital zoom is the phenomenon that is called aliasing (aliasing).For example, when the TV Presenter of news channel had on streaky necktie, the television image of this streaky necktie comprised the color phenomenon that does not appear on the actual necktie sometimes.Such is common when being aliasing in system and not having enough resolution accurately to express the one or more feature of object in the visual field.In above-mentioned example, the TV camera does not have enough resolution accurately to catch candy strip on the necktie.
In certain embodiments, digital camera devices adopts relatively moving between optics part (or its one or more parts) and the sensor array (or its one or more parts) to help improve resolution, thereby helps to reduce and/or minimize the aliasing that possibility occurs owing to digital zoom.As mentioned above, for example MEMS actuator of actuator can be for example used in such motion, thereby and by applying suitable control signal to one or more actuators so that these one or more actuators move, expand and/or shrink moves related optics and partly provide.
For example in certain embodiments, catch image moves the optics part 1/2 width that equals pixel then on the x direction distance.Catch image with optics at reposition.The image of being caught can be combined to improve effective resolution.In certain embodiments, this optics part moves on y direction rather than x direction.In further embodiments, this optics part moves on x direction and y direction and catches image in such position.In other embodiments, also (promptly do not move, moving on the x direction, move on the y direction, moving in the x and y direction) in all 4 positions and catch image, image is combined with further raising resolution and further helps to reduce, minimize or eliminate the aliasing that zoom is produced then.For example, by making double resolution, can amplify twice and do not enlarge markedly aliasing.
In certain embodiments, the form that relatively moves is the 1/3 pixel * 1/3 pixel pitch displacement of 3 * 3 forms.In certain embodiments, may it is desirable to, adopt the optical filling factor that reduces.In certain embodiments, one or more sensor arraies provide enough resolution to allow digital camera devices combine digital zoom and do not produce excessive aliasing.For example, if embodiment has zoom or do not have zoom 640 * 480 pixels of each image request, then can provide 1280 * 1024 pixels to one or more sensor arraies.In such embodiments, this Sensor section has enough pixels provides the 1/4 needed resolution that narrows down to image to digital camera devices, and also provide resolution (1/2 * 1280=640 for example, 1/2 * 1024=512) of required 640 * 480 pixels.
Figure 111 T-111V is the schematic diagram of the process partly carried out according to the zoom by digital camera devices of the such embodiment of the present invention.In certain embodiments, this subsystem when not zoom mode following time can only use pixel 1/4 (1/2 * 1280=640 for example, 1/2 * 1024=512), perhaps can adopt down-sampling to reduce pixel quantity.In other such embodiment, though digital camera devices when not exporting all pixels in zoom mode following time yet, for example 1280 * 1024.About when not zoom mode following time use how many pixels and how many pixels of output determine it can is that be scheduled to, processor control and/or user's control.
The output of zoom controller is provided for the gamma correction part, and the value that gamma correction partly helps to receive from the camera passage is mapped as the value with the tightr coupling of dynamic range characteristics of display unit (for example LCD or cathode ray tube device).From the sensor-based at least in part dynamic range characteristics of the value of camera passage, the dynamic range characteristics of this transducer does not usually match with the dynamic range characteristics of display unit.Help not matching between the compensation dynamic range by being mapped with of partly providing of gamma correction.
Figure 111 W is the diagrammatic representation that the example of gamma correction work partly is shown.
Figure 111 X illustrates an embodiment of gamma correction part.In this embodiment, gamma correction partly adopts traditional transfer function that gamma correction is provided.This transfer function can be the transfer function of any kind, comprises linear transfer function, nonlinear transfer function and/or their combination.This transfer function can have any suitable form, includes but not limited to one or more equatioies, look-up table and/or their combination.This transfer function can be that be scheduled to, that self adaptation is determined and/or their combination.
The output of gamma correction part is provided for color correcting section, and color correcting section helps the output with camera to be mapped as the form that the color preference with the user is complementary.
In this embodiment, the correction matrix that the color correcting section utilization comprises a plurality of reference values produces calibrated color value, to implement following color preference (this correction matrix for example comprises the parameter set that user and/or manufacturer by digital camera limit):
Rc Gc Bc = Rr Gr Br Rg Gg Bg Rb Gb Bb ⊃ R G B
Make:
Not R correction=(Rr * R does not proofread and correct)+(Gr * G does not proofread and correct)+(Br * B does not proofread and correct)
Not G correction=(Rg * R does not proofread and correct)+(Gg * G does not proofread and correct)+(Bg * B does not proofread and correct)
Not B correction=(Rb * R does not proofread and correct)+(Gb * G does not proofread and correct)+(Bb * B does not proofread and correct)
Wherein,
Rr represents to respond this output valve and the value of relation between the red light quantity of needs from the output valve of red camera passage and display unit,
Gr represents to respond this output valve and the value of relation between the red light quantity of needs from the output valve of green camera passage and display unit,
Br represents to respond this output valve and the value of relation between the red light quantity of needs from the output valve of blue camera passage and display unit,
Rg represents to respond this output valve and the value of relation between the green light quantity of needs from the output valve of red camera passage and display unit,
Gg represents to respond this output valve and the value of relation between the green light quantity of needs from the output valve of green camera passage and display unit,
Bg represents to respond this output valve and the value of relation between the green light quantity of needs from the output valve of blue camera passage and display unit,
Rb represents to respond this output valve and the value of relation between the blue light quantity of needs from the output valve of red camera passage and display unit,
Gb represents to respond this output valve and the value of relation between the blue light quantity of needs from the output valve of green camera passage and display unit,
Bb represents to respond this output valve and the value of relation between the blue light quantity of needs from the output valve of blue camera passage and display unit.
Figure 111 Y illustrates an embodiment of color correcting section.In this embodiment, color correcting section comprises red-correction circuit, green correction circuit and blue-correction circuit.
The red-correction circuit comprises 3 multipliers.First multiplier receives red value (P for example An) and transmission characteristic Rr, and first signal of their product of generation expression.Second multiplier receives green value (P for example Bn) and transmission characteristic Gr, and the secondary signal of their product of generation expression.The 3rd multiplier receives green value (P for example Cn) and transmission characteristic Br, and the 3rd signal of their product of generation expression.First, second and the 3rd signal are provided for adder, and adder produces the calibrated red value of expression (P for example An proofreaies and correct) and.
The green correction circuit comprises 3 multipliers.First multiplier receives red value (P for example An) and transmission characteristic Rg, and first signal of their product of generation expression.Second multiplier receives green value (P for example Bn) and transmission characteristic Gg, and the secondary signal of their product of generation expression.The 3rd multiplier receives green value (P for example Cn) and transmission characteristic Bg, and the 3rd signal of their product of generation expression.First, second and the 3rd signal are provided for adder, and adder produces the calibrated green value of expression (as P Bn proofreaies and correct) and.
The blue-correction circuit comprises 3 multipliers.First multiplier receives red value (P for example An) and transmission characteristic Rb, and first signal of their product of generation expression.Second multiplier receives green value (P for example Bn) and transmission characteristic Gb, and the secondary signal of their product of generation expression.The 3rd multiplier receives green value (P for example Cn) and transmission characteristic Bb, and the 3rd signal of their product of generation expression.First, second and the 3rd signal are provided for adder, and adder produces the calibrated blue valve of expression (as P Cn proofreaies and correct) and.
The output of color correction device is provided for edge booster/sharpener, and edge booster/sharpener is used for helping to strengthen the feature that may appear at image.
Figure 111 Z illustrates an embodiment of edge booster/sharpener.In this embodiment, edge booster/sharpener comprises high pass filter, and high pass filter is applied to extract details and edge and information extraction responded and is used for original image.
The output of edge booster/sharpener is provided for the random noise that reduces the random noise in the image and reduces part.Random noise reduces to comprise the linear or non-linear low pass filter that for example has self adaptation and edge retention performance.Such noise reduces to check the local neighborhood of institute's considered pixel.Near the edge, can on edge direction, carry out low-pass filtering, so that prevent the fuzzy of such edge.Some embodiment can application self-adapting mechanism.For example, the low pass filter (linearity and/or nonlinear) and the neighborhood of large-size can be used for smooth domain.Near the edge, can adopt the neighborhood of low pass filter (linearity and/or nonlinear) and reduced size, so that for example do not make such edge blurry.
If desired, can adopt other random noise to reduce part in combination individually or with one or more embodiment disclosed herein.In certain embodiments, for example carrying out random noise after departing from pixel correction in channel processor reduces.Such noise reduces to replace or replenish any random noise that can carry out in the image pipeline to reduce.
The output that random noise reduces part is provided for the chrominance noise that is used to reduce color noise and reduces part.
Figure 111 AA illustrates the embodiment that chrominance noise reduces part.In this embodiment, chrominance noise reduces partly to comprise RGB to YUV transducer, first and second low pass filters and YUV to RGB transducer.The output that reduces part for the random noise of the signal of rgb value form is provided for RGB to YUV transducer, and RGB to YUV transducer produces YUV value sequence, the corresponding rgb value of each YUV value representation in response to this output.
Y value or component (brightness of its presentation video) are provided for YUV to RGB transducer.U and V value or component (color component of its presentation video) are provided for first and second low pass filters respectively, and first and second low pass filters reduce the color noise on U and the V component respectively.The output of described filter is provided for YUV to RGB transducer, and YUV to RGB transducer produces the rgb value sequence in response to this output, and each rgb value is represented a corresponding YUV value.
The output that chrominance noise reduces part is provided for automatically/artificial white balance part, automatically/and artificial white balance is used to partly assist in ensuring that white object is revealed as white object, rather than redness, green or blue.
Figure 111 AB is the schematic diagram that the process of partly being carried out by the white balance among the embodiment is shown.Specifically, Figure 111 AB has described to have the rectangular coordinates plane of R/G axle and B/G axle.This rectangular coordinates plane has 3 districts, i.e. red color area, white area and blue region.First reference line defines the colour temperature that red color area and white area are separated.Second reference line defines the colour temperature that white area and blue region are separated.First reference line for example is arranged on the colour temperature of 4700K.Second reference line for example is arranged on the colour temperature of 7000K.In this embodiment, automatic white balance partly determines to limit a plurality of pixels position in by the rectangular coordinates plane of R/G axle and B/G axis limit of original image.These a plurality of locations of pixels are regarded as representing the point bunch on this rectangular coordinates plane.Automatic white balance partly determines the center of this point bunch and can be applied to R, the G of original image, the change of B pixel value, and this change moves in the white image district of this coordinate plane for example colour temperature of 6500K in order to the center that will put effectively bunch.The output of automatic white balance part is such output image, wherein the pixel value in the output image is based on the original image pixels value of correspondence, and fixed change to R, G, B pixel value can be used for the point bunch center of original image is moved in the white area, makes the point bunch center of this output image be arranged in the white image district of this coordinate plane for example colour temperature of 6500K.
The expectation colour temperature can be that be scheduled to, processor control and/or user's control.For example in certain embodiments, the reference value of expression expectation colour temperature is provided by the user, thereby will have the desired color temperature characteristic of user by the image that digital camera devices provides.In such embodiments, can move to corresponding to the change of the colour temperature of customer-furnished reference value and carry out artificial white balance by determining to can be used for point bunch center with original image.
The white balance strategy can use one or more conventional color enhancement algorithms of for example known at present or later exploitation.
Be to be understood that white balance partly is not limited to the technology of setting forth above.In fact, the white balance part can adopt any white balance technology of known at present or later exploitation.It should also be understood that white balance not necessarily.The output of white balance part is provided for automatically/artificial colors enhancing part.
Figure 111 AC is the block diagram that strengthens an embodiment of part according to the color of an embodiment.In this embodiment, color strengthens part and adjusts brightness, contrast and/or saturation, to strengthen the color performance according to one or more enhancing strategies.This process is similar to the color setting of adjusting television set or computer monitor in some aspects.Some embodiment can also adjust tone.This enhancing strategy can use one or more conventional color enhancement algorithms of for example known at present or later exploitation.
With reference to Figure 111 AC, the data of presentation video are provided for luminance increasing part, and luminance increasing part further receives adjusted value and produces the dateout of the image that expression adjusts at brightness according to this adjusted value.In this embodiment, each pixel value in the output image equals the respective pixel sum in adjusted value and the input picture.This adjusted value can be that be scheduled to, processor control and/or user's control.For example in certain embodiments, this adjusted value is provided by the user, makes the image that is provided by digital camera devices will have the desired characteristic of user.In certain embodiments, the adjusted value with positive value makes output image seem brighter than input picture.Adjusted value with negative quantity value makes output image seem darker than input picture.
The output of luminance increasing part is provided for contrast enhancing part, and contrast strengthens the output image that part further receives adjusted value and adjusts at contrast according to this adjusted value generation.In this embodiment, the contrast adjustment can be considered as the distance between " elongation " dark (for example being represented by the pixel value with little value) and bright (for example being represented by the pixel value with big value).Adjusted value with positive value makes the dark areas in the input picture seem darker in output image, and makes the bright area in the input picture seem brighter in output image.Adjusted value with negative quantity value can have reverse effect.Can adopt one or more traditional algorithms of for example known at present or later exploitation.Adjusted value can be that be scheduled to, processor control and/or user's control.For example in certain embodiments, adjusted value is provided by the user, makes the image that is provided by digital camera devices have the desired characteristic of user.
The output that contrast strengthens part is provided for saturation enhancing part, and saturation strengthens the output image that part further receives adjusted value and adjusts at saturation according to this adjusted value generation.In this embodiment, the saturation adjustment can be considered as R, G, the distance between the B composition (some aspect is similar to the contrast adjustment) of " elongation " pixel.Adjusted value with positive value makes the dark areas in the input picture seem darker in output image, and makes the bright area in the input picture seem brighter in output image.Adjusted value with negative quantity value can have reverse effect.Can adopt one or more traditional algorithms of for example known at present or later exploitation.This technology can adopt and for example be similar to the color correction matrix that is adopted by above-mentioned color correcting section.Adjusted value can be that be scheduled to, processor control and/or user's control.For example in certain embodiments, adjusted value is provided by the user, makes the image that is provided by digital camera devices have the desired characteristic of user.
Be to be understood that color strengthens part and is not limited to above-mentioned enhancement techniques.In fact, color strengthens any enhancement techniques that part can adopt known at present or later exploitation.It should also be understood that color strengthens not necessarily.
Automatically/and output that artificial colors strengthens part is provided for the image zoom part, and image zoom partly is used for for example dwindling or enlarged image to adjust picture size by removing or increasing pixel.
Image zoom partly receives the data that the image of convergent-divergent (for example amplify or dwindle) is treated in expression.The value of convergent-divergent can be that be scheduled to or default, processor control or manually operated.In certain embodiments, receive the signal (if present) of expression convergent-divergent value.If the signal indicating image of expression expectation convergent-divergent value should be exaggerated, then convergent-divergent is partly carried out convergent-divergent.If the signal indicating image of expression expectation convergent-divergent value should be reduced, then convergent-divergent is partly carried out down convergent-divergent.
Figure 111 AD-111AE is respectively according to the block diagram of the last convergent-divergent of an embodiment and key-drawing.Specifically, Figure 111 AE has described to treat enlarged image part and the image section that forms thus.In this example, treat that enlarged image partly comprises 9 pixels, for the purpose of explaining, be expressed as P11-P33, the array that is arranged to have 3 row and 3 row is shown.The image section of Xing Chenging comprises 25 pixels thus, is expressed as A-Y for the purpose of explaining, the array that is arranged to have 5 row and 5 row is shown.(image section that attention will form can alternatively be expressed as P11-P55.)
In this embodiment, image zoom partly adopts the convergent-divergent strategy, and wherein the pixel value of the infall of odd column and odd-numbered line is that A, C, E, K, M, O, U, W, Y take from the pixel value for the treatment of in the enlarged image.For example
A=P 11
C=P 21
E=P 31
K=P 12
M=P 22
O=P 32
U=P 13
W=P 23
Y=P 33
Other pixel value promptly is arranged on pixel value in even column or the even number line and is B, D, F, G, H, I, J, L, N, P, Q, R, S, T, V, X produce by interpolation.Each pixel value produces based on two or more adjacent pixel values, for example
B=(A+C)/2
D=(C+E)/2
F=(A+K)/2
H=(C+M)/2
J=(E+O)/2
L=(K+M)/2
N=(M+O)/2
P=(K+U)/2
R=(M+W)/2
T=(O+Y)/2
V=(U+W)/2
X=(W+Y)/2
G=(B+L)/2
I=(D+N)/2
Q=(L+V)/2
S=(N+X)/2
In certain embodiments, last convergent-divergent is increased to 1280 * 1024 with pixel quantity from 640 * 480, but can adopt the last convergent-divergent of any value.In certain embodiments, digital camera devices provides to the user and determines whether to carry out convergent-divergent and if carry out then definite ability that goes up the value of convergent-divergent.
In certain embodiments, convergent-divergent partly adopts one or more technology described here to be used for zoom controller, and it has or do not have pruning.
Be to be understood that convergent-divergent partly is not limited to the above-mentioned convergent-divergent strategy of going up.In fact, the convergent-divergent part can adopt any zoom technology that goes up of known at present or later exploitation.It should also be understood that convergent-divergent not necessarily.
The convergent-divergent part can have the ability of following convergent-divergent, and no matter whether the convergent-divergent part has the ability of last convergent-divergent.In certain embodiments, following convergent-divergent reduces to 640 * 480 with pixel quantity from 1280 * 1024, but can adopt the following convergent-divergent of any value.In certain embodiments, digital camera devices provides to the user and determines whether to carry out down convergent-divergent and if carry out then definite ability of the value of convergent-divergent down.
Be to be understood that any zoom technology down that can adopt known at present or later exploitation.It should also be understood that down that convergent-divergent not necessarily.
The output of image zoom part is provided for the color space conversion part, and color space conversion partly is used for color format is converted to YCrCB or YUV so that compression from RGB.In this embodiment, the following equation of this conversion using is finished:
Y=(0.257*R)+(0.504*G)+(0.098*B)+16
Cr=V=(0.439*R)-(0.368*G)-(0.071*B)+128
Cb=U=-(0.148*R)-(0.291*G)+(0.439*B)+128
The output of color space conversion part is provided for the image compression part of preprocessor.Image compression partly is used to reduce the size of image file.This ready-made JPEG, MPEG and/or the WMV compression algorithm that can for example use JPEG (joint photographic experts group), motion image expert group and Microsoft to provide is finished.
The output of image compression part is provided for the image transport formatters, and the image transport formatters is used for image data stream changed at two-way simultaneous or serial 8-16 position interface and all meets forms such as YUV422, RGB565.
Figure 112 illustrates another embodiment of channel processor.In this embodiment, two samplers receive the output of analog to digital converter rather than the output of sensor array.
Figure 113 and 114A illustrate another embodiment of channel processor and image pipeline respectively.In this embodiment, departing from the pixel correction device is arranged in the image pipeline rather than in the channel processor.In this embodiment, depart from the pixel correction device and receive plane of delineation alignment and the output of sewing up rather than the output of black level clamp.
Figure 114 B is the block diagram according to the image pipeline of another embodiment of the present invention.
Figure 114 C is that the chrominance noise that for example can adopt in the image pipeline of Figure 114 B reduces schematic block diagram partly.In this embodiment, U and V value or component (color component of its presentation video) offer first and second low pass filters respectively, and first and second low pass filters reduce the color noise on U and the V component respectively.
Be to be understood that channel processor, image pipeline and/or preprocessor can have any configuration.For example in further embodiments, the image pipeline adopts and is less than all parts shown in Figure 110 C, 110E and/or Figure 114 A, has or do not have the other parts of known at present or later exploitation, and according to any order.
Parallax
If digital camera devices has the camera passage more than, then the camera passage must spatially be offset (though may be offset little distance) each other.Skew on this space may be introduced the parallax between the camera passage, for example the significant change of the object space that causes owing to position that change to observe object.
Figure 115 A-115E illustrates the example of the parallax in the digital camera devices.Specifically, Figure 115 A illustrates object (being lightning) and has the digital camera devices of two camera passages that spatially are offset a distance.The first camera passage has transducer and is first visual field at center with first.The second camera passage has transducer and is the center and second visual field that spatially is offset with first visual field with second.Skew between two visual fields causes the position of object in first visual field to be different from the position of object in second visual field.
Figure 115 B is watched, is clashed into the expression of image of the object of the transducer in the first camera passage by the first camera passage.This transducer has a plurality of sensor elements that schematically illustrate with circle.
Figure 115 C is watched, is clashed into the expression of image of the object of the transducer in the second camera passage by the second camera passage.This transducer has a plurality of sensor elements that schematically illustrate with circle.
Figure 115 D illustrates the stack of image of being watched by the first camera passage and the image of being watched by the second camera passage.In this embodiment, parallax is on the x direction.
Figure 115 E is illustrated in the stack of image of being watched by the first camera passage under the situation of having eliminated this parallax and the image of being watched by the second camera passage.
Figure 115 F-115H illustrates the example of the parallax on the y direction.Figure 115 I is illustrated in the stack of image of being watched by the first camera passage under the situation of having eliminated this parallax and the image of being watched by the second camera passage.
Figure 115 J-115L illustrates the example of the parallax with x component and y component.Figure 115 M is illustrated in the stack of image of being watched by the first camera passage under the situation of having eliminated this parallax and the image of being watched by the second camera passage.
Figure 115 N illustrates object (being lightning) and has the digital camera devices of two camera passages that spatially are offset a distance.The first camera passage has transducer and is first visual field at center with first.The second camera passage has transducer and is the center and second visual field that spatially is offset with first visual field with second.Skew between the visual field causes the position of object in first visual field to be different from the position of object in second visual field.
Figure 115 O is watched, is clashed into the expression of image of the object of the transducer in the first camera passage by the first camera passage.This transducer has a plurality of sensor elements that schematically illustrate with circle.
Figure 115 P is watched, is clashed into the expression of image of the object of the transducer in the second camera passage by the second camera passage.This transducer has a plurality of sensor elements that schematically illustrate with circle.
Figure 115 Q illustrates the stack of image of being watched by the first camera passage and the image of being watched by the second camera passage.In this embodiment, parallax is on the x direction.
Figure 115 R is illustrated in the stack of image of being watched by the first camera passage under the situation of having eliminated this parallax and the image of being watched by the second camera passage.
Figure 115 S illustrates object (being lightning) and has the digital camera devices of two camera passages that spatially are offset a distance.The first camera passage has transducer and is first visual field at center with first.The second camera passage has transducer and is the center and second visual field that spatially is offset with first visual field with second.Skew between the visual field causes the position of object in first visual field to be different from the position of object in second visual field.
Figure 115 R is watched, is clashed into the expression of image of the object of the transducer in the first camera passage by the first camera passage.This transducer has a plurality of sensor elements that schematically illustrate with circle.
Figure 115 P is watched, is clashed into the expression of image of the object of the transducer in the second camera passage by the second camera passage.This transducer has a plurality of sensor elements that schematically illustrate with circle.
Figure 115 Q illustrates the stack of image of being watched by the first camera passage and the image of being watched by the second camera passage.In this embodiment, parallax is on the x direction.
Figure 115 R is illustrated in the stack of image of being watched by the first camera passage under the situation of having eliminated this parallax and the image of being watched by the second camera passage.
Range finding
In certain embodiments, hope can estimate and the visual field in distance between the object.This ability is sometimes referred to as " range finding ".
A kind of be used to estimate and object between the method for distance be to adopt parallax.
Figure 116 illustrate according to another embodiment of the present invention can be between estimation and object or its part apart from the time operation adopted flow chart.
This system receives the needed parallax amount of expression and/or one or more mobile signal.
This system identification is one or more moves to provide or to help to provide needed parallax amount.
In this one or more moves of this system start-up one, some or all.
Catch image from each camera passage, with between estimation and object (or its part) apart from the time use this image.For example, if when estimating, use two camera passages, then catch image and catch image from the second camera passage from the first camera passage.
In certain embodiments, this system receives one or more signals of the position of expression object in image, perhaps determines the position of object in each image.For example, if between estimation and object apart from the time use two camera passages, then this system can receive the expression object from the position in the first camera channel image and object at one or more signals from the position in the second camera channel image.In further embodiments, this system determines the position of object in each image, for example object from the position in the first camera channel image and object from the position in the second camera channel image.
This system produces the signal of the difference of position in the presentation video.For example, if adopt two camera passages, then this system produce expression object from the position in the first camera channel image and object at signal from the difference of the position in the second camera channel image.
This system at least in part based on following some estimate and object (or its part) between distance: (1) expression object from the position in the first camera channel image and object at signal from the difference of the position in the second camera channel image, (2) signal of the relative positioning of the expression first camera passage and the second camera passage, (3) expression (a) object is from poor from the position in the second camera channel image of the position in the first camera channel image and object, (b) relative positioning of the first camera passage and the second camera passage, and (c) with object between distance between related data.
Figure 117 is the block diagram of a part that an embodiment of distance measuring equipment is shown.In this embodiment, distance measuring equipment comprise ask poor device and the estimation device.Ask poor utensil to have to receive the position of expression object in first image and one or more inputs of one or more signals of the position of object in second image.Asking poor device also to comprise provides one or more outputs of difference signal Difference.This difference signal Difference represents position and object position in second image poor of object in first image.
This difference signal is provided for the estimation device, and the estimation device also receives the signal that expression provides the camera passage of first image and the relative positioning between the camera passage of second image is provided.Respond this signal, the estimation device provide expression to and object (or its part) between the output signal e stimate of estimation of distance.
In order to finish this, the estimation device comprises position and the relative positioning of the camera passage of poor, (b) of object position in second image the camera passage that produce first image and generation second image and (c) and the data of relation distance object between between of expression (a) object in first image.
These data can be taked any form, include but not limited to input (position and object position in second image poor of (a) object in first image for example, and (b) produce the camera passage of first image and produce the relative positioning of the camera passage of second image) and export (to and object between the estimation of distance) between the mapping of relation.
This mapping can have the various forms of well known to a person skilled in the art, includes but not limited to formula and/or look-up table.This mapping can be implemented with hardware, software, firmware or their any combination.
This mapping is preferably caught two or more images, is also determined that object is producing in the difference from the position in the second camera channel image from position in the first camera channel image and object by object being placed on the camera passage that has known relative positioning from digital camera devices known distance place, with two or more.
Said process can be repeated so that cover various combination from the relative positioning of object known distance and camera passage.It may be favourable covering whole range of interest (for example known distance and relative positioning), still as described below, each combination that can expect of common unnecessary covering.From the relative positioning of object known distance, camera passage and object from the position in the first camera channel image and object in a data point of all representing from every kind of combination of the difference of the position in the second camera channel image in the whole input/output relation.
Described data point can be used for being created as in the multiple combination of input value each the look-up table of the output that is associated is provided.Perhaps replace look-up table, data point can be imported into statistical packages is used for calculating based on input output with generation formula.This formula can provide suitable output for any input combination (comprising the combination that does not produce data point for it) in the interested transducer input range usually.
Look-up table embodiment can adopt interpolation to come to be the not definite suitable output of any input combination in look-up table.
Ask poor device can be suitable for providing the expression object in first image the position and any kind of one or more difference signals of the difference of the position of object in second image ask poor device.For example in this embodiment, ask poor device to comprise the absolute value subtracter, its generation equals the position of object in first image and the difference signal of the absolute value of the difference of the position of object in second image.In further embodiments, asking poor device can be the poor device of asking of ratio meter types, and it produces the position of expression object in first image and the ratio metering difference signal of the difference of the position of object in second image.
The signal of the relative position of expression camera passage can have any form.For example, this signal can be taked the form of the individual signals of alternate position spike between the direct representation camera passage.This signal also can be taked the form of a plurality of signals, and for example two or more represent the signal of the position of a corresponding camera passage respectively to these a plurality of signals, makes the relative position of these a plurality of signal secondary indication camera passages.
Have before the device and ask poor device although the position of distance measuring equipment is illustrated in estimation, distance measuring equipment is not limited thereto.For example, ask poor device can be embodied in the estimation device and/or difference signal can provide or produces with certain alternate manner.In certain embodiments, the estimation device can be in response to absolute magnitude rather than difference signal.
In addition, though the disclosed embodiments comprise 3 inputs and an output, distance measuring equipment is not limited thereto.Can adopt distance measuring equipment with any amount of input and output.
Range finding can also only use a camera passage to carry out.For example, one of camera passage can be provided with first view of object, and image can be captured.After this, one or more moving can be applied to one or more parts of this camera passage so that second view (second view is different from first view) of object is provided to this camera passage.Moving like this can be provided by navigation system.Second image can be caught with second view of object.After this, first and second images can use aforesaid operations to handle by distance measuring equipment, with the estimation of distance between generation and the object (or its part).
Figure 118 is the block diagram of the locator part of distance measuring equipment.
The 3D imaging
With reference to Figure 119 A-119D, in certain embodiments, hope can produce the image that is used to provide one or more 3D effects, is sometimes referred to as " 3D imaging ".
One type 3D imaging is called stereoscopic vision.Stereoscopic vision is at least in part based on the ability of two views that object is provided (for example offer right eye, offer left eye).In certain embodiments, these views are combined into single stereo-picture.For example in one embodiment, the view that is used for right eye can be blue, and the view that is used for left eye can be red, in this case, having on suitable glasses (for example is blue eyeglass before the left eye, be red eyeglass before the right eye) the people will in suitable eyes, see suitable view (be to be right view in the right eye, and be left view in the left eye).In another embodiment, be used for the view of right eye can be on first direction polarization, and the view that is used for left eye can be different from polarization on the second direction of first direction, in this case, having on suitable glasses (for example is the eyeglass of polarization on first direction before the left eye, be the eyeglass of polarization on second direction before the right eye) the people will in suitable eyes, see suitable view (be to be right view in the right eye, and be left view in the left eye).
With reference to Figure 120, the 3D imaging of another type is called the 3D figure, and it is at least in part based on the ability that appearance of depth is provided to image.
It is desirable to, when generation is used to provide the image of 3D effect, adopt parallax.
Increase one or more characteristics that parallax amount can help to improve the 3D imaging.
Figure 121 A-121B illustrates the flow chart of the operation that can adopt according to another embodiment of the present invention when the 3D imaging is provided.
This system receives the needed parallax amount of expression and/or one or more mobile signal.
This system identification is one or more moves to provide or to help to provide needed parallax amount.
In this system start-up is discerned one or more the moving one, some or all.
This system produces one or more images with required 3D effect.
Catch image to use when the 3D imaging from each camera passage.For example, if when the 3D imaging, use two camera passages, then catch image and catch image from the second camera passage from the first camera passage.
This system need to determine whether stereoscopic vision or does not need the 3D figure.Stereoscopic vision if desired, then all be provided for formatter from the image of first camera passage seizure and the image of catching from the second camera passage, formatter produces two images, and one is applicable to and offers eyes and another is applicable to and offers the another eyes.For example in one embodiment, for example, the view that is used for right eye can be blue, and the view that is used for left eye can be red, in this case, the people who has on suitable glasses will see suitable view (be to be right view in the right eye, and be left view in the left eye) in suitable eyes.In another embodiment, be used for the view of right eye can be on first direction polarization, and the view that is used for left eye can be different from polarization on the second direction of first direction, in this case, the people who has on suitable glasses will see suitable view (be to be right view in the right eye, and be left view in the left eye) in suitable eyes.
These two images can be combined as single stereo-picture.
3D figure rather than stereoscopic vision if desired, then this system adopts one or more characterization criterions to come characterize images.For example in one embodiment, the characterization criterion comprise identification in this image one or more features (for example edge) and the distance between one or more parts of estimation and these features.Can adopt aforesaid distance measuring equipment estimate and feature or its part between distance.This system to small part produces the 3D graph image that has appearance of depth to small part based on (1) this characterization data and (2) 3D reproduction criterion.
This characterization criterion and 3D graphics standard can be that be scheduled to, that self adaptation is determined or their combination.
Be to be understood that the 3D imaging can also only use a camera passage to carry out.For example, one of camera passage can be provided with first view of object, and image can be captured.After this, one or more moving can be applied to one or more parts of this camera passage so that second view (second view is different from first view) of object is provided to this camera passage.Moving like this can be provided by navigation system.Second image can be caught with second view of object.After this, first and second images can use aforesaid operations to handle by distance measuring equipment, with the estimation of distance between generation and the object (or its part).
Figure 123 is the block representation that is used to produce an embodiment of the image with 3D effect.
Figure 124 is the block representation that is used to produce an embodiment of the image with 3D effect.
Image identifying
Figure 125 illustrates the flow chart according to the operation that can adopt of another embodiment of the present invention when image identifying is provided.
Figure 126 A-126B illustrates the flow chart according to the operation that can adopt of another embodiment of the present invention when image identifying is provided.
Some other application
In certain embodiments, the quantity of imageing sensor, size and/or type can be selected based on application requirements.3 examples are described below illustrate how to influence camera components, and go back effect characteristics/work optimization if desired.Be to be understood that any the foregoing description or its part can adopt when implementing any following Example.
#1): the ultraphotic of imaging simultaneously spectrum digital camera:
The Hyper spectral Imaging device adopts the data that reach in 100 discrete colour bands.This can finish by the narrow band filter that electric tuning or machinery are selected.This is tunable, and maybe can to select a problem of filtered method be that colour band in the image is selected in chronological order.Need expensive Frame collection to have 3D pixel logo symbol: the complete HYPERSPECTRAL IMAGERY (being called data volume) of x, y and color.In a lot of system applies, need in a Frame, obtain whole ultraphotic spectrum data volume simultaneously.
Disclosed multiple optics/imager method can be used for utilizing color narrow-band pass filter independent in each transducer optical path to obtain institute's color band simultaneously.Example is for example to be arranged to 8 * 8 or 1 * 64 or 64 independent transducers of other transducer arrangements (customization optical module, optional MEMs mechanical shaking mechanism and the monochrome or the multicolor image sensor of optimization).This will provide the ultraphotic spectrum ability of 64 independent colour bands.Each transducer has certain suitable quantity in imageing sensor pixel covers needed visual field (for example: 256 * 256 pixels on the imager array are equipped with 3 μ m pel spacings).
Each imageing sensor can have different pel spacings and/or array sizes, wherein at incident color or colour band and optimize imager integrated circuit (IC).If the frame rate of each transducer is per second 60 frames, then can in a frame time (16.67 milliseconds), obtain 64 data in the independent colour band.Use for a lot of Hyper spectral Imagings, this ability or similar capabilities are all expected.
#2): many colors of object (threat) detection and Identification active digital camera
Some camera systems leniently visual field (WFOV) obtain data to detect interested object, utilize many colors imaging capability and higher spatial resolution ability in narrow visual field (NFOV) this object to be added frame fast then, to discern this object.
It is 20 microns 128 * 128 array sizes that the WFOV transducer for example can have pel spacing, to determine in one or more pixels of interested object in WFOV.Pixel in 128 * 128 arrays and optics can be that to be competent at highly sensitive broadband visible.
Whole digital camera (having WFOV imageing sensor and a plurality of NFOV imageing sensor) can be aimed at by universal joint mechanism.Can adjust this universal joint sighted direction from the data of WFOV imageing sensor, the feasible object that is detected is at the center of all FOV.
The NFOV imageing sensor that resolution is higher than the WFOV imageing sensor can carry out imaging and identification to object.The WFOV imageing sensor can continue the WFOV imaging so that detect other interested object.
A plurality of NFOV imageing sensors can be arranged.Can select NFOV by the focal length of Pixel Dimensions, (in the x and y direction) pixel quantity and optics.For example, camera can comprise 6 NFOV imageing sensors.NFOV can imaging the district be 1/100 of WFOV imageing sensor.If WFOV is identical with the focal length of NFOV optics, then for example pel spacing is the NFOV that 128 * 128 pattern matrixs of 2.0 μ m can provide expectation.
6 NFOV imageing sensors can be different.Example at ultraviolet (UV), blue, green, the broadband is visible, 880nm CW-laser illuminator and the illumination of 880nm pulse type laser and the imageing sensor optimized.These 6 transducers can be optimized with coupling NFOV at Pixel Dimensions and array sizes.Usually, pel spacing will increase for long wavelength, with the matching optics circle of confusion.Pulsed 880nm laser array can have special circuit to measure by the amplitude and the time of advent of the pulse of object laser light reflected in each pixel; This ability is called LADAR, provide and object between distance, reflected signal amplitude and the 3D information of body form in some cases.
Can on single integrated circuit, handle WFOV imageing sensor and other 6 NFOV imageing sensors.Can select the position of these imageing sensors on integrated circuit in order to minimize integrated circuit area or other consideration.Each sensor tip is to its action required and optimised.The optical stack of each transducer top provides required color transmission and other required optical signature.If desired, optical stack or its part can mechanically be shaken by the MEMs mechanical mechanism, to realize higher spatial resolution or other function (as image stabilization or image focusing) is provided.
The NFOV imageing sensor can utilize the FOV (may be 32 * 32 pixels under the 8 * very fast frame rate) that reduces to window and read.Data from the NFOV transducer can be used for aiming at universal joint interested object is remained on the center of the FOV that reduces.
#3: great dynamic range color digital camera
Digital camera can have the maximum light signal storage capacity of the dynamic range of restriction particular system.On the capacitor of light signal charge storage in pixel region.Charge handling capacity is subjected to the restriction of the interior storage capacitance of maximum voltage swing in the integrated circuit and pixel.The amount of integrated optical charge gather from scene with imageing sensor and the time of integrated signal directly related.This is known as integrating time.Need long integrating time for weak signal, because more optical charge is incorporated in the pixel, and the signal to noise ratio of digital camera is improved.
In case reach the maximum charge capacity, how many transducers just no longer can brighten by resolution image.This is owing to having produced an imaging difficult problem for whole visual field is provided with single integrating time.The integrating time of digital camera can be set to the low brightness level imaging and make bright signal saturated or to the levels of brightness imaging and do not detect low brightness level (because being lower than the signal to noise ratio of transducer from the integration optical charge of low brightness level).
On single IC, use and all observe same visual field simultaneously and each all has a plurality of optics and the imageing sensor of different integrating times, solved above-mentioned dynamic range problem.This digital camera can for example have one 3 * 3 image sensor module, may be that the integrating time of 3 in every kind of color (R, G, B) and every kind of color can be different, and for example every kind of color can have 3 different values (may be 0.1,1 and 10 millisecond).Data from the camera of every kind of color can be by combination of numbers, to provide much bigger dynamic range in a digital camera data frame.Though be difficult to show the imaging of this wide dynamic range under situation about not compressing, original digital camera data can be used by the Digital Signal Processing of scene.This numerical data can also be stored and show, to show low-light level or high brightness characteristic as required.
Optical stack can also comprise digital camera functionality and/or needed other optical signature of performance.But this can be the filter, polarizer such as electric tuning, wavefront coded, spatial light filter (mask) and other also unexpected feature.Some new features (except lens) can electric work by (as tunable filter) or utilize MEMs mechanism to come machinery to move.
Being manufactured on the single wafer of imageing sensor and optical stack carried out, independently assembled on the wafer (may reach two wafers: is used for IC, and is used for optics), and is bonded together with wafer scale.Can also adopt pick up with laying method and equipment optical module is attached to wafer IC, perhaps assembly drawing image-position sensor tube core or other assembly separately.
In the embodiment that adopts MEMS, the manufacturing of optical stack, MEMs and imageing sensor can carry out on single wafer, (may reach 3 wafers: one is used for IC at wafer independently, one is used for MEMs, and one is used for optical stack) go up assembling, and be bonded together with wafer scale.Can also adopt pick up with laying method and equipment so that optical module and MEMs are attached to wafer IC, perhaps assembly drawing image-position sensor tube core or other assembly (MEMs and optical stack) separately.
Although it is also understood that digital camera devices 210 is shown is used for digital camera 200, the invention is not restricted to this.In fact, digital camera devices and/or can be used for any method wherein and/or equipment can use separately or use at the device of any kind, this device comprise such as but not limited to camera and video camera, cell phone, other personal communication devices, surveillance equipment, automobile application, computer, manufacturing and checkout facility, toy and various other and the application that continues expansion.In addition, can adopt other device of digital camera devices and/or any method that is adopted and/or equipment (for example can comprise or not comprise shell shown in Figure 2, circuit board, peripheral user interface, power supply, electronic image storage medium and aperture, circuit board can not have only camera function just to have, as in cell phone, digital camera subsystem is the annex of available circuit plate.), and can adopt or not adopt not method and/or device shown in figure 2.
Digital camera can be a product independently, maybe can be embedded into other device, as cell phone, computer or countless other imaging platform available at present or that can produce in future, as those imaging platform that become feasible owing to the present invention.
One or more embodiment of the one or more aspects of the present invention can have one or more in the following advantage.Can have a plurality of independently arrays according to device of the present invention on the single image transducer, each array has its lens.The simple geometric structures of less a plurality of arrays allows less lens (diameter, thickness and focal length), and this allows to reduce the stacks as high of digital camera.
Each array is can be advantageously visible and/or can detect band at one.Especially can be at the path of this specific wavelength band and tuning to each lens.Because therefore each lens does not need to make whole spectrum to pass through, so the quantity of element can for example be reduced to one or two.
In addition and since each lens at bandwidth, each lens can be colored (for example will dye redness at the array of red visible light bands of a spectrum) at its respective bandwidth in manufacture process.Replacedly, can on each lens, apply single colour filter.This process has been saved conventional color filters (sheet of each pixel filters), has reduced cost thus, improved signal strength signal intensity and has eliminated the obstacle that pixel reduces.
In certain embodiments, in case assembled the have sensor array integrated circuit lead of (and one or more parts of the processor that may assemble), this assembly is the form of sealing device just.Therefore, such device does not need " encapsulation ", and therefore then can be directly installed on the circuit board if desired, and this has saved cost of parts and/or manufacturing cost in certain embodiments.As mentioned above, method and apparatus of the present invention is not limited to use in digital camera system, and can be in the system of any kind, include but not limited to use in the information system of any kind.
Be to be understood that feature disclosed herein can be used in combination with any.
Unless be noted that special statement, all be considered as such as the term of " comprising (comprise) ", " having ", " comprising (include) " and form of ownership thereof unrestricted, thereby do not get rid of extra element and/or feature.In addition, unless statement, such as " in response to " and " based on " term represent " at least in response to " and " at least based on " respectively, thereby do not get rid of in response to and/or based on things more than one.
In the identification of this employing, determine and generation comprises respectively and discerns by any way, determines and produce, include but not limited to calculate, visit data and/or the mapping (for example in look-up table) and/or their combination of being stored.
Although illustrated and described each embodiment, it will be understood by those of skill in the art that to the invention is not restricted to the embodiment that these provide by way of example, and can carry out various changes and correction without departing from the scope of the invention.

Claims (67)

1. a digital camera comprises
A plurality of photodetector arrays comprise:
First photodetector array is in order to sample to luminous intensity; And
Second photodetector array is in order to sample to luminous intensity;
Signal processing circuit with described first and second photodetector arrays coupling, in order to utilize (i) expression by the data of the luminous intensity of described first photodetector array sampling with (ii) represent to produce composograph by the data of the luminous intensity of described second photodetector array sampling; And
Wherein said first photodetector array, described second photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
2. digital camera according to claim 1, wherein:
Described first photodetector array is to the intensity sampling of first wavelength light; And
Described second photodetector array is to the intensity sampling of second wavelength light.
3. digital camera according to claim 2, also comprise in order to the 3rd photodetector array the sampling of three-wavelength light intensity, and wherein said signal processing circuit and the coupling of described the 3rd photodetector array, and utilization (i) expression is by the data of the luminous intensity of described first photodetector array sampling, (ii) represent by the data of the luminous intensity of described second photodetector array sampling and (ii) represent to produce composograph by the data of the luminous intensity of described the 3rd photodetector array sampling.
4. digital camera according to claim 3, wherein said first, second relative triangular arrangement that is arranged in the 3rd photodetector array.
5. digital camera according to claim 4, wherein said first photodetector array, described second photodetector array, described the 3rd photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate.
6. digital camera according to claim 5, wherein said first wavelength is associated with first color, and described second wavelength is associated with second color, and described three-wavelength is associated with the 3rd color.
7. digital camera according to claim 3, wherein:
Described first photodetector array reaches first integrating time to the intensity sampling of described first wavelength light;
Described second photodetector array reaches second integrating time to the intensity sampling of described second wavelength light;
Described the 3rd photodetector array reaches the 3rd integrating time to described three-wavelength light intensity sampling.
8. digital camera according to claim 3, wherein said first, second relative isosceles, obtuse angle, acute angle or right-angled triangle of being arranged in the 3rd photodetector array disposes.
9. digital camera according to claim 2, wherein said first wavelength is associated with first color, and described second wavelength is associated with second color.
10. digital camera according to claim 1, wherein:
Sampling reaches first integrating time to described first photodetector array to luminous intensity; And
Sampling reaches second integrating time to described second photodetector array to luminous intensity.
11. digital camera according to claim 1, wherein:
Each photoelectric detector of described first array comprises semiconductor portions, in this semiconductor portions luminous intensity is sampled; And
Each photoelectric detector of described second array comprises semiconductor portions, in this semiconductor portions luminous intensity is sampled; And the semiconductor portions of each photoelectric detector of the semiconductor portions of each photoelectric detector of wherein said first array and described second array is positioned at the different depth place with respect to the surface of each photoelectric detector.
12. digital camera according to claim 1, wherein said first photodetector array and described second photodetector array are arranged on the same plane of delineation.
13. digital camera according to claim 1, also comprise in the optical path that is arranged on described first photodetector array and first lens that are associated with this optical path and the optical path that is arranged on described second photodetector array in and second lens that are associated with this optical path.
14. digital camera according to claim 13 also comprises the uniform basically colour filter sheet in the optical path that is arranged on described first photodetector array.
15. digital camera according to claim 1 also comprises in the optical path that is arranged on described first photodetector array and first colored lens that is associated with this optical path.
16. digital camera according to claim 1 also comprises in the optical path that is arranged on described first photodetector array and first lens that are associated with this optical path, wherein:
Described first lens pass through and filtering second wavelength light first wavelength light;
Described first photodetector array is to the intensity sampling of first wavelength light; And
Described second photodetector array is to the intensity sampling of second wavelength light.
17. digital camera according to claim 1, wherein:
Described first photodetector array is to the intensity sampling of the first wavelength light intensity and second wavelength light;
Described second photodetector array is sampled to the three-wavelength light intensity; And
Described first wavelength is associated with first color, and described second wavelength is associated with second color, and described three-wavelength is associated with the 3rd color.
18. digital camera according to claim 17, wherein:
Each photoelectric detector of described first array comprises: first semiconductor portions, in this first semiconductor portions the described first wavelength light intensity is sampled; And second semiconductor portions, in this second semiconductor portions the described second wavelength light intensity is sampled;
Each photoelectric detector of described second array comprises semiconductor portions, in this semiconductor portions described three-wavelength light intensity is sampled; And
The semiconductor portions of described first and second semiconductor portions of each photoelectric detector of described first array and each photoelectric detector of described second array is positioned at relative to each other and with respect to the different depth place on the surface of each photoelectric detector.
19. digital camera according to claim 17, also comprise in the optical path that is arranged on described first photodetector array and first lens that are associated with this optical path and the optical path that is arranged on described second photodetector array in and second lens that are associated with this optical path, wherein said first lens make described first and second wavelength light by and the described three-wavelength light of filtering.
20. digital camera according to claim 17, also comprise in the optical path that is arranged on described first photodetector array and the filter that is associated with this optical path, wherein this filter passes through and the described three-wavelength light of filtering described first and second wavelength light.
21. digital camera according to claim 17, wherein:
Described first photodetector array reaches first integrating time to the intensity sampling of described first wavelength light;
Described first photodetector array reaches second integrating time to the intensity sampling of described second wavelength light; And
Described second photodetector array reaches the 3rd integrating time to described three-wavelength light intensity sampling.
22. digital camera according to claim 1, wherein said signal processing circuit:
Utilize expression to produce first image by the data of the luminous intensity of described first photodetector array sampling, and
Utilize expression to produce second image by the data of the luminous intensity of described second photodetector array sampling.
23. digital camera according to claim 22, wherein said signal processing circuit utilize described first image and described second image to produce described composograph.
24. digital camera according to claim 1 also comprises memory, in order to storage (i) expression by the data of the luminous intensity of described first photodetector array sampling with (ii) represent data by the luminous intensity of described second photodetector array sampling.
25. digital camera according to claim 24, wherein said memory, described first photodetector array, described second photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
26. digital camera according to claim 25 also comprises in order to the sequential and the control logic unit of sequential and control information to be provided to described signal processing circuit, described first photodetector array and/or described second photodetector array.
27. digital camera according to claim 24 also comprises the telecommunication circuit in order to the data of the described composograph of output expression.
28. digital camera according to claim 27, wherein said telecommunication circuit comprise at least a in wired, wireless or the optical communication circuit.
29. digital camera according to claim 24, wherein said telecommunication circuit, described memory, described first photodetector array, described second photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
30. digital camera according to claim 1, wherein said first photodetector array comprises the first surface zone, and described second photodetector array comprises the second surface zone, and wherein said first surface zone is different from described second surface zone.
31. digital camera according to claim 30, the photoelectric detector of wherein said first array comprises the first active surface zone, and the photoelectric detector of described second array comprises the second active surface zone, and the wherein said first active surface zone is different from the described second active surface zone.
32. digital camera according to claim 1, wherein said first photodetector array comprises the first surface zone, and described second photodetector array comprises the second surface zone, and wherein said first surface zone is substantially the same with described second surface zone.
33. digital camera according to claim 32, the photoelectric detector of wherein said first array comprises the first active surface zone, and the photoelectric detector of described second array comprises the second active surface zone, and the wherein said first active surface zone is different from the described second active surface zone.
34. a digital camera comprises:
A plurality of photodetector arrays comprise:
First photodetector array, in order to intensity sampling to first wavelength light, and
Second photodetector array is in order to the intensity sampling to second wavelength light;
Be arranged on first lens in the optical path of described first photodetector array, wherein said first lens comprise the predetermined optical response to described first wavelength light;
Be arranged on second lens in the optical path of described second photodetector array, wherein said second lens comprise the predetermined optical response to described second wavelength light;
Signal processing circuit with described first and second photodetector arrays coupling, in order to utilize (i) expression by the data of the luminous intensity of described first photodetector array sampling with (ii) represent to produce composograph by the data of the luminous intensity of described second photodetector array sampling; And
Wherein said first photodetector array, described second photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
35. digital camera according to claim 34, wherein:
Described first lens are delivered on the plane of delineation of photoelectric detector of described first array described first wavelength light; And
Described second lens are delivered on the plane of delineation of photoelectric detector of described second array described second wavelength light.
36. digital camera according to claim 35, wherein:
Described second wavelength light of the described first lens filtering; And
Described first wavelength light of the described second lens filtering.
37. digital camera according to claim 34 also comprises:
In order to the 3rd photodetector array to the sampling of three-wavelength light intensity;
Be arranged on the 3rd lens in the optical path of described the 3rd photodetector array, wherein said the 3rd lens comprise the predetermined optical response to described three-wavelength light; And
Wherein said signal processing circuit and the coupling of described the 3rd photodetector array, and utilization (i) expression is by the data of the luminous intensity of described first photodetector array sampling, (ii) represent by the data of the luminous intensity of described second photodetector array sampling and (ii) represent to produce composograph by the data of the luminous intensity of described the 3rd photodetector array sampling.
38. according to the described digital camera of claim 37, wherein:
Described first lens filtering described second and the three-wavelength light,
Described second lens filtering described first and the three-wavelength light, and
Described first and second wavelength light of described the 3rd lens filtering.
39. according to the described digital camera of claim 37, wherein said first, second relative triangular arrangement that is arranged in the 3rd photodetector array.
40. according to the described digital camera of claim 39, wherein said first photodetector array, described second photodetector array, described the 3rd photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate.
41. according to the described digital camera of claim 40, wherein said first wavelength is associated with first color, described second wavelength is associated with second color, and described three-wavelength is associated with the 3rd color.
42. according to the described digital camera of claim 37, wherein:
Described first photodetector array reaches first integrating time to the intensity sampling of described first wavelength light;
Described second photodetector array reaches second integrating time to the intensity sampling of described second wavelength light; And
Described the 3rd photodetector array reaches the 3rd integrating time to described three-wavelength light intensity sampling.
43. digital camera according to claim 34, wherein said first wavelength is associated with first color, and described second wavelength is associated with second color.
44. according to the described digital camera of claim 43, wherein:
Sampling reaches first integrating time to described first photodetector array to luminous intensity; And
Sampling reaches second integrating time to described second photodetector array to luminous intensity.
45. digital camera according to claim 34, also comprise shell, wherein said first and second lens, first and second photodetector arrays and described signal processing circuit are attached to this shell, and wherein said first and second lens can be located independently with respect to the photodetector array that is associated.
46. digital camera according to claim 34, wherein:
Described first photodetector array is to described first wavelength light intensity and the sampling of three-wavelength light intensity;
Described second photodetector array is to the intensity sampling of second wavelength light;
Described first wavelength is associated with first color, and described second wavelength is associated with second color, and described three-wavelength is associated with the 3rd color.
47. according to the described digital camera of claim 46, wherein:
Each photoelectric detector of described first array comprises: first semiconductor portions, in this first semiconductor portions the described first wavelength light intensity is sampled; And second semiconductor portions, in this second semiconductor portions described three-wavelength light intensity is sampled;
Each photoelectric detector of described second array comprises semiconductor portions, in this semiconductor portions the described second wavelength light intensity is sampled; And
The semiconductor portions of first and second semiconductor portions of each photoelectric detector of described first array and each photoelectric detector of described second array is positioned at relative to each other and with respect to the different depth place on the surface of each photoelectric detector.
48. according to the described digital camera of claim 46, wherein said first lens make described first and three-wavelength light by and filtering second wavelength light.
49. according to the described digital camera of claim 46, also comprise in the optical path that is arranged on described first photodetector array and the filter that is associated with this optical path, wherein this filter make described first and three-wavelength light by and described second wavelength light of filtering.
50. according to the described digital camera of claim 46, wherein:
Described first photodetector array reaches first integrating time to the intensity sampling of described first wavelength light;
Described second photodetector array reaches second integrating time to described three-wavelength light intensity sampling; And
Described first photodetector array reaches the 3rd integrating time to described three-wavelength light intensity sampling.
51. digital camera according to claim 34, wherein said signal processing circuit:
Utilize expression to produce first image by the data of the luminous intensity of described first photodetector array sampling, and
Utilize expression to produce second image by the data of the luminous intensity of described second photodetector array sampling.
52. according to the described digital camera of claim 51, wherein said signal processing circuit utilizes described first image and described second image to produce described composograph.
53. digital camera according to claim 34 also comprises memory, in order to storage (i) expression by the data of the luminous intensity of described first photodetector array sampling with (ii) represent data by the luminous intensity of described second photodetector array sampling.
54. according to the described digital camera of claim 53, wherein said memory, described first photodetector array, described second photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
55., also comprise telecommunication circuit in order to the data of the described composograph of output expression according to the described digital camera of claim 53.
56. according to the described digital camera of claim 55, wherein said telecommunication circuit comprises at least a in wireless, wired or the optical communication circuit.
57. according to the described digital camera of claim 53, wherein said telecommunication circuit, described memory, described first photodetector array, described second photodetector array and described signal processing circuit are integrated in on the semi-conductive substrate or with in the semi-conductive substrate.
58. digital camera according to claim 34, wherein said signal processing circuit comprises first signal processing circuit and secondary signal treatment circuit, wherein said first signal processing circuit is coupled and is associated with described first photodetector array, and the secondary signal treatment circuit is coupled and is associated with described second photodetector array.
59. digital camera according to claim 34, wherein said signal processing circuit comprises the first analog signal logical block and the second analog signal logical block, the wherein said first analog signal logical block is coupled and is associated with described first photodetector array, and the second analog signal logical block is coupled and is associated with described second photodetector array.
60. digital camera according to claim 34, wherein said signal processing circuit comprises the first black level logical block and the second black level logical block, the wherein said first black level logical block is coupled and is associated with described first photodetector array, and the second black level logical block is coupled and is associated with described second photodetector array.
61. digital camera according to claim 34, wherein said signal processing circuit comprises first exposure control circuit and second exposure control circuit, wherein said first exposure control circuit is coupled and is associated with described first photodetector array, and second exposure control circuit is coupled and is associated with described second photodetector array.
62. digital camera according to claim 34 also comprises framework, wherein said first and second photodetector arrays, described signal processing circuit and described first and second lens are fixed on this framework.
63. digital camera according to claim 34 also comprises in order to the sequential and the control logic unit of sequential and control information to be provided to described signal processing circuit, described first photodetector array and/or described second photodetector array.
64. digital camera according to claim 34, wherein said first photodetector array comprises the first surface zone, and described second photodetector array comprises the second surface zone, and wherein said first surface zone is different from described second surface zone.
65. according to the described digital camera of claim 64, the photoelectric detector of wherein said first array comprises the first active surface zone, and the photoelectric detector of described second array comprises the second active surface zone, and the wherein said first active surface zone is different from the described second active surface zone.
66. digital camera according to claim 34, wherein said first photodetector array comprises the first surface zone, and described second photodetector array comprises the second surface zone, and wherein said first surface zone is substantially the same with described second surface zone.
67. according to the described digital camera of claim 66, the photoelectric detector of wherein said first array comprises the first active surface zone, and the photoelectric detector of described second array comprises the second active surface zone, and the wherein said first active surface zone is different from the described second active surface zone.
CN2005800323740A 2004-08-25 2005-08-25 Apparatus for multiple camera devices and method of operating same Active CN101427372B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US60485404P 2004-08-25 2004-08-25
US60/604,854 2004-08-25
US69594605P 2005-07-01 2005-07-01
US60/695,946 2005-07-01
PCT/US2005/030256 WO2006026354A2 (en) 2004-08-25 2005-08-25 Apparatus for multiple camera devices and method of operating same

Publications (2)

Publication Number Publication Date
CN101427372A true CN101427372A (en) 2009-05-06
CN101427372B CN101427372B (en) 2012-12-12

Family

ID=36000574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2005800323740A Active CN101427372B (en) 2004-08-25 2005-08-25 Apparatus for multiple camera devices and method of operating same

Country Status (4)

Country Link
US (7) US20060054782A1 (en)
EP (1) EP1812968B1 (en)
CN (1) CN101427372B (en)
WO (1) WO2006026354A2 (en)

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102131044A (en) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 Camera module
CN102170571A (en) * 2010-06-22 2011-08-31 上海盈方微电子有限公司 Digital still camera framework for supporting two-channel CMOS (Complementary Metal Oxide Semiconductor) sensor
CN102790849A (en) * 2011-05-20 2012-11-21 英属开曼群岛商恒景科技股份有限公司 Image sensor module
CN102857699A (en) * 2011-06-29 2013-01-02 全友电脑股份有限公司 Image capturing system and method
CN103004218A (en) * 2011-05-19 2013-03-27 松下电器产业株式会社 Three-dimensional imaging device, imaging element, light transmissive portion, and image processing device
CN103458162A (en) * 2012-06-01 2013-12-18 全视科技有限公司 Lens array for partitioned image sensor
CN103516962A (en) * 2012-06-19 2014-01-15 全友电脑股份有限公司 Image capturing system and method
CN103581533A (en) * 2012-08-07 2014-02-12 联想(北京)有限公司 Method and electronic equipment for collecting image information
CN103870805A (en) * 2014-02-17 2014-06-18 北京释码大华科技有限公司 Mobile terminal biological characteristic imaging method and device
CN104185808A (en) * 2011-10-11 2014-12-03 派力肯影像公司 Lens stack arrays including adaptive optical elements
CN105579902A (en) * 2013-09-23 2016-05-11 Lg伊诺特有限公司 Camera module and manufacturing method for same
CN105629426A (en) * 2014-10-31 2016-06-01 高准精密工业股份有限公司 Zooming lens module and zooming camera module group
US9374512B2 (en) 2013-02-24 2016-06-21 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
CN105704465A (en) * 2016-01-20 2016-06-22 海信电子科技(深圳)有限公司 Image processing method and terminal
US9412206B2 (en) 2012-02-21 2016-08-09 Pelican Imaging Corporation Systems and methods for the manipulation of captured light field image data
US9485496B2 (en) 2008-05-20 2016-11-01 Pelican Imaging Corporation Systems and methods for measuring depth using images captured by a camera array including cameras surrounding a central camera
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US9521319B2 (en) 2014-06-18 2016-12-13 Pelican Imaging Corporation Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US9536166B2 (en) 2011-09-28 2017-01-03 Kip Peli P1 Lp Systems and methods for decoding image files containing depth maps stored as metadata
US9578237B2 (en) 2011-06-28 2017-02-21 Fotonation Cayman Limited Array cameras incorporating optics with modulation transfer functions greater than sensor Nyquist frequency for capture of images used in super-resolution processing
CN106537890A (en) * 2014-07-16 2017-03-22 索尼公司 Compound-eye imaging device
CN106716486A (en) * 2014-06-24 2017-05-24 弗劳恩霍夫应用研究促进协会 Device and method for positioning a multi-aperture optical unit with multiple optical channels relative to an image sensor
CN106768325A (en) * 2016-11-21 2017-05-31 清华大学 Multispectral light-field video acquisition device
US9706132B2 (en) 2012-05-01 2017-07-11 Fotonation Cayman Limited Camera modules patterned with pi filter groups
US9733486B2 (en) 2013-03-13 2017-08-15 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
US9749568B2 (en) 2012-11-13 2017-08-29 Fotonation Cayman Limited Systems and methods for array camera focal plane control
US9749547B2 (en) 2008-05-20 2017-08-29 Fotonation Cayman Limited Capturing and processing of images using camera array incorperating Bayer cameras having different fields of view
CN107193095A (en) * 2016-03-14 2017-09-22 比亚迪股份有限公司 The method of adjustment of optical filter, apparatus and system
US9774789B2 (en) 2013-03-08 2017-09-26 Fotonation Cayman Limited Systems and methods for high dynamic range imaging using array cameras
US9794476B2 (en) 2011-09-19 2017-10-17 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures
US9800859B2 (en) 2013-03-15 2017-10-24 Fotonation Cayman Limited Systems and methods for estimating depth using stereo array cameras
US9800856B2 (en) 2013-03-13 2017-10-24 Fotonation Cayman Limited Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
US9807382B2 (en) 2012-06-28 2017-10-31 Fotonation Cayman Limited Systems and methods for detecting defective camera arrays and optic arrays
US9813616B2 (en) 2012-08-23 2017-11-07 Fotonation Cayman Limited Feature based high resolution motion estimation from low resolution images captured using an array source
US9813617B2 (en) 2013-11-26 2017-11-07 Fotonation Cayman Limited Array camera configurations incorporating constituent array cameras and constituent cameras
CN107431746A (en) * 2015-11-24 2017-12-01 索尼半导体解决方案公司 Camera model and electronic equipment
US9858673B2 (en) 2012-08-21 2018-01-02 Fotonation Cayman Limited Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
US9924092B2 (en) 2013-11-07 2018-03-20 Fotonation Cayman Limited Array cameras incorporating independently aligned lens stacks
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
US9955070B2 (en) 2013-03-15 2018-04-24 Fotonation Cayman Limited Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US9986224B2 (en) 2013-03-10 2018-05-29 Fotonation Cayman Limited System and methods for calibration of an array camera
CN108140247A (en) * 2015-10-05 2018-06-08 谷歌有限责任公司 Use the camera calibrated of composograph
US10009538B2 (en) 2013-02-21 2018-06-26 Fotonation Cayman Limited Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
US10091405B2 (en) 2013-03-14 2018-10-02 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US10089740B2 (en) 2014-03-07 2018-10-02 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
WO2018176493A1 (en) * 2017-04-01 2018-10-04 SZ DJI Technology Co., Ltd. Low-profile multi-band hyperspectral imaging for machine vision
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US10127682B2 (en) 2013-03-13 2018-11-13 Fotonation Limited System and methods for calibration of an array camera
CN109155814A (en) * 2016-05-27 2019-01-04 索尼半导体解决方案公司 Processing unit, imaging sensor and system
US10218889B2 (en) 2011-05-11 2019-02-26 Fotonation Limited Systems and methods for transmitting and receiving array camera image data
US10250871B2 (en) 2014-09-29 2019-04-02 Fotonation Limited Systems and methods for dynamic calibration of array cameras
CN109644258A (en) * 2016-08-31 2019-04-16 华为技术有限公司 Multicamera system for zoom shot
US10261219B2 (en) 2012-06-30 2019-04-16 Fotonation Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
US10306120B2 (en) 2009-11-20 2019-05-28 Fotonation Limited Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps
US10366472B2 (en) 2010-12-14 2019-07-30 Fotonation Limited Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US10390005B2 (en) 2012-09-28 2019-08-20 Fotonation Limited Generating images from light fields utilizing virtual viewpoints
US10412314B2 (en) 2013-03-14 2019-09-10 Fotonation Limited Systems and methods for photometric normalization in array cameras
US10455168B2 (en) 2010-05-12 2019-10-22 Fotonation Limited Imager array interfaces
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
CN113273183A (en) * 2019-01-07 2021-08-17 Lg伊诺特有限公司 Camera module
CN113924517A (en) * 2019-06-06 2022-01-11 应用材料公司 Imaging system and method for generating composite image
US11270110B2 (en) 2019-09-17 2022-03-08 Boston Polarimetrics, Inc. Systems and methods for surface modeling using polarization cues
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11302012B2 (en) 2019-11-30 2022-04-12 Boston Polarimetrics, Inc. Systems and methods for transparent object segmentation using polarization cues
US11525906B2 (en) 2019-10-07 2022-12-13 Intrinsic Innovation Llc Systems and methods for augmentation of sensor systems and imaging systems with polarization
US11580667B2 (en) 2020-01-29 2023-02-14 Intrinsic Innovation Llc Systems and methods for characterizing object pose detection and measurement systems
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US11797863B2 (en) 2020-01-30 2023-10-24 Intrinsic Innovation Llc Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images
US11953700B2 (en) 2021-05-27 2024-04-09 Intrinsic Innovation Llc Multi-aperture polarization optical systems using beam splitters

Families Citing this family (306)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7262799B2 (en) * 2000-10-25 2007-08-28 Canon Kabushiki Kaisha Image sensing apparatus and its control method, control program, and storage medium
US7813634B2 (en) 2005-02-28 2010-10-12 Tessera MEMS Technologies, Inc. Autofocus camera
EP1549080B1 (en) * 2002-07-18 2011-03-02 Sony Corporation Imaging data processing method, imaging data processing device, and computer program
US8896725B2 (en) 2007-06-21 2014-11-25 Fotonation Limited Image capture device with contemporaneous reference image capture mechanism
US8593542B2 (en) 2005-12-27 2013-11-26 DigitalOptics Corporation Europe Limited Foreground/background separation using reference images
US7652685B2 (en) * 2004-09-13 2010-01-26 Omnivision Cdm Optics, Inc. Iris image capture devices and associated systems
US7433042B1 (en) * 2003-12-05 2008-10-07 Surface Optics Corporation Spatially corrected full-cubed hyperspectral imager
US8049806B2 (en) * 2004-09-27 2011-11-01 Digitaloptics Corporation East Thin camera and associated methods
US8953087B2 (en) * 2004-04-08 2015-02-10 Flir Systems Trading Belgium Bvba Camera system and associated methods
US7564019B2 (en) 2005-08-25 2009-07-21 Richard Ian Olsen Large dynamic range cameras
US8124929B2 (en) * 2004-08-25 2012-02-28 Protarius Filo Ag, L.L.C. Imager module optical focus and assembly method
WO2006026354A2 (en) 2004-08-25 2006-03-09 Newport Imaging Corporation Apparatus for multiple camera devices and method of operating same
US7916180B2 (en) * 2004-08-25 2011-03-29 Protarius Filo Ag, L.L.C. Simultaneous multiple field of view digital cameras
US7795577B2 (en) * 2004-08-25 2010-09-14 Richard Ian Olsen Lens frame and optical focus assembly for imager module
US8320641B2 (en) * 2004-10-28 2012-11-27 DigitalOptics Corporation Europe Limited Method and apparatus for red-eye detection using preview or other reference images
JP4534756B2 (en) * 2004-12-22 2010-09-01 ソニー株式会社 Image processing apparatus, image processing method, imaging apparatus, program, and recording medium
US7769284B2 (en) * 2005-02-28 2010-08-03 Silmpel Corporation Lens barrel assembly for a camera
JP2007003699A (en) * 2005-06-22 2007-01-11 Fuji Xerox Co Ltd Image display device
US20070102622A1 (en) * 2005-07-01 2007-05-10 Olsen Richard I Apparatus for multiple camera devices and method of operating same
US20070258006A1 (en) * 2005-08-25 2007-11-08 Olsen Richard I Solid state camera optics frame and assembly
US7566855B2 (en) * 2005-08-25 2009-07-28 Richard Ian Olsen Digital camera with integrated infrared (IR) response
US7964835B2 (en) 2005-08-25 2011-06-21 Protarius Filo Ag, L.L.C. Digital cameras with direct luminance and chrominance detection
EP2328006B1 (en) * 2005-09-19 2014-08-06 OmniVision CDM Optics, Inc. Task-based imaging systems
WO2007060847A1 (en) * 2005-11-22 2007-05-31 Matsushita Electric Industrial Co., Ltd. Imaging device
JP4147273B2 (en) * 2006-01-20 2008-09-10 松下電器産業株式会社 Compound eye camera module and manufacturing method thereof
US7684612B2 (en) * 2006-03-28 2010-03-23 Pitney Bowes Software Inc. Method and apparatus for storing 3D information with raster imagery
US20070236591A1 (en) * 2006-04-11 2007-10-11 Tam Samuel W Method for mounting protective covers over image capture devices and devices manufactured thereby
US8081207B2 (en) * 2006-06-06 2011-12-20 Point Grey Research Inc. High accuracy stereo camera
KR100871564B1 (en) * 2006-06-19 2008-12-02 삼성전기주식회사 Camera module
KR100772910B1 (en) * 2006-06-26 2007-11-05 삼성전기주식회사 Digital camera module
EP1874034A3 (en) * 2006-06-26 2011-12-21 Samsung Electro-Mechanics Co., Ltd. Apparatus and method of recovering high pixel image
US8242426B2 (en) * 2006-12-12 2012-08-14 Dolby Laboratories Licensing Corporation Electronic camera having multiple sensors for capturing high dynamic range images and related methods
US20100140461A1 (en) * 2006-12-13 2010-06-10 Georgia Tech Research Corporation Systems and methods for real time multispectral imaging
US20080165257A1 (en) * 2007-01-05 2008-07-10 Micron Technology, Inc. Configurable pixel array system and method
US8319846B2 (en) * 2007-01-11 2012-11-27 Raytheon Company Video camera system using multiple image sensors
US8456560B2 (en) * 2007-01-26 2013-06-04 Digitaloptics Corporation Wafer level camera module and method of manufacture
JP4999494B2 (en) * 2007-02-28 2012-08-15 オンセミコンダクター・トレーディング・リミテッド Imaging device
US8594387B2 (en) * 2007-04-23 2013-11-26 Intel-Ge Care Innovations Llc Text capture and presentation device
CA2685080A1 (en) * 2007-04-24 2008-11-06 Flextronics Ap Llc Small form factor modules using wafer level optics with bottom cavity and flip-chip assembly
US20090015706A1 (en) * 2007-04-24 2009-01-15 Harpuneet Singh Auto focus/zoom modules using wafer level optics
US7936377B2 (en) * 2007-04-30 2011-05-03 Tandent Vision Science, Inc. Method and system for optimizing an image for improved analysis of material and illumination image features
US7812869B2 (en) * 2007-05-11 2010-10-12 Aptina Imaging Corporation Configurable pixel array system and method
US7909253B2 (en) * 2007-05-24 2011-03-22 Northrop Grumman Systems Corporation Image detection system and methods
CN100583956C (en) * 2007-06-25 2010-01-20 鸿富锦精密工业(深圳)有限公司 Image forming apparatus and camera light strength attenuation and compensation method
US8300083B2 (en) * 2007-07-20 2012-10-30 Hewlett-Packard Development Company, L.P. Position relationships associated with image capturing devices
US20090033755A1 (en) * 2007-08-03 2009-02-05 Tandent Vision Science, Inc. Image acquisition and processing engine for computer vision
SG150414A1 (en) * 2007-09-05 2009-03-30 Creative Tech Ltd Methods for processing a composite video image with feature indication
US20090118600A1 (en) * 2007-11-02 2009-05-07 Ortiz Joseph L Method and apparatus for skin documentation and analysis
US9118850B2 (en) * 2007-11-27 2015-08-25 Capso Vision, Inc. Camera system with multiple pixel arrays on a chip
US20090159799A1 (en) * 2007-12-19 2009-06-25 Spectral Instruments, Inc. Color infrared light sensor, camera, and method for capturing images
WO2009085305A1 (en) * 2007-12-27 2009-07-09 Google Inc. High-resolution, variable depth of field image device
JP4413261B2 (en) * 2008-01-10 2010-02-10 シャープ株式会社 Imaging apparatus and optical axis control method
US7745779B2 (en) * 2008-02-08 2010-06-29 Aptina Imaging Corporation Color pixel arrays having common color filters for multiple adjacent pixels for use in CMOS imagers
US8115825B2 (en) * 2008-02-20 2012-02-14 Apple Inc. Electronic device with two image sensors
US9118825B2 (en) * 2008-02-22 2015-08-25 Nan Chang O-Film Optoelectronics Technology Ltd. Attachment of wafer level optics
US8135237B2 (en) * 2008-02-25 2012-03-13 Aptina Imaging Corporation Apparatuses and methods for noise reduction
CA2720612C (en) * 2008-04-07 2017-01-03 Mirion Technologies, Inc. Dosimetry apparatus, systems, and methods
CN101557453B (en) * 2008-04-09 2010-09-29 鸿富锦精密工业(深圳)有限公司 Image acquisition device and picture arrangement method thereof
JP4654264B2 (en) * 2008-04-10 2011-03-16 シャープ株式会社 Optical communication device and electronic equipment
US7675024B2 (en) * 2008-04-23 2010-03-09 Aptina Imaging Corporation Method and apparatus providing color filter array with non-uniform color filter sizes
US20090278929A1 (en) * 2008-05-06 2009-11-12 Flir Systems Inc Video camera with interchangable optical sensors
EP2133726B1 (en) 2008-06-10 2011-06-01 Thomson Licensing Multi-image capture system with improved depth image resolution
JP4582205B2 (en) * 2008-06-12 2010-11-17 トヨタ自動車株式会社 Electric vehicle
TW201007162A (en) * 2008-08-04 2010-02-16 Shanghai Microtek Technology Co Ltd Optical carriage structure of inspection apparatus and its inspection method
US20100110259A1 (en) * 2008-10-31 2010-05-06 Weistech Technology Co., Ltd Multi-lens image sensor module
WO2010052593A1 (en) * 2008-11-04 2010-05-14 Ecole Polytechnique Federale De Lausanne (Epfl) Camera design for the simultaneous capture of near-infrared and visible images
US9621825B2 (en) * 2008-11-25 2017-04-11 Capsovision Inc Camera system with multiple pixel arrays on a chip
US8587639B2 (en) * 2008-12-11 2013-11-19 Alcatel Lucent Method of improved three dimensional display technique
US20100165088A1 (en) * 2008-12-29 2010-07-01 Intromedic Apparatus and Method for Displaying Capsule Endoscope Image, and Record Media Storing Program for Carrying out that Method
US9494771B2 (en) 2009-01-05 2016-11-15 Duke University Quasi-monocentric-lens-based multi-scale optical system
US8830377B2 (en) 2010-01-04 2014-09-09 Duke University Monocentric lens-based multi-scale optical systems and methods of use
US9395617B2 (en) 2009-01-05 2016-07-19 Applied Quantum Technologies, Inc. Panoramic multi-scale imager and method therefor
US9635253B2 (en) 2009-01-05 2017-04-25 Duke University Multiscale telescopic imaging system
US9432591B2 (en) * 2009-01-05 2016-08-30 Duke University Multiscale optical system having dynamic camera settings
US10725280B2 (en) 2009-01-05 2020-07-28 Duke University Multiscale telescopic imaging system
US8816460B2 (en) * 2009-04-06 2014-08-26 Nokia Corporation Image sensor
DE112009004707T5 (en) * 2009-04-22 2012-09-13 Hewlett-Packard Development Co., L.P. Spatially varying spectral response calibration data
US20100321511A1 (en) * 2009-06-18 2010-12-23 Nokia Corporation Lenslet camera with rotated sensors
US8134115B2 (en) * 2009-06-23 2012-03-13 Nokia Corporation Color filters for sub-diffraction limit-sized light sensors
US8179457B2 (en) * 2009-06-23 2012-05-15 Nokia Corporation Gradient color filters for sub-diffraction limit sensors
US8198578B2 (en) * 2009-06-23 2012-06-12 Nokia Corporation Color filters for sub-diffraction limit-sized light sensors
GB0912970D0 (en) * 2009-07-27 2009-09-02 St Microelectronics Res & Dev Improvements in or relating to a sensor and sensor system for a camera
US9419032B2 (en) * 2009-08-14 2016-08-16 Nanchang O-Film Optoelectronics Technology Ltd Wafer level camera module with molded housing and method of manufacturing
JP4886016B2 (en) * 2009-10-08 2012-02-29 シャープ株式会社 Imaging lens, imaging module, imaging lens manufacturing method, and imaging module manufacturing method
CN102422417A (en) * 2009-11-11 2012-04-18 松下电器产业株式会社 Solid-state image pickup device and method for manufacturing same
US8633968B2 (en) * 2009-12-11 2014-01-21 Dish Network L.L.C. Three-dimensional recording and display system using near- and distal-focused images
US9726487B2 (en) * 2009-12-18 2017-08-08 Vito Nv Geometric referencing of multi-spectral data
US8634596B2 (en) 2009-12-22 2014-01-21 Honeywell International Inc. Three-dimensional multilayer skin texture recognition system and method
GB0922603D0 (en) * 2009-12-24 2010-02-10 Touch Emas Ltd Skin colour determining apparatus and method
CN102143346B (en) * 2010-01-29 2013-02-13 广州市启天科技股份有限公司 Cruise shooting storage method and system
JP2011216701A (en) * 2010-03-31 2011-10-27 Sony Corp Solid-state imaging apparatus and electronic device
US20110242355A1 (en) 2010-04-05 2011-10-06 Qualcomm Incorporated Combining data from multiple image sensors
US8896668B2 (en) 2010-04-05 2014-11-25 Qualcomm Incorporated Combining data from multiple image sensors
FR2959903B1 (en) * 2010-05-04 2012-07-27 Astrium Sas POLYCHROME IMAGING METHOD
US8576293B2 (en) * 2010-05-18 2013-11-05 Aptina Imaging Corporation Multi-channel imager
US8970672B2 (en) 2010-05-28 2015-03-03 Qualcomm Incorporated Three-dimensional image processing
JPWO2011155136A1 (en) * 2010-06-07 2013-08-01 コニカミノルタ株式会社 Imaging device
JPWO2011155135A1 (en) * 2010-06-07 2013-08-01 コニカミノルタ株式会社 Imaging device
US8729478B2 (en) * 2010-06-09 2014-05-20 Carestream Health, Inc. Dual screen radiographic detector with improved spatial sampling
US8681217B2 (en) * 2010-07-21 2014-03-25 Olympus Corporation Inspection apparatus and measurement method
DE102010041569B4 (en) * 2010-09-28 2017-04-06 Leica Geosystems Ag Digital camera system, color filter element for digital camera system, method for determining deviations between the cameras of a digital camera system and image processing unit for digital camera system
CN102438153B (en) * 2010-09-29 2015-11-25 华为终端有限公司 Multi-camera image correction method and equipment
JP5528976B2 (en) * 2010-09-30 2014-06-25 株式会社メガチップス Image processing device
US20140192238A1 (en) 2010-10-24 2014-07-10 Linx Computational Imaging Ltd. System and Method for Imaging and Image Processing
US9143668B2 (en) * 2010-10-29 2015-09-22 Apple Inc. Camera lens structures and display structures for electronic devices
US9137503B2 (en) * 2010-11-03 2015-09-15 Sony Corporation Lens and color filter arrangement, super-resolution camera system and method
EP2461198A3 (en) 2010-12-01 2017-03-08 BlackBerry Limited Apparatus, and associated method, for a camera module of electronic device
US9167138B2 (en) 2010-12-06 2015-10-20 Apple Inc. Pattern projection and imaging using lens arrays
EP2652524B1 (en) 2010-12-15 2019-11-06 Mirion Technologies, Inc. Dosimetry system, methods, and components
JP5976676B2 (en) * 2011-01-14 2016-08-24 ソニー株式会社 Imaging system using longitudinal chromatic aberration of lens unit and operation method thereof
WO2012098599A1 (en) * 2011-01-17 2012-07-26 パナソニック株式会社 Imaging device
US8581995B2 (en) * 2011-01-25 2013-11-12 Aptina Imaging Corporation Method and apparatus for parallax correction in fused array imaging systems
KR101829777B1 (en) * 2011-03-09 2018-02-20 삼성디스플레이 주식회사 Optical sensor
US9030528B2 (en) * 2011-04-04 2015-05-12 Apple Inc. Multi-zone imaging sensor and lens array
US20120274811A1 (en) * 2011-04-28 2012-11-01 Dmitry Bakin Imaging devices having arrays of image sensors and precision offset lenses
US20120281113A1 (en) * 2011-05-06 2012-11-08 Raytheon Company USING A MULTI-CHIP SYSTEM IN A PACKAGE (MCSiP) IN IMAGING APPLICATIONS TO YIELD A LOW COST, SMALL SIZE CAMERA ON A CHIP
JP2014521117A (en) 2011-06-28 2014-08-25 ペリカン イメージング コーポレイション Optical array for use with array cameras
JP6080343B2 (en) * 2011-07-29 2017-02-15 ソニーセミコンダクタソリューションズ株式会社 Image sensor and manufacturing method thereof
GB201114264D0 (en) 2011-08-18 2011-10-05 Touch Emas Ltd Improvements in or relating to prosthetics and orthotics
JP6034197B2 (en) * 2011-08-25 2016-11-30 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Image processing apparatus, three-dimensional imaging apparatus, image processing method, and image processing program
US9337949B2 (en) 2011-08-31 2016-05-10 Cablecam, Llc Control system for an aerially moved payload
US9477141B2 (en) 2011-08-31 2016-10-25 Cablecam, Llc Aerial movement system having multiple payloads
JPWO2013051186A1 (en) * 2011-10-03 2015-03-30 パナソニックIpマネジメント株式会社 IMAGING DEVICE, SYSTEM USING IMAGING DEVICE, AND RANGING DEVICE
US20130088603A1 (en) * 2011-10-11 2013-04-11 Thomas D. Pawlik Compact viewer for invisible indicia
US9036059B2 (en) * 2011-11-01 2015-05-19 Sony Corporation Imaging apparatus for efficiently generating multiple forms of image data output by an imaging sensor
US20130120621A1 (en) * 2011-11-10 2013-05-16 Research In Motion Limited Apparatus and associated method for forming color camera image
US9223138B2 (en) 2011-12-23 2015-12-29 Microsoft Technology Licensing, Llc Pixel opacity for augmented reality
US8917453B2 (en) 2011-12-23 2014-12-23 Microsoft Corporation Reflective array waveguide
US8941750B2 (en) * 2011-12-27 2015-01-27 Casio Computer Co., Ltd. Image processing device for generating reconstruction image, image generating method, and storage medium
US8638498B2 (en) 2012-01-04 2014-01-28 David D. Bohn Eyebox adjustment for interpupillary distance
US9606586B2 (en) 2012-01-23 2017-03-28 Microsoft Technology Licensing, Llc Heat transfer device
WO2013116253A1 (en) * 2012-01-30 2013-08-08 Scanadu Incorporated Spatial resolution enhancement in hyperspectral imaging
US9779643B2 (en) * 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9297996B2 (en) 2012-02-15 2016-03-29 Microsoft Technology Licensing, Llc Laser illumination scanning
CN103297665A (en) * 2012-02-22 2013-09-11 庄佑华 Image acquisition system
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US20150085080A1 (en) * 2012-04-18 2015-03-26 3Shape A/S 3d scanner using merged partial images
EP2845167A4 (en) * 2012-05-01 2016-01-13 Pelican Imaging Corp CAMERA MODULES PATTERNED WITH pi FILTER GROUPS
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US9754989B2 (en) * 2012-05-24 2017-09-05 Steven Huang Method for reading out multiple SRAM blocks with different column sizing in stitched CMOS image senor
US8989535B2 (en) 2012-06-04 2015-03-24 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US8978981B2 (en) * 2012-06-27 2015-03-17 Honeywell International Inc. Imaging apparatus having imaging lens
US8988538B2 (en) * 2012-07-02 2015-03-24 Canon Kabushiki Kaisha Image pickup apparatus and lens apparatus
WO2014006514A2 (en) * 2012-07-04 2014-01-09 Opera Imaging B.V. Image processing in a multi-channel camera
WO2014043641A1 (en) 2012-09-14 2014-03-20 Pelican Imaging Corporation Systems and methods for correcting user identified artifacts in light field images
US9451745B1 (en) * 2012-09-21 2016-09-27 The United States Of America, As Represented By The Secretary Of Agriculture Multi-band photodiode sensor
US9766121B2 (en) * 2012-09-28 2017-09-19 Intel Corporation Mobile device based ultra-violet (UV) radiation sensing
US9398264B2 (en) 2012-10-19 2016-07-19 Qualcomm Incorporated Multi-camera system using folded optics
US9191587B2 (en) * 2012-10-26 2015-11-17 Raytheon Company Method and apparatus for image stacking
US8805115B2 (en) * 2012-11-02 2014-08-12 Raytheon Company Correction of variable offsets relying upon scene
WO2014083489A1 (en) 2012-11-28 2014-06-05 Corephotonics Ltd. High-resolution thin multi-aperture imaging systems
US20140160253A1 (en) * 2012-12-10 2014-06-12 Microsoft Corporation Hyperspectral imager
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
GB201302025D0 (en) 2013-02-05 2013-03-20 Touch Emas Ltd Improvements in or relating to prosthetics
WO2014124743A1 (en) * 2013-02-18 2014-08-21 Sony Corporation Electronic device, method for generating an image and filter arrangement
US9638883B1 (en) 2013-03-04 2017-05-02 Fotonation Cayman Limited Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process
US9521416B1 (en) 2013-03-11 2016-12-13 Kip Peli P1 Lp Systems and methods for image data compression
US9633442B2 (en) 2013-03-15 2017-04-25 Fotonation Cayman Limited Array cameras including an array camera module augmented with a separate camera
US9497370B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Array camera architecture implementing quantum dot color filters
WO2014160819A1 (en) * 2013-03-27 2014-10-02 Bae Systems Information And Electronic Systems Integration Inc. Multi field-of-view multi sensor electro-optical fusion-zoom camera
US9547231B2 (en) * 2013-06-12 2017-01-17 Avago Technologies General Ip (Singapore) Pte. Ltd. Device and method for making photomask assembly and photodetector device having light-collecting optical microstructure
JP6139713B2 (en) 2013-06-13 2017-05-31 コアフォトニクス リミテッド Dual aperture zoom digital camera
CN108549119A (en) 2013-07-04 2018-09-18 核心光电有限公司 Small-sized focal length lens external member
US9127891B2 (en) * 2013-07-10 2015-09-08 Honeywell International, Inc. Furnace visualization
CN109246339B (en) 2013-08-01 2020-10-23 核心光电有限公司 Dual aperture digital camera for imaging an object or scene
US9473708B1 (en) * 2013-08-07 2016-10-18 Google Inc. Devices and methods for an imaging system with a dual camera architecture
US10178373B2 (en) 2013-08-16 2019-01-08 Qualcomm Incorporated Stereo yaw correction using autofocus feedback
JP6403369B2 (en) 2013-09-18 2018-10-10 ローム株式会社 Photodetector and sensor package
KR102071325B1 (en) * 2013-09-27 2020-04-02 매그나칩 반도체 유한회사 Optical sensor sensing illuminance and proximity
EP3054664B1 (en) 2013-09-30 2022-06-29 Nikon Corporation Electronic device, method for controlling electronic device, and control program
US8917327B1 (en) * 2013-10-04 2014-12-23 icClarity, Inc. Method to use array sensors to measure multiple types of data at full resolution of the sensor
KR102241706B1 (en) * 2013-11-13 2021-04-19 엘지전자 주식회사 3 dimensional camera and method for controlling the same
DE102013226196A1 (en) * 2013-12-17 2015-06-18 Volkswagen Aktiengesellschaft Optical sensor system
EP3102158B1 (en) 2014-02-04 2019-06-12 Rehabilitation Institute of Chicago Modular and lightweight prosthesis components
US20160018720A1 (en) * 2014-02-19 2016-01-21 Gil BACHAR Magnetic shielding between voice coil motors in a dual-aperture camera
GB201403265D0 (en) 2014-02-25 2014-04-09 Touch Emas Ltd Prosthetic digit for use with touchscreen devices
JP6422224B2 (en) * 2014-03-17 2018-11-14 キヤノン株式会社 Compound eye optical equipment
US20150281601A1 (en) * 2014-03-25 2015-10-01 INVIS Technologies Corporation Modular Packaging and Optical System for Multi-Aperture and Multi-Spectral Camera Core
US9383550B2 (en) 2014-04-04 2016-07-05 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
US9374516B2 (en) 2014-04-04 2016-06-21 Qualcomm Incorporated Auto-focus in low-profile folded optics multi-camera system
US9247117B2 (en) 2014-04-07 2016-01-26 Pelican Imaging Corporation Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array
CN204795370U (en) * 2014-04-18 2015-11-18 菲力尔系统公司 Monitoring system and contain its vehicle
KR102269599B1 (en) 2014-04-23 2021-06-25 삼성전자주식회사 Image pickup apparatus including lens elements having different diameters
US9300877B2 (en) * 2014-05-05 2016-03-29 Omnivision Technologies, Inc. Optical zoom imaging systems and associated methods
GB201408253D0 (en) 2014-05-09 2014-06-25 Touch Emas Ltd Systems and methods for controlling a prosthetic hand
US10057509B2 (en) 2014-05-30 2018-08-21 Flir Systems, Inc. Multiple-sensor imaging system
US10013764B2 (en) 2014-06-19 2018-07-03 Qualcomm Incorporated Local adaptive histogram equalization
US9541740B2 (en) 2014-06-20 2017-01-10 Qualcomm Incorporated Folded optic array camera using refractive prisms
US9294672B2 (en) * 2014-06-20 2016-03-22 Qualcomm Incorporated Multi-camera system using folded optics free from parallax and tilt artifacts
US9386222B2 (en) 2014-06-20 2016-07-05 Qualcomm Incorporated Multi-camera system using folded optics free from parallax artifacts
US9819863B2 (en) 2014-06-20 2017-11-14 Qualcomm Incorporated Wide field of view array camera for hemispheric and spherical imaging
US9516295B2 (en) * 2014-06-30 2016-12-06 Aquifi, Inc. Systems and methods for multi-channel imaging based on multiple exposure settings
WO2016013977A1 (en) * 2014-07-25 2016-01-28 Heptagon Micro Optics Pte. Ltd. Optoelectronic modules including an image sensor having regions optically separated from one another
US9304235B2 (en) 2014-07-30 2016-04-05 Microsoft Technology Licensing, Llc Microfabrication
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
CN104111552B (en) * 2014-08-08 2017-02-01 深圳市华星光电技术有限公司 Multi-primary-color liquid crystal display and driving method thereof
US9392188B2 (en) 2014-08-10 2016-07-12 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
US9225889B1 (en) 2014-08-18 2015-12-29 Entropix, Inc. Photographic image acquisition device and method
GB201417541D0 (en) 2014-10-03 2014-11-19 Touch Bionics Ltd Wrist device for a prosthetic limb
US9832381B2 (en) 2014-10-31 2017-11-28 Qualcomm Incorporated Optical image stabilization for thin cameras
US9857569B2 (en) * 2014-10-31 2018-01-02 Everready Precision Ind. Corp. Combined lens module and image capturing-and-sensing assembly
US9581696B2 (en) * 2014-12-22 2017-02-28 Google Inc. Image sensor and light source driver integrated in a same semiconductor package
US20160182846A1 (en) * 2014-12-22 2016-06-23 Google Inc. Monolithically integrated rgb pixel array and z pixel array
US9615013B2 (en) * 2014-12-22 2017-04-04 Google Inc. Image sensor having multiple output ports
WO2016108093A1 (en) 2015-01-03 2016-07-07 Corephotonics Ltd. Miniature telephoto lens module and a camera utilizing such a lens module
US9519979B1 (en) * 2015-01-23 2016-12-13 The United States Of America As Represented By The Secretary Of The Navy Ladar range data video color rendering
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US20160255323A1 (en) 2015-02-26 2016-09-01 Dual Aperture International Co. Ltd. Multi-Aperture Depth Map Using Blur Kernels and Down-Sampling
CN107407849B (en) 2015-04-02 2018-11-06 核心光电有限公司 Double magazine double voice coil coil motor structures of optical module
KR102088603B1 (en) 2015-04-16 2020-03-13 코어포토닉스 리미티드 Auto focus and optical imagestabilization in a compact folded camera
KR20230008893A (en) 2015-04-19 2023-01-16 포토내이션 리미티드 Multi-baseline camera array system architectures for depth augmentation in vr/ar applications
US9743007B2 (en) * 2015-04-23 2017-08-22 Altek Semiconductor Corp. Lens module array, image sensing device and fusing method for digital zoomed images
US10326981B2 (en) * 2015-05-15 2019-06-18 Semyon Nisenzon Generating 3D images using multi-resolution camera set
WO2016189455A1 (en) 2015-05-28 2016-12-01 Corephotonics Ltd. Bi-directional stiffness for optical image stabilization and auto-focus in a dual-aperture digital camera
CN112672022B (en) 2015-08-13 2022-08-02 核心光电有限公司 Dual aperture zoom camera with video support and switching/non-switching dynamic control
CN109901342B (en) 2015-09-06 2021-06-22 核心光电有限公司 Auto-focus and optical image stabilization with roll compensation in compact folded cameras
WO2017087542A1 (en) 2015-11-18 2017-05-26 The Board Of Trustees Of The Leland Stanford Junior University Method and systems for measuring neural activity
KR102291525B1 (en) 2015-12-29 2021-08-19 코어포토닉스 리미티드 Dual-aperture zoom digital camera with automatic adjustable tele field of view
US10451548B2 (en) * 2016-01-15 2019-10-22 The Mitre Corporation Active hyperspectral imaging system
KR102002718B1 (en) 2016-05-30 2019-10-18 코어포토닉스 리미티드 Rotary Ball-Guid Voice Coil Motor
KR101785458B1 (en) * 2016-06-07 2017-10-16 엘지전자 주식회사 Camera module and mobile terminal having the same
EP4020958B1 (en) 2016-06-19 2023-10-25 Corephotonics Ltd. Frame synchronization in a dual-aperture camera system
WO2018007951A1 (en) 2016-07-07 2018-01-11 Corephotonics Ltd. Dual-camera system with improved video smooth transition by image blending
CN107924064B (en) 2016-07-07 2020-06-12 核心光电有限公司 Linear ball guided voice coil motor for folded optical devices
US11102467B2 (en) * 2016-08-25 2021-08-24 Facebook Technologies, Llc Array detector for depth mapping
US11185426B2 (en) 2016-09-02 2021-11-30 Touch Bionics Limited Systems and methods for prosthetic wrist rotation
WO2018042215A1 (en) 2016-09-02 2018-03-08 Touch Bionics Limited Systems and methods for prosthetic wrist rotation
US10297034B2 (en) 2016-09-30 2019-05-21 Qualcomm Incorporated Systems and methods for fusing images
US10026014B2 (en) * 2016-10-26 2018-07-17 Nxp Usa, Inc. Method and apparatus for data set classification based on generator features
KR102269547B1 (en) 2016-12-28 2021-06-25 코어포토닉스 리미티드 Folded camera structure with extended light-folding-element scanning range
JP7057364B2 (en) 2017-01-12 2022-04-19 コアフォトニクス リミテッド Compact flexible camera
KR101963547B1 (en) 2017-02-23 2019-03-28 코어포토닉스 리미티드 Folded camera lens design
KR102530535B1 (en) 2017-03-15 2023-05-08 코어포토닉스 리미티드 Cameras with panoramic scanning range
CN107426471B (en) 2017-05-03 2021-02-05 Oppo广东移动通信有限公司 Camera module and electronic device
EP3568729A4 (en) * 2017-05-26 2020-02-26 SZ DJI Technology Co., Ltd. Method and system for motion camera with embedded gimbal
KR102301232B1 (en) * 2017-05-31 2021-09-10 삼성전자주식회사 Method and apparatus for processing multiple-channel feature map images
CN107277352A (en) * 2017-06-30 2017-10-20 维沃移动通信有限公司 The method and mobile terminal of a kind of shooting
US10969275B2 (en) * 2017-08-02 2021-04-06 Nanolamda Korea On-chip spectrometer employing pixel-count-modulated spectral channels and method of manufacturing the same
US10567636B2 (en) 2017-08-07 2020-02-18 Qualcomm Incorporated Resolution enhancement using sensor with plural photodiodes per microlens
US10499020B1 (en) 2017-08-17 2019-12-03 Verily Life Sciences Llc Lenslet based snapshot hyperspectral camera
US10904512B2 (en) 2017-09-06 2021-01-26 Corephotonics Ltd. Combined stereoscopic and phase detection depth mapping in a dual aperture camera
US10951834B2 (en) 2017-10-03 2021-03-16 Corephotonics Ltd. Synthetically enlarged camera aperture
US10930709B2 (en) 2017-10-03 2021-02-23 Lockheed Martin Corporation Stacked transparent pixel structures for image sensors
US10510812B2 (en) 2017-11-09 2019-12-17 Lockheed Martin Corporation Display-integrated infrared emitter and sensor structures
JP6806919B2 (en) 2017-11-23 2021-01-06 コアフォトニクス リミテッド Compact bendable camera structure
US10973660B2 (en) 2017-12-15 2021-04-13 Touch Bionics Limited Powered prosthetic thumb
EP3732508A1 (en) * 2017-12-27 2020-11-04 AMS Sensors Singapore Pte. Ltd. Optoelectronic modules and methods for operating the same
EP3848749A1 (en) 2018-02-05 2021-07-14 Corephotonics Ltd. Reduced height penalty for folded camera
US11616941B2 (en) 2018-02-07 2023-03-28 Lockheed Martin Corporation Direct camera-to-display system
US10652529B2 (en) 2018-02-07 2020-05-12 Lockheed Martin Corporation In-layer Signal processing
US10594951B2 (en) * 2018-02-07 2020-03-17 Lockheed Martin Corporation Distributed multi-aperture camera array
US10979699B2 (en) 2018-02-07 2021-04-13 Lockheed Martin Corporation Plenoptic cellular imaging system
US10951883B2 (en) 2018-02-07 2021-03-16 Lockheed Martin Corporation Distributed multi-screen array for high density display
US10838250B2 (en) 2018-02-07 2020-11-17 Lockheed Martin Corporation Display assemblies with electronically emulated transparency
US10690910B2 (en) 2018-02-07 2020-06-23 Lockheed Martin Corporation Plenoptic cellular vision correction
WO2019155289A1 (en) 2018-02-12 2019-08-15 Corephotonics Ltd. Folded camera with optical image stabilization
KR102507746B1 (en) * 2018-03-02 2023-03-09 삼성전자주식회사 Method for generating plural information using camera to sense plural wave bandwidth and apparatus thereof
US10694168B2 (en) 2018-04-22 2020-06-23 Corephotonics Ltd. System and method for mitigating or preventing eye damage from structured light IR/NIR projector systems
US11268829B2 (en) 2018-04-23 2022-03-08 Corephotonics Ltd Optical-path folding-element with an extended two degree of freedom rotation range
WO2019227974A1 (en) * 2018-06-02 2019-12-05 Oppo广东移动通信有限公司 Electronic assembly and electronic device
CN111316346B (en) 2018-08-04 2022-11-29 核心光电有限公司 Switchable continuous display information system above camera
TWI768103B (en) * 2018-08-16 2022-06-21 先進光電科技股份有限公司 Optical image capturing module、system and manufacturing method thereof
US11635596B2 (en) 2018-08-22 2023-04-25 Corephotonics Ltd. Two-state zoom folded camera
TWI768127B (en) * 2018-09-21 2022-06-21 先進光電科技股份有限公司 Optical image capturing module, optical image system and optical image capturing manufacture method
US10866413B2 (en) 2018-12-03 2020-12-15 Lockheed Martin Corporation Eccentric incident luminance pupil tracking
KR102558301B1 (en) * 2018-12-13 2023-07-24 에스케이하이닉스 주식회사 Image Sensing Device Having an Organic Pixel Array and an Inorganic Pixel Array
CN111919057B (en) 2019-01-07 2021-08-31 核心光电有限公司 Rotating mechanism with sliding joint
WO2020183312A1 (en) 2019-03-09 2020-09-17 Corephotonics Ltd. System and method for dynamic stereoscopic calibration
US10698201B1 (en) 2019-04-02 2020-06-30 Lockheed Martin Corporation Plenoptic cellular axis redirection
US10921450B2 (en) * 2019-04-24 2021-02-16 Aeye, Inc. Ladar system and method with frequency domain shuttering
WO2020250774A1 (en) * 2019-06-11 2020-12-17 富士フイルム株式会社 Imaging device
JP2022537117A (en) 2019-06-21 2022-08-24 ザ ガバニング カウンシル オブ ザ ユニバーシティ オブ トロント Method and system for extending image dynamic range using pixel-by-pixel encoding of pixel parameters
CN110392149A (en) 2019-07-23 2019-10-29 华为技术有限公司 Image capture display terminal
KR102365748B1 (en) 2019-07-31 2022-02-23 코어포토닉스 리미티드 System and method for creating background blur in camera panning or motion
JP7314752B2 (en) * 2019-09-30 2023-07-26 株式会社リコー PHOTOELECTRIC CONVERSION ELEMENT, READING DEVICE, IMAGE PROCESSING DEVICE, AND METHOD FOR MANUFACTURING PHOTOELECTRIC CONVERSION ELEMENT
US11659135B2 (en) 2019-10-30 2023-05-23 Corephotonics Ltd. Slow or fast motion video using depth information
US11931270B2 (en) 2019-11-15 2024-03-19 Touch Bionics Limited Prosthetic digit actuator
US11470287B2 (en) 2019-12-05 2022-10-11 Samsung Electronics Co., Ltd. Color imaging apparatus using monochrome sensors for mobile devices
US11770618B2 (en) 2019-12-09 2023-09-26 Corephotonics Ltd. Systems and methods for obtaining a smart panoramic image
US11949976B2 (en) 2019-12-09 2024-04-02 Corephotonics Ltd. Systems and methods for obtaining a smart panoramic image
CN114144898B (en) 2020-04-26 2022-11-04 核心光电有限公司 Temperature control for Hall bar sensor calibration
US11509837B2 (en) 2020-05-12 2022-11-22 Qualcomm Incorporated Camera transition blending
KR20230020585A (en) 2020-05-17 2023-02-10 코어포토닉스 리미티드 Image stitching in the presence of a full field of view reference image
US11770609B2 (en) 2020-05-30 2023-09-26 Corephotonics Ltd. Systems and methods for obtaining a super macro image
US11637977B2 (en) 2020-07-15 2023-04-25 Corephotonics Ltd. Image sensors and sensing methods to obtain time-of-flight and phase detection information
CN116125660A (en) 2020-07-15 2023-05-16 核心光电有限公司 Method for correcting viewpoint aberration of scan folding camera
EP4065934A4 (en) 2020-07-31 2023-07-26 Corephotonics Ltd. Hall sensor-magnet geometry for large stroke linear position sensing
US11917272B2 (en) * 2020-10-28 2024-02-27 Semiconductor Components Industries, Llc Imaging systems for multi-spectral imaging
WO2023154946A1 (en) * 2022-02-14 2023-08-17 Tunoptix, Inc. Systems and methods for high quality imaging using a color-splitting meta-optical computation camera
WO2023171470A1 (en) * 2022-03-11 2023-09-14 パナソニックIpマネジメント株式会社 Light detection device, light detection system, and filter array

Family Cites Families (122)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1224063A (en) * 1968-09-04 1971-03-03 Emi Ltd Improvements in or relating to static split photo-sensors
US3676317A (en) 1970-10-23 1972-07-11 Stromberg Datagraphix Inc Sputter etching process
US3806633A (en) 1972-01-18 1974-04-23 Westinghouse Electric Corp Multispectral data sensor and display system
US3971065A (en) * 1975-03-05 1976-07-20 Eastman Kodak Company Color imaging array
US4028725A (en) * 1976-04-21 1977-06-07 Grumman Aerospace Corporation High-resolution vision system
US4323925A (en) * 1980-07-07 1982-04-06 Avco Everett Research Laboratory, Inc. Method and apparatus for arraying image sensor modules
US4385373A (en) * 1980-11-10 1983-05-24 Eastman Kodak Company Device for focus and alignment control in optical recording and/or playback apparatus
US4554460A (en) 1982-07-02 1985-11-19 Kollmorgen Technologies Corp. Photodetector automatic adaptive sensitivity system
JPS6211264U (en) 1985-07-04 1987-01-23
JPS6211264A (en) 1985-07-09 1987-01-20 Fuji Photo Film Co Ltd Solid-state image pickup device
US4679068A (en) 1985-07-25 1987-07-07 General Electric Company Composite visible/thermal-infrared imaging system
US4688080A (en) 1985-09-27 1987-08-18 Ampex Corporation Multi-standard adaptive chrominance separator
GB2207020B (en) * 1987-07-08 1991-08-21 Gec Avionics Imaging system
US4751571A (en) 1987-07-29 1988-06-14 General Electric Company Composite visible/thermal-infrared imaging apparatus
JPH01161326A (en) * 1987-12-18 1989-06-26 Asahi Optical Co Ltd Lens moving mechanism for focal distance variable lens
DE58902538D1 (en) * 1988-05-19 1992-12-03 Siemens Ag METHOD FOR OBSERVING A SCENE AND DEVICE FOR IMPLEMENTING THE METHOD.
DE3927334C1 (en) * 1989-08-18 1991-01-10 Messerschmitt-Boelkow-Blohm Gmbh, 8012 Ottobrunn, De
JP3261152B2 (en) * 1991-03-13 2002-02-25 シャープ株式会社 Imaging device having a plurality of optical systems
US6347163B2 (en) * 1994-10-26 2002-02-12 Symbol Technologies, Inc. System for reading two-dimensional images using ambient and/or projected light
US5317394A (en) 1992-04-30 1994-05-31 Westinghouse Electric Corp. Distributed aperture imaging and tracking system
JPH06133191A (en) 1992-10-16 1994-05-13 Canon Inc Image pickup device
US5850479A (en) * 1992-11-13 1998-12-15 The Johns Hopkins University Optical feature extraction apparatus and encoding method for detection of DNA sequences
DE69321078T2 (en) 1992-11-20 1999-02-25 Picker Int Inc System for a panorama camera
DK45493D0 (en) * 1993-04-21 1993-04-21 Vm Acoustics Aps ADJUSTABLE SUSPENSION MOUNTING FOR WALL MOUNTING EX. FOR SPEAKERS
US5694165A (en) * 1993-10-22 1997-12-02 Canon Kabushiki Kaisha High definition image taking apparatus having plural image sensors
US6486503B1 (en) 1994-01-28 2002-11-26 California Institute Of Technology Active pixel sensor array with electronic shuttering
US5766980A (en) * 1994-03-25 1998-06-16 Matsushita Electronics Corporation Method of manufacturing a solid state imaging device
JPH08172635A (en) * 1994-12-16 1996-07-02 Minolta Co Ltd Image pickup device
US5515109A (en) 1995-04-05 1996-05-07 Ultimatte Corporation Backing color and luminance nonuniformity compensation
US5694155A (en) * 1995-04-25 1997-12-02 Stapleton; Robert E. Flat panel display with edge contacting image area and method of manufacture thereof
US5604534A (en) 1995-05-24 1997-02-18 Omni Solutions International, Ltd. Direct digital airborne panoramic camera system and method
JPH0934422A (en) 1995-07-19 1997-02-07 Sony Corp Image signal processing method and image device
US5691765A (en) * 1995-07-27 1997-11-25 Sensormatic Electronics Corporation Image forming and processing device and method for use with no moving parts camera
US5742659A (en) * 1996-08-26 1998-04-21 Universities Research Assoc., Inc. High resolution biomedical imaging system with direct detection of x-rays via a charge coupled device
US6137535A (en) * 1996-11-04 2000-10-24 Eastman Kodak Company Compact digital camera with segmented fields of view
JPH10243296A (en) * 1997-02-26 1998-09-11 Nikon Corp Image-pickup device and drive method for the same
US6366319B1 (en) 1997-07-03 2002-04-02 Photronics Corp. Subtractive color processing system for digital imaging
US6714239B2 (en) * 1997-10-29 2004-03-30 Eastman Kodak Company Active pixel sensor with programmable color balance
NO305728B1 (en) * 1997-11-14 1999-07-12 Reidar E Tangen Optoelectronic camera and method of image formatting in the same
US6381072B1 (en) * 1998-01-23 2002-04-30 Proxemics Lenslet array systems and methods
US7170665B2 (en) * 2002-07-24 2007-01-30 Olympus Corporation Optical unit provided with an actuator
US6100937A (en) 1998-05-29 2000-08-08 Conexant Systems, Inc. Method and system for combining multiple images into a single higher-quality image
JP3771054B2 (en) * 1998-07-01 2006-04-26 株式会社リコー Image processing apparatus and image processing method
US6903770B1 (en) * 1998-07-27 2005-06-07 Sanyo Electric Co., Ltd. Digital camera which produces a single image based on two exposures
KR100284306B1 (en) 1998-10-14 2001-03-02 김영환 Unit pixel driving method to improve image sensor image quality
GB2345217A (en) 1998-12-23 2000-06-28 Nokia Mobile Phones Ltd Colour video image sensor
US6611289B1 (en) * 1999-01-15 2003-08-26 Yanbin Yu Digital cameras using multiple sensors with multiple lenses
US20030029651A1 (en) 1999-02-03 2003-02-13 Palmeri Frank A. Electronically controlled tractor trailer propulsion braking and stability systems
US6366025B1 (en) * 1999-02-26 2002-04-02 Sanyo Electric Co., Ltd. Electroluminescence display apparatus
US6570613B1 (en) * 1999-02-26 2003-05-27 Paul Howell Resolution-enhancement method for digital imaging
US6727521B2 (en) * 2000-09-25 2004-04-27 Foveon, Inc. Vertical color filter detector group and array
US6859299B1 (en) * 1999-06-11 2005-02-22 Jung-Chih Chiao MEMS optical components
US6882368B1 (en) * 1999-06-30 2005-04-19 Canon Kabushiki Kaisha Image pickup apparatus
US6833873B1 (en) * 1999-06-30 2004-12-21 Canon Kabushiki Kaisha Image pickup apparatus
US6885404B1 (en) * 1999-06-30 2005-04-26 Canon Kabushiki Kaisha Image pickup apparatus
US6375075B1 (en) 1999-10-18 2002-04-23 Intermec Ip Corp. Method and apparatus for reading machine-readable symbols including color symbol elements
US6960817B2 (en) * 2000-04-21 2005-11-01 Canon Kabushiki Kaisha Solid-state imaging device
US6437335B1 (en) * 2000-07-06 2002-08-20 Hewlett-Packard Company High speed scanner using multiple sensing devices
EP1176808A3 (en) * 2000-07-27 2003-01-02 Canon Kabushiki Kaisha Image sensing apparatus
US6946647B1 (en) * 2000-08-10 2005-09-20 Raytheon Company Multicolor staring missile sensor system
US6952228B2 (en) * 2000-10-13 2005-10-04 Canon Kabushiki Kaisha Image pickup apparatus
US7139028B2 (en) * 2000-10-17 2006-11-21 Canon Kabushiki Kaisha Image pickup apparatus
US7262799B2 (en) * 2000-10-25 2007-08-28 Canon Kabushiki Kaisha Image sensing apparatus and its control method, control program, and storage medium
US7128266B2 (en) * 2003-11-13 2006-10-31 Metrologic Instruments. Inc. Hand-supportable digital imaging-based bar code symbol reader supporting narrow-area and wide-area modes of illumination and image capture
JP2002252338A (en) * 2000-12-18 2002-09-06 Canon Inc Imaging device and imaging system
JP2002209226A (en) * 2000-12-28 2002-07-26 Canon Inc Image pickup device
JP2002290793A (en) * 2001-03-28 2002-10-04 Mitsubishi Electric Corp Mobile phone with image pickup device
JP2003037757A (en) * 2001-07-25 2003-02-07 Fuji Photo Film Co Ltd Imaging unit
US7362357B2 (en) * 2001-08-07 2008-04-22 Signature Research, Inc. Calibration of digital color imagery
CN1240229C (en) 2001-10-12 2006-02-01 松下电器产业株式会社 Brightness signal/chroma signal separator and brightness signal/chroma signal separation method
US7239345B1 (en) * 2001-10-12 2007-07-03 Worldscape, Inc. Camera arrangements with backlighting detection and methods of using same
JP2003143459A (en) * 2001-11-02 2003-05-16 Canon Inc Compound-eye image pickup system and device provided therewith
US6617565B2 (en) * 2001-11-06 2003-09-09 Omnivision Technologies, Inc. CMOS image sensor with on-chip pattern recognition
US7054491B2 (en) * 2001-11-16 2006-05-30 Stmicroelectronics, Inc. Scalable architecture for corresponding multiple video streams at frame rate
JP3811403B2 (en) * 2001-12-28 2006-08-23 日本圧着端子製造株式会社 A connector with a locking member that can be mounted from either the front or back wall of the panel
US7436038B2 (en) 2002-02-05 2008-10-14 E-Phocus, Inc Visible/near infrared image sensor array
US20030151685A1 (en) 2002-02-11 2003-08-14 Ia Grone Marcus J. Digital video camera having only two CCDs
JP4198449B2 (en) * 2002-02-22 2008-12-17 富士フイルム株式会社 Digital camera
US6841816B2 (en) * 2002-03-20 2005-01-11 Foveon, Inc. Vertical color filter sensor group with non-sensor filter and method for fabricating such a sensor group
US7129466B2 (en) * 2002-05-08 2006-10-31 Canon Kabushiki Kaisha Color image pickup device and color light-receiving device
JP2004032172A (en) * 2002-06-24 2004-01-29 Canon Inc Fly-eye imaging device and equipment comprising the same
US20040027687A1 (en) * 2002-07-03 2004-02-12 Wilfried Bittner Compact zoom lens barrel and system
US20040012689A1 (en) * 2002-07-16 2004-01-22 Fairchild Imaging Charge coupled devices in tiled arrays
US20040012688A1 (en) * 2002-07-16 2004-01-22 Fairchild Imaging Large area charge coupled device camera
CN1234234C (en) * 2002-09-30 2005-12-28 松下电器产业株式会社 Solid-state photographic device and equipment using the photographic device
KR20040036087A (en) * 2002-10-23 2004-04-30 주식회사 하이닉스반도체 CMOS image sensor having different depth of photodiode by Wavelength of light
JP4269334B2 (en) * 2002-10-28 2009-05-27 コニカミノルタホールディングス株式会社 Imaging lens, imaging unit, and portable terminal
WO2004071069A2 (en) * 2003-02-03 2004-08-19 Goodrich Corporation Random access imaging sensor
US20040183918A1 (en) * 2003-03-20 2004-09-23 Eastman Kodak Company Producing enhanced photographic products from images captured at known picture sites
US7379104B2 (en) * 2003-05-02 2008-05-27 Canon Kabushiki Kaisha Correction apparatus
US6834161B1 (en) * 2003-05-29 2004-12-21 Eastman Kodak Company Camera assembly having coverglass-lens adjuster
US7095561B2 (en) * 2003-07-29 2006-08-22 Wavefront Research, Inc. Compact telephoto imaging lens systems
JP4113063B2 (en) * 2003-08-18 2008-07-02 株式会社リガク Method for detecting specific polymer crystals
US7115853B2 (en) * 2003-09-23 2006-10-03 Micron Technology, Inc. Micro-lens configuration for small lens focusing in digital imaging devices
US20050128509A1 (en) * 2003-12-11 2005-06-16 Timo Tokkonen Image creating method and imaging device
FI20031816A0 (en) * 2003-12-11 2003-12-11 Nokia Corp Method and device for creating an image
US7453510B2 (en) * 2003-12-11 2008-11-18 Nokia Corporation Imaging device
US7511749B2 (en) * 2003-12-18 2009-03-31 Aptina Imaging Corporation Color image sensor having imaging element array forming images on respective regions of sensor elements
US7123298B2 (en) * 2003-12-18 2006-10-17 Avago Technologies Sensor Ip Pte. Ltd. Color image sensor with imaging elements imaging on respective regions of sensor elements
US7151653B2 (en) * 2004-02-18 2006-12-19 Hitachi Global Technologies Netherlands B.V. Depositing a pinned layer structure in a self-pinned spin valve
EP1594321A3 (en) 2004-05-07 2006-01-25 Dialog Semiconductor GmbH Extended dynamic range in color imagers
EP1608183A1 (en) 2004-06-14 2005-12-21 Dialog Semiconductor GmbH Matrix circuit for imaging sensors
US7095159B2 (en) * 2004-06-29 2006-08-22 Avago Technologies Sensor Ip (Singapore) Pte. Ltd. Devices with mechanical drivers for displaceable elements
US7570809B1 (en) * 2004-07-03 2009-08-04 Hrl Laboratories, Llc Method for automatic color balancing in digital images
US7564019B2 (en) 2005-08-25 2009-07-21 Richard Ian Olsen Large dynamic range cameras
US7417674B2 (en) * 2004-08-25 2008-08-26 Micron Technology, Inc. Multi-magnification color image sensor
WO2006026354A2 (en) * 2004-08-25 2006-03-09 Newport Imaging Corporation Apparatus for multiple camera devices and method of operating same
US7280290B2 (en) * 2004-09-16 2007-10-09 Sony Corporation Movable lens mechanism
US7460160B2 (en) * 2004-09-24 2008-12-02 Microsoft Corporation Multispectral digital camera employing both visible light and non-visible light sensing on a single image sensor
US7545422B2 (en) * 2004-10-27 2009-06-09 Aptina Imaging Corporation Imaging system
US7214926B2 (en) * 2004-11-19 2007-05-08 Micron Technology, Inc. Imaging systems and methods
US7483065B2 (en) * 2004-12-15 2009-01-27 Aptina Imaging Corporation Multi-lens imaging systems and methods using optical filters having mosaic patterns
KR100597651B1 (en) 2005-01-24 2006-07-05 한국과학기술원 Image sensor, apparatus and method for changing a real image for an electric signal
US7663662B2 (en) * 2005-02-09 2010-02-16 Flir Systems, Inc. High and low resolution camera systems and methods
US20060187322A1 (en) * 2005-02-18 2006-08-24 Janson Wilbert F Jr Digital camera using multiple fixed focal length lenses and multiple image sensors to provide an extended zoom range
US7256944B2 (en) * 2005-02-18 2007-08-14 Eastman Kodak Company Compact image capture assembly using multiple lenses and image sensors to provide an extended zoom range
US7236306B2 (en) * 2005-02-18 2007-06-26 Eastman Kodak Company Digital camera using an express zooming mode to provide expedited operation over an extended zoom range
US7206136B2 (en) * 2005-02-18 2007-04-17 Eastman Kodak Company Digital camera using multiple lenses and image sensors to provide an extended zoom range
US7561191B2 (en) * 2005-02-18 2009-07-14 Eastman Kodak Company Camera phone using multiple lenses and image sensors to provide an extended zoom range
US7358483B2 (en) * 2005-06-30 2008-04-15 Konica Minolta Holdings, Inc. Method of fixing an optical element and method of manufacturing optical module including the use of a light transmissive loading jig
US20070102622A1 (en) * 2005-07-01 2007-05-10 Olsen Richard I Apparatus for multiple camera devices and method of operating same
EP1938136A2 (en) * 2005-10-16 2008-07-02 Mediapod LLC Apparatus, system and method for increasing quality of digital image capture

Cited By (141)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US9749547B2 (en) 2008-05-20 2017-08-29 Fotonation Cayman Limited Capturing and processing of images using camera array incorperating Bayer cameras having different fields of view
US9712759B2 (en) 2008-05-20 2017-07-18 Fotonation Cayman Limited Systems and methods for generating depth maps using a camera arrays incorporating monochrome and color cameras
US9576369B2 (en) 2008-05-20 2017-02-21 Fotonation Cayman Limited Systems and methods for generating depth maps using images captured by camera arrays incorporating cameras having different fields of view
US11412158B2 (en) 2008-05-20 2022-08-09 Fotonation Limited Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US9485496B2 (en) 2008-05-20 2016-11-01 Pelican Imaging Corporation Systems and methods for measuring depth using images captured by a camera array including cameras surrounding a central camera
US10142560B2 (en) 2008-05-20 2018-11-27 Fotonation Limited Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US10027901B2 (en) 2008-05-20 2018-07-17 Fotonation Cayman Limited Systems and methods for generating depth maps using a camera arrays incorporating monochrome and color cameras
US10306120B2 (en) 2009-11-20 2019-05-28 Fotonation Limited Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps
CN102131044B (en) * 2010-01-20 2014-03-26 鸿富锦精密工业(深圳)有限公司 Camera module
CN102131044A (en) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 Camera module
US10455168B2 (en) 2010-05-12 2019-10-22 Fotonation Limited Imager array interfaces
CN102170571A (en) * 2010-06-22 2011-08-31 上海盈方微电子有限公司 Digital still camera framework for supporting two-channel CMOS (Complementary Metal Oxide Semiconductor) sensor
US11875475B2 (en) 2010-12-14 2024-01-16 Adeia Imaging Llc Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US10366472B2 (en) 2010-12-14 2019-07-30 Fotonation Limited Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US11423513B2 (en) 2010-12-14 2022-08-23 Fotonation Limited Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US10218889B2 (en) 2011-05-11 2019-02-26 Fotonation Limited Systems and methods for transmitting and receiving array camera image data
US10742861B2 (en) 2011-05-11 2020-08-11 Fotonation Limited Systems and methods for transmitting and receiving array camera image data
CN103004218B (en) * 2011-05-19 2016-02-24 松下知识产权经营株式会社 Three-dimensional image pickup device, imaging apparatus, transmittance section and image processing apparatus
US9179127B2 (en) 2011-05-19 2015-11-03 Panasonic Intellectual Property Management Co., Ltd. Three-dimensional imaging device, imaging element, light transmissive portion, and image processing device
CN103004218A (en) * 2011-05-19 2013-03-27 松下电器产业株式会社 Three-dimensional imaging device, imaging element, light transmissive portion, and image processing device
CN102790849A (en) * 2011-05-20 2012-11-21 英属开曼群岛商恒景科技股份有限公司 Image sensor module
US9578237B2 (en) 2011-06-28 2017-02-21 Fotonation Cayman Limited Array cameras incorporating optics with modulation transfer functions greater than sensor Nyquist frequency for capture of images used in super-resolution processing
CN102857699A (en) * 2011-06-29 2013-01-02 全友电脑股份有限公司 Image capturing system and method
US9794476B2 (en) 2011-09-19 2017-10-17 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures
US10375302B2 (en) 2011-09-19 2019-08-06 Fotonation Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures
US10275676B2 (en) 2011-09-28 2019-04-30 Fotonation Limited Systems and methods for encoding image files containing depth maps stored as metadata
US10984276B2 (en) 2011-09-28 2021-04-20 Fotonation Limited Systems and methods for encoding image files containing depth maps stored as metadata
US10430682B2 (en) 2011-09-28 2019-10-01 Fotonation Limited Systems and methods for decoding image files containing depth maps stored as metadata
US10019816B2 (en) 2011-09-28 2018-07-10 Fotonation Cayman Limited Systems and methods for decoding image files containing depth maps stored as metadata
US9811753B2 (en) 2011-09-28 2017-11-07 Fotonation Cayman Limited Systems and methods for encoding light field image files
US9536166B2 (en) 2011-09-28 2017-01-03 Kip Peli P1 Lp Systems and methods for decoding image files containing depth maps stored as metadata
US20180197035A1 (en) 2011-09-28 2018-07-12 Fotonation Cayman Limited Systems and Methods for Encoding Image Files Containing Depth Maps Stored as Metadata
US11729365B2 (en) 2011-09-28 2023-08-15 Adela Imaging LLC Systems and methods for encoding image files containing depth maps stored as metadata
CN104185808A (en) * 2011-10-11 2014-12-03 派力肯影像公司 Lens stack arrays including adaptive optical elements
US9754422B2 (en) 2012-02-21 2017-09-05 Fotonation Cayman Limited Systems and method for performing depth based image editing
US9412206B2 (en) 2012-02-21 2016-08-09 Pelican Imaging Corporation Systems and methods for the manipulation of captured light field image data
US10311649B2 (en) 2012-02-21 2019-06-04 Fotonation Limited Systems and method for performing depth based image editing
US9706132B2 (en) 2012-05-01 2017-07-11 Fotonation Cayman Limited Camera modules patterned with pi filter groups
CN103458162A (en) * 2012-06-01 2013-12-18 全视科技有限公司 Lens array for partitioned image sensor
CN103458162B (en) * 2012-06-01 2018-06-26 豪威科技股份有限公司 For the lens array of sectional image sensor
CN103516962A (en) * 2012-06-19 2014-01-15 全友电脑股份有限公司 Image capturing system and method
US9807382B2 (en) 2012-06-28 2017-10-31 Fotonation Cayman Limited Systems and methods for detecting defective camera arrays and optic arrays
US10334241B2 (en) 2012-06-28 2019-06-25 Fotonation Limited Systems and methods for detecting defective camera arrays and optic arrays
US10261219B2 (en) 2012-06-30 2019-04-16 Fotonation Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
US11022725B2 (en) 2012-06-30 2021-06-01 Fotonation Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
CN103581533A (en) * 2012-08-07 2014-02-12 联想(北京)有限公司 Method and electronic equipment for collecting image information
US9858673B2 (en) 2012-08-21 2018-01-02 Fotonation Cayman Limited Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints
US10380752B2 (en) 2012-08-21 2019-08-13 Fotonation Limited Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints
US9813616B2 (en) 2012-08-23 2017-11-07 Fotonation Cayman Limited Feature based high resolution motion estimation from low resolution images captured using an array source
US10462362B2 (en) 2012-08-23 2019-10-29 Fotonation Limited Feature based high resolution motion estimation from low resolution images captured using an array source
US10390005B2 (en) 2012-09-28 2019-08-20 Fotonation Limited Generating images from light fields utilizing virtual viewpoints
US9749568B2 (en) 2012-11-13 2017-08-29 Fotonation Cayman Limited Systems and methods for array camera focal plane control
US10009538B2 (en) 2013-02-21 2018-06-26 Fotonation Cayman Limited Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
US9743051B2 (en) 2013-02-24 2017-08-22 Fotonation Cayman Limited Thin form factor computational array cameras and modular array cameras
US9374512B2 (en) 2013-02-24 2016-06-21 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
US9774831B2 (en) 2013-02-24 2017-09-26 Fotonation Cayman Limited Thin form factor computational array cameras and modular array cameras
US9917998B2 (en) 2013-03-08 2018-03-13 Fotonation Cayman Limited Systems and methods for measuring scene information while capturing images using array cameras
US9774789B2 (en) 2013-03-08 2017-09-26 Fotonation Cayman Limited Systems and methods for high dynamic range imaging using array cameras
US10958892B2 (en) 2013-03-10 2021-03-23 Fotonation Limited System and methods for calibration of an array camera
US11272161B2 (en) 2013-03-10 2022-03-08 Fotonation Limited System and methods for calibration of an array camera
US11570423B2 (en) 2013-03-10 2023-01-31 Adeia Imaging Llc System and methods for calibration of an array camera
US10225543B2 (en) 2013-03-10 2019-03-05 Fotonation Limited System and methods for calibration of an array camera
US9986224B2 (en) 2013-03-10 2018-05-29 Fotonation Cayman Limited System and methods for calibration of an array camera
US9800856B2 (en) 2013-03-13 2017-10-24 Fotonation Cayman Limited Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
US10127682B2 (en) 2013-03-13 2018-11-13 Fotonation Limited System and methods for calibration of an array camera
US9733486B2 (en) 2013-03-13 2017-08-15 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
US10547772B2 (en) 2013-03-14 2020-01-28 Fotonation Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US10091405B2 (en) 2013-03-14 2018-10-02 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US10412314B2 (en) 2013-03-14 2019-09-10 Fotonation Limited Systems and methods for photometric normalization in array cameras
US10182216B2 (en) 2013-03-15 2019-01-15 Fotonation Limited Extended color processing on pelican array cameras
US10638099B2 (en) 2013-03-15 2020-04-28 Fotonation Limited Extended color processing on pelican array cameras
US9800859B2 (en) 2013-03-15 2017-10-24 Fotonation Cayman Limited Systems and methods for estimating depth using stereo array cameras
US10455218B2 (en) 2013-03-15 2019-10-22 Fotonation Limited Systems and methods for estimating depth using stereo array cameras
US10542208B2 (en) 2013-03-15 2020-01-21 Fotonation Limited Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US9955070B2 (en) 2013-03-15 2018-04-24 Fotonation Cayman Limited Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US10674138B2 (en) 2013-03-15 2020-06-02 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
CN105579902B (en) * 2013-09-23 2019-06-28 Lg伊诺特有限公司 A method of manufacture camera model
US10151859B2 (en) 2013-09-23 2018-12-11 Lg Innotek Co., Ltd. Camera module and manufacturing method for same
CN105579902A (en) * 2013-09-23 2016-05-11 Lg伊诺特有限公司 Camera module and manufacturing method for same
US10540806B2 (en) 2013-09-27 2020-01-21 Fotonation Limited Systems and methods for depth-assisted perspective distortion correction
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
US9924092B2 (en) 2013-11-07 2018-03-20 Fotonation Cayman Limited Array cameras incorporating independently aligned lens stacks
US10767981B2 (en) 2013-11-18 2020-09-08 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US11486698B2 (en) 2013-11-18 2022-11-01 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US10708492B2 (en) 2013-11-26 2020-07-07 Fotonation Limited Array camera configurations incorporating constituent array cameras and constituent cameras
US9813617B2 (en) 2013-11-26 2017-11-07 Fotonation Cayman Limited Array camera configurations incorporating constituent array cameras and constituent cameras
US9690970B2 (en) 2014-02-17 2017-06-27 Eyesmart Technology Ltd. Method and device for mobile terminal biometric feature imaging
CN103870805A (en) * 2014-02-17 2014-06-18 北京释码大华科技有限公司 Mobile terminal biological characteristic imaging method and device
CN103870805B (en) * 2014-02-17 2017-08-15 北京释码大华科技有限公司 A kind of mobile terminal biological characteristic imaging method and device
US10089740B2 (en) 2014-03-07 2018-10-02 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US10574905B2 (en) 2014-03-07 2020-02-25 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US9521319B2 (en) 2014-06-18 2016-12-13 Pelican Imaging Corporation Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US10542196B2 (en) 2014-06-24 2020-01-21 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Device and method for relative positioning of multi-aperture optics comprising several optical channels in relation to an image sensor
CN106716486A (en) * 2014-06-24 2017-05-24 弗劳恩霍夫应用研究促进协会 Device and method for positioning a multi-aperture optical unit with multiple optical channels relative to an image sensor
CN106537890A (en) * 2014-07-16 2017-03-22 索尼公司 Compound-eye imaging device
US10250871B2 (en) 2014-09-29 2019-04-02 Fotonation Limited Systems and methods for dynamic calibration of array cameras
US11546576B2 (en) 2014-09-29 2023-01-03 Adeia Imaging Llc Systems and methods for dynamic calibration of array cameras
CN105629426B (en) * 2014-10-31 2019-05-24 高准精密工业股份有限公司 Variable focus lens package and varifocal camera module
CN105629426A (en) * 2014-10-31 2016-06-01 高准精密工业股份有限公司 Zooming lens module and zooming camera module group
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
CN108140247A (en) * 2015-10-05 2018-06-08 谷歌有限责任公司 Use the camera calibrated of composograph
CN108140247B (en) * 2015-10-05 2022-07-05 谷歌有限责任公司 Method and apparatus for camera calibration using composite images
CN107431746A (en) * 2015-11-24 2017-12-01 索尼半导体解决方案公司 Camera model and electronic equipment
CN107431746B (en) * 2015-11-24 2021-12-14 索尼半导体解决方案公司 Camera module and electronic device
TWI781085B (en) * 2015-11-24 2022-10-21 日商索尼半導體解決方案公司 Fly-eye lens module and fly-eye camera module
CN105704465A (en) * 2016-01-20 2016-06-22 海信电子科技(深圳)有限公司 Image processing method and terminal
CN107193095A (en) * 2016-03-14 2017-09-22 比亚迪股份有限公司 The method of adjustment of optical filter, apparatus and system
CN107193095B (en) * 2016-03-14 2020-07-10 比亚迪股份有限公司 Method, device and system for adjusting optical filter
CN109155814A (en) * 2016-05-27 2019-01-04 索尼半导体解决方案公司 Processing unit, imaging sensor and system
CN109644258A (en) * 2016-08-31 2019-04-16 华为技术有限公司 Multicamera system for zoom shot
US10616493B2 (en) 2016-08-31 2020-04-07 Huawei Technologies Co., Ltd. Multi camera system for zoom
CN109644258B (en) * 2016-08-31 2020-06-02 华为技术有限公司 Multi-camera system for zoom photography
CN106768325A (en) * 2016-11-21 2017-05-31 清华大学 Multispectral light-field video acquisition device
CN110476118A (en) * 2017-04-01 2019-11-19 深圳市大疆创新科技有限公司 Low profile multiband high light spectrum image-forming for machine vision
CN110476118B (en) * 2017-04-01 2021-10-15 深圳市大疆创新科技有限公司 Low profile multiband hyperspectral imaging for machine vision
WO2018176493A1 (en) * 2017-04-01 2018-10-04 SZ DJI Technology Co., Ltd. Low-profile multi-band hyperspectral imaging for machine vision
US10962858B2 (en) 2017-04-01 2021-03-30 SZ DJI Technology Co., Ltd. Low-profile multi-band hyperspectral imaging for machine vision
US10818026B2 (en) 2017-08-21 2020-10-27 Fotonation Limited Systems and methods for hybrid depth regularization
US11562498B2 (en) 2017-08-21 2023-01-24 Adela Imaging LLC Systems and methods for hybrid depth regularization
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
CN113273183B (en) * 2019-01-07 2024-03-08 Lg伊诺特有限公司 Camera module
CN113273183A (en) * 2019-01-07 2021-08-17 Lg伊诺特有限公司 Camera module
CN113924517B (en) * 2019-06-06 2024-03-22 应用材料公司 Imaging system and method for generating composite image
CN113924517A (en) * 2019-06-06 2022-01-11 应用材料公司 Imaging system and method for generating composite image
US11270110B2 (en) 2019-09-17 2022-03-08 Boston Polarimetrics, Inc. Systems and methods for surface modeling using polarization cues
US11699273B2 (en) 2019-09-17 2023-07-11 Intrinsic Innovation Llc Systems and methods for surface modeling using polarization cues
US11525906B2 (en) 2019-10-07 2022-12-13 Intrinsic Innovation Llc Systems and methods for augmentation of sensor systems and imaging systems with polarization
US11842495B2 (en) 2019-11-30 2023-12-12 Intrinsic Innovation Llc Systems and methods for transparent object segmentation using polarization cues
US11302012B2 (en) 2019-11-30 2022-04-12 Boston Polarimetrics, Inc. Systems and methods for transparent object segmentation using polarization cues
US11580667B2 (en) 2020-01-29 2023-02-14 Intrinsic Innovation Llc Systems and methods for characterizing object pose detection and measurement systems
US11797863B2 (en) 2020-01-30 2023-10-24 Intrinsic Innovation Llc Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images
US11683594B2 (en) 2021-04-15 2023-06-20 Intrinsic Innovation Llc Systems and methods for camera exposure control
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11953700B2 (en) 2021-05-27 2024-04-09 Intrinsic Innovation Llc Multi-aperture polarization optical systems using beam splitters
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers

Also Published As

Publication number Publication date
US20080030597A1 (en) 2008-02-07
US20060054787A1 (en) 2006-03-16
US20110108708A1 (en) 2011-05-12
US7199348B2 (en) 2007-04-03
US10142548B2 (en) 2018-11-27
US20140232894A1 (en) 2014-08-21
CN101427372B (en) 2012-12-12
US8664579B2 (en) 2014-03-04
US7884309B2 (en) 2011-02-08
US20160234443A1 (en) 2016-08-11
EP1812968B1 (en) 2019-01-16
WO2006026354A3 (en) 2009-05-14
US20100208100A9 (en) 2010-08-19
EP1812968A4 (en) 2010-03-31
EP1812968A2 (en) 2007-08-01
US20060054782A1 (en) 2006-03-16
US20130277533A1 (en) 2013-10-24
US8415605B2 (en) 2013-04-09
WO2006026354A2 (en) 2006-03-09
US9313393B2 (en) 2016-04-12

Similar Documents

Publication Publication Date Title
CN101427372B (en) Apparatus for multiple camera devices and method of operating same
US7566855B2 (en) Digital camera with integrated infrared (IR) response
US7714262B2 (en) Digital camera with integrated ultraviolet (UV) response
US20230132892A1 (en) Digital cameras with direct luminance and chrominance detection
US7916180B2 (en) Simultaneous multiple field of view digital cameras
US8436286B2 (en) Imager module optical focus and assembly method
US7795577B2 (en) Lens frame and optical focus assembly for imager module
US9077916B2 (en) Improving the depth of field in an imaging system
US9979941B2 (en) Imaging system using a lens unit with longitudinal chromatic aberrations and method of operating
US7773143B2 (en) Thin color camera having sub-pixel resolution
US20120274811A1 (en) Imaging devices having arrays of image sensors and precision offset lenses
EP2315448A1 (en) Thin camera having sub-pixel resolution
JP2996958B2 (en) Structure for focusing and color filtering on a semiconductor photoelectric device and method for manufacturing the structure
JP2015521411A (en) Camera module patterned using π filter group
CN103999449A (en) Image capture element

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: POLOTARISFILO CO., LTD.

Free format text: FORMER OWNER: MANSION VIEW CO., LTD.

Effective date: 20090612

C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20090612

Address after: Delaware

Applicant after: Newport Imaging Corp.

Address before: American California

Applicant before: Newport Imaging Corp.

C14 Grant of patent or utility model
GR01 Patent grant