CN104683774A - Techniques to reduce color artifacts in a digital image - Google Patents

Techniques to reduce color artifacts in a digital image Download PDF

Info

Publication number
CN104683774A
CN104683774A CN201410582124.5A CN201410582124A CN104683774A CN 104683774 A CN104683774 A CN 104683774A CN 201410582124 A CN201410582124 A CN 201410582124A CN 104683774 A CN104683774 A CN 104683774A
Authority
CN
China
Prior art keywords
lsc
image
group
image processing
pretreated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410582124.5A
Other languages
Chinese (zh)
Other versions
CN104683774B (en
Inventor
D·帕利
L·兰皮宁
J·尼卡宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Publication of CN104683774A publication Critical patent/CN104683774A/en
Application granted granted Critical
Publication of CN104683774B publication Critical patent/CN104683774B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters

Abstract

Techniques for reducing color artifacts in a digital image are described. In one embodiment, for example, an apparatus may comprise logic, at least a portion of which is in hardware, the logic to determine a respective set of error values for each of a set of lens shading correction (LSC) tables, each set of error values describing error associated with application of its corresponding LSC table to a preprocessed image, determine a respective weight for each of the set of LSC tables based on the corresponding set of error values for that LSC table, and generate a composite LSC table for the preprocessed image as a weighted sum of the set of LSC tables, based on the respective weights for the set of LSC tables. Other embodiments are described and claimed.

Description

Reduce the technology of the color artifacts in digital picture
Background technology
Electronic equipment just has less form factor with growing speed.Smart phone, flat board and wearable computer are just being attempted making various component miniaturization to allow the more convenient use of these equipment.Meanwhile, more multiple features and assembly are integrated in these less form factor device.Such as, smart phone can comprise the digital camera that can produce digital picture.But although reduce form factor can improve user friendliness, it usually sacrifices performance or quality accomplishes this point.About digital camera, less form factor causes the digital picture with color artifacts and variable color usually.Consider to need improvement of the present invention for these and other just.
Accompanying drawing explanation
Figure 1A illustrates the first perspective view of the first equipment.
Figure 1B illustrates the second perspective view of the first equipment.
Fig. 2 illustrates the embodiment of application subsystem.
Fig. 3 A illustrates the embodiment of the first image.
Fig. 3 B illustrates the embodiment of the second image.
Fig. 4 illustrates the embodiment of image subsystems.
Fig. 5 illustrates the embodiment of correcting lens shadow assembly.
Fig. 6 A illustrates the embodiment of the first sample distribution.
Fig. 6 B illustrates the embodiment of the second sample distribution.
Fig. 7 illustrates the embodiment of storage medium.
Fig. 8 illustrates the embodiment of the second equipment.
Fig. 9 illustrates the embodiment of logic flow.
Figure 10 illustrates the embodiment of system.
Figure 11 illustrates the embodiment of the 3rd equipment.
Embodiment
Each embodiment relates generally to image capture system, as digital camera.Some embodiments particularly relate to the image-capturing apparatus in multi-function electronic devices such as being integrated into such as smart phone, flat board, wearable computer and other less form factor device.In one embodiment, such as, a kind of device can comprise logic, described logic be example, in hardware at least partially, described logic is used for determining that the corresponding set of error values that each LSC in one group of correcting lens shadow (LSC) table shows, each set of error values describe and the error shown its corresponding LSC to be applied to pretreated image and be associated; Correspondence one set of error values shown based on each LSC in described one group of LSC table determines the respective weights that each LSC shows; And based on the weighted sum that the synthesis LSC table that each respective weights that this group LSC shows generates pretreated image is shown as this group LSC.Also describe and other embodiment claimed.
Referring now to accompanying drawing, Ref. No. identical in whole accompanying drawing is for representing identical element.In the following description, numerous detail has been set forth for purpose of explanation to provide complete understanding of the present invention.But, obviously, can not have to implement each novel embodiment when these details.In other cases, show in block form each known structure and equipment so that describe the present invention.The present invention covers all modifications, equivalents and alternative according to theme required for protection.
Figure 1A, 1B illustrate each perspective view of equipment 100.Equipment 100 can comprise and having for catching and processing any electronic equipment of the image capture system 120 of image.As shown in Figure 1, equipment 100 can comprise the less form factor device with the digital display 102 being arranged to present digital picture 104, as smart phone.The more detailed diagram of equipment 100 and example devices can be described with reference to Figure 8.
The multi-function electronic devices such as such as smart phone can have the form factor of height (H), width (W) and the degree of depth (D) size in the palm being made into the average hand being easy to be adapted at user usually.Under these circumstances, equipment 100 can have the degree of depth (D) measured for unit with millimeter (mm).Such as, conventional smartphone easily can have the representative degree of depth or thinner of 7.6mm.When the equipment 100 of so narrow degree of depth comprises image capture system 120 (e.g., digital camera), it must reduce the distance between camera optics element and imageing sensor.Distance through reducing makes light by each several part of mode percussion figure transducer heterogeneous, as towards light in the center set of imageing sensor and towards the external margin attenuate light of imageing sensor.In order to correct such heterogeneity, image capture system 120 can realize various correcting lens shadow (LSC) algorithm and compensate.LSC algorithm can utilize one or more checking list to perform such correction.
Each embodiment attempts correcting these and other problems by utilizing the enhancing image processing techniques being designed to other image quality issues reduced in color artifacts, shadow correction and digital picture, and these problems are required for the performance that such as Automatic white balance (AWB) operation waits consistent operation.More specifically, each embodiment can utilize the enhancing LSC algorithm realizing multi-stage method and come for image procossing and correction.Strengthen LSC algorithm can realize such as characterizing the stage, this stage is for creating one or more LSC tables of color distortion and the shade that can be used to compensating digits image.Strengthen LSC algorithm and can realize such as calibration phase, this calibration phase analysing digital image is to quantize color distortion and the spatial information of digital picture 104.Calibration phase subsequently can based on the Preprocessing of digital picture 104 is selected in LSC table one or more come for the treatment of and correcting digital image 104.
Strengthen LSC algorithm and the remarkable advantage being better than conventional LSC algorithm is provided.Strengthen LSC algorithm and can be provided for reality owing to the color distortion of optical lens shade and solution efficiently.Such as, enhancing LSC algorithm can reduce or minimize the color artifacts in digital picture 104.Strengthen LSC algorithm trial identified and select one or more LSC to show, this one or more LSC table carrys out correcting digital image 104 by by the minimum error of the actual scene caught relative to image capture system 120.In this way, each embodiment provides the remarkable improvement of picture quality, especially for such as less form factor device such as equipment 100 grade, utilizes the compensation power of reduced levels compared with utilizing the equipment of conventional LSC algorithm simultaneously.As a result, each embodiment can improve operator, the property born of equipment or network, scalability, module voltinism, extensibility or interoperability.
Fig. 2 illustrates the block diagram of the application subsystem 200 of camera system 120.In one embodiment, application subsystem 200 can comprise computer implemented application subsystem 200, and computer implemented application subsystem 200 has the camera applications 220 comprising one or more assembly 222-a.Although the application subsystem shown in Fig. 2 200 has the element of the limited quantity according to certain topological structure, but be appreciated that the needs according to given realization, application subsystem 200 can comprise the more or less element according to alternative topologies.
It should be noted that " a " used herein and " b " and " c " and similar designator are intended to represent the variable of any positive integer.Thus, such as, if value is set to a=5 by a realization, then one group of complete assembly 222-a can comprise assembly 222-1,222-2,222-3,222-4 and 222-5.Each embodiment is not limited to this context.
Application subsystem 200 can comprise camera applications 220 and other assemblies.The pretreated image 210 that camera applications 220 generally can be arranged to the image capture system 120 by equipment 100 to catch as input, and exports treated image 230.Pretreated image 210 can comprise the digital picture 104 of the primitive form (before any enhancing of application and/or correcting) caught by image capture system 120.Treated image 230 can be included in (e.g., by LSC assembly 222-2) applies the pretreated image 210 after strengthening and/or correcting.In one embodiment, image 210,230 can be coloured image.In one embodiment, image 210,230 can be achromatic image.
Camera applications 220 can comprise image-signal processor (ISP) assembly 222-1.ISP assembly 222-1 can be arranged to receive original image information as input from pretreated image 210, from original image information, collecting selective image data and/or statistics, and exporting treated image information to another system (Video Controller as being presented on by treated image 230 on digital display 102) of equipment 100.ISP assembly 222-1 also can receive the identifier of the one or more LSC tables 224 selected by LSC assembly 222-3 as input.
Camera applications 220 can comprise LSC assembly 222-2.LSC assembly 222-2 generally can be arranged to realize strengthening LSC algorithm.LSC assembly 222-2 can from ISP assembly 222-1 receiver selectivity view data and/or statistics, and selects the combination of one or more LSC table 224 or LSC table 224 to correct shade in pretreated image 210 based on the view data received and/or statistics.LSC assembly 222-2 the identifier of selected one or more LSC tables 224 can be exported to ISP assembly 222-1 for process and correct from pretreated image 210 original image information and generate treated image 230.
Camera applications 220 can comprise one or more LSC and show 224-b to be used by ISP assembly 222-1 and/or LSC assembly 222-2.LSC table 224 generally can be arranged to one or more correlated colour temperatures (CCT) value storing conventional luminous element.
Fig. 3 A illustrates the image 300 using conventional calibration techniques process.In normal image treatment system, color distortion may occur due to incorrect optical lens shade.Lens shade causes primarily of the unequal optical distortion in image sensor plane, this unequal optical distortion is caused by the compression distance between optical element and imageing sensor, this compression distance and then be caused by the less form factor of the electronic equipments such as such as equipment 100.Light is being decayed more near the angle place of imageing sensor and is decaying less in optical centre at imageing sensor.This decay is not static in Color plane, and this causes serious color distortion.Typical result Shi center is brighter, it is comparatively dark to keep to the side, pseudomorphism color and the digital picture that is attenuated relative to original scene.
Image 300 shows some examples of color distortion 302.As shown in image 300, visible distortion 302 clearly spreads all over image 300.The inappropriate correction that distortion 302 partly provides due to conventional LSC algorithm causes.
Fig. 3 B shows image 350, and image 350 shows the lens shade distortion on perfect smooth test zone.As shown in FIG 3 B, the angular zone 370 of image 350 is darker than the central area 360 of image 350.What this means to be caused by optical lens shade is greater than optical attenuation relative to central area 360 relative to the optical attenuation of angular zone 370.These embodiments are not limited thereto example.
Refer again to Fig. 2, in order to compensating light decay, camera applications 220 can store multiple LSC table 224.Each LSC table 224 can comprise the correction factor of image capture system 120, mainly utilizes Ordinary Light Sources Have.The task of camera applications 220 controlling image capture system 120 decides which LSC table 224 based on the content analyzed of digital picture 104 should be used to correcting digital image 104.
Camera applications 220 provides the senior selection to LSC table 224, this so that less distortion 302 can be caused.This is designed to quantize color distortion and utilize the enhancing technology of the spatial information of institute's analysis image to realize by using.Such as, distortion 302 is disclosed to the analysis of the distortion 302 in image 300 the most noticeable and measured the most efficiently in the smooth achromatic region of image 300.But typical scene is uneven and is not colourless.By the uniform smooth region in reference numbers image 104, this information can be used to select LSC table 224 better.Specifically, color distortion usually by relative to project to calibrated and white balance image a block in the tone of hue/saturation plane, saturation and value (HSV) color space in the mean square error (MSE) of initial point measure.Minimum each block of wherein this error of image is considered to relevant for the block carrying out classifying.Weight is assigned to the LSC table 224 providing minimal error in one or several image blocks.
This solution provides more sane with consistent result, maintains simultaneously and computationally implements simply.Camera applications 220 makes the decision being conducive to the LSC table 224 provided compared with hue distortion automatically.Typical use scenes is the auto color shadow correction being in low CCT light source wherein for adjacent white heat/tungsten and fluorescent illuminant, and their attribute is different shadow outline substantially.Another use scenes can be the image for being generally used for any smooth or semi-flat (such as, the texture) type that picture quality confirms.These are only examples, and also there are other use scenes.
Generally speaking, correcting lens shadow comprises two Main Stage: characterize stage and calibration phase.The sign stage performs in the design of the image capture system 120 of equipment 100 and assembly process usually.Calibration phase performs in real time usually during use image capture system 120.
The sign stage is that image capture system 120 is by the moment characterized.LSC table 224 is created the most of CCT value scopes covering conventional luminous element.LSC table 224 to be stored in lasting storage and to use during calibration phase after a while.The camera applications 220 controlling image capture system 120 determines and which kind of ratio each LSC table 224 to be used to carry out the shade of compensating digits image 104 in when.
Calibration phase performs real-time the term of execution, by data when image capture system 120 spreads and is passed to ISP assembly 221.ISP assembly 222-1 analyzes from the view data of flow transmission and the content of supplementary metadata.The example of metadata can include but not limited to automatic focus, CCT estimation, time value etc.Based on this information, ISP assembly 222-1 calculates weight w l, l=0 ..., L-1, wherein L is used to use following formula (1) to show T lbe fused into the quantity of the sign light source of single table:
T C=Σ lT C,l*w l(1)
Wherein T c,lbe matrix and w lbe scalar, and C represent color (red, green or blue) index.
Fig. 4 illustrates the block diagram of image subsystems 400.Image subsystems 400 illustrates the interoperability between the image capture system 120 of equipment 100 and each several part of camera applications 220.
Image capture system 120 can comprise one or more optical element 402 and one or more imageing sensor 404 and other assemblies, and wherein each optical element matches with corresponding imageing sensor 404.Each in imageing sensor 404 based on any one in the various technology of the image for catching scene, can include but not limited to charge coupled device (CCD) semiconductor technology.Each one or more lens, mirror, prism, shutter, filter etc. also limiting this visual field at least in part by the visual field being used to each image of scene to convey to each imageing sensor of correspondence in imageing sensor 404 in optical element 402 form.Imageing sensor 404 and optical element 402 (no matter their quantity) relative to each other locating and orienting in the following manner: be intended to each imageing sensor and optical element the equitant visual field, one or more the visual field provided with other imageing sensors and optical element centering.
As shown in Figure 4, the light that optical element 402 receives focuses on imageing sensor 404 place.Imageing sensor 404 provides original image information as the data flow to ISP assembly 222-1 in digital form.ISP assembly 222-1 reduces original image information, calculates relevant statistics, and (3A) block 406 after being sent to focusing, after white balance and after exposure.3A module 406 comprises LSC assembly 222-2.3A module 406 analyzes the information that ISP assembly 222-1 provides, and calculates CCT estimation and auto-focusing (AF) statistics.LSC assembly 222-2 estimates based on CCT and LSC table 224 is selected and/or generated to possible AF statistics, and the identifier of LSC table 224 that is selected and/or that generate is sent to ISP assembly 222-1.ISP assembly 222-1 retrieves corrected value from LSC table 224 that is selected and/or that generate, and application corrects, and generates RGB image or video flowing.
A focus of camera applications 220 estimates original light source type by analysis image content and uses above-mentioned formula (1) to identify the LSC table 224 being most suitable for this image, wherein T c,lbe the sign stage create and be w lcalculate at calibration phase.
Fig. 5 illustrates the block diagram of the exemplary L SC assembly 222-2 of camera applications 220.LSC assembly 222-2 can receive view data selected by pretreated image 210 and/or statistics from ISP assembly 222-1.Pretreated image 210 can be caught by the image capture system 120 of equipment 100.Pretreated image 210 can be single image or image sequence, and Tathagata is from video sequence.Pretreated image 210 can by the LSC assembly 222-2 of camera applications 220 from image capture system 120 real-time reception, or from the non real-time reception of memory cell.Each embodiment is not limited to this context.
LSC assembly 222-2 can comprise estimator module 502.LSC assembly 222-2 receives view data from ISP assembly 222-1.Estimator module 502 generates CCT and estimates.Estimator module 502 use through double sampling input image data calculate from characterization database luminous element there is probability P cCT.For those luminous elements with the probability being greater than 0, characterize LSC table 224 and be applied to input image data (light globally 1..., light l).
LSC assembly 222-2 can comprise segmentation module 504.Segmentation module 504 receives AF statistics, and provides approaching of the smooth level and smooth area in scene based on AF statistics according to processing area.Without loss of generality, approach based on this, segmentation module 504 IOB size subsequently, or more complicated image segmentation is represented as a just block further.The Size-dependent of given processing block is in the content of scene.For smooth dull scene, such as, the view data of single block size H × W can be used.For the more complicated scene with many textures, some pieces can be used (12 to 24 blocks such as, dividing this plane).Although block size may be different, the view data that each block should comprise q.s is with the robust iterative (such as, recommending minimum value to be 100) of the deviation between providing each color sample.
For each group input image data light 1..., light l, LSC assembly 222-2 can comprise LSC module 506, grey-world Automatic white balance (GW-AWB) module 508, red/green/blue (RGB) to hue/saturation/value (HSV) module 510 and error module 512.For each image block, GW-AWB module 508, RGB to HSV module 510 and error module 512 process input picture block, and export treated image block to Fusion Module 514.Fusion Module 514 can export the one or more LSC table identifiers 520 corresponding with selected LSC table 224 to ISP assembly 222-1 subsequently, or alternatively exports actual LSC table 224.
Error module 512 each block b of result that can be as segmentation calculates one or more error E of each luminous element l in the mode of block level l,b.Error module 512 uses the error E relative to the origin of coordinates (0 tone and 0 saturation) in following formula (2) calculating HSV color space after application GW-AWB algorithm (such as, via GW-AWB module 508) l,b(such as, via RGB to HSV module 510):
E l , b = 1 H b W b Σ i h i 2 s i 2 - - - ( 2 )
Wherein h is tone, and s is saturation, H bthe block size in vertical direction, W bbe the block size in horizontal direction, and l is luminous element index.Value coordinate is left in the basket.
Fig. 6 A, 6B illustrate the example of h and the s Data distribution8 by polar flat site.Fig. 6 A illustrates the sample distribution in the hue/saturation polar coordinates applying the rear block of incorrect LSC table 224 realized by conventional LSC assembly.Fig. 6 B illustrates the sample distribution in the hue/saturation polar coordinates applying the rear block of correct LSC table 224 using the enhancing LSC algorithm realized by LSC assembly 222-2.As shown in Fig. 6 A, 6B, tone h is angle and saturation s is radius.
Based on following formula (3), larger weight is given the luminous element with the distribution concentrated on around 0, thus is relevant to the less error of initial point:
Wherein E min=min l{ E l,b, E max=E l,b* tolerance limit, b is block index, and tolerance limit represents that the much errors of tolerable or similar function need not to be piecewise linear relative constant (such as, the tolerance limit of 5% equals 1.05), if computational resource allows.W l,bfollowing formula (4) can be used subsequently to calculate:
w l,b=P l,b/E l,b(4)
Refer again to Fig. 5, LSC assembly 222-2 can comprise Fusion Module 514.Fusion Module 514 generates the final weight w of each LSC table 224 lbe used as w l,bweighted average.In one embodiment, its corresponding error E l,bthe block (such as, from 2 to 8) of minimum limited quantity is included into consideration.Final weight be similar to formula (3) mode based on the minimal error E in each block l,bconfidence level C l,bcalculate, its tolerance value comparatively large (such as, [2,4]) and the adjustment stood as shown in the formula (5):
w l=Σ bw l,bC l,b(5)
In embodiments, calculate in formula 2 and be applied to the error E l of formula 3 and 4, the reliability of b can be different with each block b of image.Such as, in certain embodiments, some blocks other blocks comparable are subject to more impacts of noise.Under certain conditions (such as such as light conditions), such heterogeneity can affect and often general especially.Each embodiment is not limited to this context.
Any specific image that LSC assembly 222-2 analyzes can describe according to following formula (6):
z(x)=k(x)y(x)+σ(y(x))n (6)
Wherein x=(x i, x j) be space coordinates on the plane of delineation, y (x) is the true expection unknown images value at space coordinates x place, k is the two-dimensional grid of attenuation coefficient, n ~ N (0,1) be the random white Gauss noise of zero-mean and unity standard deviation, and σ is the function of the standard deviation of noise, makes std{z (x) }=σ (y (x)).
In order to obtain the best estimate of the true picture value in subregion, solve with the form of formula (7):
y ~ ( x ) = F { k - 1 ( x ) z ( x ) } = F { y ( x ) + k - 1 ( x ) σ ( y ( x ) ) n } - - - ( 7 )
Wherein F{} is the operator of instruction grey-world white balance and colour space transformation.
To calculate for each locus to count the different reliability of value, dependability parameter C (x) can be defined.In embodiments, the value of C (x) can be determined based on the value of Std (x), and this can calculate according to following formula (8):
Std ( x ) = std { H { y ~ ( x ) } S { y ~ ( x ) } } - - - ( 8 )
Wherein H{} indicates tone value, and S{} indicates intensity value, and std represents standard deviation operator.In some cases, may be nonlinearity, and therefore the formal calculating of Std (x) be computationally intensive.In certain embodiments, in order to save computational resource, C (x) can use the desired value of Std (x) fixedly approach define.Such as, image is divided in each embodiment of four lines and six row sub-blocks wherein, takes the set C of dependability parameter C (x) of the matrix form shown in formula (9) to be used:
C = { c b - 1 } = 8 2 2 2 2 8 4 2 1 1 2 4 4 2 1 1 2 4 8 2 2 2 2 8 - - - ( 9 )
Wherein expression will be used to the inverse of the coefficient of the sub-block b being positioned at the i-th row and jth row.In embodiments, the coefficient C in formula (5) l, bcan use carry out weighting.In certain embodiments, approaching of using in the set of such dependability parameter can based on experience.Such as, in embodiments, such approaching can based on the simulation result performed real image sensing data and/or test.Each embodiment is not limited to this context.In certain embodiments, interpolation can be used to some somes x to avoid the sharply change of the weighting when picture frame only changes a little.To understand, and in embodiments, depend on applicable decay factor and/or noise model, C can be different from the example in formula (9).In certain embodiments, stand the block size split and shape can change, but dependability parameter C (x) (be similar to shown in formula (9) those) can be fixing and the transducer be stored for broad spectrum and/or module.Each embodiment is not limited to this context.
Fig. 7 illustrates an embodiment of storage medium 700.Storage medium 700 can comprise goods.In one embodiment, storage medium 700 can comprise any non-transitory computer-readable medium or machine readable media, as optics, magnetic or semiconductor storage.Storage medium can store various types of computer executable instructions, refers now to the logic of Fig. 1-6 description and/or one or more the instruction in operating strictly according to the facts.The example of computer-readable or machinable medium can comprise can any tangible medium of storage of electronic, comprises volatile memory or nonvolatile memory, removable or irremovable storage device, erasable or nonerasable memory, can write or recordable memory etc.The example of computer executable instructions can comprise the code of any suitable type, such as source code, compiled code, interpretive code, executable code, static code, dynamic code, object-oriented code, visual code etc.Each embodiment is not limited to this context.
Fig. 8 illustrates an embodiment of equipment 800.Equipment 800 can realize such as equipment 100 and/or storage medium 700.As shown in Figure 8, equipment 800 can comprise radio interface 810, baseband circuit 820 and computing platform 860, but each embodiment is not limited to this configuration.
Equipment 800 can in single computational entity (such as all in individual equipment) realize equipment 100 and/or storage medium 700 structure and/or operation in some or all.Or, equipment 800 can use Distributed System framework to be distributed on multiple computational entity by equipment 100 and/or the structure of storage medium 700 and/or each several part of operation, as the distributed system of client-server architectural framework, 3 grades of architectural frameworks, N level architectural framework, close coupling or cluster system structure, peer-to-peer system structure, MS master-slave architectural framework, shared data bank architectural framework and other types.Each embodiment is not limited to this context.
In one embodiment, radio interface 810 can comprise and is applicable to launch and/or receive single carrier or multicarrier modulated signal (such as, comprise complementary code keying (CCK) and/or OFDM (OFDM) code element) assembly or the combination of assembly, although each embodiment is not limited to any specific aerial (over-the-air) interface or modulation scheme.Radio interface 810 can comprise such as, receiver 812, transmitter 816 and/or frequency synthesizer 814.Radio interface 810 can comprise biased control, crystal oscillator and/or one or more antenna 818-p.In another embodiment, radio interface 810 can use external voltage-controlled oscillators (VCO), surface acoustic wave filter, intermediate frequency (IF) filter and/or RF filter as required.Due to various possible RF Interface design, eliminate its extended description.
Baseband circuit 820 can communicate to process with radio interface 810 and receive and/or transmit, and can comprise the signal such as received for down-conversion analog to digital converter 822, for up-conversion signal for launch digital to analog converter 824.In addition, baseband circuit 820 can comprise base band or physical layer (PHY) treatment circuit 856, for the PHY link layer process receiving accordingly/transmit.Baseband circuit 820 can comprise such as treatment circuit 828, controls (MAC)/data link layer deals for media interviews.Baseband circuit 820 can comprise for such as via the Memory Controller 832 that one or more interface 834 communicates with treatment circuit 828 and/or computing platform 860.
In certain embodiments, PHY treatment circuit 826 can comprise the frame combined with the adjunct circuit of such as buffer memory and so on and construct and/or detection module, to construct and/or destructing communication frame (as radio frame 302-e).Alternatively or cumulatively, MAC treatment circuit 828 can perform these process for some shared processing in these functions or independent of PHY treatment circuit 826.In certain embodiments, MAC and PHY process accessible site is in single circuit.
Computing platform 860 can provide the computing function of equipment 800.As shown in the figure, computing platform 860 can comprise processing components 840.As baseband circuit 820 supplement or replace, equipment 800 can be UE 80, base station 800, storage medium 1000 and use processing components 830 logical circuit 830 perform process operate or logic.Processing components 830 (and/or PHY 826 and/or MAC 828) can comprise each hardware elements, software elements or both combinations.The example of hardware element can comprise device, logical device, assembly, processor, microprocessor, circuit, processor circuit (such as processor circuit 220,820), circuit element (such as, transistor, resistor, capacitor, inductor etc.), integrated circuit, application-specific integrated circuit (ASIC) (ASIC), programmable logic device (PLD), digital signal processor (DSP), field programmable gate array (FPGA), memory cell, gate, register, semiconductor device, chip, microchip, chipset etc.The example of software element can comprise component software, program, application, computer program, application program, system program, software development procedures, machine program, operating system software, middleware, firmware, software module, routine, subroutine, function, method, process, software interface, application programming interfaces (API), instruction set, Accounting Legend Code, computer code, code segment, computer code segments, word, value, symbol or their combination in any.Needed for given realization, determine embodiment be utilize hardware component and/or software part to realize can be different according to the factor of any amount, these factors such as required computation rate, power level, thermal capacitance limit, cycle for the treatment of budget, input data rate, output data rate, memory resource, data bus speed and other design or performance constraints.
Computing platform 860 also can comprise other platform assemblies 850.Other platform assemblies 850 comprise common computing element, as one or more processor, polycaryon processor, coprocessor, memory cell, chipset, controller, ancillary equipment, interface, oscillator, timing device, video card, audio card, multimedia I/O (I/O) assembly (such as, digital display), power supply, etc.The example of memory cell can include but not limited to various types of computer-readable and the machinable medium of the form of one or more memory cell more at a high speed, as read-only memory (ROM), random access memory (RAM), dynamic ram (DRAM), double data rate DRAM (DDR AM), synchronous dram (SDRAM), static RAM (SRAM) (SRAM), programming ROM (PROM), erasable programmable ROM (EPROM), electrically erasable ROM (EEPROM), flash memory, the polymer memories such as such as ferroelectric polymer memory, ovonic memory, phase transformation or ferroelectric memory, silicon-oxide-nitride--oxide-silicon (SONOS) memory, magnetic or optical card, the device arrays such as such as redundant array of independent disks (RAID), fixed memory device (such as, USB storage, solid-state drive (SSD)) and be suitable for the storage medium of any other type of storage information.
Equipment 800 can be such as ultra-mobile device, mobile device, permanent plant, machine to machine (M2M) equipment, personal digital assistant (PDA), mobile computing device, smart phone, phone, digital telephone, cell phone, subscriber's installation, E-book reader, mobile phone, unidirectional pager, bidirection pager, messaging devices, computer, personal computer (PC), desktop computer, laptop computer, notebook, net book computer, handheld computer, flat computer, server, server array or server farm, web server, the webserver, Internet server, work station, minicom, mainframe computer, supercomputer, the network equipment, web equipment, distributed computing system, multicomputer system, based on the system of processor, consumption electronic product, programmable consumer electronics, game station, television set, digital television, Set Top Box, WAP (wireless access point), base station, B node, evolved B node (eNB), subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge, switch, machine, or its combination.Therefore, according to suitable needs, can comprise in each embodiment of equipment 800 or omit function and/or the customized configuration of equipment 800 as herein described.In certain embodiments, equipment 800 can be configured to the agreement that is associated with for one or more in IEEE 802.11 standard of WLAN and/or other broadband wireless networks, Hotspot 2.0 standard, 3GPP LTE specification and/or IEEE 802.16 standard and frequency compatible, although each embodiment is not limited to this aspect.
The embodiment of equipment 800 can utilize single-input single-output (SISO) framework to realize.But some realization can comprise the multiple antennas (such as, antenna 818-p) utilizing adaptive antenna technology (for beam forming or space division multiple access (SDMA)) and/or utilize the MIMO communication technology to launch and/or receive.
The combination in any of discrete circuit, application-specific integrated circuit (ASIC) (ASIC), gate and/or single-chip framework can be utilized to realize assembly and the feature of equipment 800.In addition, in the appropriate case, microcontroller, programmable logic array and/or microprocessor or above-mentioned combination in any can be utilized to realize the feature of equipment 800.Note, hardware, firmware and/or software element can be collectively referred to as or be called individually " logic " or " circuit " in this article.
Should understand, the example devices 800 shown in the block diagram of Fig. 8 can represent much functional description examples in the cards.Therefore, the division of the frame function be described in the drawings, omit or comprise nextport hardware component NextPort, circuit, software and/or the element that can not estimate for realizing these functions and must be divided, omit or comprise in embodiments.
The operation of above embodiment can further describe with appended example with reference to the following drawings.Some accompanying drawing may comprise flow chart.Although these accompanying drawings presented can comprise specific logic flow herein, can recognize, this logic flow only provides the example that how can realize general utility functions as described here.In addition, unless otherwise noted, not must perform given logic flow according to presented order.In addition, given logic flow can be realized by hardware elements, the software elements performed by processor or its combination in any.Each embodiment is not limited to this context.
Fig. 9 illustrates an embodiment of logic flow 900, and it can represent the operation performed by one or more embodiment described herein.More specifically, logic flow 900 can represent that execution in embodiments generates synthesis LSC table with the operation being applied to pretreated image.In certain embodiments, a kind of device (as the equipment 100 of Figure 1A, 1B and/or the equipment 800 of Fig. 8) can comprise the logic for the one or more operations in each operation of actuating logic flow process 900, this logic be hardware at least partially.Such as, in embodiments, the logical circuit 830 in the equipment 800 of Fig. 8 can one or more operations in each operation of actuating logic flow process 900.In certain embodiments, a kind of device can comprise camera applications, and as the camera applications 220 of Fig. 2, its each assembly can perform one or more operation like this.In embodiments, one or more storage medium (storage medium 700 as Fig. 7) can comprise instruction set, and in response to being performed on the computing device, this instruction set makes one or more operations of computing equipment actuating logic flow process 900.Each embodiment is not limited to this context.
As shown in logic flow 900,902, one group of LSC table that be weighted to be applied to pretreated image can be identified.Such as, the LSC assembly 222-2 of Fig. 2 can be used for identifying one group of LSC table 224 that will be weighted to be applied to pretreated image 210.In certain embodiments, corresponding exist probability and can calculate for each in multiple LSC table, and this group LSC table can be defined as comprising the plurality of LSC show in its there is each LSC that probability is greater than 0 and show.In the embodiment that each is such, each LSC table may correspond in different corresponding luminous elements, and each LSC show exist probability can indicate LSC show corresponding to luminous element there is probability.Each embodiment is not limited to this context.
904, for each in this group LSC table determines a corresponding set of error values.Such as, the LSC assembly 222-2 of Fig. 2 can be used for the corresponding set of error values determining each in one group of LSC table 224.In certain embodiments, each set of error values can describe and the error shown its corresponding LSC to be applied to pretreated image and be associated.In embodiments, pretreated image can be divided into multiple pieces, and this set of error values that each LSC shows can comprise a chunk error amount, and each block error amount comprises the error amount of a block in multiple pieces.Each embodiment is not limited to this context.
906, determine based on correspondence one set of error values of each during this group LSC shows the respective weights that this LSC shows.Such as, the LSC assembly 222-2 of Fig. 2 can determine the respective weights of each in one group of LSC table 224 with the corresponding set of error values based on each in one group of LSC table 224.In certain embodiments, the correspondence one chunk error amount that a corresponding chunk weight can be shown based on this LSC calculates for each LSC table in this group.And each block weight may correspond in one of multiple pieces.In embodiments, the respective weights that each LSC in this group shows can be confirmed as the weighted sum of the block weight that this LSC shows subsequently.In the embodiment that some are such, one group of dependability parameter can be identified for each in multiple pieces, and the weighted sum of a chunk weight that each LSC shows can calculate by being weighted block weight according to the dependability parameter of corresponding blocks.In embodiments, dependability parameter can indicate the reliability level of the error amount of their corresponding blocks.Each embodiment is not limited to this context.
908, the respective weights can shown based on this group LSC is come for pretreated Computer image genration synthesis LSC table.Such as, image-signal processor (ISP) the assembly 222-1 of Fig. 2 can carry out my pretreated image 210 generation synthesis LSC table based on the respective weights of one group of LSC table 224.In certain embodiments, LSC table can be generated as this group LSC table weighted sum based on respective weights is synthesized.910, synthesis LSC table can be applied to pretreated image to generate treated image.Such as, synthesis LSC table can be applied to pretreated image 210 to generate treated image 230 by the ISP assembly 222-1 of Fig. 2.Each embodiment is not limited to these examples.
One embodiment of Figure 10 interpretation system 1000.In embodiments, system 1000 can represent the system or architectural framework that are suitable for using together with one or more embodiment described herein, as the application subsystem 200 of the equipment 100, Fig. 2 of Figure 1A, 1B, the equipment 800 of the storage medium 700, Fig. 8 of Fig. 7 and/or the logic flow 900 of Fig. 9.These embodiments are not limited to this aspect.
As shown in Figure 10, system 1000 can comprise multiple element.By design or performance limitations given collection needed for, one or more element can use one or more circuit, assembly, register, processor, software subroutines, module or its combination in any to realize.Although Figure 10 illustrates that in particular topology, a limited number of element exemplarily, being appreciated that can by the element used in system 1000 needed for given realization in any suitably topology more or less.Each embodiment is not limited to this context.
In embodiments, system 1000 can be media system, although system 1000 is not limited only to this context.Such as, system 1000 accessible site is to the combination of personal computer (PC), laptop computer, ultra-laptop computer, panel computer, touch pad, portable computer, handheld computer, palmtop PC, personal digital assistant (PDA), cell phone, cell phone/PDA, television set, smart machine (such as, smart phone, Intelligent flat computer or intelligent TV set), mobile internet device (MID), Messaging Devices, data communications equipment etc.
In embodiments, system 1000 comprises the platform 1001 being coupled to display 1045.Platform 1001 can receive content from the content device of such as content services devices (multiple) 1048 or content delivery equipment (multiple) 1049 and so on or other similar content source.The navigation controller 1050 comprising one or more navigation characteristic can be used to carry out alternately with such as platform 1001 and/or display 1045.Each in these assemblies is described hereinafter in more detail.
In embodiments, platform 1001 can comprise processor circuit 1002, chipset 1003, memory cell 1004, transceiver 1044, store 1046, application 1051 and/or any combination of graphics subsystem 1052.Chipset 1003 can provide the intercommunication between processor circuit 1002, memory cell 1004, transceiver 1044, storage 1046, application 1051 and/or graphics subsystem 1052.Such as, chipset 1003 can comprise the memory adapter (description) intercomed mutually that can provide with storing 1046.
Processor circuit 1002 can use any processor or logical device to realize, as the processor of complex instruction set computer (CISC) (CISC) microprocessor, Jing Ke Cao Neng (RISC) microprocessor, very long instruction word (VLIW) microprocessor, compatible x86 instruction set, the processor, such as dual core processor or the double-core that realize the combination of instruction set move the polycaryon processors such as processor or any other microprocessor or CPU (CPU).Processor circuit 1002 also can be implemented as application specific processor, and such as controller, microcontroller, flush bonding processor, chip multi-processor (CMP), coprocessor, digital signal processor (DSP), network processing unit, Media Processor, I/O (I/O) processor, media interviews control (MAC) processor, radio baseband processor, application-specific integrated circuit (ASIC) (ASIC), field programmable gate array (FPGA), programmable logic device (PLD) etc.Such as, in one embodiment, processor circuit 1002 can be implemented as general processor, such as by santa clara the processor that company manufactures.Each embodiment is not limited to this context.
Memory cell 1004 can use and can any machine readable of storage data or computer-readable medium realize, and comprises volatibility and nonvolatile memory.Such as, memory cell 1004 can comprise read-only memory (ROM), random access memory (RAM), dynamic ram (DRAM), double data rate (DDR) DRAM (DDRAM), synchronous dram (SDRAM), static RAM (SRAM) (SRAM), programming ROM (PROM), erasable programmable ROM (EPROM), electrically erasable ROM (EEPROM), flash memory, the polymer memory of such as ferroelectric polymer memory and so on, Ao Fuxinsiji (ovonic) memory, phase transformation or ferroelectric memory, silicon oxide nitride oxide silicon (SONOS) memory, magnetic or optical card, or the medium being suitable for storing information of other type any.
Transceiver 1044 can comprise one or more radio devices that can use the transmission of various suitable wireless communication technology and Received signal strength.Such technology can relate to the communication across one or more wireless network.Example wireless network includes, but is not limited to WLAN (wireless local area network) (WLAN), Wireless Personal Network (WPAN), wireless MAN (WMAN), cellular network and satellite network.In the communication across such network, transceiver 1044 can operate according to the one or more applicable standard of any version.In embodiments, transceiver 1044 can comprise radio frequency (RF) transceiver.Each embodiment is not limited to this context.
Display 1045 can comprise any display device that can show the information received from processor circuit 1002.The example of display 1045 can comprise television set, monitor, projecting apparatus and computer screen.In one embodiment, such as, display 1045 can be realized by the appropriate visual interface of liquid crystal display (LCD), light-emitting diode (LED) or other types.Display 1045 can comprise such as touch-sensitive display panel (" touch-screen ").In each realization, display 1045 can comprise one or more thin film resistor (TFT) LCD, comprises embedded transistor.Each embodiment is not limited to this context.
Memory device 1046 can be implemented as non-volatile memory devices, such as, but be not limited only to, disc driver, CD drive, tape drive, internal storage device, attached storage device, flash memory, battery powered SDRAM (synchronous dram), and/or the memory device of network-accessible.In embodiments, the technology that 1046 memory properties that can comprise for improving valuable Digital Media when such as comprising multiple hard disk drive improve protection is stored.Store 1046 other examples can comprise hard disk, floppy disk, compact disk read-only memory (CD-ROM), can recording compressed dish (CD-R), compact disk (CD-RW), CD, magnetic medium, magneto-optical media, mobile memory card or dish, all kinds of DVD equipment, carrying device, card apparatus etc. can be write.Each embodiment is not limited to this context.
Graphics subsystem 1052 can perform process for display to the image of such as rest image or video and so on.Graphics subsystem 1052 can be such as Graphics Processing Unit (GPU) or VPU (VPU).Analog or digital interface can be used to can couple graphics subsystem 1052 and display 1045 communicatedly.Such as, interface can be HDMI (High Definition Multimedia Interface), display port (DisplayPort), radio HDMI, and/or any one in wireless HD compatible technique.Graphics subsystem 1052 can be integrated in processor circuit 1002 or chipset 1003.Graphics subsystem 1052 can be the stand-alone card that can be coupled to chipset 1003 communicatedly.
Figure described herein and/or video processing technique can realize with various hardware architecture.Such as, figure and/or video capability can be integrated in chipset.Alternatively, independent figure and/or video processor can be used.As another embodiment, figure and/or video capability can be realized by the general processor comprising polycaryon processor.In another embodiment, these functions can realize in the consumer electronics device.
In embodiments, therefore content services devices 1048 by any domestic, international and/or stand-alone service institute master control, and can access platform 1001 via such as internet.Content services devices (multiple) 1048 can be coupled to platform 1001 and/or display 1045.Platform 1001 and/or content services devices (multiple) 1048 can be coupled to network 1053, to travel to and fro between network 1053 ground communication (such as, send and/or receive) media information.Content delivery equipment 1049 can also be coupled to platform 1001 and/or display 1045.
In embodiments, content services devices (multiple) 1048 can comprise CATV set-top-box, equipment is enabled in personal computer, network, phone, the Internet or can the facility of transmitting digital information and/or content, and can via network 1053 or directly at content provider and other similar devices any that is unidirectional or bidirectionally exchanging contents between platform 1001 and/display 1045.Be appreciated that content by network 1053 uniaxially and/or can bidirectionally transmit any one and content supplier in travelling to and fro between in system 1000 assembly.The example of content can comprise any media information, comprises, such as, and video, music, medical treatment and game information etc.
Content services devices (multiple) 1048 receives content, such as cable television program, comprises media information, digital information and/or other guide.The example of content supplier can comprise any wired or satellite television or radio station or ICP.The example provided is not intended to be construed as limiting the embodiment of disclosed theme.
In embodiments, platform 1001 can from navigation controller 1050 reception control signal with one or more navigation characteristic.The navigation characteristic of navigation controller 1050 can be used to, and such as, carries out alternately with user interface 1054.In certain embodiments, navigation controller 1050 can be pointing device, and it can be the computer hardware component (specifically human interface device) that space (such as continuous and multidimensional) data are input in computer by permission user.Such as many systems of graphic user interface (GUI), television set and monitor and so on can make user use physics gesture control and provide data to computer or television set.
The movement of the navigation characteristic of navigation controller 1050 can pass through pointer, cursor, focus ring, or display other visual detectors over the display, and reflection over the display (such as, display 1045).Such as, under the control of software application 1051, the navigation characteristic be positioned on navigation controller 1050 can be mapped to the virtual navigation feature of display in user interface 1054.In embodiments, navigation controller 1050 can not be independent assembly, but is integrated on platform 1001 and/or display 1045.But, the background that each embodiment is not limited to these elements or illustrates herein or describe.
In embodiments, driver (not shown) can comprise technology, and this technology is such as when activated for allowing user to be connected at once by touch button after initial boot and cutting off the platform 1001 (such as when activated) of similar television set.When platform is " closed ", programmed logic can allow platform 1001 by content flow to media filter or other guide service equipment (multiple) 1048 or content delivery equipment (multiple) 1049.In addition, chipset 1003 can also comprise, such as, for hardware and/or the software support of 5.1 surround sound audio frequency and/or high definition 7.1 surround sound audio frequency.Driver can comprise the graphics driver of integrated graphic platform.In embodiments, graphdriver can comprise periphery component interconnection (PCI) (Express) graphics card fast.
In embodiments, any one or more in assembly shown in system 1000 can be integrated.Such as, can integrated platform 1001 and content services devices (multiple) 1048, or also can integrated platform 1001 and content delivery equipment (multiple) 1049, or, such as, also can integrated platform 1001, content services devices (multiple) 1048, and content delivery equipment (multiple) 1049.In embodiments, platform 1001 and display 1045 can be integrated units.Such as, can integrated display 1045 and content services devices (multiple) 1048, or also can integrated display 1045 and content delivery equipment (multiple) 1049.These examples are not intended to the theme disclosed in restriction.
In embodiments, system 1000 can be embodied as wireless system, wired system or both combinations.When implemented as a wireless system, system 1000 can comprise and is applicable to carry out the assembly that communicates and interface, such as one or more antenna, reflector, receiver, transceiver, amplifier, filter, control logic etc. by wireless shared media.The example of wireless shared media can comprise some part of the wireless range of such as RF spectrum and so on etc.When implemented as a wired system, system 1000 can comprise and is applicable to carry out the assembly that communicates and interface, such as I/O adapter, the physical connector be connected with corresponding wired communication media by I/O adapter, network interface unit (NIC), Magnetic Disk Controler, Video Controller, Audio Controller etc. by wired communication media.The example of wired communication media can comprise, circuit, cable, plain conductor, printed circuit board (PCB) (PCB), rear board, switch architecture, semi-conducting material, twisted-pair feeder, coaxial cable, optical fiber etc.
Platform 1001 can set up one or more logic OR physical channel with transmission of information.Information can comprise media information and control information.Media information can refer to and represent to any data of the content of user.Such as, content example can comprise the data from voice conversation, video conference, stream video, Email (" email ") message, voice mail message, alphanumeric notation, figure, image, video, text etc.Data from voice conversation can be, such as, and verbal information, silence period length, background noise, comfort noise, tone etc.Control information can refer to any data represented for the order of automated system, instruction or control word.Such as, control information can be used for by system route media information, or instructs node processes media information in a predetermined manner.But each embodiment is not limited to the element in shown in Figure 10 or described context.
As previously mentioned, system 1000 can show as different physical fashion or form factor.Figure 11 illustrates the embodiment of the little form factor device 1000 that wherein can embody system 1000.In embodiments, such as equipment 1100 can be implemented as a part for the mobile computing device with wireless capability.Mobile computing device can refer to any equipment such as, with treatment system and portable power source (such as, one or more battery).
As described above, the example of mobile computing device can comprise personal computer (PC), laptop computer, ultra-laptop computer, panel computer, touch pad, portable computer, handheld computer, palmtop PC, personal digital assistant (PDA), cell phone, the combination of cell phone/PDA, television set, smart machine (such as, smart phone, Intelligent flat computer or intelligent TV set), mobile internet device (MID), message transmitting device, data communications equipment etc.
The example of mobile computing device can also comprise the computer being configured to be worn by people, such as wrist computer, finger computer, ring computer, eyeglass computer, belt computer, arm straps computer, footwear computer, clothing computers, and other wearable computers.In embodiments, such as mobile computing device can be implemented as the smart phone that can perform computer application and voice communication and/or data communication.Although describe some embodiments for the mobile computing device being embodied as smart phone, other embodiments can be understood other wireless mobile computing equipments also can be utilized to realize.Each embodiment is not limited to this context.
As shown in figure 11, equipment 1100 can comprise display 1145, navigation controller 1150, user interface 1154, shell 1155, I/O equipment 1156 and antenna 1157.Display 1045 can comprise any suitable display unit for showing the information being suitable for mobile computing device, and can be same or similar with the display 1045 of Figure 10.Navigation controller 1150 can comprise can be used to the one or more navigation parts mutual with user interface 1154, and can be same or similar with the navigation controller 1050 of Figure 10.I/O equipment 1156 can comprise any suitable I/O equipment for inputting information in mobile computing device.The example of I/O equipment 1156 can comprise alphanumeric keyboard, numeric keypad, touch pad, enter key, button, switch, reciprocating switch, microphone, loud speaker, speech recognition apparatus and software etc.Information can also be input in equipment 1100 by microphone.This information is by speech recognition apparatus digitlization.Each embodiment is not limited to this context.
Each embodiment can utilize hardware component, software part or both combinations to realize.The example of hardware component can comprise processor, microprocessor, circuit, circuit element (such as transistor, resistor, capacitor, inductor etc.), integrated circuit, application-specific integrated circuit (ASIC) (ASIC), programmable logic device (PLD), digital signal processor (DSP), field programmable gate array (FPGA), gate, register, semiconductor device, chip, microchip, chipset etc.The example of software can comprise component software, program, application, computer program, application program, system program, machine program, operating system software, middleware, firmware, software module, routine, subroutine, function, method, program, software interface, application programming interfaces (API), instruction set, Accounting Legend Code, computer code, code segment, computer code segments, word, value, symbol or their combination in any.Determining whether embodiment utilizes hardware component and/or software part to realize can according to the factors vary of any amount, and these factors are than computation rate as expected, power level, thermal capacitance limit, cycle for the treatment of budget, input data rate, output data rate, memory resource, data bus speed and other design or performance constraints.
One or more aspects of at least one embodiment can be realized by the representative instruction stored on a machine-readable medium, instruction represents the various logic in processor, and instruction makes the logic of this machine making for performing the techniques described herein when being read by machine.These expressions being called as " IP kernel " can be stored on tangible machine readable media, and are provided to multiple client or production facility to be loaded in the manufacturing machine of this logic OR processor of actual manufacture.Such as, machine readable media or goods can be used to realize some embodiments, and these media or goods can store instruction or instruction set, and these instructions or instruction set can make this machine come manner of execution and/or operation according to embodiment when being performed by machine.This machine can comprise such as any suitable processing platform, computing platform, computing equipment, treatment facility, computing system, treatment system, computer, processor etc., and any appropriate combination of hardware and/or software can be used to realize.Machine readable media or works can comprise the memory cell of such as any suitable type, memory devices, memory works, storage medium, memory device, store works, storage medium and/or memory cell, such as memory, removable or non-removable medium, erasable or not erasable medium, can write or rewritable media, numeral or simulation medium, hard disk, floppy disk, compact disk read-only memory (CD-ROM), compact disk can record (CD-R), compact disk can rewrite (CD-W), CD, magnetic medium, magnet-optical medium, removable memory card or dish, various types of digital versatile disc (DVD), tape, cassette etc.Instruction can comprise the code of any suitable type, the such as code etc. of source code, compiled code, interpretive code, executable code, static code, dynamic code, encryption, their use any suitable programming language that is senior, rudimentary, OO, visual, compiling and/or that explain to realize.
Following example belongs to further embodiment:
Example 1 is a kind of image processing apparatus, comprise the logic that it is hardware at least partially, described logic is used for determining the corresponding set of error values of each in one group of correcting lens shadow (LSC) table, each set of error values describes and the error shown its corresponding LSC to be applied to pretreated image and be associated, the respective weights that this LSC shows is determined based on correspondence one set of error values of each during this group LSC shows, and based on the weighted sum that the synthesis LSC table that the respective weights that this group LSC shows generates described pretreated image is shown as this group LSC.
In example 2, pretreated Iamge Segmentation is optionally become multiple pieces by the logic of example 1, and each LSC during this group LSC shows is shown, calculate a corresponding chunk error amount, and each block error amount optionally comprises the error amount of a block in described multiple pieces.
In example 3, the logic of example 2 is optionally shown for each LSC in described one group of LSC table, the described chunk error amount shown based on described LSC calculates a corresponding chunk weight, each block weight comprises the weight of one of described multiple pieces, and calculates the weighted sum of the described chunk weight that respective weights is shown as described LSC for each in described one group of LSC table.
In example 4, the logic of example 3 optionally identifies one group of dependability parameter of described multiple pieces, the reliability level of the error amount of one of described multiple pieces of each dependability parameter instruction, and block weight is weighted to the weighted sum calculating the described chunk weight that each LSC shows by the dependability parameter according to their relevant block.
In example 5, described one group of dependability parameter of example 4 optionally comprises approaching based on the simulation result performed vision sensor data or test.
In example 6, pretreated Iamge Segmentation is optionally become the block of four lines and six row by the logic of any one in example 4 to 5, and described one group of dependability parameter optionally comprises the dependability parameter matrix comprising four lines and six row.
In example 7, each LSC of any one in example 1 to 6 shows optionally to correspond to different corresponding luminous elements.
In example 8, there is probability in what the logic of example 7 optionally calculated its corresponding luminous element for each in multiple LSC table, and each LSC that probability is greater than 0 that exists becoming to comprise corresponding luminous element in described multiple LSC table by described one group of LSC table definition shows accordingly.
In example 9, the logic of any one in example 2 to 8 optionally will by the block number of described pretreated Iamge Segmentation one-tenth and the block described pretreated Iamge Segmentation being become selected quantity based on the content choice of pretreated image.
In example 10, the logic of any one in example 1 to 9 is optionally that described pretreated Computer image genration multiple synthesis LSC shows, and each LSC shows optionally to correspond to corresponding color.
In example 11, described synthesis LSC table is optionally applied to described pretreated image to generate treated image by the logic of any one in example 1 to 10.
In example 12, described treated image is optionally outputted to Video Controller to be presented on digital display by described treated image by the logic of example 11.
In example 13, each LSC of any one table in example 1 to 12 optionally comprises one or more correlated colour temperature (CCT) value.
In example 14, each LSC of any one table in example 1 to 13 is optionally stored in lasting storage.
In example 15, the term of execution that the logic of any one in example 1 to 14 being optionally when the operation of camera applications, determine the respective weights of described one group of LSC table.
Example 16 is a kind of systems, comprises the image processing apparatus according to any one in example 1 to 15, display, radio frequency (RF) transceiver and one or more RF antenna.
Example 17 is at least one the non-transient state machine readable media comprising one group of image processing commands, be performed on the computing device in response to described instruction, make the corresponding set of error values of each that described computing equipment is determined in one group of correcting lens shadow (LSC) table, each set of error values describes and the error shown its corresponding LSC to be applied to pretreated image and be associated, the respective weights that this LSC shows is determined based on correspondence one set of error values of each during this group LSC shows, and based on the weighted sum that the synthesis LSC table that the respective weights that this group LSC shows generates described pretreated image is shown as this group LSC.
In example 18, at least one non-transient state machine readable media of example 17 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment that described pretreated Iamge Segmentation is become multiple pieces, and each LSC during this group LSC shows is shown, calculate a corresponding chunk error amount, and each block error amount optionally comprises the error amount of a block in described multiple pieces.
In example 19, at least one non-transient state machine readable media of example 18 optionally comprises image processing commands, be performed on the computing device in response to described instruction, described computing equipment is shown for each LSC in described one group of LSC table, the described chunk error amount shown based on this LSC calculates a corresponding chunk weight, each block weight comprises the weight of one of described multiple pieces, and calculates the weighted sum of the described chunk weight that respective weights is shown as described LSC for each in described one group of LSC table.
In example 20, at least one non-transient state machine readable media of example 19 optionally comprises image processing commands, be performed on the computing device in response to described instruction, described computing equipment is made to identify one group of dependability parameter of described multiple pieces, the reliability level of the error amount of one of described multiple pieces of each dependability parameter instruction, and block weight is weighted to the weighted sum calculating the described chunk weight that each LSC shows by the dependability parameter according to their relevant block.
In example 21, described one group of dependability parameter of example 20 optionally comprises approaching based on the simulation result performed vision sensor data or test.
In example 22, at least one non-transient state machine readable media of any one in example 20 to 21 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment pretreated Iamge Segmentation be become the block of four lines and six row, and described one group of dependability parameter optionally comprise the dependability parameter matrix comprising four lines and six row.
In example 23, each LSC of any one in example 17 to 22 shows optionally to correspond to different corresponding luminous elements.
In example 24, at least one non-transient state machine readable media of example 23 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment be each during multiple LSC shows calculate its corresponding luminous element there is probability accordingly, and each LSC that probability is greater than 0 that exists becoming to comprise corresponding luminous element in described multiple LSC table by described one group of LSC table definition shows.
In example 25, at least one non-transient state machine readable media of any one in example 18 to 24 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment by the block number of described pretreated Iamge Segmentation one-tenth, and described pretreated Iamge Segmentation will be become the block of selected quantity based on the content choice of pretreated image.
In example 26, at least one non-transient state machine readable media of any one in example 17 to 25 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment be that described pretreated Computer image genration multiple synthesis LSC shows, and each LSC show optionally to correspond to corresponding color.
In example 27, at least one non-transient state machine readable media of any one in example 17 to 26 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment that described synthesis LSC table is applied to described pretreated image to generate treated image.
In example 28, at least one non-transient state machine readable media of example 27 optionally comprises image processing commands, be performed on the computing device in response to described instruction, make described computing equipment described treated image be outputted to Video Controller to be presented on digital display by described treated image.
In example 29, each LSC of any one table in example 17 to 28 optionally comprises one or more correlated colour temperature (CCT) value.
In example 30, each LSC of any one table in example 17 to 29 is optionally stored in lasting storage.
In example 31, at least one non-transient state machine readable media of any one in example 17 to 30 optionally comprises image processing commands, be performed on the computing device in response to described instruction, the term of execution of making described computing equipment when the operation of camera applications, determine the respective weights of described one group of LSC table.
Example 32 is a kind of image processing methods, comprise by processor circuit determine one group of correcting lens shadow (LSC) show in the corresponding set of error values of each, each set of error values describes and the error shown its corresponding LSC to be applied to pretreated image and be associated, the respective weights that this LSC shows is determined based on correspondence one set of error values of each during this group LSC shows, and based on the weighted sum that the synthesis LSC table that the respective weights that this group LSC shows generates described pretreated image is shown as this group LSC.
In example 33, the image processing method of example 32 optionally comprises described pretreated Iamge Segmentation is become multiple pieces, and each LSC during this group LSC shows is shown, calculate a corresponding chunk error amount, and each block error amount optionally comprises the error amount of a block in described multiple pieces.
In example 34, the image processing method of example 33 optionally comprises to be shown for each LSC in described one group of LSC table, the described chunk error amount shown based on described LSC calculates a corresponding chunk weight, each block weight comprises the weight of one of described multiple pieces, and calculates the weighted sum of the described chunk weight that respective weights is shown as described LSC for each in described one group of LSC table.
In example 35, the image processing method of example 34 optionally comprises one group of dependability parameter of described multiple pieces of mark, the reliability level of the error amount of one of described multiple pieces of each dependability parameter instruction, and block weight is weighted to the weighted sum calculating the described chunk weight that each LSC shows by the dependability parameter according to their relevant block.
In example 36, described one group of dependability parameter of example 35 optionally comprises approaching based on the simulation result performed vision sensor data or test.
In example 37, the image processing method of any one in example 35 to 36 optionally comprises the block pretreated Iamge Segmentation being become four lines and six row, and described one group of dependability parameter optionally comprises the dependability parameter matrix comprising four lines and six row.
In example 38, each LSC of any one in example 32 to 37 shows optionally to correspond to different corresponding luminous elements.
In example 39, the image processing method of example 38 optionally comprise calculate its corresponding luminous element for each in multiple LSC table there is probability accordingly, and each LSC that probability is greater than 0 that exists becoming to comprise corresponding luminous element in described multiple LSC table by described one group of LSC table definition shows.
In example 40, the content choice that the image processing method of any one in example 33 to 39 optionally comprises based on pretreated image will by the block number of described pretreated Iamge Segmentation one-tenth and the block described pretreated Iamge Segmentation being become selected quantity.
In example 41, the image processing method of any one in example 32 to 40 optionally comprises for described pretreated Computer image genration multiple synthesis LSC shows, and each LSC shows optionally to correspond to corresponding color.
In example 42, the image processing method of any one in example 32 to 41 optionally comprises to be shown to be applied to described pretreated image to generate treated image by described synthesis LSC.
In example 43, the image processing method of example 42 optionally comprises described treated image is outputted to Video Controller to be presented on digital display by described treated image.
In example 44, each LSC of any one table in example 32 to 43 optionally comprises one or more correlated colour temperature (CCT) value.
In example 45, each LSC of any one table in example 32 to 44 is optionally stored in lasting storage.
In example 46, the term of execution of when the image processing method of any one in example 32 to 45 is optionally included in the operation of camera applications, determine the respective weights of described one group of LSC table.
Example 47 is at least one machine readable media comprising one group of instruction, and in response to performing on the computing device, described instruction makes the execution of described computing equipment according to the image processing method of any one in example 32 to 46.
Example 48 is a kind of equipments, comprises the device for performing the image processing method according to any one in example 32 to 46.
Example 49 is a kind of systems, comprises the device according to example 48, display, radio frequency (RF) transceiver and one or more RF antenna.
Example 50 is a kind of image processing equipment, comprise the device for determining the corresponding set of error values of each in one group of correcting lens shadow (LSC) table, each set of error values describes and the error shown its corresponding LSC to be applied to pretreated image and be associated, for determining the device of the respective weights that this LSC shows based on correspondence one set of error values of each in this group LSC table, and the respective weights for showing based on this group LSC generates the device of synthesis LSC table as the weighted sum of this group LSC table of described pretreated image.
In example 51, the image processing equipment of example 50 optionally comprises the device for described pretreated Iamge Segmentation being become multiple pieces, and show for each LSC in showing for this group LSC, calculate the device of a corresponding chunk error amount, and each block error amount optionally comprises the error amount of a block in described multiple pieces.
In example 52, the image processing equipment of example 51 optionally comprises for being shown for each LSC in described one group of LSC table, based on the device of the corresponding chunk weight of a described chunk error amount calculating that described LSC shows, each block weight comprises the weight of one of described multiple pieces, and for calculating the device of the weighted sum of the described chunk weight that respective weights is shown as described LSC for each in described one group of LSC table.
In example 53, the image processing equipment of example 52 optionally comprises the device for the one group of dependability parameter identifying described multiple pieces, the reliability level of the error amount of one of described multiple pieces of each dependability parameter instruction, and for the device by being weighted the weighted sum calculating the described chunk weight that each LSC shows to block weight according to the dependability parameter of their relevant block.
In example 54, described one group of dependability parameter of example 53 optionally comprises approaching based on the simulation result performed vision sensor data or test.
In example 55, the image processing equipment of any one in example 53 to 54 optionally comprises the device of the block for pretreated Iamge Segmentation being become four lines and six row, and described one group of dependability parameter optionally comprises the dependability parameter matrix comprising four lines and six row.
In example 56, each LSC of any one in example 50 to 55 shows optionally to correspond to different corresponding luminous elements.
In example 57, the image processing equipment of example 56 optionally comprises the device that there is probability accordingly for calculating its corresponding luminous element for each in multiple LSC table, and there is for what become to comprise corresponding luminous element in described multiple LSC table by described one group of LSC table definition the device that each LSC that probability is greater than 0 shows.
In example 58, the image processing equipment of any one in example 51 to 57 optionally comprises the device of the block number for described pretreated Iamge Segmentation being become based on the content choice of pretreated image and is used for the device of the block described pretreated Iamge Segmentation being become selected quantity.
In example 59, the image processing equipment of any one in example 50 to 58 optionally comprises for the device for described pretreated Computer image genration multiple synthesis LSC table, and each LSC shows optionally to correspond to corresponding color.
In example 60, the image processing equipment of any one in example 50 to 59 optionally comprises for described synthesis LSC table is applied to described pretreated image to generate the device of treated image.
In example 61, the image processing equipment of example 60 optionally comprises for described treated image being outputted to Video Controller described treated image to be presented on the device on digital display.
In example 62, each LSC of any one table in example 50 to 61 optionally comprises one or more correlated colour temperature (CCT) value.
In example 63, each LSC of any one table in example 50 to 62 is optionally stored in lasting storage.
In example 64, the image processing equipment of any one in example 50 to 63 optionally comprise for when the operation of camera applications the term of execution determine the device of the respective weights of described one group of LSC table.
Example 65 is a kind of systems, comprises the image processing apparatus according to any one in example 50 to 64, display, radio frequency (RF) transceiver and one or more RF antenna.
Many concrete details have been illustrated, to have thorough understanding to these embodiments at this.But those skilled in the art can understand, these embodiments can not be implemented by these details.In other example, known method, program, assembly and circuit are not described in detail in order to avoid make the present invention unclear.Be understandable that, concrete structure disclosed herein and function detail can represent but not necessarily limit the scope of embodiment.
Can be described some embodiments with " connection " and derivative thereof with statement " coupling ".These terms are not intended to synonym each other.Such as, " can to connect " with term and/or " coupling " is described some embodiments, to represent direct physical or the electrical contact each other of two or more elements.But term " coupling " also can refer to that two or more elements are each other and non-direct contact, but still coordination with one another or mutual.
Unless specifically stated otherwise, should recognize, such as the term such as " process ", " calculating ", " computing ", " determination " represents action and/or the process of computer or computing system or similar electronic computing device, its by be represented as in the register of computing system and/or memory physical quantity (such as, electronics) data processing and/or be converted to computing system memory, register or other this category information stores, in transmission or display device by similar other data being expressed as physical quantity.Each embodiment is not limited to this context.
It should be noted that method described herein need not perform with described order or any particular order.In addition, the various activities described in conjunction with the method that provides herein can order or parallel mode perform.
Although illustrated in this article and described each specific embodiment, will be appreciated that, estimated that any arrangement that can realize identical object can replace shown specific embodiment.The disclosure is intended to any and whole remodeling or the change of containing each embodiment.Be appreciated that description is above made to explain orally mode, instead of ways to restrain.For those skilled in that art, just can know once look back explanation above the combination of other embodiment known previous embodiment and do not illustrate herein.Therefore, the scope of each embodiment is included in other application any wherein using aforementioned combinatorial, structure and method.
Can be described some embodiments with statement " embodiment " and " embodiment " and derivative thereof.These statements refer to the special characteristic, structure or the characteristic that describe in conjunction with this embodiment and are included at least one embodiment.The phrase " in one embodiment " occurred everywhere at specification not necessarily all refers to identical embodiment.In addition, can be described some embodiments with " connection " and derivative thereof with statement " coupling ".These terms are not necessarily as synonym each other.Such as, " can to connect " with term and/or " coupling " is described some embodiments, to represent direct physical or the electrical contact each other of two or more elements.But term " coupling " also can refer to that two or more elements are each other and non-direct contact, but still coordination with one another or mutual.In addition, can be combined from each side of different embodiment or element.
It is emphasized that the summary providing disclosure is herein to meet 37C.F.R. § .1.72 (b), it requires that summary can make reader find out the essence of technology disclosure fast.This summary is not used in explanation or the restriction scope of claim or the understanding of implication and submits to.In addition, in superincumbent detailed description, can find out that each feature is grouped into together in single embodiment in order to make the disclosure become smooth.This open method should not be interpreted as the reflecting requirement embodiment of protection compares the intention that the feature clearly stated in each claim needs more multiple features.On the contrary, as reflected in claims, aspect of the present invention is less than all features of above disclosed single embodiment.Thus, claim is below included in this specification clearly at this, wherein each claim itself as independent preferred embodiment.In the following claims, term " comprises " and " wherein " is used as corresponding term respectively and " comprises " and the flat civilian English equivalent of " wherein ".In addition, term " first ", " second " and " the 3rd " etc. only with marking, and are not intended to force numerical requirements to their object.
Although with this theme of architectural feature and/or the special language description of method action, be appreciated that subject matter defined in the appended claims is not necessarily limited to above-mentioned specific features or action.On the contrary, above-mentioned specific features and action are as disclosed in the exemplary forms realizing claim.

Claims (25)

1. an image processing apparatus, comprising:
Logic, described logic be hardware at least partially, described logic is used for the corresponding set of error values determining that each LSC in one group of correcting lens shadow (LSC) table shows, each set of error values describes and the error shown its corresponding LSC to be applied to pretreated image and be associated, correspondence one set of error values shown based on each LSC in described one group of LSC table determines the respective weights that each LSC described shows, and the weighted sum of synthesis LSC table as described one group of LSC table of described pretreated image is generated based on the respective weights that described one group of LSC shows.
2. image processing apparatus as claimed in claim 1, it is characterized in that, described logic is used for described pretreated Iamge Segmentation to become multiple pieces, and each LSC in described one group of LSC table is shown, calculate a corresponding chunk error amount, each block error amount comprises the error amount of a block in described multiple pieces.
3. image processing apparatus as claimed in claim 2, it is characterized in that, described logic is used for, each LSC in described one group of LSC table is shown, the described chunk error amount shown based on described LSC calculates a corresponding chunk weight, each block weight comprises the weight of one of described multiple pieces, and calculates the weighted sum of the described chunk weight that respective weights is shown as described LSC for each in described one group of LSC table.
4. image processing apparatus as claimed in claim 3, it is characterized in that, the one group dependability parameter of described logic for identifying described multiple pieces, the reliability level of the error amount of one of described multiple pieces of each dependability parameter instruction, and block weight is weighted to the weighted sum calculating the described chunk weight that each LSC shows by the dependability parameter according to their relevant block.
5. image processing apparatus as claimed in claim 4, is characterized in that, described one group of dependability parameter comprises approaching based on the simulation result performed vision sensor data or test.
6. image processing apparatus as claimed in claim 4, is characterized in that, described logic is used for the block described pretreated Iamge Segmentation being become four lines and six row, and described one group of dependability parameter comprises the dependability parameter matrix comprising four lines and six row.
7. image processing apparatus as claimed in claim 1, is characterized in that, each LSC shows to correspond to different corresponding luminous elements.
8. image processing apparatus as claimed in claim 7, it is characterized in that, there is probability in what described logic was used for calculating its corresponding luminous element for each in multiple LSC table, and each LSC that probability is greater than 0 that exists becoming to comprise corresponding luminous element in described multiple LSC table by described one group of LSC table definition shows accordingly.
9. image processing apparatus as claimed in claim 2, it is characterized in that, described logic is used for the block number that described pretreated Iamge Segmentation will be become based on the content choice of described pretreated image and the block described pretreated Iamge Segmentation being become selected quantity.
10. image processing apparatus as claimed in claim 1, is characterized in that, described logic is used for showing for described pretreated Computer image genration multiple synthesis LSC, and each LSC shows to correspond to corresponding color.
11. image processing apparatus as claimed in claim 1, is characterized in that, described logic is used for described synthesis LSC table to be applied to described pretreated image to generate treated image.
12. 1 kinds of systems, comprising:
Device according to any one in claim 1 to 11;
Display;
Radio frequency (RF) transceiver; And
One or more RF antenna.
13. 1 kinds of image processing methods, comprising:
By processor circuit determine one group of correcting lens shadow (LSC) show in the corresponding set of error values of each, each set of error values describes and the error shown its corresponding LSC to be applied to pretreated image and be associated;
Correspondence one set of error values shown based on each LSC in described one group of LSC table determines the respective weights that each LSC described shows; And
Respective weights based on described one group of LSC table generates the weighted sum of synthesis LSC table as described one group of LSC table of described pretreated image.
14. image processing methods as claimed in claim 13, is characterized in that, comprising:
Described pretreated Iamge Segmentation is become multiple pieces; And
Show for each LSC in described one group of LSC table, calculate a corresponding chunk error amount, each block error amount comprises the error amount of a block in described multiple pieces.
15. image processing methods as claimed in claim 14, is characterized in that, comprising:
Show for each LSC in described one group of LSC table, the described chunk error amount shown based on described LSC calculates a corresponding chunk weight, and each block weight comprises the weight of one of described multiple pieces; And
For each LSC in described one group of LSC table shows the weighted sum calculating the described chunk weight that respective weights is shown as each LSC described.
16. image processing methods as claimed in claim 15, is characterized in that, comprising:
Identify one group of dependability parameter of described multiple pieces, the reliability level of the error amount of one of described multiple pieces of each dependability parameter instruction; And
Calculate the weighted sum of a described chunk weight that each LSC shows, described calculating to be weighted block weight by the dependability parameter of the relevant block according to block weight to be carried out.
17. image processing methods as claimed in claim 13, is characterized in that, each LSC shows to correspond to different corresponding luminous elements.
18. image processing methods as claimed in claim 17, is characterized in that, comprising:
There is probability accordingly in what calculate its corresponding luminous element for each in multiple LSC table; And
Each LSC that probability is greater than 0 that exists becoming to comprise the corresponding luminous element in described multiple LSC table by described one group of LSC table definition shows.
19. image processing methods as claimed in claim 14, is characterized in that, comprising:
Based on the block number that described pretreated Iamge Segmentation will become by the content choice of described pretreated image; And
Described pretreated Iamge Segmentation is become the block of selected quantity.
20. image processing methods as claimed in claim 13, is characterized in that, comprising:
Described synthesis LSC table is applied to described pretreated image to generate treated image; And
Described treated image is outputted to Video Controller to be presented on digital display by described treated image.
21. image processing methods as claimed in claim 13, is characterized in that, each LSC table comprises one or more correlated colour temperature (CCT) value.
22. image processing methods as claimed in claim 13, is characterized in that, each LSC table is stored in lasting storage.
23. image processing methods as claimed in claim 13, is characterized in that, determine the respective weights of described one group of LSC table term of execution of when the operation of camera applications.
24. at least one machine readable media comprising one group of instruction, described instruction makes described computing equipment perform image processing method as described in any one in claim 13 to 23 in response to performing on the computing device.
25. 1 kinds of equipment, comprise the device for performing the image processing method as described in any one in claim 13 to 23.
CN201410582124.5A 2013-11-27 2014-10-27 Reduce the technology of the color artifacts in digital picture Active CN104683774B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361909914P 2013-11-27 2013-11-27
US61/909,914 2013-11-27
US14/182,824 2014-02-18
US14/182,824 US9361537B2 (en) 2013-11-27 2014-02-18 Techniques to reduce color artifacts in a digital image

Publications (2)

Publication Number Publication Date
CN104683774A true CN104683774A (en) 2015-06-03
CN104683774B CN104683774B (en) 2018-02-23

Family

ID=51951587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410582124.5A Active CN104683774B (en) 2013-11-27 2014-10-27 Reduce the technology of the color artifacts in digital picture

Country Status (4)

Country Link
US (1) US9361537B2 (en)
EP (1) EP2879375A1 (en)
KR (1) KR101586954B1 (en)
CN (1) CN104683774B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019137396A1 (en) * 2018-01-12 2019-07-18 华为技术有限公司 Image processing method and device
CN112243116A (en) * 2020-09-30 2021-01-19 格科微电子(上海)有限公司 Multichannel LSC gain adjusting method and device, storage medium and image processing equipment

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9186909B1 (en) 2014-09-26 2015-11-17 Intel Corporation Method and system of lens shading color correction using block matching
GB201516173D0 (en) * 2015-09-14 2015-10-28 Apical Ltd Adaptive shading correction
US10929945B2 (en) * 2017-07-28 2021-02-23 Google Llc Image capture devices featuring intelligent use of lightweight hardware-generated statistics
US10542243B2 (en) 2018-04-10 2020-01-21 Intel Corporation Method and system of light source estimation for image processing
CN108881725B (en) * 2018-07-19 2020-10-20 长沙全度影像科技有限公司 Panoramic camera color shadow correction method based on non-equidistant radial symmetry model

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040257454A1 (en) * 2002-08-16 2004-12-23 Victor Pinto Techniques for modifying image field data
US20090322892A1 (en) * 2008-06-25 2009-12-31 Micron Technology, Inc. Method and apparatus for calibrating and correcting shading non-uniformity of camera systems
US20100110241A1 (en) * 2008-11-04 2010-05-06 Aptina Imaging Corporation Multi illuminant shading correction using singular value decomposition
US7834925B2 (en) * 2006-06-05 2010-11-16 Core Logic Inc. Lens shading correction device and method in image sensor
US20110149112A1 (en) * 2009-12-23 2011-06-23 Nokia Corporation Lens shading correction
KR20110072213A (en) * 2009-12-22 2011-06-29 엘지이노텍 주식회사 Image correction method
JP2013198041A (en) * 2012-03-22 2013-09-30 Samsung R&D Institute Japan Co Ltd Image processing device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008277926A (en) * 2007-04-25 2008-11-13 Kyocera Corp Image data processing method and imaging device using same
US8223229B2 (en) * 2009-07-02 2012-07-17 Nethra Imaging Inc Lens shading correction for autofocus and zoom lenses
KR101672944B1 (en) * 2009-12-14 2016-11-04 엘지이노텍 주식회사 Lens shading correction method in the auto focus camera module
US8593548B2 (en) * 2011-03-28 2013-11-26 Aptina Imaging Corporation Apparataus and method of automatic color shading removal in CMOS image sensors

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040257454A1 (en) * 2002-08-16 2004-12-23 Victor Pinto Techniques for modifying image field data
US7834925B2 (en) * 2006-06-05 2010-11-16 Core Logic Inc. Lens shading correction device and method in image sensor
US20090322892A1 (en) * 2008-06-25 2009-12-31 Micron Technology, Inc. Method and apparatus for calibrating and correcting shading non-uniformity of camera systems
US20100110241A1 (en) * 2008-11-04 2010-05-06 Aptina Imaging Corporation Multi illuminant shading correction using singular value decomposition
KR20110072213A (en) * 2009-12-22 2011-06-29 엘지이노텍 주식회사 Image correction method
US20110149112A1 (en) * 2009-12-23 2011-06-23 Nokia Corporation Lens shading correction
JP2013198041A (en) * 2012-03-22 2013-09-30 Samsung R&D Institute Japan Co Ltd Image processing device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019137396A1 (en) * 2018-01-12 2019-07-18 华为技术有限公司 Image processing method and device
CN110033412A (en) * 2018-01-12 2019-07-19 华为技术有限公司 A kind of image processing method and device
CN110033412B (en) * 2018-01-12 2023-12-15 华为技术有限公司 Image processing method and device
CN112243116A (en) * 2020-09-30 2021-01-19 格科微电子(上海)有限公司 Multichannel LSC gain adjusting method and device, storage medium and image processing equipment
CN112243116B (en) * 2020-09-30 2022-05-31 格科微电子(上海)有限公司 Method and device for adjusting multichannel lens shadow compensation LSC gain, storage medium and image processing equipment

Also Published As

Publication number Publication date
KR20150061564A (en) 2015-06-04
US20150146979A1 (en) 2015-05-28
KR101586954B1 (en) 2016-01-19
CN104683774B (en) 2018-02-23
EP2879375A1 (en) 2015-06-03
US9361537B2 (en) 2016-06-07

Similar Documents

Publication Publication Date Title
CN104683774A (en) Techniques to reduce color artifacts in a digital image
CN110163806B (en) Image processing method, device and storage medium
CN106101561B (en) Camera focusing detection method and device
CN106101547A (en) The processing method of a kind of view data, device and mobile terminal
US20200051225A1 (en) Fast Fourier Color Constancy
CN106027787B (en) A kind of white balance method and mobile terminal of mobile terminal
CN113132704B (en) Image processing method, device, terminal and storage medium
JP2016535353A (en) Object detection and segmentation method, apparatus, and computer program product
CN110462617B (en) Electronic device and method for authenticating biometric data with multiple cameras
CN108391060A (en) A kind of image processing method, image processing apparatus and terminal
CN108259746B (en) Image color detection method and mobile terminal
CN109658330A (en) A kind of color development method of adjustment and device
CN105991982A (en) Color matching for imaging systems
CN105469357A (en) Image processing method and device, and terminal
US11954789B2 (en) System and method for sparse distributed rendering
US20170171524A1 (en) Techniques for improving stereo block matching with the pyramid method
CN104782112B (en) Device, method, system and equipment for adjusting video camera array
CN109522869A (en) Face image processing process, device, terminal device and computer storage medium
US20230360286A1 (en) Image processing method and apparatus, electronic device and storage medium
CN108282664A (en) Image processing method, device, system and computer readable storage medium
CN110097570A (en) A kind of image processing method and device
US20160086377A1 (en) Determining an image target's suitability for color transfer in an augmented reality environment
US8340416B2 (en) Techniques for robust color transfer
CN107209947A (en) The feature under multiple yardsticks is used to be transmitted for color in augmented reality
CN109062644A (en) The method and apparatus of processing information for terminal

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant