CN102200881B - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
CN102200881B
CN102200881B CN201110069689.XA CN201110069689A CN102200881B CN 102200881 B CN102200881 B CN 102200881B CN 201110069689 A CN201110069689 A CN 201110069689A CN 102200881 B CN102200881 B CN 102200881B
Authority
CN
China
Prior art keywords
unit
overlapping display
image
operating body
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110069689.XA
Other languages
Chinese (zh)
Other versions
CN102200881A (en
Inventor
柏谷辰起
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010068269A external-priority patent/JP2011203823A/en
Priority claimed from JP2010068270A external-priority patent/JP2011203824A/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN102200881A publication Critical patent/CN102200881A/en
Application granted granted Critical
Publication of CN102200881B publication Critical patent/CN102200881B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention provides a kind of image processing apparatus, image processing method and program, wherein this image processing apparatus comprises: Overlapping display position determination unit, and it determines the position of the object with predetermined plane or curved surface from the object of imaging in the input image based on environmental map; Overlapping display image generation unit, it is by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generates Overlapping display image; Image superimposition unit, its by Overlapping display imaging importing on the visual field of user; Operating body recognition unit, it identifies the operating body of imaging in the input image; And processing execution unit, it performs and the position based on the operating body identified by operating body recognition unit and process corresponding to the project selected.

Description

Image processing apparatus and image processing method
Technical field
The present invention relates to a kind of image processing apparatus and image processing method.
Background technology
Recently, the technology being called as augmented reality (AR) arouses attention, obtain by carrying out imaging to real space thus and be presented to user by the image that particular procedure is revised.In AR technology, such as, the useful information about the object in the real space shown in input picture can be inserted in this image, to export as output image.That is, in AR technology, usually, the major part of presenting to the image of user illustrates real space, and can according to the some parts of application purpose process image.This characteristic is formed with the virtual reality using computer graphical (CG) to form the entirety (or major part) of output image and contrasts.By using AR technology, such as, can provide and such as easily understand the situation of real space or the advantage of work support based on output image user.
In addition, in AR technology, imaging being carried out to real space and except technology in the image that obtains except relating to the useful information about the object in real space to be inserted through, the useful information about the object in real space being presented to by the visual field being also present in the user of viewing real space the technology of user in the mode of superposition.In this technology, coming by using half-mirror (halfmirror) etc. to synthesize with the visual field optics of the user of viewing real space, presenting the useful information about the object in real space to user.Equally, when use this AR technology, such as, can provide such as make user easily understand real space situation or work support advantage.
In AR technology, in order to present useful information truly to user, the situation that real space understood exactly by computing machine is very important.Therefore, develop the technology being intended to the situation understanding real space, it is used as the basis of AR technology.Such as, Japanese Patent Application Laid-Open No.2008-304268 discloses following methods: technology that simultaneously can estimate the position of camera and the position of posture and the unique point shown in the image of camera by application, that be called as simultaneous localization and mapping (SLAM), dynamically generates the environmental map of the three-dimensional position representing the object be present in real space.Note, at " Real-TimeSimultaneousLocalizationandMappingwithaSingleCa mera " (AndrewJ.Davison, Proceedingsofthe9thIEEEInternationalConferenceonComputer VisionVolume2,2003, pp.1403-1410) in disclose the ultimate principle of SLAM technology using monocular camera.
Summary of the invention
Meanwhile, as the compact display apparatus be arranged on head, head mounted display (HMD) is widely used.As using the technology of the HMD of camera of being equipped with, following technology can be illustrated: its for use image processing apparatus, by AR technology modification by the image of camera imaging and display by the amended image of HMD so that user watches amended image.The function that this HMD has can such as be realized by video transmission-type HMD.In addition, such as, also there is following technology: it, for the image by camera imaging is used as source, generates additional information image by image processing apparatus by AR technology, and in visual field, use the optics such as half-mirror to synthesize the additional information image generated, so that user watches image.Such as, this function of HMD realizes by optical transmission-type HMD.By using these technology, user easily can understand the situation of real space, and can provide the work support based on output image.As for inputting the device making image processing apparatus perform the operation of process, such as, can suppose the input media of such as keyboard or mouse.
But, although illustrate the advantage using HMD, namely, the trouble of user's input operation can be saved, but there is following problem: when input media (such as, keyboard or mouse) is used as the device of input operation, cause the trouble bringing input operation to user.Thus, decrease the advantage using HMD.
Accordingly, desirable to provide a kind of image processing apparatus, image processing method and program, it is novel and improves, and it is convenient to user's input operation in the configuration being shown information by HMD by AR technology on the visual field of user in a superimposed manner.
According to embodiments of the invention, provide a kind of image processing apparatus, comprising: characteristic storage unit, it stores the characteristic of the feature of the outward appearance representing object; Overlapping display data storage cell, it stores Overlapping display data and item location, and wherein, Overlapping display data are the sources of the image be superimposed upon on the visual field of user, and item location is the item destination locations forming Overlapping display data; Environmental map generation unit, it, based on the input picture obtained by using imaging device to carry out imaging to real space and the characteristic be stored in characteristic storage unit, generates the environmental map of the position representing the one or more objects be present in real space; Overlapping display position determination unit, it is based on environmental map, determines the position of the object with predetermined plane or curved surface from the object of imaging in the input image; Overlapping display image generation unit, it is by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generates Overlapping display image; Image superimposition unit, its by Overlapping display imaging importing on the visual field of user; Operating body recognition unit, it identifies the operating body of imaging in the input image; And processing execution unit, it performs and the position based on the operating body identified by operating body recognition unit and process corresponding to the project selected.
The foot of imaging in the input image can be identified as operating body by operating body recognition unit.
Operating body recognition unit can perform as the footwear registration image of the image of the footwear of registering in advance and mating between input picture, and when operating body recognition unit determine register images match with footwear footwear in the input image imaging time, these footwear are identified as operating body by operating body recognition unit.
When imaging device is worn on his/her head by user, operating body recognition unit can determine whether the foot of imaging in the input image enters among each limit forming input picture with the immediate limit of user, and when operating body recognition unit determines that foot is from when entering with the immediate limit of user, foot is identified as operating body by operating body recognition unit.
Operating body recognition unit can determine to have passed through in advance predetermined labels, the whether in the input image imaging of the markd footwear of tool, and when operating body recognition unit determine the markd footwear of tool in the input image imaging time, these footwear are identified as operating body by operating body recognition unit.
Processing execution unit can determine whether the touch sensor being attached to foot detects contact, and when touch sensor detects contact, and processing execution unit can perform process corresponding to the project selected with the position based on foot.
Processing execution unit can determine whether the operating body identified by operating body recognition unit has stopped scheduled time slot in substantially identical position, and when processing execution unit determines that operating body has stopped described scheduled time slot in substantially identical position, processing execution unit can perform process corresponding to the project selected with the position based on operating body.
Other Overlapping display data that Overlapping display data storage cell shows after can being stored in Overlapping display data.When by processing execution Unit selection project, Overlapping display image generation unit is by arranging other Overlapping display data further to generate new Overlapping display image.When passing through processing execution Unit selection project, other Overlapping display data can be superimposed upon on new Overlapping display image by image superimposition unit further.
The hard recognition of imaging in the input image can be operating body by operating body recognition unit.
Image superimposition unit can by Overlapping display imaging importing on the visual field of user, and can make display unit show Overlapping display image.
When image superimposition unit is by using the sensor of the degree of tilt detecting imaging device, when detecting that the user be worn on his/her head makes head tilt in a downward direction with the degree of tilt exceeding predetermined value by imaging device, image superimposition unit can make display unit show Overlapping display image, and when image superimposition unit is by using the sensor of the degree of tilt detecting imaging device, when not detecting that the user be worn on his/her head makes head tilt in a downward direction with the degree of tilt exceeding predetermined value by imaging device, image superimposition unit can limit display Overlapping display image, wherein, this display is performed by display unit.
The position of the object with the plane extended in substantially horizontal directions can be defined as the position of the object with predetermined plane or curved surface by Overlapping display position determination unit.
Image processing apparatus can also comprise location estimation unit, and it is based on the position of the point on the surface of the object represented by environmental map, the ground in estimation real space or the position of metope.The position on ground further based on the position of the ground in the real space by location estimation unit estimation or metope, can be defined as the position of object by Overlapping display position determination unit.
Characteristic can comprise for the one or more points on the surface of each object, represent the data whether each point probably contacts with the ground in real space or metope.Location estimation unit can the position of ground further in feature based data estimation real space or metope.
In addition, according to another embodiment of the present invention, provide a kind of image processing method performed by image processing apparatus, wherein, this image processing apparatus comprises the storage unit of the characteristic of the characteristic of the feature storing the outward appearance representing object, store the Overlapping display data storage cell of Overlapping display data and item location, environmental map generation unit, Overlapping display position determination unit, image superimposition unit, operating body recognition unit, and processing execution unit, wherein, these Overlapping display data are the sources of the image be superimposed upon on the visual field of user, and this item location is the item destination locations forming described Overlapping display data, this image processing method comprises the following steps: by environmental map generation unit based on the input picture obtained by using imaging device to carry out imaging to real space and the characteristic be stored in characteristic storage unit, generate the environmental map of the position representing the one or more objects be present in real space, by Overlapping display position determination unit based on environmental map, from the object of imaging in the input image, determine the position of the object with predetermined plane or curved surface, by Overlapping display image generation unit by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generate Overlapping display image, by image superimposition unit by Overlapping display imaging importing on the visual field of user, by the operating body of operating body recognition unit identification imaging in the input image, and performed and the position based on the operating body identified by operating body recognition unit and process corresponding to the project selected by processing execution unit.
In addition, according to another embodiment of the present invention, provide a kind of program for making computing machine play the effect of image processing apparatus, this image processing apparatus comprises: characteristic storage unit, and it stores the characteristic of the feature of the outward appearance representing object; Overlapping display data storage cell, it stores Overlapping display data and item location, and wherein, these Overlapping display data are the sources of the image be superimposed upon on the visual field of user, and this item location is the item destination locations forming Overlapping display data; Environmental map generation unit, it, based on the input picture obtained by using imaging device to carry out imaging to real space and the characteristic be stored in characteristic storage unit, generates the environmental map of the position representing the one or more objects be present in real space; Overlapping display position determination unit, it is based on environmental map, determines the position of the object with predetermined plane or curved surface from the object of imaging in the input image; Overlapping display image generation unit, it is by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generates Overlapping display image; Image superimposition unit, its by Overlapping display imaging importing on the visual field of user; Operating body recognition unit, it identifies the operating body of imaging in the input image; And processing execution unit, it performs and the position based on the operating body identified by operating body recognition unit and process corresponding to the project selected.
According to the image processing apparatus of above-mentioned the present embodiment, image processing method and program, user's input operation can be convenient in the configuration being shown information by HMD by AR technology on the visual field of user in a superimposed manner.
Meanwhile, as the compact display apparatus be arranged on head, head mounted display (HMD) is widely used.As using the technology of the HMD of camera of being equipped with, following technology can be illustrated: it is for using image processing apparatus, showing the image revised by HMD, so that user watches amended image by AR technology modification by the image of camera imaging.The function that this HMD has can such as be realized by video transmission-type HMD.In addition, such as, also there is following technology: it, for the image by camera imaging is used as source, generates additional information image by image processing apparatus by AR technology, and in visual field, use the optics such as half-mirror to synthesize the additional information image generated, so that user watches image.Such as, this function of HMD realizes by optical transmission-type HMD.By using these technology, user easily can understand the situation of real space, and can provide the work support based on input picture.As the treatment technology for taking image, can illustrate the technology on the buildings of the information superposition imaging in shooting image about buildings.
Here, usually there is the object (such as, buildings) of imaging near the front of shooting image, and according to this reason, the information about object is shown in a superimposed manner, such as, about the information of buildings near the front of usually shown on HMD output image.Therefore, when shooting image front near show in a superimposed manner be not directly dependent on shooting image in imaging object (such as, menu, advertisement, schedule and p.m.entry) data (being hereinafter referred to as " Overlapping display data ") time, Overlapping display data and the overlapped or close situation of the information relevant to the object of imaging in captured image may be there is.When this happens, there is user and become very difficult problem of watching by the superimposed image obtained on shooting image by Overlapping display data investigation.
Accordingly, desirable to provide a kind of image processing apparatus, image processing method and program, it is novel and improves, and it can make it possible to the easily superimposed image of viewing by being obtained on the visual field of user by Overlapping display data investigation.
According to embodiments of the invention, provide a kind of image processing apparatus, comprising: characteristic storage unit, it stores the characteristic of the feature of the outward appearance representing object; Overlapping display data storage cell, it is stored as the Overlapping display data in the source of the image be superimposed upon on the visual field of user; Environmental map generation unit, it, based on the input picture obtained by using imaging device to carry out imaging to real space and the characteristic be stored in characteristic storage unit, generates the environmental map of the position representing the one or more objects be present in real space; Overlapping display position determination unit, it is based on environmental map, determines the position of the object with predetermined plane or curved surface from the object of imaging in the input image; Overlapping display image generation unit, it is by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generates Overlapping display image; And image superimposition unit, Overlapping display imaging importing makes display unit show Overlapping display image by it on the visual field of user.
When image superimposition unit is by using the sensor of the degree of tilt detecting imaging device to detect that the user be worn on his/her head makes head tilt in a downward direction with the degree of tilt exceeding predetermined value by imaging device, image superimposition unit can make display unit show Overlapping display image, and when image superimposition unit is by using the sensor of the degree of tilt detecting imaging device not detect that the user be worn on his/her head makes head tilt in a downward direction with the degree of tilt exceeding predetermined value by imaging device, image superimposition unit can limit display Overlapping display image, wherein, this display is performed by display unit.
Image processing apparatus can also comprise self-position detecting unit, and it is based on input picture and characteristic, dynamically the position of detected image treating apparatus.When the distance from the position of the image processing apparatus detected by self-position detecting unit to the position of the object determined by Overlapping display position determination unit exceedes predetermined value, image superimposition unit can limit display Overlapping display image, wherein, this display is performed by display unit.
The position of the object with the plane extended in substantially horizontal directions can be defined as the position of the object with predetermined plane or curved surface by Overlapping display position determination unit.
The position of at least one in ground, desktop and stair can be defined as the position of the object with the plane extended in substantially horizontal directions by Overlapping display position determination unit.
When Overlapping display image generation unit is by using the sensor of the rotation detecting imaging device, when detecting that the user be worn on his/her head makes head rotate in substantially horizontal directions by imaging device, Overlapping display image generation unit, by changing the setting position of Overlapping display data according to the degree rotated, moves the Overlapping display data arranged in Overlapping display image.
Image processing apparatus can also comprise location estimation unit, and it is based on the position of the point on the surface of the object represented by environmental map, the ground in estimation real space or the position of metope.The position on ground further based on the position of the ground in the real space by location estimation unit estimation or metope, can be defined as the position of object by Overlapping display position determination unit.
Characteristic can comprise the data whether probably contacted with the ground in real space or metope for each point of one or more expression on the surface of each object.Location estimation unit can the position of ground further in feature based data estimation real space or metope.
In addition, according to another embodiment of the present invention, provide a kind of image processing method performed by image processing apparatus, image processing apparatus comprises the characteristic storage unit of the characteristic of the feature storing the outward appearance representing object, be stored as the Overlapping display data storage cell of the Overlapping display data in the source of the image be superimposed upon on the visual field of user, environmental map generation unit, Overlapping display position determination unit, Overlapping display image generation unit and image superimposition unit, this image processing method comprises the following steps: by environmental map generation unit based on the input picture obtained by using imaging device to carry out imaging to real space and the characteristic be stored in characteristic storage unit, generate the environmental map of the position representing the one or more objects be present in real space, by Overlapping display position determination unit based on environmental map, from the object of imaging in the input image, determine the position of the object with predetermined plane or curved surface, by Overlapping display image generation unit by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generate Overlapping display image, and by image superimposition unit by Overlapping display imaging importing on the visual field of user, and make display unit show Overlapping display image by image superimposition unit.
In addition, according to another embodiment of the present invention, provide a kind of program for making computing machine play the effect of image processing apparatus, this image processing apparatus comprises: characteristic storage unit, and it stores the characteristic of the feature of the outward appearance representing object; Overlapping display data storage cell, it is stored as the Overlapping display data in the source of the image be superimposed upon on the visual field of user; Environmental map generation unit, it, based on the input picture obtained by using imaging device to carry out imaging to real space and the characteristic be stored in characteristic storage unit, generates the environmental map of the position representing the one or more objects be present in real space; Overlapping display position determination unit, it is based on environmental map, determines the position of the object with predetermined plane or curved surface from the object of imaging in the input image; Overlapping display image generation unit, it is by arranging Overlapping display data in the position of the object determined by Overlapping display position determination unit, generates Overlapping display image; Image superimposition unit, Overlapping display imaging importing on the visual field of user, and makes display unit show Overlapping display image by it.
According to the image processing apparatus of above-mentioned the present embodiment, image processing method and program, the easily superimposed image of viewing by being obtained on the visual field of user by Overlapping display data investigation can be made it possible to.
Accompanying drawing explanation
Fig. 1 is the schematic diagram of the image processing apparatus illustrated according to embodiment;
Fig. 2 is the key diagram of the example of the input picture for image procossing illustrated according to embodiment;
Fig. 3 is the block diagram of the configuration example of the image processing apparatus illustrated according to the first embodiment;
Fig. 4 is the process flow diagram of the example of the flow process of the self-position check processing illustrated according to the first embodiment;
Fig. 5 is the key diagram that the unique point arranged on object is described;
Fig. 6 illustrates the key diagram adding unique point;
Fig. 7 is the key diagram of the example that forecast model is described;
Fig. 8 is the key diagram of the configuration example of characterization data;
Fig. 9 is the process flow diagram of the flow example of the object identification process illustrated according to the first embodiment;
Figure 10 A illustrates the diagram of user towards the example of the output image generated by image processing apparatus during front when having on imaging device;
Figure 10 B is the diagram of the example that the output image generated when user looks down is shown;
Figure 10 C is the diagram of the example that the output image generated by the items selection of the first stratum is shown;
Figure 10 D be immediately preceding the project of selection second stratum before the diagram of example of output image;
Figure 10 E is the diagram of the example that the output image generated by the items selection of the second stratum is shown;
Figure 11 illustrates the diagram in the example according to the data stored in the Overlapping display data storage cell of the first embodiment;
Figure 12 is the process flow diagram of the example of the flow process of the output image generating process illustrated according to the first embodiment;
Figure 13 is the process flow diagram of the example of the flow process of the items selection process illustrated according to the first embodiment;
Figure 14 is the block diagram of the example of the configuration of the image processing apparatus illustrated according to the second embodiment;
Figure 15 is the key diagram of another example of the configuration of characterization data;
Figure 16 is the key diagram that the polygonal example relevant to the characteristic shown in Figure 15 is described; And
Figure 17 is the block diagram of the example of the hardware configuration that multi-purpose computer is shown.
Embodiment
Hereinafter, with reference to accompanying drawing the preferred embodiment of the present invention will be described in detail.Note, in the present description and drawings, the structural detail with substantially identical function and structure represents by same reference numerals, and omits the repeat specification to these structural details.
In addition, in the following order " embodiment " is described.
1. according to the general introduction of the image processing apparatus of embodiment
2. the first embodiment
2-1. image-generating unit
2-2. environmental map generation unit
2-3. output image generation unit
The summary of 2-4. first embodiment
3. the second embodiment
3-1. environmental map generation unit
3-2. output image generation unit
The summary of 3-3. second embodiment
4. hardware configuration
<1. according to the general introduction > of the image processing apparatus of embodiment
Fig. 1 is the schematic diagram that image processing apparatus is according to an embodiment of the invention described.Fig. 1 illustrates environment 1 according to an embodiment of the invention, wherein, there is the user be worn over by image processing apparatus 100 on his/her head.
With reference to figure 1, in environment 1, there are three object Obj01, Obj02 and Obj03, metope W01 and W02 and ground F0.The corner of object Obj01 between metope W01 and W02.In addition, along metope W01, object Obj02 is in close proximity to object Obj01 and places, and object Obj03 is in close proximity to object Obj02 places.When environment 1 is the Room in house, such as, object Obj01, Obj02 and Obj03 correspond to more than one piece furniture, such as drawer.
Image processing apparatus 100 imaging in environment 1 (it is real space), and perform image procossing according to the present embodiment described after a while.Although show with lower device as the example of the image processing apparatus 100 in Fig. 1, but image processing apparatus 100 is not limited to such example: this device is equipped with the camera be arranged on the head of user, and amendment by camera imaging image and image is exported to the display device D of such as head mounted display (HMD).Such as, image processing apparatus 100 can be signal conditioning package, such as, can obtain the personal computer (PC) of image, mobile terminal or digital home appliance from the imaging device of such as video camera.In addition, in image processing apparatus 100 need not and the camera installed on the head of access customer.That is, camera must be held by user, but the configuration except the camera be included in image processing apparatus 100 need not be held by user, as shown in Figure 1.In addition, environment 1 is not limited to the example shown in Fig. 1, and can be indoor environment or outdoor environment.
Fig. 2 illustrate as in environment 1 in FIG by the input picture Im01 of the example of image processing apparatus 100 imaging.Three shown in Fig. 1 object Obj01, Obj02 and Obj03, metope W01 and W02 and ground F0 is there is in input picture Im01.Image processing apparatus 100 obtains such as this input picture, and generates the output image obtained by the data (hereinafter referred to " Overlapping display data ") superposing the object not being directly dependent on imaging in input picture over an input image.As the example of Overlapping display data, the data of such as menu, advertisement, schedule, p.m.entry can be supposed.
The object of usual existence imaging near the front of input picture (such as, buildings), and according to this reason, near the front of usually shown on display device D output image, show the information about object in a superimposed manner, such as, about the information of buildings.Therefore, when showing Overlapping display data in a superimposed manner near the front taking image, Overlapping display data may be there are and the information relevant to the object of imaging in taken image superposes mutually or close situation.When this happens, user be difficult to watch by by Overlapping display data investigation over an input image and obtain output image.In this specification, such as following technology will be described in detail: in order to make viewing by by Overlapping display data investigation over an input image and obtain output image easier, the position on ground is identified from input picture, and by the position on Overlapping display data investigation ground in the input image.Because ground exists everywhere, and the information relevant to object unlikely superposes on the ground, so can improve the situation being difficult to watch output image.
In addition, illustrate the troublesome advantage using HMD can save user's input operation, but when the input media of such as keyboard or mouse is used as the device of input operation, for user, cause the trouble of input operation.Thus, the advantage using HMD is decreased.In this manual, the process can easily selecting user to want image processing apparatus 100 to make user to perform, such as following technology will be described in detail: when being positioned at by his/her foot while user watches Overlapping display data on expectation item destination locations, image processing apparatus 100 performs the process corresponding with the project that foot is located based on input picture.According to this technology, the trouble of user's input operation can be saved.
In addition, this technology can be applied to image processing apparatus 100, and it shows Overlapping display data by the synthesis of optics in the visual field of user Overlapping display data in a superimposed manner on real space.Self-evident, in this case, achieve user and watch composograph and become easier effect and the troublesome effect of user's input operation can be saved.Hereinafter, by following image processing apparatus is exemplarily made description: its display by by Overlapping display data investigation over an input image and obtain output image.
<2. the first embodiment >
Fig. 3 is the block diagram of the configuration example of the image processing apparatus 100 illustrated according to the first embodiment.With reference to Fig. 3, image processing apparatus 100 comprises image-generating unit 102, environmental map generation unit 110 and output image generation unit 180.
[2-1. image-generating unit]
Such as, image-generating unit 102 can be embodied as the imaging device with image-forming component (such as, charge-coupled image sensor (CCD) or complementary metal oxide semiconductor (CMOS) (CMOS)).Although a part for image-generating unit 102 composing images treating apparatus 100 in the present embodiment, it is outside that image-generating unit 102 can be arranged on image processing apparatus 100.Image-generating unit 102 will export environmental map generation unit 110 and output image generation unit 180 by the image generated becoming real space (environment 1 such as, shown in Fig. 1) to carry out imaging to as input picture.
[2-2. environmental map generation unit]
Environmental map generation unit 110 build environment map, this environmental map, based on the input picture inputted from image-generating unit 102 and the characteristic being stored in the object described subsequently in characteristic storage unit 130, represents the position etc. of the one or more objects existed in real space.As shown in Figure 3, in this embodiment, environmental map generation unit 110 comprises self-position detecting unit 120, characteristic storage unit 130, image identification unit 140, environmental map construction unit 150 and environmental map storage unit 152.
(1) self-position detecting unit
Self-position detecting unit 120, based on the input picture inputted from image-generating unit 102 and the characteristic be stored in characteristic storage unit 130, dynamically detects the position that have taken the imaging device of input picture.Such as, also when imaging device has monocular camera, self-position detecting unit 120 is by being applied in " Real-TimeSimultaneousLocalizationandMappingwithaSingleCa mera " (AndrewJ.Davison, Proceedingsofthe9thIEEEInternationalConferenceonComputer VisionVolume2,2003, pp.1403-1410) the SLAM technology described in, dynamically determines the position of camera and posture and the position of unique point on the imaging plane of camera for every frame.
First, the overall flow of the self-position check processing of the self-position detecting unit 120 of application SLAM technology is described through with reference to Fig. 4.Next, self-position check processing is described in detail with reference to Fig. 5 to Fig. 7.
Fig. 4 is the process flow diagram of the example of the flow process that the self-position check processing undertaken by the self-position detecting unit 120 of application SLAM technology is shown.In the diagram, when self-position check processing starts, self-position detecting unit 120 is init state variable (step S102) first.In this embodiment, state variable comprises the following vector as element: the position of camera and posture (rotation angle), the translational speed of camera and the position of angular velocity and one or more unique point.Then, self-position detecting unit 120 sequentially obtains input picture (step S112) from image-generating unit 102.Can for the process of each input picture (that is, every frame) repetition from step S112 to step S118.
In step S114, self-position detecting unit 120 follows the tracks of the unique point occurred in the input image.Such as, self-position detecting unit 120 detects the sheet (patch) (the little image of 3 × 3=9 pixel such as, around unique point) of each unique point be stored in advance in characteristic storage unit 130 from input picture.When upgrading state variable, be subsequently used in the position of the sheet detected by this, that is, the position of unique point.
In step S116, self-position detecting unit 120 such as generates the predicted value of the state variable of next frame based on predetermined forecast model.And in step S118, self-position detecting unit 120 is used in the observed reading of the predicted value of the state variable that step S116 generates and the position according to the unique point detected by step S114, upgrade state variable.Self-position detecting unit 120 performs in the process of step S116 and S118 based on the principle of extended Kalman filter.
As the result of this process, export the value of the state variable upgraded for every frame.Hereinafter more specifically describe tracking characteristics point (step S114), predicted state variable (step S116) and upgrade each content processed of state variable (step S118).
(1-1) tracking characteristics point
In the present embodiment, characteristic storage unit 130 prestores characteristic, and this characteristic represents the feature of the object corresponding with the physical objects that may be present in real space.Such as, characteristic comprises little image, that is, relevant to one or more unique point sheet, and each unique point all represents the feature of the outward appearance of each object.Such as, sheet can be by unique point around the little image that forms of 3 × 3=9 pixel.
Fig. 5 illustrates two examples of object and the example of the unique point (FP) be arranged on each object and sheet.Left side object in Fig. 5 be represent drawer object (with reference to Fig. 6 a).The multiple unique points comprising unique point FP1 are set on the object.And, define sheet Pth1 about unique point FP1.On the other hand, the right side object of Fig. 5 is the object (with reference to Fig. 6 b) representing calendar.The multiple unique points comprising unique point FP2 are set on the object.And, define sheet Pth2 about unique point FP2.
When obtaining input picture from image-generating unit 102, the parts of images comprised in the input image mates with the sheet being used for being stored in advance in each unique point shown in the Fig. 6 in characteristic storage unit 130 by self-position detecting unit 120.Then, as matching result, self-position detecting unit 120 specifies the position (position of the center pixel of such as, detected sheet) of each unique point comprised in the input image.
It should be noted that, in order to tracking characteristics point (the step S114 in Fig. 4), not to be stored in characteristic storage unit 130 about the data in advance of all unique points that will follow the tracks of.Such as, in the example in figure 6, at time T=t-1 place, detect that six unique points (with reference to Fig. 7 a) in the input image.Next, when changing in the position of time T=t place camera or posture, only appear at two unique points in time T=t-1 place appearance six unique points in the input image in the input image.In this case, the position that self-position detecting unit 120 can exist at the exemplary pixels pattern (pattern) of input picture resets unique point, and uses new feature point in for self check processing of frame subsequently.Such as, in the example in figure 6, at time T=t, object arranges four new feature points (with reference to Fig. 7 b).This is the feature of SLAM technology, and according to this technology, can reduce the cost pre-setting all unique points, and the unique point of accelerating can be used to improve the accuracy of process.
(1-2) prediction of state variable
In the present embodiment, self-position detecting unit 120 by the state variable X expressed in following equation as the state variable being applied to extended Kalman filter.
[equation 1]
X = x &omega; x &CenterDot; &omega; &CenterDot; p 1 . . . p N - - - ( 1 )
The first element representation camera of the state variable X in equation (1) as the three-dimensional position in the global coordinate system (x, y, z) of the coordinate system arranged in real space, as expressed in following equation.
[equation 2]
X = x c y c z c - - - ( 2 )
In addition, the second element of state variable is four-vector ω, and it makes the hypercomplex number (quaternion) of posture that is corresponding with rotation matrix, that represent camera as element.Note, Eulerian angle can be used to replace hypercomplex number to represent the posture of camera.And the 3rd element of state variable and the 4th element represent translational speed and the angular velocity of camera respectively.
In addition, expressed in following equation, the unique point FP of the Fifth Element of state variable and element representation subsequently i(i=1...N) the three-dimensional position p in global coordinate system i.Note, as mentioned above, the quantity N of unique point can change during processing.
[equation 3]
p i = x i y i z i - - - ( 3 )
Self-position detecting unit 120 based on step S102 the value of initialized state variable X or the value of state variable X that upgrades in previous frame, generate the predicted value of the state variable of latest frame.According to the equation of state of the extended Kalman filter of the multiple normal distribution as shown in following equation, generate the predicted value of state variable.
[equation 4]
Predicted state variable X ^ = F ( X , a ) + w - - - ( 4 )
Here, F represents and shifts relevant forecast model to the state of coordinate system, and " a " represents predicted condition.In addition, w represents Gaussian noise and such as can comprise model approximate error, observational error etc.Usually, the mean value of Gaussian noise w is 0.
Fig. 6 is the key diagram of the example for illustration of the forecast model according to the present embodiment.With reference to Fig. 6, illustrate according to the predicted condition of two in the forecast model of the present embodiment.First, as first condition, suppose that the three-dimensional position of unique point in global coordinate system does not change.That is, the three-dimensional position of the unique point FP1 at hypothesis time T place is p t, then following relation is met.
[equation 5]
p t=p t-1(5)
Next, as second condition, suppose that the motion of camera is uniform motion.That is, for camera from time T=t-1 to the speed of time T=t and angular velocity, meet following relation.
[equation 6]
X &CenterDot; t = X &CenterDot; t - 1 - - - ( 6 )
&omega; &CenterDot; t = &omega; &CenterDot; t - 1 - - - ( 7 )
Self-position detecting unit 120, based on the equation of state expressed in this forecast model and equation (4), generates the predicted value of the state variable of latest frame.
(1-3) renewal of state variable
Then, self-position detecting unit 120 uses observation equation, such as, estimate the observation information predicted from the predicted value of state variable and the error between the actual observation information that obtains as the result of feature point tracking.Note, the v in equation (8) is error.
[equation 7]
Observation information s = H ( X ^ ) + v - - - ( 8 )
The observation information of prediction
Here, H represents observation model.Such as, as expressed in following equation, feature points FP iposition on imaging surface (u-v face).
[equation 8]
FP iposition on imaging surface p ~ i = u i v i 1 - - - ( 10 )
Here, the position x of camera, the posture ω of camera and unique point FP is provided ithree-dimensional position P iin whole elements as state variable X.Then, according to pin hole (pinhole) model, use following equation to the unique point FP that derives iposition on imaging surface.
[equation 9]
&lambda; p ~ i = A R &omega; ( p i - x ) - - - ( 11 )
Here, λ represents normalized parameter, and A represents camera internal parameter, R ωrepresent the rotation matrix corresponding with hypercomplex number ω, wherein, this hypercomplex number ω represents the posture of the camera be included in state variable X.According to the characteristic of the imaging device of shooting input picture, as expressed in following equation, provide camera internal parameter A in advance.
[equation 10]
A = - f &CenterDot; k u f &CenterDot; k u &CenterDot; cot &theta; u O 0 - f &CenterDot; k v sin &theta; v O 0 0 1 - - - ( 12 )
Here, f represents focal length, and θ represents the orthogonality (ideal value is 90 degree) of image axle, k urepresent the ratio (being tied to the rate of change of the ratio of the coordinate system of imaging surface from world coordinates) along the longitudinal axis of imaging surface, k vrepresent the ratio along the transverse axis of imaging surface, and (u o, v o) be expressed as the center of image planes.
Therefore, feasible last state variable X is obtained by search condition variable X, error between this prediction observation information (that is, the position of each unique point on imaging surface) that use equation (11) is derived and the result of step S114 place feature point tracking is in the diagram minimum.
[equation 11]
Last state variable X &LeftArrow; X ^ + Innov ( s - s ^ ) - - - ( 13 )
Self-position detecting unit 120 exports the position x of the camera (imaging device) dynamically updated by applying SLAM technology in like fashion and posture ω to environmental map construction unit 150 and output image generation unit 180.
(2) characteristic storage unit
Characteristic storage unit 130 uses the storage medium of such as hard disk or semiconductor memory, prestores the characteristic of the feature of the object representing corresponding with the physical objects that may be present in real space.Although the storage unit of characteristic shown in Fig. 3 130 is examples of a part for environmental map generation unit 110, be not limited to this example, and characteristic storage unit 130 can be arranged on environmental map generation unit 110 outside.Fig. 8 is the key diagram of the configuration example for illustration of characteristic.
With reference to Fig. 8, the characteristic FD1 as the example about object Obj1 is shown.Characteristic FD1 comprises object names FD11, view data FD12, sheet data FD13 from six direction shooting, three-dimensional shape data FD14 and body (ontology) data FD15.
Object names FD11 is the title can specifying respective objects, such as " coffee cup A ".
Such as, view data FD12 comprises the six sections of view data obtained by the image from six direction (front, back, left, right, up and under) shooting respective objects.Sheet data FD13 be for each in the one or more unique points be arranged on each object, the set of little image around each unique point.View data FD12 and sheet data FD13 can be used for the object identification process that the image identification unit 140 by describing subsequently is carried out.In addition, sheet data FD13 can be used for the above-mentioned self-position check processing that undertaken by self-position detecting unit 120.
Three-dimensional shape data FD14 comprises the polygon information of the shape for identifying respective objects and the three dimensional local information of unique point.The environmental map undertaken by environmental map construction unit 150 that three-dimensional shape data FD14 can be used for describing subsequently builds process and the CG Computer image genration process for each object.
Such as, ontology data FD15 is the data that the environmental map that can be used for assisting to be undertaken by environmental map construction unit 150 builds process.In the example depicted in fig. 8, ontology data FD15 represent as coffee cup object Obj1 probably with the object contact corresponding to desk or dish-washing machine, and can not with the object contact corresponding to bookshelf.
(3) image identification unit
Image identification unit 140 uses the above-mentioned characteristic be stored in characteristic storage unit 130, the corresponding relation between specified object and appearance physical objects in the input image.
Fig. 9 is the process flow diagram of the example of the flow process that the object identification process undertaken by image identification unit 140 is shown.With reference to Fig. 9, first image identification unit 140 obtains input picture (step S212) from image-generating unit 102.Next, the sheet of the parts of images comprised in the input image with one or more unique points of each object be included in characteristic mates by image identification unit 140, to extract the unique point (step S214) comprised in the input image.It should be noted that the unique point used in the object identification process undertaken by image identification unit 140 is not necessarily identical with the unique point used in the self-position check processing undertaken by self-position detecting unit 120.But when all using public characteristic point in two kinds of process, image identification unit 140 can reuse the result of the feature point tracking undertaken by self-position detecting unit 120.
Next, the extraction result of image identification unit 140 distinguished point based, specifies the object (step S216) occurred in the input image.Such as, when belonging to the unique point of an object with high density extraction in an area, image identification unit 140 identifiable design goes out object and is present in this region.Then, image identification unit 140 exports the object names (or identifier) of specified object and the position of unique point on imaging surface that belong to this object to environmental map construction unit 150 (step S218).
(4) environmental map construction unit
Environmental map construction unit 150 uses the position of camera that inputs from self-position detecting unit 120 and posture, from the position of unique point on imaging surface that image identification unit 140 inputs and the characteristic be stored in characteristic storage unit 130, build environment map.In this manual, environmental map is the set of the data of the position (and posture) representing the one or more objects be present in real space.Such as, environmental map can comprise the polygon information of the shape corresponding to the object names of object, the three-dimensional position belonging to the unique point of object and formation object.Such as, by the position of unique point on imaging surface inputted from image identification unit 140, the three-dimensional position that obtains each unique point according to above-mentioned pin-hole model, constructing environment map is carried out.
By making the relational equation distortion of the pin-hole model of expressing in equation (11), obtain unique point FP by following equation ithree-dimensional position p in global coordinate system i.
[equation 12]
p i = x + &lambda; &CenterDot; R &omega; T &CenterDot; A - 1 &CenterDot; p ~ i = x + d &CenterDot; R &omega; T A - 1 &CenterDot; p ~ i | | A - 1 &CenterDot; p ~ i | | - - - ( 14 )
Here, d represents the distance between camera and each unique point in global coordinate system.Environmental map construction unit 150 based on the distance between the unique point of the position of at least four unique points on imaging surface and each object, can calculate this distance d.As the three-dimensional shape data FD14 be included in characteristic shown in reference Fig. 8, the distance between unique point is pre-stored in characteristic storage unit 130.It should be noted that the computing of the distance d that disclose in detail equation (14) in Japanese Patent Application Laid-Open No.2008-304268.
After calculating distance d, the position of unique point on imaging surface that the remaining variables on the right side of equation (14) is the position of the camera inputted from self-position detecting unit 120 and posture and inputs from image identification unit 140, and all these is known.Then, environmental map construction unit 150 calculates the three-dimensional position of each unique point global coordinate system inputted from image identification unit 140 according to equation (14).Then, environmental map construction unit 150 builds up-to-date environmental map according to the three-dimensional position of calculated each unique point, and allows environmental map storage unit 152 to store constructed environmental map.It should be noted that now, environmental map construction unit 150 can use the ontology data FD15 be included in reference in the characteristic shown in Fig. 8, improves the accuracy of the data of environmental map.
Environmental map storage unit 152 uses the storage medium of such as hard disk or semiconductor memory to store the environmental map built by environmental map construction unit 150.
[2-3. output image generation unit]
Output image generation unit 180 is by arranging Overlapping display data based on the position among environmental map object in the input image with the object of predetermined plane or curved surface, generate Overlapping display image, and by Overlapping display imaging importing over an input image, thus generate output image.As shown in Figure 3, in the present embodiment, output image generation unit 180 comprises Overlapping display data storage cell 181, processing execution unit 182, operating body recognition unit 183, Overlapping display image generation unit 184, Overlapping display position determination unit 186 and image superimposition unit 188.
Figure 10 A illustrates the diagram of user towards the example of the output image generated by image processing apparatus 100 during front when having on imaging device.Figure 10 B is the diagram of the example that the output image generated when user looks down is shown.Figure 10 C is the diagram of the example that the output image generated by the items selection of the first stratum is shown.Figure 10 D be illustrate immediately preceding the project in selection second stratum before the diagram of example of output image.Figure 10 E is the diagram of the example that the output image generated by the items selection of the second stratum is shown.By using Figure 10 A to Figure 10 E, the function of each block forming output image generation unit 180 will be described.
(1) Overlapping display data storage cell
Overlapping display data storage cell 181 stores Overlapping display data and item location by using the storage medium of such as hard disk or semiconductor memory, wherein, these Overlapping display data are the source of superposition image over an input image, and item location is the item destination locations forming Overlapping display data.The configuration example of the polytype data be stored in Overlapping display data storage cell 181 is described with reference to Figure 11 subsequently.
(2) Overlapping display position determination unit
Overlapping display position determination unit 186, based on the environmental map be stored in environmental map storage unit 152, determines the position of the object in the object of imaging in the input image with predetermined plane or curved surface.Such as, the position of the object with the plane extended in substantially horizontal directions is defined as the position of the object with predetermined plane or curved surface by Overlapping display position determination unit 186.As shown in FIG. 10A, in this embodiment, the position of ground F0 is defined as the position of the object with the plane extended in substantially horizontal directions by Overlapping display position determination unit 186, but the object with the plane extended in substantially horizontal directions is not limited thereto, and can be desktop, stair etc.Note, in the output image Ima shown in Figure 10 A, the ground F0 as the example of the plane extended in substantially horizontal directions extends in boundary B0.Overlapping display position determination unit 186 exports determined position to Overlapping display image generation unit 184.
(3) Overlapping display image generation unit
Overlapping display image generation unit 184 arranges Overlapping display data in the position of the object determined by Overlapping display position determination unit 186, generates Overlapping display image thus.When the position of the object determined by Overlapping display position determination unit 186 is the position of ground F0, Overlapping display image generation unit 184, by arranging Overlapping display data by Overlapping display image generation unit 184 in the position of ground F0, generates Overlapping display image.Because the Overlapping display image generated by Overlapping display image generation unit 184 is used to generate output image, so Overlapping display image generation unit 184 exports generated Overlapping display image to image superimposition unit 188.
(4) image superimposition unit
Image superimposition unit 188, by by Overlapping display imaging importing over an input image, generates output image.Output image Imb shown in Figure 10 B obtains like this: by by comprise such as " Email ", " navigation " and " game " the Overlapping display imaging importing of image on the input picture of image comprising such as ground F0.Output image Imb shown in Figure 10 B comprises the image of the foot of the example as he/her operating body OP of user own.Then, the output image generated in like fashion is exported to display device D (or when needing, exporting another functional unit to) by image superimposition unit 188, as the result of the process undertaken by image processing apparatus 100.Display device D is the example of display unit and display translation image.
Image superimposition unit 188 can detect that output image exports by the stage that user looks down with the degree of tilt exceeding predetermined value.Namely, when the sensor of the degree of tilt detecting imaging device detect lift one's head the user that wears imaging device head is tilted in a downward direction with the degree of tilt exceeding predetermined value, image superimposition unit 188 can make display device D show the output image generated.In addition, when the sensor of degree of tilt detecting imaging device do not detect lift one's head the user that wears imaging device head is tilted in a downward direction with the degree of tilt exceeding predetermined value, image superimposition unit 188 can limit the output image that display generates, wherein, this display is performed by display device D.More specifically, when the sensor of degree of tilt detecting imaging device do not detect lift one's head the user that wears imaging device head is tilted in a downward direction with the degree of tilt exceeding predetermined value, image superimposition unit 188 can not make display device D show the output image generated, but display device D can be made to show input picture.Like this, such as when user towards front and do not intend viewing Overlapping display data, do not generate the output image being superimposed with Overlapping display data.
Plane in superposition destination (such as, ground F0) head away from user when, because hypothesis is when Overlapping display data when Overlapping display data investigation is in the plane of superposition destination are become very little, so can not by Overlapping display data investigation in the plane of superposition destination.Therefore, such as, distance when the position from the position of the image processing apparatus 100 detected by self-position detecting unit 120 to the object determined by Overlapping display position determination unit 186 exceedes predetermined value, image superimposition unit 188 can limit display translation image, wherein, this display is performed by display device D.More specifically, distance when the position from the position of the image processing apparatus 100 detected by self-position detecting unit 120 to the object determined by Overlapping display position determination unit 186 exceedes predetermined value, image superimposition unit 188 can not make display device D display translation image, and can perform the control such as making display device D show input picture.
Figure 11 is the diagram of the example that the data be stored in Overlapping display data storage cell 181 are shown.As shown in figure 11, except Overlapping display data, Overlapping display data storage cell 181 also stores one or more combinations of the contents processing of the content representing the process that the item position information of each the destination locations forming Overlapping display data will perform when have selected project with expression, wherein, each in item position information and contents processing is associated with Overlapping display data.
Figure 12 is the process flow diagram of the example of the flow process that the output image generating process performed by output image generation unit 180 is shown.In fig. 12, when output image generating process starts, first, Overlapping display position determination unit 186 obtains environmental map (step S302) from environmental map construction unit 150.Then, Overlapping display position determination unit 186 determines whether the conversion operations (step S304) detecting display mode.Although in the examples described above, perform the conversion operations of display mode when the user having on imaging device looks down, when operation is input to input media etc. by user, also can perform the conversion operations of display mode.Can for the process of each input picture (that is, every frame) repetition from step S302 to step S314.
When Overlapping display position determination unit 186 determines the conversion operations not detecting display mode (being "No" in step s 304), process proceeds to step S314, and when Overlapping display position determination unit 186 determines the conversion operations detecting display mode (being "Yes" in step s 304), Overlapping display position determination unit 186 determines whether there is the plane (step S306) that can be used for Overlapping display in obtained environmental map.Although determine whether there is the plane that can be used for Overlapping display in obtained environmental map, as mentioned above, also can determine in obtained environmental map, whether there is the curved surface that can be used for Overlapping display.
When Overlapping display position determination unit 186 determine in obtained environmental map, to there is not the plane that can be used for Overlapping display (being "No" in step S306), process proceeds to step S314, and when Overlapping display position determination unit 186 determine to exist in obtained environmental map can be used for the plane of Overlapping display (being "Yes" in step S306), Overlapping display position determination unit 186 determines whether the area of the plane that can be used for Overlapping display exceedes threshold value (step S308).When the area that Overlapping display position determination unit 186 determines the plane that can be used for Overlapping display does not exceed threshold value (being "No" in step S308), process proceeds to step S314, and when Overlapping display position determination unit 186 determines that the area of the plane that can be used for Overlapping display exceedes threshold value (being "Yes" in step S308), process proceeds to step S310.
In step S310, Overlapping display image generation unit 184 obtains Overlapping display data from Overlapping display data storage cell 181, image superimposition unit 188 is by generating output image (step S312) over an input image by Overlapping display data investigation, and output image generating process completes.In step S314, input picture is used as output image (step S314), and output image generating process completes.
(5) operating body recognition unit
Operating body recognition unit 183 identifies the operating body OP of the user of imaging in the input image.In this embodiment, the foot of imaging is in the input image identified as the example of the operating body OP of user by operating body recognition unit 183, but the part except foot can be identified as the operating body OP of user.Such as, when the position of desktop to be defined as the position of the object with the plane extended in substantially horizontal directions by Overlapping display position determination unit 186, the hard recognition of imaging in the input image can be the operating body OP of user by operating body recognition unit 183.This is because when desktop in the input image imaging time, can suppose that user can easily be put at his/her hand on the table.
In addition, can be the technology that operating body recognition unit 183 identifies the operating body OP of the user of imaging in the input image by the multiple other technologies of hypothesis.Such as, operating body recognition unit 183 performs input picture and registers mating between image with the footwear of the image as the footwear of registering in advance, and when operating body recognition unit 183 determine register images match with footwear footwear in the input image imaging, footwear can be identified as the operating body of user.
In addition, when imaging device is worn on the head by user, operating body recognition unit 183 can determine whether the foot of imaging in the input image enters among each limit forming input picture with the immediate limit of user.In output image Imb in fig. 1 ob, show the foot (example of operating body OP) of imaging in the input image from the state entered with the immediate limit of user (limit on the bottom direction of Figure 10 B) among each limit forming input picture.Determine that foot is from when entering with the immediate limit of user at operating body recognition unit 183, foot can be identified as the operating body OP of user by operating body recognition unit 183.
In addition, whether in the input image operating body recognition unit 183 also can determine the markd footwear of tool (it have passed through predetermined labels in advance) imaging.When operating body recognition unit 183 determine the markd footwear of tool in the input image imaging, footwear can be identified as the operating body OP of user by operating body recognition unit 183.
Operating body recognition unit 183 is by using such as normal image recognition technology to identify the operating body OP of user.When identifying the operating body OP of user, operating body recognition unit 183 can grasp the position of the operating body OP of user.That is, as the result of the operating body OP of identification user, operating body recognition unit 183 exports the position of the operating body OP of user to processing execution unit 182.
(6) processing execution unit
Processing execution unit 182 performs and the position based on the operating body OP identified by operating body recognition unit 183 and process corresponding to the project selected.Project forms Overlapping display data, and in fig. 1 ob shown in output image Imb in, each in " Email ", " navigation " and " game " all represents project.Represent that the item position information of item destination locations is stored in Overlapping display data storage cell 181 as the item position information shown in Figure 11, and in output image, each project is present in the position of the position that the displacement of distance Overlapping display data is represented by item position information.Processing execution unit 182 selects the project corresponding with the position of operating body OP among the project existed as mentioned above.Output image Imc shown in Figure 10 C is by obtaining like this: when the operating body OP of user is moved to the position of project " navigation " by user, as the process corresponding with project " navigation ", performed the Overlapping display data investigation process over an input image that will be used for option (such as, " nearest website " or " convenience store ") by processing execution unit 182.
Can be the timing that processing execution unit 182 performs process by multiple timing hypothesis.When foot is used as operating body OP by user, processing execution unit 182 can determine whether the touch sensor being attached to foot detects contact.In that case, when touch sensor detects contact, processing execution unit 182 can perform process corresponding to the project selected with the position based on foot.
In addition, processing execution unit 182 can determine whether the operating body OP identified by operating body recognition unit 183 has stopped scheduled time slot in substantially identical position.In that case, when processing execution unit 182 determines that operating body OP has stopped scheduled time slot in substantially identical position, processing execution unit 182 can perform process corresponding to the project selected with the position based on operating body OP.In addition, processing execution unit 182 also can determine whether the operating body OP identified by operating body recognition unit 183 is present in in the presumptive area arranged with the corresponded manner of project.In that case, when processing execution unit 182 determines that operating body OP is present in presumptive area, such as, processing execution unit 182 can perform the process corresponding with the project corresponding to presumptive area.
In output image Imb in fig. 1 ob, superposition be the Overlapping display data (comprising such as the project of " Email ", " navigation " and " game ") of the first stratum, and in the output image Imc in fig 1 oc, superposition be the Overlapping display data (comprising such as the project of " convenience store " and " nearest website ") of the second stratum.In this way, Overlapping display data hierarchically can be superposed over an input image.In this case, other Overlapping display data that Overlapping display data storage cell 181 shows after being stored in Overlapping display data, and when by 182 option of processing execution unit, Overlapping display image generation unit 184 generates new Overlapping display image by arranging other Overlapping display data further.Then, when by 182 option of processing execution unit, image superimposition unit 188 further by other Overlapping display data investigation on new Overlapping display image, generate new output image thus.
Output image Imd shown in Figure 10 D is by obtaining like this: when the operating body OP of user to be moved to the position of project " nearest website " by user, as the process corresponding with project " nearest website ", perform and performed for search to the process of the application program of the route of " nearest website " by processing execution unit 182.Output image Ime shown in Figure 10 E illustrates the example of Search Results.
In addition, all items can also supposing to form Overlapping display data is not suitable for the situation of output image.In this case, a part for Overlapping display data can be superimposed upon on output image, and when user in substantially horizontal directions rotatable head, the extendible portion of Overlapping display data can be superimposed upon on output image.Namely, such as, when the sensor of the rotation detecting imaging device detects that the user be worn on the head by imaging device rotates his/her head in substantially horizontal directions, Overlapping display image generation unit 184 can change the setting position of Overlapping display data according to the degree rotated.Correspondingly, Overlapping display image generation unit 184 can move Overlapping display data set in Overlapping display image.
Figure 13 is the process flow diagram of the example of the flow process that the items selection process performed by output image generation unit 180 is shown.In fig. 13, when items selection process starts, first, operating body recognition unit 183 identifying operation body OP (step S402).Then, whether operating body recognition unit 183 determination operation body OP is the operating body OP (step S404) of user.Can for the process of each input picture (that is, every frame) repetition from step S402 to step S410.
Operating body recognition unit 183 determines that whether identified operating body OP is the operating body OP (step S404) of user.When to determine identified operating body OP be not the operating body OP of user to operating body recognition unit 183 (being "No" in step s 404), process is back to step S402.When to determine identified operating body OP be the operating body OP of user to operating body recognition unit 183 (being "Yes" in step s 404), the position (step S406) of operating body recognition unit 183 assigned operation body OP.
Subsequently, whether processing execution unit 182 determination operation body OP performs the operation (step S408) of option.When processing execution unit 182 determines that operating body OP does not perform the operation of option (being "No" in step S408), process is back to step S406.When processing execution unit 182 determines that operating body OP performs the operation of option (being "Yes" in step S408), processing execution unit 182 performs the process (step S410) corresponding with selected item, and items selection process completes.
[summary of 2-4. first embodiment]
According to the image processing apparatus 100 corresponding to the present embodiment, based on the environmental map representing the object dimensional position corresponding with the object be present in real space, generate the output image obtained by superposing Overlapping display data.Accordingly, the output image that user easily watches can be generated.
In addition, according to the present embodiment, the position of the operating body OP of user can be identified, and based on identified position, the project forming superposition Overlapping display data over an input image can be selected.Therefore, in the configuration that shooting image shows by AR technology modification and by HMD, can so that the operation input of user.
<3. the second embodiment >
In a first embodiment, the example metope in real space and ground being also identified as object is described.On the other hand, when not limiting with metope or ground characteristic of correspondence data in advance, metope or ground are not included in environmental map.In this case, preferably identify metope or ground in addition, to generate output image according to recognition result.Therefore, in the portion, will can identify that when metope or ground are not included in environmental map the configuration example of the image processing apparatus on metope or ground is described as the second embodiment of the present invention in addition.
Figure 14 is the block diagram of the configuration example of the image processing apparatus 200 illustrated according to the second embodiment.With reference to Figure 14, image processing apparatus 200 comprises image-generating unit 102, environmental map generation unit 210 and output image generation unit 280.
[3-1. environmental map generation unit]
In the present embodiment, environmental map generation unit 210 comprises self-position detecting unit 120, characteristic storage unit 230, image identification unit 140, environmental map construction unit 150 and environmental map storage unit 152.
(1) characteristic storage unit
Characteristic storage unit 230 uses the storage medium of such as hard disk or semiconductor memory to prestore characteristic, and this characteristic represents the feature of the object corresponding with the physical objects that can be present in real space.In the present embodiment, except the data shown in Fig. 8, characteristic also comprises the additional data representing whether the polygonal summit forming each object probably contacts with ground or metope.Figure 15 is the key diagram of the configuration example that this characteristic is described.
Object names FD21, view data FD22, sheet data FD23 from six direction shooting, three-dimensional shape data FD24, ontology data FD25 and additional data FD26 is comprised with reference to Figure 15, characteristic FD2 exemplarily.
Additional data FD26 has two marks, these two mark needles, to polygonal each summit of each object limited by the polygon information be included in three-dimensional shape data FD24, represent whether whether summit probably may contact with metope with earth surface and summit.Such as, in the example depicted in fig. 15, additional data FD26 represents that the polygonal summit A of the object corresponding with characteristic FD2 probably still can not contact with metope with earth surface.In addition, additional data FD26 represents that the polygonal summit B of the object corresponding with characteristic FD2 still probably can not contact with metope with earth surface.It should be noted that polygonal summit can be the unique point used in the above-mentioned process undertaken by self-position detecting unit 120 or image identification unit 140, or can be some points except unique point.
Figure 16 is the key diagram that the polygonal example relevant to the characteristic shown in Figure 15 is described.With reference to Figure 16, three objects Obj21, Obj22 and Obj23 are shown.Wherein, object Obj21 represents chair.In the polygonal summit corresponding with object Obj21, six summits of the part in chair legs probably with earth surface.In addition, object Obj22 represents calendar.Probably contact with metope with eight summits in the polygonal summit that object Obj22 is corresponding.In addition, object Obj23 represents drawer.Be positioned in the polygonal summit corresponding with object Obj23 four summits on the bottom surface of drawer probably with earth surface.Four summits be positioned in the polygonal summit corresponding with object Obj23 on the rear surface of drawer probably contact with metope.Additional data FD26 shown in Figure 15 limits this attribute on each summit.
In the present embodiment, the characteristic storage unit 230 of environmental map generation unit 210 stores the characteristic comprising above-mentioned additional data, and exports additional data according to the request from location estimation unit 281.
[3-2. output image generation unit]
As shown in figure 14, in the present embodiment, output image generation unit 280 comprises location estimation unit 281, Overlapping display data storage cell 181, processing execution unit 182, operating body recognition unit 183, Overlapping display image generation unit 184, Overlapping display position determination unit 186 and image superimposition unit 188.
(1) location estimation unit
Location estimation unit 281, based on the position of the point on the surface of the object represented by environmental map and above-mentioned characteristic, estimates the position of ground in real space or metope.In this embodiment, the point on the surface of object can be the polygonal summit corresponding with each in above-mentioned object.
More specifically, such as, location estimation unit 281 from the polygonal set of vertices of the object be included in the environmental map that inputs from environmental map construction unit 150, extract probably with earth surface, the set of vertices that represented by above-mentioned characteristic.Then, location estimation unit 281, based on the three-dimensional position of extracted set of vertices in global coordinate system, estimates the plane corresponding with ground.Such as, location estimation unit 281 can use well-known Hough transformation method from the three-dimensional position of set of vertices estimate to comprise set of vertices can parallel planes.
Similarly, such as, location estimation unit 281, from being included in from the polygonal set of vertices of the object the environmental map of environmental map construction unit 150 input, extracts set of vertices that probably contact with metope, that represented by above-mentioned characteristic.Then, location estimation unit 281, based on the three-dimensional position of extracted set of vertices in global coordinate system, estimates the plane corresponding with metope.It should be noted that when two or more metope may be present in real space, set of vertices according to the three-dimensional position of set of vertices, can be divided into plural set by location estimation unit 281, estimates the plane corresponding with the metope for each set thus.
Location estimation unit 281 exports the position of the ground estimated in this way and/or metope to Overlapping display position determination unit 186.
[summary of 3.3. second embodiment]
According to the image processing apparatus 200 corresponding to the present embodiment, be present in the environmental map of the three-dimensional position of the object in real space based on expression, generate the output image obtained by superposition Overlapping display data.Here, based on the position of the point probably contacted with ground or metope in the point on the surface being included in the object in environmental map, estimate the position of ground or metope.
<4. hardware configuration >
Noting, is be inessential by hardware or software simulating according to a series of process of above-mentioned first embodiment and the second embodiment.Such as, when a series of process or a part are wherein performed by software, use the program being incorporated to the computing machine in specialized hardware or the execution of the multi-purpose computer shown in Figure 17 formation software.
In fig. 17, CPU (central processing unit) (CPU) 902 controls the integrated operation of multi-purpose computer.Describe a part for a series of process or the program of entirety or data to be stored in ROM (read-only memory) (ROM) 904.The program used by CPU902 when performing process and data are temporarily stored in random access memory (RAM) 906.
CPU902, ROM904 and RAM906 are interconnected by bus 910.Input/output interface 912 is connected to bus 910 further.
Input/output interface 912 is the interfaces be connected with input media 920, display device 922, memory storage 924, imaging device 926 and driver 930 with RAM906 CPU902, ROM904.
Such as, input media 920 is accepted from the instruction of user and the information that inputted by input interface (such as, button, switch, control lever (lever), mouse and keyboard).But when imaging device 926 exists, input media 920 can not exist.Such as, display device 922 is made up of cathode-ray tube (CRT) (CRT), liquid crystal display, Organic Light Emitting Diode (OLED) etc., and shows image on its screen.In the above-described embodiments, the display unit assembling HMD corresponds to display device 922.
Such as, memory storage 924 is made up of hard disk drive or semiconductor memory, and storage program and data.Imaging device 926 corresponds to the hardware of above-mentioned image-generating unit 102, and uses the image-forming component of such as CCD or CMOS to carry out imaging to real space.If need driver 930 to arrange on a general-purpose computer, and such as removable medium 932 be arranged on driver 930.
When performing a series of process according to the first embodiment and the second embodiment by software, the program be such as stored in the ROM904 shown in Figure 17, memory storage 924 or removable medium 932 to be read in RAM906 and to be performed by CPU902.
It will be understood by those skilled in the art that and in the scope of claims or its equivalent, various amendment, combination, sub-portfolio and change can be carried out according to designing requirement and other factors.
The application comprises the subject content that disclosed in the Japanese Priority Patent Application JP2010-068270 that submits to Japan Office in 24, on March of Japanese Priority Patent Application JP2010-068269 and 2010 of submitting to Japan Office on March 24th, 2010, subject content is relevant, it is herein incorporated in full by reference at this.

Claims (20)

1. an image processing apparatus, comprising:
Characteristic storage unit, it stores the characteristic of the feature of the outward appearance representing object;
Overlapping display data storage cell, it stores Overlapping display data and item location, and wherein, described Overlapping display data are the sources of the image be superimposed upon on the visual field of user, and described item location is the item destination locations forming described Overlapping display data;
Environmental map generation unit, it, based on the input picture obtained by using imaging device to carry out imaging to real space and the described characteristic be stored in described characteristic storage unit, generates the environmental map of the position representing the one or more objects be present in described real space;
Location estimation unit, it is based on the position of the point on the surface of the object represented by described environmental map, estimates the position of ground in described real space or metope;
Overlapping display position determination unit, it is based on described environmental map, the position of the object with predetermined plane or curved surface is determined from the object of imaging described input picture, and further based on the position of the ground in the described real space by described location estimation unit estimation or metope, the position on ground is defined as the position of described object;
Overlapping display image generation unit, it is by arranging described Overlapping display data in the position on the ground determined by described Overlapping display position determination unit, generate Overlapping display image, wherein, described Overlapping display data are hierarchically superimposed upon on described input picture;
Image superimposition unit, its by described Overlapping display imaging importing on the visual field of described user;
Operating body recognition unit, the foot of imaging in described input picture is identified as operating body by it; And
Processing execution unit, it performs and the position based on the described operating body identified by described operating body recognition unit and process corresponding to the described project selected.
2. image processing apparatus according to claim 1,
Wherein, described operating body recognition unit performs as the footwear registration image of the image of the footwear of registering in advance and mating between described input picture, and when described operating body recognition unit determines that the footwear of registering images match with described footwear are in described input picture during imaging, these footwear are identified as described operating body by described operating body recognition unit.
3. image processing apparatus according to claim 1,
Wherein, when described imaging device is worn on his/her head by user, described operating body recognition unit determines whether the foot of imaging in described input picture enters with the immediate limit of described user among each limit of the described input picture of formation, and when described operating body recognition unit determines that described foot is from when entering with the immediate limit of described user, described foot is identified as described operating body by described operating body recognition unit.
4. image processing apparatus according to claim 1,
Wherein, described operating body recognition unit determine to have passed through in advance predetermined labels, the whether imaging in described input picture of the markd footwear of tool, and when described operating body recognition unit determines the markd footwear of described tool in described input picture during imaging, these footwear are identified as described operating body by described operating body recognition unit.
5. image processing apparatus according to claim 1,
Wherein, described processing execution unit determines whether the touch sensor being attached to described foot detects contact, and when described touch sensor detects described contact, described processing execution unit performs and the position based on described foot and process corresponding to the described project selected.
6. image processing apparatus according to claim 1,
Wherein, described processing execution unit determines whether the described operating body identified by described operating body recognition unit has stopped scheduled time slot in identical position, and when described processing execution unit determines that described operating body has stopped scheduled time slot in identical position, described processing execution unit performs and the position based on described operating body and process corresponding to the described project selected.
7. image processing apparatus according to claim 1,
Wherein, other Overlapping display data that described Overlapping display data storage cell shows after being stored in described Overlapping display data,
Wherein, when by described processing execution Unit selection during described project, described Overlapping display image generation unit generates new Overlapping display image by arranging other Overlapping display data described further, and
Wherein, when by described processing execution Unit selection during described project, other Overlapping display data described are superimposed upon on described new Overlapping display image by described image superimposition unit further.
8. image processing apparatus according to claim 1,
Wherein, described Overlapping display imaging importing on the visual field of described user, and makes display unit show described Overlapping display image by described image superimposition unit.
9. image processing apparatus according to claim 8,
Wherein, when described image superimposition unit is by using the sensor of the degree of tilt detecting described imaging device, detect when described head being tilted in a downward direction the described imaging device user be worn on his/her head with the degree of tilt exceeding predetermined value, described image superimposition unit makes described display unit show described Overlapping display image, and when described image superimposition unit is by using the described sensor of the degree of tilt detecting described imaging device, do not detect when described head being tilted in a downward direction the described imaging device user be worn on his/her head with the degree of tilt exceeding described predetermined value, the described Overlapping display image of described image superimposition unit restriction display, wherein, described display is performed by described display unit.
10. image processing apparatus according to claim 1,
Wherein, the position of the object with the plane extended in the horizontal direction is defined as the position of the object with described predetermined plane or curved surface by described Overlapping display position determination unit.
11. image processing apparatus according to claim 1,
Wherein, described characteristic comprises the data whether contacted with the ground in described real space or metope for each point of one or more expression on the surface of each object, and
Wherein, described location estimation unit estimates the position of ground in described real space or metope further based on described characteristic.
12. 1 kinds of image processing methods, it is performed by image processing apparatus, described image processing apparatus comprises the characteristic storage unit of the characteristic of the feature storing the outward appearance representing object, store the Overlapping display data storage cell of Overlapping display data and item location, environmental map generation unit, location estimation unit, Overlapping display position determination unit, image superimposition unit, operating body recognition unit, and processing execution unit, wherein, described Overlapping display data are the sources of the image be superimposed upon on the visual field of user, and described item location is the item destination locations forming described Overlapping display data, described image processing method comprises the following steps:
By described environmental map generation unit based on the input picture obtained by using imaging device to carry out imaging to real space and the described characteristic be stored in described characteristic storage unit, generate the environmental map of the position representing the one or more objects be present in described real space;
By the position of described location estimation unit estimation based on the point on the surface of the object represented by described environmental map, estimate the position of ground in described real space or metope;
By described Overlapping display position determination unit based on described environmental map, the position of the object with predetermined plane or curved surface is determined from the object of imaging described input picture, and further based on the position of the ground in the described real space by described location estimation unit estimation or metope, the position on ground is defined as the position of described object;
By described Overlapping display image generation unit by arranging described Overlapping display data in the position on the ground determined by described Overlapping display position determination unit, generate Overlapping display image, wherein, described Overlapping display data are hierarchically superimposed upon on described input picture;
By described image superimposition unit by described Overlapping display imaging importing on the visual field of described user;
By described operating body recognition unit, the foot of imaging in described input picture is identified as operating body; And
Performed and the position based on the described operating body identified by described operating body recognition unit and process corresponding to the described project selected by described processing execution unit.
13. 1 kinds of image processing apparatus, comprising:
Characteristic storage unit, it stores the characteristic of the feature of the outward appearance representing object;
Overlapping display data storage cell, it is stored as the Overlapping display data in the source of the image be superimposed upon on the visual field of user;
Environmental map generation unit, it, based on the input picture obtained by using imaging device to carry out imaging to real space and the described characteristic be stored in described characteristic storage unit, generates the environmental map of the position representing the one or more objects be present in described real space;
Location estimation unit, it is based on the position of the point on the surface of the object represented by described environmental map, estimates the position of ground in described real space or metope;
Overlapping display position determination unit, it is based on described environmental map, the position of the object with predetermined plane or curved surface is determined from the object of imaging described input picture, and further based on the position of the ground in the described real space by described location estimation unit estimation or metope, the position on ground is defined as the position of described object;
Overlapping display image generation unit, it is by arranging described Overlapping display data in the position on the ground determined by described Overlapping display position determination unit, generate Overlapping display image, wherein, described Overlapping display data are hierarchically superimposed upon on described input picture; And
Image superimposition unit, described Overlapping display imaging importing on the visual field of described user, and makes display unit show described Overlapping display image by it.
14. image processing apparatus according to claim 13,
Wherein, when described image superimposition unit is by using the sensor of the degree of tilt detecting described image device, detect when described head being tilted in a downward direction the described imaging device user be worn on his/her head with the degree of tilt exceeding predetermined value, described image superimposition unit makes described display unit show described Overlapping display image, and when described image superimposition unit is by using the described sensor of the degree of tilt detecting described image device, do not detect when described head being tilted in a downward direction the described imaging device user be worn on his/her head with the degree of tilt exceeding described predetermined value, the described Overlapping display image of described image superimposition unit restriction display, wherein, described display is performed by described display unit.
15. image processing apparatus according to claim 14, also comprise
Self-position detecting unit, it, based on described input picture and described characteristic, dynamically detects the position of described image processing apparatus,
Wherein, when distance when the position from the position of the described image processing apparatus detected by described self-position detecting unit to the object determined by described Overlapping display position determination unit exceedes predetermined value, the described Overlapping display image of described image superimposition unit restriction display, wherein, described display is performed by described display unit.
16. image processing apparatus according to claim 13,
Wherein, the position of the object with the plane extended in the horizontal direction is defined as the position of the object with described predetermined plane or curved surface by described Overlapping display position determination unit.
17. image processing apparatus according to claim 16,
Wherein, the position of at least one in ground, desktop and stair is defined as the position of the object with the plane extended in the horizontal direction by described Overlapping display position determination unit.
18. image processing apparatus according to claim 13,
Wherein, when described Overlapping display image generation unit is by using the sensor of the rotation detecting described imaging device, detect when described head being rotated in the horizontal direction the described imaging device user be worn on his/her head, described Overlapping display image generation unit, by changing the setting position of described Overlapping display data according to the degree of described rotation, moves the described Overlapping display data arranged in described Overlapping display image.
19. image processing apparatus according to claim 13,
Wherein, described characteristic comprises the data whether contacted with the ground in described real space or metope for each point of one or more expression on the surface of each object, and
Wherein, described location estimation unit estimates the position of ground in described real space or metope further based on described characteristic.
20. 1 kinds of image processing methods, it is performed by image processing apparatus, described image processing apparatus comprises the characteristic storage unit of the characteristic of the feature storing the outward appearance representing object, the Overlapping display data storage cell being stored as the Overlapping display data in the source of the image be superimposed upon on the visual field of user, environmental map generation unit, location estimation unit, Overlapping display position determination unit, Overlapping display image generation unit and image superimposition unit, and described image processing method comprises the following steps:
By described environmental map generation unit based on the input picture obtained by using imaging device to carry out imaging to real space and the described characteristic be stored in described characteristic storage unit, generate the environmental map of the position representing the one or more objects be present in described real space;
By the position of described location estimation unit estimation based on the point on the surface of the object represented by described environmental map, estimate the position of ground in described real space or metope;
By described Overlapping display position determination unit based on described environmental map, the position of the object with predetermined plane or curved surface is determined from the object of imaging described input picture, and further based on the position of the ground in the described real space by described location estimation unit estimation or metope, the position on ground is defined as the position of described object;
By described Overlapping display image generation unit by arranging described Overlapping display data in the position on the ground determined by described Overlapping display position determination unit, generate Overlapping display image, wherein, described Overlapping display data are hierarchically superimposed upon on described input picture; And
By described image superimposition unit by described Overlapping display imaging importing on the visual field of described user, and make display unit show described Overlapping display image by described image superimposition unit.
CN201110069689.XA 2010-03-24 2011-03-17 Image processing apparatus and image processing method Active CN102200881B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2010068269A JP2011203823A (en) 2010-03-24 2010-03-24 Image processing device, image processing method and program
JP2010068270A JP2011203824A (en) 2010-03-24 2010-03-24 Image processing device, image processing method and program
JP2010-068270 2010-03-24
JP2010-068269 2010-03-24

Publications (2)

Publication Number Publication Date
CN102200881A CN102200881A (en) 2011-09-28
CN102200881B true CN102200881B (en) 2016-01-13

Family

ID=44661594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110069689.XA Active CN102200881B (en) 2010-03-24 2011-03-17 Image processing apparatus and image processing method

Country Status (1)

Country Link
CN (1) CN102200881B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2899618B1 (en) * 2012-09-21 2019-05-15 Sony Corporation Control device and computer-readable storage medium
US9448404B2 (en) 2012-11-13 2016-09-20 Qualcomm Incorporated Modifying virtual object display properties to increase power performance of augmented reality devices
CN103902192A (en) * 2012-12-28 2014-07-02 腾讯科技(北京)有限公司 Trigger control method and trigger control device for man-machine interactive operation
JP2014203309A (en) * 2013-04-05 2014-10-27 オムロン株式会社 Image processing apparatus, control method, and program
JP6108926B2 (en) * 2013-04-15 2017-04-05 オリンパス株式会社 Wearable device, program, and display control method for wearable device
CN104683104B (en) * 2013-12-03 2018-11-23 腾讯科技(深圳)有限公司 The method, apparatus and system of authentication
CN105683900B (en) * 2013-12-04 2019-09-17 英特尔公司 Wearable map and image display
KR20160001178A (en) * 2014-06-26 2016-01-06 엘지전자 주식회사 Glass type terminal and control method thereof
US10175825B2 (en) * 2014-07-30 2019-01-08 Sony Corporation Information processing apparatus, information processing method, and program for determining contact on the basis of a change in color of an image
CN105872526B (en) * 2015-01-21 2017-10-31 成都理想境界科技有限公司 Binocular AR wears display device and its method for information display
US9659407B2 (en) * 2015-01-26 2017-05-23 MediaTek Singapore, Pte. Lte. Preemptive flushing of spatial selective bins for deferred graphics processing
CN105224084B (en) 2015-09-30 2018-04-24 深圳多新哆技术有限责任公司 Determine the method and device of virtual article position in Virtual Space
SG11201810598XA (en) * 2016-05-27 2018-12-28 Rovi Guides Inc Methods and systems for selecting supplemental content for display near a user device during presentation of a media asset on the user device
CN106874594A (en) * 2017-02-13 2017-06-20 云南电网有限责任公司电力科学研究院 One introduces a collection lotus control method for coordinating
CN107038746B (en) * 2017-03-27 2019-12-24 联想(北京)有限公司 Information processing method and electronic equipment
EP3416027B1 (en) 2017-06-12 2020-04-08 Hexagon Technology Center GmbH Augmented-reality device and system with seamless bridging
CN108664124B (en) * 2018-05-08 2021-08-24 北京奇艺世纪科技有限公司 Control method and device based on spatial orientation information
CN109171780A (en) * 2018-10-10 2019-01-11 宝鸡市中医医院 A kind of medical visiting pen
CN109756672A (en) * 2018-11-13 2019-05-14 深圳艺达文化传媒有限公司 Short-sighted frequency animal model stacking method and Related product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998011528A1 (en) * 1997-05-09 1998-03-19 Remec Inc. Computer control device
US6050822A (en) * 1997-10-01 2000-04-18 The United States Of America As Represented By The Secretary Of The Army Electromagnetic locomotion platform for translation and total immersion of humans into virtual environments
WO2002073287A2 (en) * 2001-03-13 2002-09-19 Canon Kabushiki Kaisha Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer
EP1244003A2 (en) * 2001-03-09 2002-09-25 Samsung Electronics Co., Ltd. Information input system using bio feedback and method thereof
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
CN1696606A (en) * 2004-05-14 2005-11-16 佳能株式会社 Information processing method and apparatus for finding position and orientation of targeted object

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070132722A1 (en) * 2005-12-08 2007-06-14 Electronics And Telecommunications Research Institute Hand interface glove using miniaturized absolute position sensors and hand interface system using the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998011528A1 (en) * 1997-05-09 1998-03-19 Remec Inc. Computer control device
US6050822A (en) * 1997-10-01 2000-04-18 The United States Of America As Represented By The Secretary Of The Army Electromagnetic locomotion platform for translation and total immersion of humans into virtual environments
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
EP1244003A2 (en) * 2001-03-09 2002-09-25 Samsung Electronics Co., Ltd. Information input system using bio feedback and method thereof
WO2002073287A2 (en) * 2001-03-13 2002-09-19 Canon Kabushiki Kaisha Mixed reality system which reduces measurement errors of viewpoint position and direction of an observer
CN1696606A (en) * 2004-05-14 2005-11-16 佳能株式会社 Information processing method and apparatus for finding position and orientation of targeted object

Also Published As

Publication number Publication date
CN102200881A (en) 2011-09-28

Similar Documents

Publication Publication Date Title
CN102200881B (en) Image processing apparatus and image processing method
US9367964B2 (en) Image processing device, image processing method, and program for display of a menu on a ground surface for selection with a user&#39;s foot
US9996982B2 (en) Information processing device, authoring method, and program
CN102129290B (en) Image processing device, object selection method and program
US8933966B2 (en) Image processing device, image processing method and program
US20210027543A1 (en) Image processing device, image processing method, and program
CN102239470B (en) Display input device and guider
Schmalstieg et al. The world as a user interface: Augmented reality for ubiquitous computing
CN102695032B (en) Information processor, information sharing method and terminal device
US8515982B1 (en) Annotations for three-dimensional (3D) object data models
US20140132595A1 (en) In-scene real-time design of living spaces
CN103733229A (en) Information processing device, information processing method, and program
WO2011010533A1 (en) Information processing device, information processing method, and program
KR20130029800A (en) Mobile device based content mapping for augmented reality environment
JP2021532447A (en) Augmented reality model video multi-planar interaction methods, devices, devices and storage media
CN107608507A (en) The method for selecting of locomotive component and locomotive auxiliary maintaining system under low light condition
JP2012053631A (en) Information processor and information processing method
CN113359986B (en) Augmented reality data display method and device, electronic equipment and storage medium
Hertzberg et al. Experiences in building a visual SLAM system from open source components
Alam et al. Pose estimation algorithm for mobile augmented reality based on inertial sensor fusion.
Zendjebil et al. Outdoor augmented reality: State of the art and issues
KR101546653B1 (en) Areal-time coi registration system and the method based on augmented reality
Araujo et al. Life cycle of a slam system: Implementation, evaluation and port to the project tango device
Lee et al. Real-time camera tracking using a particle filter and multiple feature trackers
KR20210105484A (en) Apparatus for feeling to remodeling historic cites

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant